Skip to main content


It is all about integration - quarterly update

System integration in the future part 2

Content sections



System integration in the future part 2

Are the integration requirements really dead or even declining? Some might say that our traditional ways of integrating applications are dead but the truth is that the need for integration is increasing at a very high pace as we need to digitize our work and processes, connect more and more diverse types of equipment and innovate at an ever-increasing speed.

However, the way that we work within the IT business and therefore also within the integration area is changing. We must complement our traditional integration methods, processes and techniques with new ones. I say complement because we still need most building blocks that we already have, as our total IT landscape will not change overnight and many of the aspects taken care of in traditional integration technologies are still valid. Isolating applications from each other, letting applications  develop in their own pace, having an integration layer that reduces the complexity when applications use different formats, protocols and techniques are still good practices.

The modern integration landscape is based on a number of key areas where a big part of this is around how we work together between different parts of the organization, how we do application development and integration and then we complement this with a few extra techniques. Doing this right will put us in a better situation for future challenges.

  • IC4E
  • Microservices
  • Containers and serverless
  • Event streams
  • API Management


At the very start in the transition to a more modern integration approach is to look at more agile methods and empowering application development teams to do more. Instead of having an Integration Competence Center (ICC) that does the integration for them, we change the focus to an Integration Center for Enablement (IC4E) that supports agile application integration methods.

This is a trend that has been going on for some time now, but now we are lucky that technologies really support this direction. First by using containers or even one step further by using serverless computing, we can package our applications so that they can be deployed without worrying about dependencies outside the container or function. The application teams will become much more efficient as a lot of automation can be applied using for instance Jenkins or why not a cloud native CI/CD pipeline such as Tekton.

When the application teams get more efficient, they will not have the patience to wait for an integration to be developed. This implies that more integration needs to be part of the application for instance by using events and APIs.



The application can also be assembled in smaller parts, which can be managed individually. They have their own life cycle, and you can perform the operation as individual parts. It will be easier to create new functionality, make corrections, find problems and perform operation tasks such as scalability, performance, availability etc. By doing this to its  full extent, you will break it down to microservices where each service is managed separately. This is not suitable for all applications but very useful for others.In an integration context, it can sometimes be useful  to think of an integration as a microservice.

It might very well be that an integration doesn’t fully comply with all definitions of a microservice but for me that doesn’t really matter as you may use the same useful approach.


Containers and serverless

All major cloud providers such as AWS, Azure, Google Cloud, IBM Cloud etc. support containers and serverless computing. If you want to run, or end up running a multi cloud or hybrid cloud solution, a better choice is to run Red Hat OpenShift platform which provides you with the freedom to run your applications anywhere you want and prevents lock-in.

So far, we have not discussed integration techniques that much but now you probably understand that the previous changes affect which integration techniques to use. So, we need to add a couple of newer technologies which are event streams and API management. These technologies are well documented and easily understood. They can be consumed by developers and the integration is considered to be a part of the application. This will help the application development teams to be more efficient.


Event streams

Event streaming platforms such as Kafka are ideal for a number of use cases. The first one that comes to mind is as you may have divided your application into smaller parts or microservices, you do not want to have synchronous relations between each service. That will not be easy to manage and scale. A better choice is to publish an event that the other services can consume when they have the need and capacity for it. Kafka is very well suited for high volume event streaming and comes in many different packages such as IBM Event Streams, Red Hat AMQ Streams and it’s also available as an open source project. Other use-cases for event streaming are to be more reactive and handle real time events, manage all incoming events that you want to put in your data lakes, or help to connect your legacy applications with new services in the cloud just to mention a few more.


API management

Not everything can be handled through events though. Most part of the IT industry have adopted the concept of modern APIs based on the OpenAPI standard that specifies how RESTful APIs should be used. If you look at most services or products today, they will offer APIs which you may use to consume their services.

Also, when building your own applications, you may want to expose APIs for others to consume. APIs should be, and normally are very easy to use as they are designed with usability in focus. This has made this technique very popular.

With this, the risk to lose control over your integration landscape and even worse, how your company’s valuable information is used, increases if not managed properly. A good practice is to have a strategy how APIs should be used, policies and directives that defines what is allowed and an API management platform to manage all aspects of API usage. Such a platform will cover how to secure the APIs, how to manage traffic to them, portals that have a catalogue of the APIs so that users can find information about them and manage their usage and finally analysis tools to use to monitor the usage.

Stay tuned and follow our next blog in this series covering IaaS & Cloud based Integration which will soon be published.

You may also get deeper insights by reading our information around API management.

Mats Andersson
Head of IBM Integration