3 different architectural patterns using Kafka

In this blog I explain the most important architectural patterns and use cases where Kafka is used.

Author: Mario Maric

Characteristic for this technology is its versatility. In addition to the realization of event-driven architectures (EDA), there are various forms of integration patterns that can be implemented with Kafka. Event Sourcing, Event Distribution and Event Processing are typical forms of EDA. In this article, however, I will deal with the lesser known application areas of master data integration, mainframe offloading and audit trail.

1) Master data integration

Master data integration is a well-known problem of enterprise integration. In modern application landscapes, this topic is becoming increasingly important. It is necessary to connect less agile backend applications with fast moving frontend applications. This can be achieved by making data available centrally via event streaming. Replication of data is no longer an absurdity. Kafka's technology and viability is now so advanced that data can be replicated almost as quickly and as often as required.

The biggest challenge with this type of integration is the appropriate transformation logic, which on the one hand provides an easily maintainable integration interface, but on the other hand allows the front-end as much flexibility as possible. Put simply, the aim is to make the standardized backend process in the frontend as flexible as possible and to enrich it with intelligence. Context-related integration with event streaming is suitable for this.

Stammdatenintegration.png
Master data integration is gaining in importance. It is necessary to connect less agile backend applications with fast moving frontend applications.

2) Mainframe Offloading

Mainframes are complex and expensive to operate and their integration with other systems is usually inflexible. Kafka can remedy this situation by making data from mainframes centrally available, thus facilitating read access to the data. This can save quite a bit of money.

Basically, it can be said that both master data integration and mainframe offloading ensure that data and, if necessary, services are available around the clock because the applications behind them are decoupled with Kafka. The data is cached in Kafka during a possible downtime and is therefore at least readily available. As soon as the underlying application is available again, the data can be synchronized again.

Combined together, mainframe offloading and master data integration lead in the best case to the fact that the functionality from legacy applications can be replaced step by step.

Mainframe-Offloading.png
Mainframes are complex and expensive to operate and their integration with other systems is usually inflexible. Kafka can help here.

3) Audit Trail

Kafka is ideally suited as a technology for creating both professionally and technologically sound audit trails. Applications, sensors, interactions, infrastructure components, databases, etc. can send very fine-grained, frequent and specific events to an audit topic in Kafka.

Kafka provides the technical performance to make this possible on a broad front. Imagine the monitoring of all applications and the access to any data in the company. The properties of Kafka ensure that the audit trail can also meet the technical requirements. The sequence of events is retained. The data is available almost immediately. The connection to further vessels is seamlessly possible, e.g. storage in long-term storage like Hadoop, stream processing frameworks like Spark or Flink, or other monitoring & alerting frameworks like Prometheus, Grafana, Kibana, Splunk, etc.

In times of increasing regulations and interconnectivity of countless services and devices, traceable auditing is indispensable. With Kafka, a technology is now available that meets the necessary technical requirements.
Audit Trail

Audit-Trail.png
Kafka is ideally suited as a technology for creating audit trails that are both professionally and technologically sound.

Fazit

Kafka stellt eine nutzbringende Technologie dar, die sich bereits in zahlreichen Anwendungsfällen in unterschiedlichsten Branchen bewährt hat. Nebst der Umsetzung flexibler und zukunftsorientierter Architekturen und  Kosteneinsparungen, hilft es vor allem mehr Flexibilität und Agilität in die Organisation rein zu bringen. Dieses Nutzenversprechen gilt es jedoch, wie bei jeder Technologie, auf den Anwendungsfall abgestimmt, richtig zu realisieren.