Enterprise level Integration Service Event Mesh in a big 4 bank
A Tier 1 Australian bank, providing banking and financial products and services to over 8.5 million retail and business customers, and operating across 33 markets.
The integration services team are responsible for providing enterprise grade integration capabilities for all systems across the bank.
One of the core data systems is the Event Mesh. A confluent kafka cluster which is used as the backbone for real time eventing between on-prem, GCP and AWS environments.
eliiza data engineering was asked to provide Kafka expertise to provide guidance to the current team and to design and implement solution accelerators
The Event Mesh Framework – a set of lightweight libraries to speed up development and maintenance of Kafka applications. These utilities included:
- Adapters and Data format converters to produce data (JSON, Protobuf, Fixed-Length, XML) into Event Mesh (Kafka platform).
- Integration with the Kafka Streams API to creating Stream Processing applications.
- Integration with the Confluent Schema registry and utilities for schema handling and testing.
- Stream processor bridging the bank’s innovation arm and fraud detection mainframe system
- Notification Broker: hybrid of adapter and stream splitting processor
- The creation of these solution accelerators have helped lower barrier of entry to the Event Mesh platform for engineering teams.
- The bank’s integration service is now opening up the Stream Processor Framework (a layer on top of the Kafka Streams API) for other engineering teams to use.
- Engineering onboarding to the Event mesh has been simplified through framework samples and documentation about the framework in the form of code.
- The fraud detection and notification broker solutions are in production and delivering value to those engineering feature teams.
- Next steps are to deliver ksqlDB as self-service. This will help engineers that have no capacity or skills with Java/Kafka Streams to develop their stream processors in SQL.