This article covers how Domino Technologies recently implemented an HTTP Event Collector (HEC) solution for a banking client.
Project Overview
The HTTP Event Collector (HEC) project was designed to create a highly scalable, single-entry API platform. This system was built to manage the high-speed, real-time flow of data into the Kafka streaming platform. The HEC microservice acts as a REST endpoint, allowing both internal and external consumers and applications to publish data and events seamlessly.
Client Information
The client is a bank aiming to improve its data analytics capabilities. A key challenge they faced was capturing and understanding customer actions after login. This valuable data would enable the bank to identify popular customer actions, informing improvements to services and products.
Role and Responsibilities
API Architect Rajesh Yantrapati held a pivotal role in the project. His core responsibilities included:
- Leading the design and implementation of scalable, high-performance cloud solutions on the Azure platform for the bank. This included overseeing cloud migrations and crafting solutions to enhance performance and cost efficiency.
- Architecting and implementing distributed systems and event-driven architectures. This encompassed building robust backend systems and optimizing data storage solutions.
- Designing and implementing CI/CD pipelines using tools like Azure DevOps, GitHub Actions, and Jenkins. This streamlined the deployment process and accelerated time-to-market.
- Developing microservices for critical functions such as logging, error handling, auditing, and authentication using technologies like Spring Reactive, REST, and GraphQL.
- Integrating advanced logging and monitoring tools like Elastic Search, Splunk, and AppDynamics to enable proactive system diagnostics and insights.
- Engineering Hazelcast-based solutions for event messaging microservices, facilitating efficient distributed caching and improving system performance.
- Providing overall technical leadership in microservices design and cloud development.
Project Execution
The project was executed in several key phases:
- API Development and Deployment: The initial stage involved developing and deploying the APIs to the bank’s on-premises infrastructure and onboarding the first set of consumers.
- Cloud Migration: Subsequently, the APIs were migrated to the Azure cloud, where new consumers continued to be onboarded.
- Consumer Integration: The final phase focused on integrating and onboarding both internal and external consumers to leverage the API for real-time event publishing.
The project utilized a range of technologies, including microservices architecture design patterns, Java, SpringBoot, Kubernetes, Kafka, Spring Reactive, and Hazelcast.
Impact and Outcomes
The project delivered substantial benefits to the client. The bank gained the ability to collect detailed insights into customer behavior, including the most common actions performed after login. These insights empowered the bank to make data-driven decisions to enhance its products and services.
The project was a significant achievement for the bank, establishing a robust framework for collecting and analyzing customer event and action data. Moreover, it provided invaluable technical knowledge that can be applied to future implementations of similar event-driven applications.
Key Learnings
The project underscored several key learnings:
- The critical importance of scalable, cloud-based solutions for managing high-volume, real-time data.
- The effectiveness of event-driven architectures in capturing and processing customer interaction data.
- The necessity of advanced logging and monitoring tools for maintaining system health and real-time issue diagnosis.
- The broad applicability of the technical insights gained from this project to a wide array of event-driven applications, enabling the creation of scalable and robust backend systems.