All-In-One Scriptless Test Automation Solution!

Generic selectors
Exact matches only
Search in title
Search in content

Case Study

Real-Time Data Streaming, Routing, and Processing Using Kafka and Snowflake

Overview

A top-tier bank’s legacy messaging infrastructure posed multiple challenges in handling growing data volumes – Transaction Data, Customer Data, New Loan Application Requests, KYC Data, etc. Hence, activating any new digital experience using the existing legacy infrastructure meant enabling high volumes of asynchronous data processing. Traditional messaging middleware like Message Queues (MQs), Enterprise Service Buses (ESBs), and Extract, Transform and Load (ETL) tools were unable to provide the necessary support that modern applications demand.

Modern Applications Require Asynchronous, Heterogeneous Data Processing

What is Asynchronous Data Processing?

Asynchronous processing allows the system to handle multiple loan applications simultaneously without waiting for each application to complete. This means that while one application is being reviewed, others can continue to be processed in parallel.

For example, when a borrower applies for a mortgage loan through an online lending platform, the backend must be capable of collecting required documents and information, such as income statements, tax returns, credit reports, property details, and employment history.

When the borrower submits their application, the system immediately acknowledges receipt and starts the process. Meanwhile, in the background, the system also asynchronously verifies employment history, orders a credit report, and assesses property value.

Why Enable Data Streaming, Routing, and Processing Using APIs?

With the implementation of a Digital API Hub, the Legacy Messaging Middleware gets integrated with modern event streaming automation tools. It can then be taken enterprise-wide to enable new services or functionalities using the existing data.

How are Reusable Microservices Built Using a Modern Messaging layer?

The new messaging layer helps create reusable components from existing topics or data. Hence, launching any new digital service or feature can consume existing topics or data. A topic here implies code that is inside of a Terraform module which can be reused in multiple places throughout an application.

Why Choose Kafka and Snowflake for Real-Time Data Streaming

Snowflake as a data warehousing architecture and Kafka as a platform was chosen to automate different data stream lanes. Our developers were able to used Snowflake to enable event-driven consumption using Snowpipe. By integrating this cloud-based system, we were able to provide easy access to more cloud-based applications for different banking processes and teams.

  • We set up a Java application for data producing teams to scrape an API and integrating it with the data routing platform.
  • Using Kafka as a buffer between data producers and Snowflake allowed for decoupling of ingestion and processing layers, providing flexibility and resilience.
  • Information on different topics is then pushed into further processing for sending our event-driven notifications.
  • We also set up different event-driven data streams that achieves close to real-time fraud detection, transaction monitoring, and risk analysis.

Our Solution: Enabling Modern Experiences Using APIs for Asynchronous Data Processing

At Sun Technologies, we bring you the expertise to integrate event-driven automation that works perfect well with traditional messaging middleware or iPaaS.

  1. Integrated Intelligent Automation Plugins: Document AI for customer onboarding and underwriting
  2. Integrated Gen AI in Workflows: Workbots capture data from excel spreadsheets, ERP, chat messages, folders, and attachments.
  3. Configured Approval Hierarchy & Controls: Faster data access and cross-departmental decisioning for lending
  4. Automated Customer Support Workflows: Streamlined borrower relationship and account management

Challenge: Building a system that can handle up to 2 million messages per day

  • Legacy data is run on software and hardware housed in monolithic and tightly coupled environments
  • Massive cost incurred in hosting, managing, and supporting legacy messaging infrastructure
  • Difficult-to-find IT skills does not let non-technical staff to be participative in automating workflows
  • Legacy messaging services pose challenges of platform retirement and end-of-life
  • Legacy messaging systems built on batch-based architectures do not support complex workflow routing
  • Legacy architecture is designed for executing simple P2P request or reply patterns
  • The tightly coupled architecture does not support creation of new workflow patterns

How Our Solution Helped

  1. Our Cloud and Data architects examined the legacy data landscape to see how it can be made compatible with modern Intelligent Automation (IA) integrations
  2. We not only identified the right data pipelines but also launched them using No-Code App development
    • Replacing Legacy Messaging using Kafka or a similar event routing platform
    • Building and deploying applications that are always available and responsive
    • Integrating with multiple event brokers to enable new routing decisions
  3. Replaced manual processes with automated workflows in Trade Finance, Guarantee Management, Information Security, and Regulatory Compliance
  4. Our No-Code Change Management Consulting completely automats the building of Asynchronous, Heterogeneous Data Pipelines

The Possible Impact

  • 3X Speed of new event streaming adoption and workflow pipeline creation
  • Simple event streaming and publishing set-up takes 1 hour
  • New data pipelines can handle up to 2 million messages per day
  • New messaging layer capable of handling 5,000 messages per second
  • Cloud-Agnostic data streaming saving millions in licensing cost

 

Download More Case Studies

Get inspired by some real-world examples of complex data migration and modernization undertaken by our cloud experts for highly regulated industries.

Contact Your Solutions Consultant!

India Job Inquiry / Request Form

US Job Inquiry / Request Form

Apply for Job