Boopathy Gopalsamy
2 min readSep 22, 2021

--

Mainframe Batch Transformation to Real-time Event Driven Serverless Architecture using AWS — Pattern# 1

Mainframe Batch

Traditionally most of the large scale enterprises using mainframe batch for large data processing workloads. The primary reason being to avoid contention with online processing and to save mainframe cost (MIPS savings) by running in a scheduled batch mode in non-peak hours.

Mainframe Batch to Cloud

As part of the cloud migration, especially for the Mainframe Batch workloads customers are evaluating the option for converting Batch to near real time cloud native workloads which would improve end user experience by providing real time insight to data.

Event Driven Serverless Strategy

Most of the mainframe batch workloads (Especially reporting apps) have have predominantly 3 functionalities 1. Processing data 2. Aggregating Data and 3. Summarize the data. AWS provides various serverless options such as AWS Lambda, Step Functions, DynamoDB which can utilized for implementing an end-end batch processing pipeline. Below is a sample architecture for reference based on the reporting use case.

  1. Batch files from On-Prem Mainframe transferred to S3 through various options (Sftp, AWS CLI, using third party tools etc..)
  2. S3 event trigger invokes the lambda function and the lambda in turn triggers the step functions
  3. Business logic can be coded in the step function or you can invoke any of the AWS services such as AWS Lambda, AWS Batch etc.. to implement business logic
  4. Data Persistence can be possible through dynamo db which is a better fit for serverless application scaling

--

--

Boopathy Gopalsamy

Sr.Architect — Mainframe Capacity Management, Modernization and Cloud Migration