Experienced data engineer with over 10+ years of expertise in software development. Specializes in ETL development, DWH design, production support, and project maintenance. Adept at leveraging technical skills to optimize data workflows and ensure seamless data integration. Demonstrates a proven track record of delivering high-quality solutions that meet business objectives.
Overview
15
15
years of professional experience
Work History
Senior ETL Developer
The Hartford Insurance
07.2023 - Current
Designing and optimizing ETL workflows to support insurance data processing and analytics for Group Benefits
Implemented robust data validation processes to maintain high data quality, accuracy, and consistency across the entire data pipeline
Collaborated closely with data analysts, business stakeholders, and IT teams to gather requirements, provide technical expertise, and deliver data solutions that align with business objectives
Designed and developed ETL workflows using Informatica PowerCenter, Informatica Data Exchange (IDX), and Informatica IICS, enabling seamless data integration across enterprise systems
Built and optimized scalable ETL pipelines to extract, transform, and load financial and insurance data from multiple sources into AWS S3, and Oracle databases
Implemented automation scripts using Python and Unix Shell Scripting, streamlining data ingestion, validation, and reporting processes, reducing manual efforts by 40%
Integrated AWS S3 as a staging area for high-volume structured and unstructured data, improving data processing efficiency and scalability
Collaborated with business and analytics teams to define data strategies, KPIs, and reporting frameworks, ensuring alignment with insurance regulatory requirements
Performed performance tuning on ETL workflows by optimizing SQL queries, indexing, and Informatica transformations, reducing query execution time by 30%
Automated and scheduled ETL workflows using TIDAL, Autosys, and Informatica Workflow Manager, ensuring 99.9% uptime and operational stability
Provided production support by troubleshooting ETL failures, resolving data discrepancies, and improving error-handling mechanisms to ensure data accuracy and integrity
Designing and optimizing ETL workflows to support financial data processing and analytics
Responsible for developing scalable ETL pipelines, integrating cloud-based data solutions, and ensuring high data quality and performance efficiency
Collaborate with cross-functional teams to drive data-driven decision-making and maintain compliance with financial regulations
Designed and optimized ETL workflows to support the enterprise data warehouse and financial analytics, enabling better data-driven decision-making
Developed scalable and efficient ETL pipelines using Informatica PowerCenter and SQL-based transformations, improving data processing speed and reliability
Implemented end-to-end ETL solutions with Informatica PowerCenter, ensuring seamless data integration and transformation across multiple financial systems
Collaborated with business stakeholders to define data strategies, KPIs, and reporting solutions, aligning data processes with business objectives
Performed performance tuning and optimization of ETL workflows, reducing query execution time by 30%, enhancing overall system performance
Ensured data governance and regulatory compliance by implementing best practices for data security, quality, and integrity in alignment with financial industry regulations
Automated several business reports, eliminating manual efforts and improving operational efficiency, allowing teams to focus on strategic analysis
Toronto, Canada
Environment: Informatica Powercenter 10.5, Oracle SQL Developer, Zena, Postgres, Putty
Informatica Developer
Aviva Canada
04.2020 - 06.2022
Involved in MDM Cloud migration project where we extracted the MDM data from the Informatica Hosted Cloud MDM database (Oracle) to EDH (HIVE) and created mappings to provide extract to multiple downstream systems
Understanding the Source System, Analyzing users’ requirements
Involved in creating Low-level design documents based on the HLD
Creating complex mapping and introduced slowly changed dimension Type accordingly
Build and follow infrastructure design and best practices for Informatica 10x deployment
Used most of the transformations such as the source Qualifier, Expression, Aggregator, Connected & Unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner and Update Strategy
Used Workflow Manager for Workflow and Session Management, database connection management and scheduling of jobs to be run in the batch process
Imported data from various Sources transformed and loaded into Data Warehouse Targets using informatica
Created technical design and worked for Informatica upgrade, patching, workflow migration and service continuity
Adopt to Agile methodologies for projects deliverables and day to day activities
Worked with project team and develop application for migration build & deployment
Developing Informatica mappings & tuning them when necessary
Creating complex Informatica mappings, sessions and workflow designs for ETL transformations
Build the code using Informatica tool and do the performance tuning debugger
Verify and compare the target data (Oracle database) against source data
Ensuring Data Integrity after executing Informatica Workflows
End to End, Performance testing to improvise performance, Live Data load generation and Performance snag diagnosis in business-critical applications
Query optimization using various methodologies and minimize database calls
Work on Performance Improvement for best Optimal Performance
Automation and Scheduling: Scheduling of the Enterprise job in ZENA
Environment: Informatica BDM 10.4, Informatica 10.2, Oracle SQL developer, Hive, MS SQL, DB2 Server, Zena, Postgres, Putty
Informatica Developer
Aviva Canada
03.2019 - 04.2020
Involved in Re-designing and Development of ETL process for Aviva Finance to extract Guidewire Billing Center billing related data from Oracle Financial staging tables and create multiple ETL batches for payment Disbursements like ETL Disbursement Cheque, ETL Disbursement Credit Card, EFT etc
Understanding the Source System, Analyzing users’ requirements
Involved in creating Low-level design documents based on the HLD
Creating complex mapping and introduced slowly changed dimension Type accordingly
Build and follow infrastructure design and best practices for Informatica 10x deployment
Used most of the transformations such as the source Qualifier, Expression, Aggregator, Connected & Unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner and Update Strategy
Used Workflow Manager for Workflow and Session Management, database connection management and scheduling of jobs to be run in the batch process
Imported data from various Sources transformed and loaded into Data Warehouse Targets using informatica
Created technical design and worked for Informatica upgrade, patching, workflow migration and service continuity
Adopt to Agile methodologies for projects deliverables and day to day activities
Worked with project team and develop application for migration build & deployment
Developing Informatica mappings & tuning them when necessary
Creating complex Informatica mappings, sessions and workflow designs for ETL transformations
Build the code using Informatica tool and do the performance tuning debugger
Verify and compare the target data (Oracle database) against source data
Ensuring Data Integrity after executing Informatica Workflows
End to End, Performance testing to improvise performance, Live Data load generation and Performance snag diagnosis in business-critical applications
Query optimization using various methodologies and minimize database calls
Work on Performance Improvement for best Optimal Performance
Automation and Scheduling: Scheduling of the Enterprise job in ZENA
Involved in designing and development of the Data Warehousing project for the improvement of Guest Education Centre (GEC)
Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer in Informatica Power Center 9.5
Developed data integration architecture (sources, staging, data and process flows, business rules and transformations, and targets) for client engagements
Integrated data between multiple systems using Informatica Power Center version 9.5
Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP
Create work tables and indexes for intermediate processing as part of ETL process
Create and update the Informatica parameter files which are using during workflow processing
Developed and deployed UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands
Create and install packages as part of moving code across different environments
Troubleshoot technical issues/Service requests on the production platform
Environment: Informatica Power Center, 9.5 (Designer, Workflow Manager and Monitor), Oracle RDBMS, MS SQL Server, MS Excel, UNIX shell, TIDAL, Cyber Duck
Informatica Developer
Bank of America
03.2016 - 02.2017
Involved in designing and development of the Data Warehousing project for the improvement of Account Management System
This system was developed to provide the required reports to the managers to assist in deciding the potential customers who can afford the loan
Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance
Created mappings with heterogeneous sources like flat files, MS Access, Oracle databases and created targets in Oracle data warehouse using Informatica PowerCenter 9.5.1
Built reusable transformations for recurring business logics using mapplets and used them in multiple mappings
Designed and developed test cases for unit and system testing
Modified Unix Shell Scripts for executing the Informatica workflows
Created parameters and variables for incremental data loading effectively using Informatica workflow manager
Created event raise, event-wait tasks for maintaining dependencies between workflows
Environment: Informatica Power Center, 9.5/9.1 (Designer, Workflow Manager and Monitor), Oracle RDBMS, MS SQL Server, MS Excel, UNIX shell, Autosys
ETL Developer
ICICI PVT LTD
08.2012 - 12.2015
The project involved in the development and implementation of incremental data load from CBS OLTP system to the Oracle Database in the Business Intelligence Environment
Designing of Data models based on Star Schema - Facts and Dimensions
Involved in performance tuning and query optimization
Used Informatica Workflow Manager and Workflow Monitor to create, schedule, monitor sessions and send pre and post session emails to communicate success or failure of session execution
Involved in designing the database schema and ER diagram depicting relationships between various tables in the database
Wrote PL/SQL procedures and triggers to carry out database maintenance tasks
Created and used different type of Joins, Indexes, and Synonyms
Involved in Performance Tuning and Query Rewrite
Environment: Oracle 10g, Oracle SQL/PL/SQL, SQL server, TOAD, Microsoft Visio, MS Excel
Report Developer
BAJAJ AUTO LTD
07.2010 - 06.2011
Participated in system analysis and data modelling includes creating tables, views, indexes, synonyms, triggers, functions, procedures, cursors and packages
Writing queries and stored procedures in PL/SQL to fetch data from the OLTP system and executed at regular intervals of time
Developed PL/SQL scripts to validate and load data into interface tables
Environment: Oracle10g, MS Excel, MS PowerPoint, SQL
Loader
Education
MBA -
SMU
01.2015
B.Tech -
UPTU
07-2010
Skills
ETL Tools: Informatica PowerCenter 10x/9x
BDM Informatica 102/104
IDQ
Informatica DX
Informatica Intelligent Cloud Service (IICS)
Reporting Tools: JIRA
SSRS
Business Objects
TOAD
RDBMS: SQLServer2008/2005
Oracle11g/10g/9i
MS-Access2003/2007/2010
HIVE
IMPALA
Postgres
DB2
Applications: Cyber duck
Microsoft Word
Excel
Outlook
Visio
Power Point
Cyberark
Operating Systems: WINDOWS 7/ XP/2003/2000
MS-DOS
LINUX
Scheduling Tool: ZENA
TIDAL
Autosys
Technical Summary
Hands on experience in Informatica for over 10 years.
Hands on experience in Informatica BDM for over 3+ years
Extensively worked on ETL DWH projects.
Hands on experience in Teradata database for over 5 years.
Hands on experience in SQL Developer database for over 3 years.
Hands on experience in Hive/Impala database for over 3+ years.
Hands on experience in Postgres database for over 3+ years.
Extensively followed Agile methodology to in data to day activity.
Hands on experience in RDBMS.
Hands on experience on UNIX/Linux shell scripting for over 9 years.
Extensively worked on Informatica Designer components – Source Analyzer, Warehouse designer, Transformation Developer, Mapplet and Mapping designer.
Having Hands on experience on Dimensional Modeling Techniques Star and Snowflake schemas.
Experience in job schedulers – ZENA, Autosys, TIDAL, CA Workload Automation and enterprise job scheduler.
Experience in Query optimization using various methodologies and minimize database calls.
End to End, Performance testing to improvise performance in business-critical applications.