Professional Summary
Detail-oriented IT professional with over 15 years of experience in strategizing, designing, and delivering Enterprise Data Warehouses, Business Intelligence, and Analytics systems within the Financial Industry. I bring strong expertise in Big Data enablement, including Data Architecture, Data Sourcing, Data Cataloging, Data Curation, Data Preparation, Data Blending, Data Provisioning, and Data Analysis & Consumption. I am adept at planning, developing, and executing strategies that have successfully launched new products, opened lucrative channels, and driven significant revenue growth. My domain experience spans banking, healthcare, and retail, with a particular focus on AI and collaboration with large language models.
Key Achievements
Credit Card Fraud Detection: Collaborated with subject matter experts to analyze credit card fraud scenarios, identify issues, and design solutions that optimized alert generation for regulatory reporting.
Enterprise Data Platform (EDP) Services: Designed Spark-based configurable data services for the EDP, encompassing data ingestion, control validation, quality assurance, standardization, surrogate key management, CDC Type 2, job orchestration, audit logging, job lineage, operational control, and data provisioning.
Data Quality Engine Development: Developed and deployed an inline Data Quality Engine for a major financial institution, conducting daily scans of billions of records to facilitate regulatory audits. This engine provided actionable evidence of data quality, streamlining compliance processes and enhancing regulatory adherence.
Cloud-Native Architectures: Designed and implemented cloud-native architectures that meet business requirements with minimal risk. This includes designing a security model, data anonymization, multi-tenant support, and a secured API strategy for both client and internal levels.
Real-Time Messaging Architecture: Created a real-time messaging architecture for operational auditing, integrating mainframe-originated messages with AWS DMS, and processing them through Spark Streaming.
Reusable Purge Strategy: Designed and implemented a reusable purge strategy across the bank’s infrastructure to ensure efficient data management and compliance.
Python Framework for Snowflake Integration: Designed and implemented a robust Python framework that triggers SQL queries, generates 15MB files, and loads them into Snowflake staging for subsequent processing into permanent tables.
ETL Framework Development: Developed and implemented an ETL framework that triggers DataStage and Informatica jobs from a UNIX server, performs comprehensive quality control checks, and creates detailed DS log files.