Design, build, and maintain robust, scalable ETL processes for data acquisition, transformation, and delivery
Collaborate with Data Analysts, Architects, and Compliance stakeholders to translate requirements into efficient ETL solutions
Ensure data quality, integrity, lineage, and traceability across large-scale datasets
Optimize ETL workflows for performance, scalability, and maintainability
Perform data profiling, troubleshooting, and root cause analysis to resolve pipeline or quality issues
Deploy, monitor, and support ETL jobs across development, QA, and production environments
Contribute to the development of technical standards, documentation, and operational best practices
Hands-on ETL development experience in enterprise data environments
Proven expertise in designing and optimizing complex data pipelines
Strong understanding of data modeling, data integration, and data warehousing concepts
Experience with modern ETL tools (e.g., Informatica, Talend, DataStage, or equivalent)
Advanced SQL skills and familiarity with scripting (Python, Shell, etc.) for workflow automation
Knowledge of data governance, quality, and lineage frameworks
Experience working in regulated industries (financial services or similar) is a strong plus
Excellent performance tuning and troubleshooting skills
Strong communication skills and proven success in agile, cross-functional teams
ETL development, DataStage, Informatica, Talend, Data pipeline design, ETL workflow optimization, Data modeling, Data integration, Data warehousing, SQL (advanced), Scripting (Python, Shell), Data quality, Data governance, Data lineage, Performance tuning, Troubleshooting, Regulated industry experience (financial services), Communication, Agile teamwork, Cross-functional collaboration.