
ETL Developer - AI/ML Integration (Hybrid)
Broadridge Financial Solutions
- New York City, NY
- $115,000-130,000 per year
- Permanent
- Full-time
- Architect, build, and optimize scalable ETL workflows and data models for high-volume, high-performance analytics environments.
- Integrate AI/ML solutions and automation tools to streamline and accelerate data mapping, transformation, and ETL processes across diverse enterprise projects.
- Collaborate closely with Clients, business stakeholders, and cross-functional technical teams to document data mapping requirements ensure data accuracy, and optimize workflows.
- Author and maintain comprehensive documentation, including data architecture diagrams, data flow lineage, mapping specifications, and reusable component repositories.
- Institute standardization practices for data mapping, enforce naming conventions, and promote reuse of modular ETL assets.
- Partner with data engineering and AI/ML teams to ensure effective integration of ETL pipelines with machine learning workflows and feature stores.
- Continuously evaluate and optimize ETL processes for efficiency, reliability, scalability, and alignment with evolving data strategies and technologies.
- Provide expert guidance for system integration, data migration, troubleshooting, and performance tuning to maintain data accuracy, quality, and operational excellence.
- Stay abreast of emerging trends in cloud data platforms (especially AWS), data modeling practices, and AI/ML innovations to drive continuous improvement across the data landscape.
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a closely related field.
- 5+ years’ experience in data engineering, data modeling, or database design for enterprise environments.
- Advanced proficiency with data modeling tools (e.g., Erwin, ER/Studio, dbt) and strong command of SQL and relational databases (Oracle, SQL Server, MySQL, PostgreSQL).
- Hands-on experience developing and maintaining ETL pipelines using industry-standard tools (Informatica, Talend, SSIS) and scripting (Python, Scala).
- Experience with cloud-based data warehouses (Snowflake, Redshift, Azure Synapse, BigQuery) and exposure to big data technologies.
- Solid foundation in data governance, data quality management, and metadata management principles.
- Exceptional analytical, organizational, and communication skills, with a proven ability to bridge technical and business domains.
- Demonstrated experience supporting AI/ML initiatives (e.g., enabling feature stores, integrating data science workflows, supporting model serving/pipelines).
- Experience with NoSQL databases and semi-structured data (e.g., MongoDB, Cassandra, DynamoDB).
- Familiarity with cloud-native ETL and orchestration tools.
- Proficiency in data integration scripting (Python, Scala) and ML-related data processing libraries.
- Understanding of Agile/Scrum methodologies and practices.
- Relevant certifications in data modeling, cloud platforms (e.g., AWS Database Specialty, Snowflake), or data management.
- Exposure to MLOps, feature engineering, and end-to-end AI/ML solution delivery.