
Lead Data Engineer - AWS & Python
- Columbus, OH
- Permanent
- Full-time
- Provide direction, oversight, and coaching for a team of entry-level to mid-level engineers working on basic to moderately complex tasks.
- Executes solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into effective visual solutions.
- Work in an Agile development environment with team members, including Product Managers, SRE Engineers.
- Develop secure, high-quality production code, review and debug code written by others, and drive decisions influencing product design, application functionality, and technical operations.
- Serve as a subject matter expert in one or more areas of focus and actively contribute to the engineering community as an advocate of firmwide frameworks, tools, and practices of the end to end Development Life Cycle.
- Influence peers and project decision-makers to consider the use and application of leading-edge technologies.
- Stay current with industry trends and emerging technologies in Data Management, Artificial Intelligence and Machine Learning.
- Formal training or certification on data engineering concepts and 5+ years applied experience.
- 5+ years of demonstrated coaching and mentoring experience.
- Hands-on experience in writing code using Python libraries such as Pandas, Boto3, PySpark, and Jupyter Notebooks, along with 3+ years of AWS services experience including Glue, S3, Kafka, and Kubernetes.
- Hands-on practical experience delivering system design, application development, testing, and operational stability.
- Collaborate with various stakeholders and independently tackle design and functionality challenges with minimal oversight.
- Proficient in automation and continuous delivery methods.
- Skilled in resolving code issues and proficient in Git for managing repositories and team collaboration.
- Experience and proficiency across the data lifecycle.
- Evaluates and reports on access control processes to determine effectiveness of data asset security with minimal supervision.
- Advanced understanding of agile methodologies, Application Resiliency, and Security.
- Bachelor's degree in data science, Computer Science, Information Systems, Statistics, or a related field.
- Strong Python experience, especially in the context of developing solutions for large financial platforms.
- Strong experience in AWS serverless services including expertise in using AWS Step functions, Lambda , Dynamodb and NoSQL database services
- Experience using AWS Lake Formation service.
- Familiarity with Snowflake/Databricks or other PaaS offerings.
- Experience with metadata management and machine learning.
eQuest