Senior Staff Data Engineer - Hybrid
The Hartford
- Charlotte, NC
- $132,400-198,600 per year
- Permanent
- Full-time
- Demonstrate expertise in Snowflake’s cloud native architecture and Microsoft SQL server technology.
- Ability to create, troubleshoot, enhance complex code in Snowflake and SQL server.
- Experience in building data pipelines (ELT) with Snowflake cloud data platform using AWS compute (EC2) and storage layers (S3).
- Experience in building the Snowflake SQL Data warehouse using the Virtual warehouses based on best practices.
- Hands on experience working with Talend or SSIS as an ELT tool with Snowflake and SQL server Data Integration.
- Implement and leverage Materialized views, Data Sharing, Clone Feature and Performed Dynamic data Masking.
- Have a solid understanding of delivery methodology (SDLC) and lead teams in the implementation of the solution according to the design/architecture.
- Hands on experience with Snow SQL, Stored Procedures, UDF’s using JavaScript, SnowPipe and other snowflake utilities.
- Experience in Data migration from RDMS to snowflake cloud Data warehouse.
- Experience in data security and data access controls and design.
- Solution the data loading and unloading activities to/from Snowflake.
- Experience working with Data Lakes loading disparate data sources- Structured, semi-structured data (Flat files, XML, JSON, Parquet) and unstructured data.
- Experience in building data pipelines using Talend and automation of data ingestion including change data capture (CDC).
- Integration of data pipelines with source control repository and build CI/CD pipeline and DevOps.
- Experience in Performance Tuning of Talend / SQL Agent Jobs to reduce the CPU time/load timing.
- Deeper Knowledge on SnowFlake License model and their continuous data protection life cycle.
- Architect reusable Talend components such as Audit and reconciliation of jobs.
- Researches and evaluates alternative solutions and recommends the most efficient and cost-effective solution for the systems design.
- Support and quickly respond to Production issues and requirements clarifications.
- Coordinate as needed between multiple disciplines such as, Architects, Business Analysts, Scrum Masters, and Developers to get technical clarity leading to design, develop and implementation of business solution.
- Oversight of quality and completeness of detailed technical specifications, solution designs, and code reviews as well as adherence to the non-functional requirements.
- Experience in delivering technical solutions in an iterative, agile environment (Scrum/Kanban)
- Participate as active agile team member to help drive feature refinement, user story completion, code review, etc.
- Identify, document, and communicate technical risks, issues and alternative technical solutions discovered during project.
- Collaboration with a high-performing, forward-focused team, Release train engineer, Product Owner(s) and Business stakeholders.
- Ability to work on innovative and new projects with a "fail-fast" approach to provide optimal solutions that bring the most value to the business.
- Passion for learning new skills and the ability to adjust priorities on multiple projects based on changing demands/needs.
- Bachelor’s degree
- 5+ years in Snowflake in AWS and Talend Data Integrator or other BI/ETL tools.
- 7+ years of hands-on experience in Data warehouse and Data Integration (ELT/ETL)
- 7+ years of Proficiency in ETL with Microsoft Business Intelligence (SSIS, SSRS) and other tools.
- 2+ years of hands-on experience with Data Visualization (preferably Tableau).
- Strong background and problem-solving skills with Enterprise Data warehouse, ETL/ELT development, Database Replication, metadata management and data quality.
- Hands-on experience in all phases of SDLC developing ETL solutions using T-SQL code, Stored Procedures, SSIS.
- Strong data warehouse applications knowledge in preferably in financial/insurance domain is required.
- Knowledge of version control tools, CI/CD pipeline and DevOps tools like GitHub, Jenkins nexus and uDeploy.
- Knowledge of Data Profiling, Data Modeling and Database design is key to this role.
- Must be authorized to work in the US without company sponsorship.