Senior Data Engineer
Evolve Vacation Rental
- Denver, CO
- $141,000-172,000 per year
- Permanent
- Full-time
- Collaborate as a trusted partner with business stakeholders, data analysts, data engineers, analytics engineers, and data architects to build a solid data foundation
- Mentor data engineers, analytics engineers, and data analysts around the organization to aid in growth, ensure best practices and similar business rules are consistently applied when turning data into information
- Translate ambiguous or complex business logic into technical solutions.
- Build, support, and optimize data pipelines using tools like Fivetran, dbt, Prefect, and Python to move data to/from Snowflake, SaaS APIs, and other data stores.
- Design, modify, and implement data structures in Snowflake to support data ingestion, integration, and analytics
- Curate and transform data into appropriate structures for analytics and data science purposes using SQL, Python, Snowflake scripting, and data transformation tools like Matillion and dbt.
- Design and implement processes to automate monitoring and alerting on source data quality, data ingestion and transformation processes, and the overall health of our data infrastructure
- Develop a deep understanding of the data you are working with, relevant business processes, strategies, and goals
- Maintain and optimize Evolve's cloud data platform, environment, and infrastructure by solving problems and tuning performance for underlying data structures, systems, and processes
- Manage the deployment and monitoring of scheduled data ingestion and transformation processes
- Research, recommend, and implement new and enhanced tools and methods that support Evolve's data ecosystem
- Lead definition of quality standards for ELT, Python, Prefect, Snowflake, Fivetran, dbt, and AWS as well as documenting and training other teammates on these standards
- Perform collaboration duties such as code reviews and technical documentation for peers
- Provide advanced data ingestion and pipeline support.
- Partner with stakeholders to develop scalable solutions for new and modified data sources
- Prioritize multiple tasks and projects efficiently, and clearly communicate progress and status
- 8+ years in a developer, architect, engineer, or DBA role working with large data sets
- Subject matter expert in data ingestion concepts and best practices
- Subject matter expert in data pipeline design, development and automation
- Comfortable working with DevOps teams to optimize CI/CD pipelines
- Advanced SQL skill is required
- Experience coding with Python is required
- Experience with Snowflake, Fivetran, dbt, Tableau, and AWS is preferred
- Experience with Git version control and repository management in Gitlab
- Experience with advanced ELT tool administration (code deployment, security, setup, configuration, and governance)
- Experience with enterprise ELT tools like Fivetran, dbt, Matillion or other similar ETL/ELT tools
- Expertise with one or more cloud-based data warehouses is required such as Snowflake
- Expertise extracting raw data from APIs using industry standard ingestion techniques
- Ability to explain complex information and concepts to technical and non-technical audiences
- Enjoy supporting team members by sharing technical knowledge and helping solve problems
- Enjoy a connected, collegial environment even though we are remote, hybrid, and on-site
- Familiarity with documenting data definitions and code
- Driven by a fast-paced, energetic, results-oriented environment
- Exemplary organizational skills with the ability to manage multiple competing priorities