
EverCommerce - Head of Data Infrastructure & Engineering
- Denver, CO
- $180,000 per year
- Permanent
- Full-time
- Design and implement scalable, reliable, and efficient data infrastructure solutions to support the organization's data processing, storage, and analytics needs.
- Collaborate with data engineers, software engineers, and data scientists to understand data requirements and translate them into technical specifications and architecture designs.
- Develop and maintain data pipelines for ingesting, processing, and transforming large volumes of data from various sources, ensuring data quality and integrity.
- Optimize data storage and retrieval processes, including database schema design, indexing strategies, and query optimization, to enhance performance and reduce latency.
- Implement monitoring, alerting, and logging mechanisms to proactively identify and troubleshoot issues in the data infrastructure, ensuring high availability and reliability.
- Stay updated on emerging technologies, tools, and best practices in data engineering and infrastructure management, and recommend adoption as appropriate.
- Work closely with stakeholders to understand business requirements and priorities.
- Technical Expertise: Lead the design and implementation of scalable data infrastructure using cloud-based platforms (e.g., Data Lake, Lakehouse, Redshift, Databricks, Snowflake, BigQuery, etc.) for EverCommerce.
- Data Pipelines: Build the overall strategy for data lake / Lakehouse, data ingestion, data processing and develop and manage ETL processes and data integration solutions using modern tools (e.g., dbt, Fivetran, Airflow, Glue, AWS Data Pipelines, etc.).
- Optimization: Continuously monitor and improve the performance of data systems, ensuring high availability and security. This includes building engineering tools and framework using automation to optimize the storage, compute and ingestion.
- Monitoring & Observability: Evaluate, test, implement, and roll out data monitoring and observability tools to proactively catch data pipelines and infrastructure issues.
- Team Leadership: Provide technical leadership and mentorship to junior members of the team, fostering a culture of collaboration, learning, and continuous improvement.
- Innovation: Stay updated on the latest trends and technologies in data engineering and implement best practices.
- Collaboration: Work closely with data architects and other stakeholders to ensure data systems align with business needs.
- Vendor Management: Build and maintain strong relationships with data platform vendors (AWS, Azure, GCP, Fivetran, Databricks, Snowflake etc.)
- Bachelor’s or Master’s degree in computer science, engineering, business analytics, data science, or related fields and experience.
- 12+ years of overall experience in implementing and managing data infrastructure and platforms.
- 7+ years of experience in data & analytics leadership role in managing cross functional teams that creates, engineers, builds and maintains data infrastructure and platforms.
- 5+ years of strong experience in data engineering or infrastructure roles, with a focus on designing and building scalable modern data platforms (data lakes, lake houses etc.) with a deep understanding of cloud-base data platforms.
- Proficiency in programming languages such as Python, Java, or Scala, and experience with data processing frameworks such as Apache Spark, Snowflake, Apache Flink, or Hadoop.
- Strong understanding of distributed systems, cloud computing platforms (e.g., AWS, GCP, Azure), and containerization technologies (e.g., Docker, Kubernetes).
- Deep experience with relational databases and data warehousing technologies (e.g., PostgreSQL, MySQL, Redshift, Snowflake) and NoSQL databases (e.g., Cassandra, MongoDB).
- Hands-on experience with data pipeline orchestration tools such as Apache Airflow, Luigi, or Prefect.
- Excellent problem-solving skills and the ability to troubleshoot complex issues in a distributed environment.
- Strong communication skills and the ability to collaborate effectively.
- Experience with Agile methodologies and DevOps practices is a plus.
- Proven ability to manage and lead team of data engineers, data ops engineers.
The EverCommerce team is distributed globally, with teams in the U.S., Canada, the U.K., Jordan, New Zealand, and Australia. With a widely distributed team, we are used to working remotely across different time zones. This role can be based anywhere in the United States or Canada – if you’re close to one of our offices, we can set you up in-office or you can work 100% remotely. Please note that you must be eligible to work without sponsorship to qualify for this position, and this role may require travel to our Corporate Headquarters in Denver, Colorado, or to other office locations around North America.Benefits & Perks:
- Flexibility to work where/how you want within your country of employment – in-office, remote, or hybrid
- Continued investment in your professional development
- Day 1 access to a robust health and wellness benefits package, including an annual wellness stipend
- 401k with up to a 4% match and immediate vesting
- Flexible and generous (FTO) time-off
- Employee Stock Purchase Program