
Senior Data Warehouse Engineer
Bill.com
- Draper, UT
- $121,000-145,200 per year
- Permanent
- Full-time
- Build and manage robust data pipelines that support scalable and efficient operations across various data platforms.
- Work closely with different teams to translate business requirements into sustainable technical solutions, facilitating effective data usage and integration
- Participate in designing, implementing, and refining data models and database schemas to effectively support business functionalities and operations.
- Collaborate on migration projects to modern data platforms like Trino, using Iceberg as the table format, and enhance data flow and architecture to improve data reliability, efficiency, and quality.
- Engage in continuous optimization of data models and pipelines, contributing to infrastructure migrations and improvements in CI processes and Airflow orchestrations.
- Develop reusable classes, components, and modular scripts to automate and enhance daily tasks and workflows, thereby improving efficiency for both stakeholders and team operations.
- BS/BA in Computer Science, Information Systems, Mathematics, or a related technical field, or equivalent practical experience.
- At least 3 years of experience in data warehousing roles, demonstrating expertise in large-scale data architecture design, implementation, and maintenance.
- Proficient in advanced SQL and familiar with database management practices, including experience with cloud data warehouses like Snowflake, Redshift, or similar platforms.
- A plus would be experience working in financial services and/or SaaS companies, with a strong understanding of industry-specific data requirements and compliance issues.
- Python: Must be adept at scripting in Python for data manipulation and integration tasks, with experience in Object-Oriented Programming (OOP).
- SQL, dbt, and Data Modeling
- Must be adept with advanced SQL techniques for querying, data transformation, and performance optimization.
- Familiarity with dbt (data build tool) for managing data transformation workflow.
- Must have a strong understanding of data modeling best practices, expertise in normalization and denormalization techniques to optimize analytical queries and database performance.
- ETL/ELT Processes: Extensive experience in designing, building, and optimizing ETL/ELT data pipelines, including both batch and streaming data processing.
- Version Control: Heavy experience with version control, branching, and collaboration on GitHub/Gitlab.
- Data Visualization: Familiarity or interacted with Tableau or similar tools
- Collaboration and Communication: Excellent documentation skills and the ability to work closely with diverse teams to translate business requirements into technical solutions.
- DevOps Practices: Knowledge of unit testing, CI/CD, and repository management.
- Technologies: Familiarity with Docker and cloud technologies such as AWS.
- Prompt Engineering for LLMs: Experience with crafting and refining prompts for LLMs like GPT is a plus.
- 100% paid employee health, dental, and vision plans (choose HMO, PPO, or HDHP)
- HSA & FSA accounts
- Life Insurance, Long & Short-term disability coverage
- Employee Assistance Program (EAP)
- 11+ Observed holidays and wellness days and flexible time off
- Employee Stock Purchase Program with employee discounts
- Wellness & Fitness initiatives
- Employee recognition and referral programs
- And much more