Senior Data Engineer
DPR Construction
- Washington DC
- Training
- Full-time
- Participate in and collaborate with cross-functional Workgroups and functional teams to align Data Engineering efforts and resources with business goals and objectives.
- Drive strategic conversations with stakeholders to fully understand and document pain points and business requirements, define the key deliverables required to improve business processes, and develop required integrations and apply other data engineering approaches.
- Develop and maintain relationships with business stakeholders and a deep understanding of their processes, tools, and goals.
- Design, build, and maintain robust data pipelines and architectures, ensuring high availability and reliability.
- Develop complex data models and algorithms to extract insights from large Supply Chain, Procurement, and Construction datasets.
- Utilize Snowflake for efficient data storage, data warehousing, and Azure Data Factory for orchestrating and automating data workflows.
- Script and program in Python and use DBT for transformation tasks to optimize data processes.
- Responsibilities include solutions implementation from cloud-first thinking using Agile, Scrum, and Data Ops.
- Assemble data sets that meet functional / non-functional business requirements.
- Design and implement process improvements like automating manual processes, optimizing data pipeline from source to consumption, scaling the data infrastructure, etc.
- Collaborate closely with stakeholders to understand data requirements and translate these into technical solutions.
- Maintain data integrity and compliance, adhering to industry standards and best practices.
- Stay on top of emerging trends and technologies in data engineering and construction tech.
- Bachelor’s/Master’s in Computer Science or a related technical field.
- 4+ years prior experience as a Data Engineer/Integrations Engineer in a fast-paced, technical, problem-solving environment. (Internship experience does not apply)
- 2+ years of experience with a public cloud (AWS/Microsoft Azure/ Google Cloud)
- 2+ years of data warehousing experience (Redshift or Snowflake)
- 2+ years of experience with Agile engineering practices
- Experience with enterprise data lakes, data warehouses, data marts, and big data.
- Expert-level proficiency in Azure Data Factory, Integration Platforms, Python programming, DBT, and data modeling.
- Demonstrated experience with API development and management for data integration.
- Advanced SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with various databases.
- Strong analytic skills related to working with unstructured datasets.
- Strong project management and organizational skills.
- Experience with SQL, JSON and XML, LookML
- Knowledge of APIs, REST, and GraphQL.
- Experience with programming languages like Python.
- Exposure to the Construction Industry is a huge plus.
- Excellent communication skills to ask questions, clarify requirements, and engage with the team and stakeholders.
- Strong logic, reasoning, and critical thinking skills to solve problems as they arise.
- Must be an independent problem solver who can evaluate a situation and build solution options.