
Data Engineer
- Indianapolis, IN
- Contract
- Full-time
Location: Indianapolis, IN (Onsite)
Job Type: Contract
Skillset Required: Databricks + FabricWe are currently seeking experienced professionals with expertise in Databricks and Microsoft Fabric for an upcoming project in Indianapolis, IN. This is a contract opportunity.
1. Role Objective
- Build, operate, and govern production-grade data and analytics solutions that span Databricks (Pipelines, Delta Live Tables, Genie, Agent Bricks) and Microsoft Fabric (Data Engineering, Lakehouse, Data Warehouse, Power BI).
- Deliver fast, reliable, and cost-optimized data flows while maintaining enterprise-grade security and observability.
Architecture & Design
- Design end-to-end ingestion, transformation, and serving layers across Databricks and Fabric.
- Define datamodel standards (star schema, CDC, semistructured handling).
- Implement CICD-ready pipelines using Databricks Pipelines/Jobs API and Fabric pipelines (SparkSQL, notebooks).
- Enable realtime streaming (Event Hub/Kafka → Structured Streaming → Fabric Lakehouse).
- Register assets in Unity Catalog & Fabric Lakehouse catalog; enforce rowlevel security, data masking, and Purview lineage.