Technical Lead - Data Warehouse & Data Integration
Blick Art Materials
- Highland Park, IL
- $120,000-150,000 per year
- Permanent
- Full-time
- Lead and guide engineers (primarily offshore) responsible for ETL development, data pipelines, and data warehouse loads.
- Build and maintain workflows for data ingestion, cleansing, transformation, and transfer using tools like SSIS, Python, ADF, Fabric.
- Support both legacy systems (on-prem SQL Server, SSRS) and modern platforms (Redshift, MS Fabric, Power BI), with a focus on enabling transition to Azure-based infrastructure.
- Be highly hands-on—capable of coding, debugging, reviewing, and delivering alongside your team.
- Drive sprint planning, backlog grooming, and progress reporting using Scrum practices and tools such as JIRA and dashboards.
- Lead end-to-end delivery of data initiatives, from requirement intake to production deployment and stakeholder sign-off.
- Proactively manage risks, remove blockers, and ensure delivery timelines and quality standards are met.
- Establish and enforce clean data architecture principles and practices for all pipeline development.
- Take ownership of data validation, cleansing, and transformation standards, especially where tools are currently lacking.
- Bring in practical experience or advocate for frameworks (e.g., Great Expectations, or custom tooling) to elevate data reliability and observability.
- Collaborate cross-functionally with Product Owners, Architects, Developers, Analysts, and Business Leaders to ensure that data delivery aligns with business needs.
- Act as a key liaison to stakeholders, representing the technical roadmap, delivery plans, and status updates.
- Partner with peers such as the Sr. BI Developer and Lead DBA to ensure alignment across reporting, infrastructure, and data modeling.
- Be a driving force in modernizing our DW/ETL stack—championing new tools, clean architecture, and cloud-first principles.
- Contribute to evaluation and implementation of platforms like Fabric, Azure Synapse, Redshift, Terraform, as appropriate.
- Explore and implement AI-enabled tooling or automation to improve data engineering velocity, testing, and monitoring.
- Bachelor’s degree in computer science, Information Systems, or equivalent experience.
- 7 to 10 years of experience in data warehousing, including 3+ years in a technical leadership role.
- Expert-level proficiency in at least one leading ETL tool such as SSIS, Azure Data Factory (ADF), Talend, Informatica, or AWS Glue.
- Strong experience with cloud platforms (AWS, Azure, or GCP), especially in data engineering and analytics services
- Strong experience with:
- SQL (T-SQL, PL/SQL), and Python
- Cloud data platforms like Redshift, and on-prem SQL Server
- Power BI and SSRS
- Agile delivery using Scrum and tools like JIRA
- Proven ability to lead delivery, manage priorities, and report progress in a structured, business-facing manner.
- Deep understanding of data quality, cleansing, and architecture best practices.
- Strong communication skills and the ability to work independently in a fast-paced, cross-functional environment.
- High ownership mindset—able to lead projects, coach team members, and make informed technical decisions.
- Experience with the following is preferred, but not required:
- Experience with modern data platforms like Redshift, Fabric, BigQuery, or Databricks.
- Familiarity with DevOps/CI-CD practices for data pipeline deployment.
- Understanding of data security, privacy, and governance frameworks.
- DataOps practices or AI-assisted data validation/testing frameworks.
- E-commerce or retail business intelligence use cases.
- $120,000- $150,000 per year + Incentives
- Medical/Dental/Vision Insurance
- 401K & Profit Sharing Plan
- Incentive Bonus Plans
- Paid Holidays & Paid Time Off
- Paid Parental Leave
- Short-Term/Long-Term Disability
- Training Opportunities
- Basic & Optional Life Insurance
- Employee Discount