Data Engineer / QA Analyst
Aptive
- Alexandria, VA
- Permanent
- Full-time
- Design and maintain cloud-native data pipelines. Ingest, cleanse, and transform diverse datasets using AWS services and infrastructure-as-code.
- Implement automated test suites (unit, integration, regression, performance, accessibility) and embed them in the CI/CD pipeline for every build.
- Validate data quality and integrity. Build checks, anomaly detection, and reconciliation reports to keep accuracy and completeness above contract thresholds.
- Conduct performance and load testing; analyze results and partner with engineers to tune latency, throughput, and scalability.
- Embed compliance controls. Apply Section 508/WCAG, FedRAMP-Moderate, and NIST RMF requirements to data workflows and test criteria.
- Monitor production health. Use dashboards, logs, and alerts to track pipeline uptime, test coverage, and key quality metrics; act quickly on incident signals.
- Collaborate daily with DevSecOps, application engineers, designers, and product managers to refine data models, testable requirements, and release plans.
- Document schemas, data flows, and test plans; provide clear status updates and quality reports to technical and non-technical stakeholders.
- Support release and hyper-care windows, performing root-cause analysis and driving continuous improvements in reliability and security.
- Coach teammates on data-engineering and QA best practices, fostering a culture of automation, measurement, and user-centered quality.
- 5–8 years combined experience in data engineering, QA automation, or closely related roles, including 2 + years on cloud-hosted systems.
- Bachelor’s degree in computer science, engineering, data science, or a similar discipline (or equivalent practical experience).
- Hands-on skill building ETL/ELT pipelines on AWS (e.g., Glue, Lambda, Redshift, S3) with Python, SQL, or PySpark.
- Proven ability to design and run automated test suites (unit, integration, regression, performance) integrated into CI/CD workflows such as GitHub Actions or Jenkins.
- Experience monitoring pipeline health and application performance with logging, metrics, and alerting tools (CloudWatch, Grafana, or equivalent).
- Working knowledge of Section 508/WCAG accessibility checks and FedRAMP Moderate / NIST RMF security requirements.
- Strong analytical, problem-solving, and communication skills; comfortable collaborating with engineers, designers, and product managers.
- Ability to obtain and maintain a Public Trust clearance.
- Legal authorization to work in the U.S.
- Prior experience supporting Federal or DoD cloud programs.
- AWS certifications such as AWS Data Analytics–Specialty or Solutions Architect – Associate.
- QA or testing credentials: ISTQB Advanced Test Analyst, Certified Agile Tester, or DHS Trusted Tester v5 for Section 508.
- Hands-on expertise with big-data and analytics tools (PySpark, Glue ETL, Redshift, Athena) and data-quality frameworks (Great Expectations, dbt tests).
- Experience shepherding systems through FedRAMP Moderate / DoD IL-4 ATO processes and authoring security-control evidence.
- Familiarity with DORA/Accelerate metrics and SRE practices to improve pipeline reliability and lead time.