SRE Data Engineer
GEICO
- Chevy Chase, MD
- $76,000-236,500 per year
- Permanent
- Full-time
- Utilize programming languages such as SQL, C# or other object-oriented languages, Java, JS, Python, various SQL/NoSQL databases and data management tools
- Utilize various analytics and data processing tools to cleanse, stage, transform, prepare, and curate data to gain in-depth insight and modeling capability
- Engage in cross-functional collaboration throughout the entire software lifecycle
- Build product definition and leverage technical skills to drive towards the right solution
- Build product definition and leverage your technical skills to drive towards the right solution
- Lead design sessions and code reviews with peers to elevate the quality of engineering across the organization
- Define, create, and support reusable application components/patterns from a business and technology perspective
- Mentor other engineers
- Consistently share best practices and improve processes within and across teams
- Analysis and Estimation skills
- Strong problem-solving ability
- Strong oral and written communication skills
- Ability to excel in a fast-paced, startup-like environment
- Knowledge of CS data structures and algorithms
- Programing experience with one or more languages such as Python, Java, Go.
- Advance experience with Enterprise Reporting Tool such as Power BI or Grafana
- Familiarity with developing Data Pipeline, ETL/ELT, utilizing tooling of Big Data Platform, Datalake, Synapse, Airflow, or Snowflake
- Familiarity with HTML-5, XML, and JSON
- Understanding of Web Service APIs with technologies such as Rest and GraphQL
- Understanding of security protocols and products such as of Active Directory, Windows Authentication, SAML, OAuth
- Understanding of the architecture and design (architecture, design patterns, reliability, and scaling) of new and current systems
- Understanding of Datacenter structure, capabilities, and offerings, including the Azure platform, and its native services
- Advanced understanding of DevOps concepts including Azure DevOps framework and tools
- Advance understanding of infrastructure as code
- Knowledge of Git, CICD and developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication)
- Knowledge of Open-Source Framework
- Advance experience with App Insight, Grafana, Splunk, Dynatrace etc.
- Advanced understanding of Vulnerability reports (Veracode scans)
- Advanced understanding of monitoring concepts and tooling
- Advanced understanding of security protocols and products
- Utilize various analytics and data processing tools to cleanse, stage, transform, prepare, and curate data to gain in-depth insight and modeling capability
- Advance experience with shell scripting
- Solid understanding of container orchestration services including Docker and Kubernetes, and a variety of Azure tools and services
- Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication)
- Experience with relational and non-relational database technologies such as SQL, Oracle, COSMOS, NoSQL, Cassandra, MongoDB
- Experience with Data Ingestion/ streaming tooling such as ADF (Azure Data Factory), Kafka, Five Tran, Spark, or similar technologies
- Advance experience with metadata management, data Ingestion, data management, data quality, and data lineage services and technologies
- Experience with SQL query language and relational database concepts
- Advance understanding of API development and integration, Spring Boot framework and other Spring technologies
- Design and implement enterprise data governance solutions
- Design and implement data quality and data lineage solutions
- Scope, design, and build scalable, resilient distributed systems
- Lead design sessions and code reviews to elevate the quality of engineering across the organization
- Experience with version control such as GIT
- Experience with Big Data technologies such as HBase, Hive, Spark, Kafka, Graph DB and Cassandra
- Experience with Load test tooling (Gatling or equivalent)
- Proven understanding of microservices oriented architecture and extensible REST APIs and GraphQL
- 4+ years of professional software or data analytics engineering experience
- 3+ years of experience with Data visualization tools
- 3+ years of experience with AWS, GCP, Azure, or hybrid data center
- 2+ years of experience in open source frameworks
- Bachelor’s degree in computer science, Information Systems, or equivalent education or work experience
- Premier Medical, Dental and Vision Insurance with no waiting period**
- Paid Vacation, Sick and Parental Leave
- 401(k) Plan
- Tuition Reimbursement
- Paid Training and Licensures