Are you looking for an opportunity to develop a data platform that will have an impact on rapid exploitation and sharing of multi-INT information across the intelligence community? Solid platform development is a critical part of any program’s success and you know how to do it right – scalable design with baked in security. That’s why we need you, a developer with the skills to build a platform that will transform the integrated intelligence mission.
As a data platform developer on our team, you’ll design and develop the core data integration platform for our project from end to end. You’ll work with customers and end-users to understand their mission, current architecture, and security requirements. With a focus on the customer’s goals, you’ll build a design that will scale to meet their evolving needs. Your technical expertise will be vital as you recommend tools and capabilities based on your research of the current environment and new technology. Your design will set the standard for future development, so you’ll craft an architecture that smoothly works with existing infrastructure without compromising security. As a technical leader, you’ll identify new opportunities to build platform-based solutions to help your customers meet their toughest challenges. This is a chance to use your deep OS knowledge and broaden your skill set into areas like Cloud computing, large-scale data ingestion and processing, multiple data store types, including graph data, time series, spatial, and other NoSQL, high performance data streaming, data science, and supporting the integration of novel mission applications and analytics. Join us as we develop software-based solutions to make a difference for the integrated intelligence mission.
Empower change with us.
- 4+ years of experience with Java and ETL engineering
- 2+ years of experience with working in data ingestion, processing, and distribution
- 1+ years of experience with developing and deploying data ingestion, processing, and distribution systems on and with AWS technologies
- Experience in working with IC data sets and NoSQL databases, including ElasticSearch and HBase
- Experience with using the following AWS datastores, including RDS Postgres, S3, or DynamoDB
- Experience working with pub/sub messaging technology, including Apache Kafka
- Experience with Agile software development practices
- Top Secret clearance required
- HS diploma or GED
- Ability to obtain Security+ CE, SSCP, CCNA-Security, or GSEC Certification within 6 months of start date
- 4+ years of Scala, MapReduce, Spark, or Hive
- 2+ years of experience with cognitive computing, data integration, data mining, Natural Language Processing, Hadoop platforms, or automating machine learning components
- 1+ years of experience with data mining using current methods and tools
- Experience with performing ETL activities using Apache NiFi
- Experience with graph data stores, time series database, and other NoSQL technologies
- Knowledge of one or more of the following: Jira, Git, Kafka, Kubernetes, Rancher, or Docker
- Knowledge of data science tools and their integration with big data stores
- Knowledge of data security policies and guidelines
- TS/SCI clearance with a polygraph preferred
- BA or BS degree preferred
- AWS Certification
- Security+ CE, SSCP, CCNA-Security, or GSEC Certification
Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; Top Secret clearance is required.
Build Your Career:
A challenging and dynamic work environment isn’t all we have to offer. When you join Booz Allen, you’ll have access to:
- experts in virtually every field
- a culture that focuses on supporting our employees
- opportunities that provide stability while offering variety
We’re an EOE that empowers our people—no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, veteran status, or other protected characteristic—to fearlessly drive change.
#LI-AH1, ID16, NSG1
Booz Allen Hamilton