
Senior Data Engineer
- Oklahoma City, OK
- Permanent
- Full-time
- Create and maintain an optimal data architecture and data pipelines designed to scale with current and future enterprise needs.
- Organize and assemble large, complex data sets from multiple sources (Jack Henry SilverLake, Salesforce, etc.) into data warehouse, data marts, and data cubes that meet business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, data quality controls, and data infrastructure for scalability and resilience.
- Build the infrastructure and tools required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL, Python, AWS Glue, and other technologies.
- Work with team members to support data analytics and forecasting that utilize our data pipelines to provide actionable insights into operational efficiency, customer behavior, and key performance indicators.
- Work with stakeholders to resolve data related technical issues and support data infrastructure needs across departments.
- Ensure that data processes are properly segmented, encrypted, and secured across network boundaries (on-prem and cloud) through AWS S3, Aurora PostgreSQL, and AWS Glue workflows.
- Collaborate with analytics, engineering, and business subject matter experts to support system enhancements and improved business reporting.
- Develop and support deployment operations (DevOps) and data operations (DataOps) principles and workflows.
- Ensure compliance with industry regulations, bank policies and procedures.
- 4+ years of experience in a Data Engineer, Architecture and/or Analyst or role.
- 4+ years of experience in the banking industry.
- 2+ years of experience in programming with Python.
- Advanced working SQL knowledge and experience with relational databases, complex query authoring, and graph databases.
- Experience building and optimizing scalable data pipelines, data architectures, and data sets.
- Strong analytic skills related to working with structured and semi-structured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Excellent interpersonal skills for working with technical and non-technical colleagues across the organization.
- Experience with the following is preferred:
- Core Banking Platforms: Jack Henry Silverlake
- Data Integration: Salesforce, use of REST and SOAP API's
- Databases: PostgreSQL, SQL Server, AWS Neptune or Neo4j
- Programming Languages: Python, PowerShell Scripting
- Query Languages: SQL & T-SQL, Postgres SQL & PL/pgSQL, Cypher, Gremlin or SPARQL
- Cloud Platforms: AWS (PostgreSQL, Redshift, Neptune, Glue, S3, EC2)
- BI & Visualization: DOMO, Power BI
- Must be able to work within a routine office environment.
- Ability to travel from one office location to another.