
Data Engineer - PGIM Global Services (Tampa, FL - Hybrid)
- Tampa, FL
- Permanent
- Full-time
- Build and optimize data pipelines, logic, and storage systems with the latest coding practices and industry standards, modern design patterns, and architectural principles; remove technical impediments.
- Develop high-quality, well-documented, and efficient code adhering to all applicable Prudential standards.
- Conduct complex data analysis and report on results, prepare data for prescriptive and predictive modeling, and combine raw information from different sources.
- Collaborate with data analysts, scientists, and architects on data projects to enhance data acquisition, transformation, organization processes, data reliability, efficiency, and quality.
- Write unit, integration tests, and functional automation, researching problems discovered by quality assurance or product support, and developing solutions to address the problems.
- Bring a strong understanding of relevant and emerging technologies, provide input and coach team members, and embed learning and innovation in the day-to-day.
- Work on complex problems in which analysis of situations or data requires an evaluation of intangible variables.
- Use programming languages including but not limited to Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting.
- Bachelor of Computer Science or Engineering or experience in related fields.
- Experience in working with DevOps automation tools & practices; Knowledge of the full software development life cycle (SDLC).
- Ability to perform with minimal guidance and effectively leverage diverse ideas, experiences, thoughts, and perspectives to the benefit of the organization.
- Knowledge of business concepts, tools, and processes that are needed for making sound decisions in the context of the company's business.
- Ability to learn new skills and knowledge on an ongoing basis through self-initiative and tackling challenges.
- Excellent problem-solving, communication, and collaboration skills; enjoy learning new skills!
- Advanced experience and/or expertise with several of the following:
- Programming Languages: Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting.
- Data Ingestion, Integration & Transformation: Moving data from multiple sources, formats, and volumes to analytics platforms through various tools. Preparing data for further analysis; transforming and mapping raw data to generate insights and wrangling data through tools. Extensive knowledge of Microsoft Fabric
- Database Management Systems: Storing, organizing, managing, and delivering data using relational DBs, NoSQL DBs, Graph DBs, and data warehouse technologies including Azure SQL Database and Azure Synapse Analytics.
- Database Tools: Data architecture to store, organize, and manage data. Experience with SQL and NoSQL based databases for storage and processing of structured, semi-structured & unstructured data.
- Real-Time Analytics: Azure Stream Analytics, Azure Event Hubs.
- Data Buffering: Azure Event Hubs, Azure Service Bus.
- Workflow Orchestration: Azure Data Factory, Azure Logic Apps.
- Data Visualization: Power BI, MS Excel.
- Data Lakes & Warehousing: Building Data Models, Data Lakes, and Data Warehousing using Azure Data Lake Storage and Azure Synapse Analytics.
- Data Protection and Security: Knowledge of data protection, security principles, and services; data loss prevention, role-based access controls, data encryption, data access capture, and core security services.
- Common Infrastructure as Code (IaC) Frameworks: Azure Resource Manager (ARM) templates, Terraform.
- Cloud Computing: Knowledge of fundamentals of Azure architectural principles and services; Strong ability to write code and deploy using Azure services. Microsoft Fabric End to End Knowledge
- Testing/Quality: Unit, interface, and end-user testing concepts and tooling inclusive of non-functional requirements (performance, usability, reliability, security/vulnerability scanning, etc.) including how testing is integrated into DevOps; accessibility awareness.
- Serverless data pipeline development using Azure Functions and Azure Logic Apps.
- Relevant certifications in Azure technologies.