Architect, Data Scientist, Data & AI Team (North America Remote)
JAGGAER
- Durham, NC
- Permanent
- Full-time
- Data & Feature Engineering
- Build scalable ingestion, ETL/ELT, and feature store pipelines across OpenSearch, Snowflake, Redshift, and Redis.
- Design semantic layers and vector indexes (Pinecone, OpenSearch) that power retrieval augmented generation (RAG) and Agentic AI workflows.
- Model Development & Experimentation
- Prototype, train, and evaluate predictive, prescriptive, and generative models in Amazon SageMaker (plus open source frameworks).
- Implement automated A/B tests and champion/challenger experiments; translate findings into product requirements.
- ML / LLM Ops
- Own CI/CD, monitoring, drift detection, and scalable inference for classical ML and LLM pipelines.
- Package models and agents into reusable micro services with Terraform / Docker / Kubernetes.
- Agentic Platform Integration
- Orchestrate multi-agent task flows (LangGraph, CrewAI, or equivalent) that call JAGGAER and third-party APIs.
- Collaborate with front-end teams to embed real-time analytics and AI insights into customer-facing apps.
- Insight Generation & Storytelling
- Diagnose customer data issues; deliver visual analyses (Tableau, Superset, Streamlit, or R/Python) for executives and non-technical stakeholders.
- Champion data-driven decision making across Product, Services, and Go to Market teams.
- Bachelor’s or Master’s in Computer Science, Statistics, Math, Data Science, or related field.
- 10+ years designing and deploying production-grade ML or data engineering solutions.
- Proficiency in Python (Pandas, PySpark, scikit learn, TensorFlow/PyTorch) and SQL.
- Hands-on work with at least two of the following platforms: OpenSearch, Snowflake, Redshift, Redis, Pinecone, SageMaker.
- Solid grounding in statistical modeling, supervised/unsupervised ML, and evaluation metrics.
- Experience with Linux, Git, CI/CD, Docker, and at least one orchestration framework (Airflow, Prefect, Kubeflow, or Dagster).
- Clear, concise communicator able to present complex analyses to senior leadership.
- Preferred Qualifications:
- Prior exposure to LLM fine tuning, prompt engineering, or RAG pipelines.
- Experience deploying ML services on AWS (S3, ECS/EKS, Bedrock), Azure, or GCP.
- Knowledge of procurement, supply chain, IoT sensor, or ERP data domains.
- Familiarity with Agentic AI frameworks (LangChain Agents, CrewAI, Haystack, etc.).
- Track record of hackathon wins, open-source contributions, or published research.
- Work directly with the CDAO’s innovation team to shape the future of enterprise AI. Your models and agents will be embedded directly into a platform that manages
- Collaborate with world-class talent in a fast-paced, impact-driven culture.
- Enjoy flexibility, purpose-driven work, and competitive compensation.