We have other current jobs related to this field that you can find below


  • Hartford, Connecticut, United States The Travelers Companies, Inc. Full time

    Who Are We? At Travelers, we prioritize the well-being of our customers, communities, and each other. This commitment is at the heart of the Travelers Promise. With over 160 years of excellence in the property casualty insurance sector, we have built a reputation that stands the test of time. Experience a culture that thrives on innovation and collaboration,...

GenAI Engineer

2 months ago


Hartford, United States Saransh Full time

***** GenAI Engineer

***** Hartford, CT (or) Charlotte, NC (Onsite from Day 1)

Job ***** Contract

**

  • The Generative AI team is comprised of a multiple cross-functional group that works in unison and ensures a sound move from our research activities to scalable solutions.
  • You will collaborate closely with our cloud, security, infrastructure, enterprise architecture and data science team to conceive and execute essential functionalities.

  • Design and build fault-tolerant infrastructure to support the Generative AI Ref architecture (RAG, Summarization, Agent etc).
  • Ensure code is delivered without vulnerabilities by enforcing engineering practices, code scanning, etc.
  • Build and maintain IAC (terraform/Cloud Formation), CICD (Jenkins) scripts, CodePipeline, uDeploy, & GitHub Actions.
  • Partner with our shared service teams like Architecture, Cloud, Security, etc to design and implement platform solutions.
  • Collaborate with the DS team to develop a self-service internal developer Generative AI platform.
  • Design and build the Data ingestion pipeline for Finetuning LLM Models.
  • Create templates (Architecture As Code) implementing Ref architecture application’s topology.
  • Build a feedback system using HITL for Supervised finetuning.

***

  • Bachelors degree in Computer Science, Computer Engineering, or a technical field.
  • 4+ years of experience with AWS cloud.
  • At least 8 years of experience designing and building data-intensive solutions using distributed computing.
  • 8+ years building and shipping software and/or platform infrastructure solutions for enterprises.
  • Experience with CI/CD pipelines, Automated Testing, Automated Deployments, Agile methodologies, Unit Testing and Integration Testing tools.
  • Experience with building scalable serverless application (real-time / batch) on AWS stack (Lambda + step function)
  • Knowledge of distributed NoSQL database systems.
  • Experience with data engineering, ETL technology, and conversation UX is a plus.
  • Experience with HPCs, vector embedding, and Hybrid/Semantic search technologies.
  • Experience with AWS OpenSearch, Step/Lambda Functions, SageMaker, API Gateways, ECS/Docker is a plus.
  • Proficiency in customization techniques across various stages of the RAG pipeline, including model fine-tuning, retrieval re-ranking, and hierarchical navigable small-world graph (HNSW) is a plus.
  • Strong proficiency in embeddings, ANN/KNN, vector stores, database optimization, & performance tuning.
  • Extensive programming experience with Python, Java.
  • Experience with LLM orchestration frameworks like Langchain, LlamaIndex etc.
  • Foundational understanding of Natural Language Processing, and Deep Learning.
  • Experience with CI/CD pipelines, Automated Testing, Automated Deployments, Agile methodologies, Unit Testing, and Integration Testing tools.
  • Excellent problem-solving skills and the ability to work in a collaborative team environment.