Current jobs related to MLOPS Architect - Saint Louis - Aurotekcorp

  • Senior Architect

    1 week ago


    Saint Paul, Minnesota, United States Travelers Insurance Full time

    Transforming Business and Technology CapabilitiesAs a member of Enterprise Technology Solutions, you will be joining a strategic and collaborative team that is passionate about transforming our business and technology capabilities and paving the way for best-in-class solutions.As a Senior Architect, Generative AI supporting Personal Insurance, you will...

  • Senior Data Scientist

    3 weeks ago


    St. Louis, United States Daugherty Business Solutions Full time

    Overview Are you ready to embark on an exciting career journey with a leading multinational company? Daugherty Business Solutions is a leading advisory services and technology consulting firm that delivers customized, innovative solutions across all industries. We are seeking a Senior Data Scientist to join our Data Science and AI Competency, driving...

MLOPS Architect

3 months ago


Saint Louis, United States Aurotekcorp Full time
Job DescriptionJob Description

Title: MLOPS Architect (Machine Learning / AI Architect)

Location: Saint Louis, MO

Job Type: Contract

Job Description:

Mandatory required skills AWS, Python, Airflow, Kedro, or Luigi

Preferred/Desired skills Hadoop, Spark, or similar frameworks. Experience with graph databases a plus

Please share the Detailed Job Description

1. Designing Cloud Architecture:

o As an AWS Cloud Architect, you'll be responsible for designing cloud architectures, preferably on AWS, Azure, or multi-cloud environments.

o Your architecture design should enable seamless scalability, flexibility, and efficient resource utilization for MLOps implementations.

2. Data Pipeline Design:

o Develop data taxonomy and data pipeline designs to ensure efficient data management, processing, and utilization across the AI/ML platform.

o These pipelines are critical for ingesting, transforming, and serving data to machine learning models.

3. MLOps Implementation:

o Collaborate with data scientists, engineers, and DevOps teams to implement MLOps best practices.

o This involves setting up continuous integration and continuous deployment (CI/CD) pipelines for model training, deployment, and monitoring.

4. Infrastructure as Code (IaC):

o Use tools like AWS CloudFormation or Terraform to define and provision infrastructure resources.

o Infrastructure as Code allows you to manage your cloud resources programmatically, ensuring consistency and reproducibility.

5. Security and Compliance:

o Ensure that the MLOps architecture adheres to security best practices and compliance requirements.

o Implement access controls, encryption, and monitoring to protect sensitive data and models.

6. Performance Optimization:

o Optimize cloud resources for cost-effectiveness and performance.

o Consider factors like auto-scaling, load balancing, and efficient use of compute resources.

7. Monitoring and Troubleshooting:

o Set up monitoring and alerting for the MLOps infrastructure.

o Be prepared to troubleshoot issues related to infrastructure, data pipelines, and model deployments.

8. Collaboration and Communication:

o Work closely with cross-functional teams, including data scientists, software engineers, and business stakeholders.

o Effective communication is essential to align technical decisions with business goals.

Activities

Strong experience in Python

Experience in data product development, analytical models, and model governance

Experience with AI workflow management tools such as Airflow, Kedro, or Luigi

Exposure statistical modeling, machine learning algorithms, and predictive analytics.

Highly structured and organized work planning skills

Strong understanding of the AI development lifecycle and Agile practices

Proficiency in big data technologies like Hadoop, Spark, or similar frameworks. Experience with graph databases a plus.

Extensive Experience in working with cloud computing platforms - AWS

Proven track record of delivering data products in environments with strict adherence to security and model governance standards.