MLOPS Engineer

3 weeks ago


Nashville, United States Diverse Lynx Full time
Technical/Functional Skills
  • Proficient in writing Spark code in following languages: Python or Scala or SQL
  • Well versed in the concepts of Data/Delta Lake
  • Experience in Big Data components such as PySpark, Spark SQL, Data frames
  • Implement end-to-end data pipelines.
  • Implement Client pipelines to train, build, test and deploy predictive models.
  • Run tests, perform statistical analysis, and interpret test results.
  • Experience in development of MLOps using cloud native services.
  • Working experience in Agile delivery model
  • Experience with cloud-based architectures
  • Experience working with large scale data processing services (Hadoop)
  • Automotive and manufacturing domain knowledge


Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
  • sr. MLops engineer

    4 weeks ago


    Nashville, United States Diverse Lynx Full time

    Last 4 digit of SSN: Role:- MLOps engineer Location:-Chicago,IL Main goal here is to create and deploy a complete Client ops framework based on the Azure Machine Learning platform. The goal here is NOT to build a model, but to build a document an Client ops framework for how to build, deploy and monitor models in the AML platform. Azure Client development...

  • ML Ops Enigneer

    1 week ago


    Nashville, United States Diverse Lynx Full time

    Role: Client ops Engineer Contract Locations: Nashville, TN Job Title Client ops Engineer Relevant Experience (in Yrs) 10+ Years Technical/Functional Skills Proficient in writing Spark code in following languages: Python or Scala or SQL Well versed in the concepts of Data/Delta Lake Experience in Big Data components such as PySpark, Spark SQL, Data frames...