Current jobs related to Sr. Data Engineer - Dearborn - EPITEC

  • Sr Software Engineer

    4 weeks ago


    Dearborn, United States Tekvivid Inc Full time

    W2 Requirement! Looking for Sr. Software Engineer Data Engineer Location: Dearborn Mi Hybrid GCP Certification Mandatory Experience Required: 8 years of professional experience in: o Data engineering data product development and software product launches o At least three of the following languages: Java Python Spark Scala SQL and experience performance...

  • Sr Software Engineer

    2 months ago


    Dearborn, United States Tekvivid Inc Full time

    W2 Requirement! Looking for Sr. Software Engineer Data Engineer Location: Dearborn Mi Hybrid GCP Certification Mandatory Experience Required: 8 years of professional experience in: o Data engineering data product development and software product launches o At least three of the following languages: Java Python Spark Scala SQL and experience performance...

  • Data Engineer

    4 weeks ago


    Dearborn, United States Tekvivid Inc Full time

    W2 Requirement! Looking for Sr. Software Engineer Data Engineer Location: Dearborn Mi Hybrid Experience: 7 years Skills Required: Python SQL 3 years experience with ETL solutions 3 years experience SQL/Oracle/Teradata Ability to lead and mentor associate engineers Skills Preferred: Experience with Google Cloud Platform (GCP) ...

  • Sr Data Modeler

    4 weeks ago


    Dearborn, United States Tanisha Systems, Inc. Full time

    Job Ttile : Sr Data Modeler Location: Dearborn, MI Position Type: Full Time Experience : 10 + Years The candidate will be part of Information / Data Architecture Practice supporting major initiatives in Marketing Sales and Service plus Portfolio. The position requires a strong architect with comprehensive knowledge of data...

  • Sr Data Architect

    4 weeks ago


    Dearborn, United States Tanisha Systems, Inc. Full time

    Job Title: Sr Data ArchitectLocation: Dearborn, MIPosition Type: Full Time Experience : 10 + Years The candidate will be part of Information / Data Architecture Practice supporting major initiatives in Marketing Sales and Service plus Portfolio. The position requires a strong architect with comprehensive knowledge of data architecture, hands on experience...

  • Sr. Data Scientist

    5 months ago


    Dearborn, United States Carhartt us Full time

    Summary ​​The Sr. Data Scientist will drive Carhartt’s analytics direction to help us gain a deeper understanding of every stage of our business from demand to supply and the optimization of everything in between. This position will partner with business leaders and project owners to identify the appropriate analytics solution to answer their...

  • Senior Data Engineer

    2 weeks ago


    Dearborn, Michigan, United States Ford Motor Company Full time

    Job DescriptionFord Motor Company is seeking a highly skilled Data Engineer to join our team. As a key member of our Data Platform and Engineering team, you will be responsible for designing and building data engineering solutions that support our business goals.Key Responsibilities:Collaborate with cross-functional teams to understand data engineering...


  • Dearborn, Michigan, United States Apex Systems Full time

    Job Title: Senior Data EngineerJob Summary:We are seeking a highly skilled Senior Data Engineer to join our team at Apex Systems. As a Senior Data Engineer, you will be responsible for designing, developing, and deploying big data solutions on Google Cloud Platform (GCP) integrating native GCP services and 3rd party data technologies.Key...

  • Azure Data Engineer

    1 month ago


    Dearborn, United States Info Services Full time

    Job Summary:As a cloud engineer for Azure, you will be responsible for designing and implementing scalable, high-performance data engineering solutions using Azure Databricks and Azure Data Factory. Essential Job Functions:*Responsibilities:* - Design and implementation of complex cloud solutions using Microsoft Azure, focusing on integration with Databricks...

  • Cloud Data Engineer

    1 month ago


    Dearborn, United States JRD Systems Full time

    Job Title - Cloud Data Engineer (Azure & Databricks)Base Location – Dearborn, Michigan(Hybrid)Duration – 15+monthsInterview Mode - Video call and In person interview**Candidates must be local to MichiganJob Description :As a cloud engineer for Azure, you will be responsible for designing and implementing scalable, high-performance data engineering...

  • Cloud Data Engineer

    7 days ago


    Dearborn, United States JRD Systems Full time

    Job Title - Cloud Data Engineer (Azure & Databricks)Base Location - Dearborn, Michigan(Hybrid)Duration - 15+monthsInterview Mode - Video call and In person interview**Candidates must be local to MichiganJob Description :As a cloud engineer for Azure, you will be responsible for designing and implementing scalable, high-performance data engineering solutions...

  • Data Engineer Lead

    1 month ago


    Dearborn, Michigan, United States Damco Solutions Full time

    Job Title: Data Engineer LeadAt Damco Solutions, we are seeking an experienced Data Engineer Lead to join our team. As a key member of our Data Factory Enablement Team, you will play a crucial role in enabling teams to build their solutions in the GCP Data Factory Platform.Key Responsibilities:Work as part of an implementation team from concept to...

  • Data Engineer General

    1 month ago


    Dearborn, Michigan, United States Ciber Full time

    Job Title: Data Engineer GeneralWe are seeking a highly skilled Data Engineer General to join our team at HTC Global Services. As a Data Engineer General, you will be responsible for designing, building, and maintaining scalable and robust data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow.Key Responsibilities:Design...

  • Azure Data Engineer

    3 weeks ago


    Dearborn, United States Droisys Full time

    About CompanyDroisys is an innovation technology company focused on helping companies accelerate their digital initiatives from strategy and planning through execution. We leverage deep technical expertise, Agile methodologies, and data-driven intelligence to modernize systems of engagement and simplify human/tech interaction.Amazing things happen when we...

  • GCP Data Engineer

    2 weeks ago


    Dearborn, Michigan, United States kyyba Full time

    Job Title:GCP Data EngineerJob Summary:We are seeking a highly skilled GCP Data Engineer to join our team. As a GCP Data Engineer, you will be responsible for designing and implementing cloud solutions, Hadoop applications, and providing technical leadership to our software engineers.Key Responsibilities:Design and implement cloud solutions and Hadoop...

  • GCP Data Engineer

    1 month ago


    Dearborn, Michigan, United States Stefanini Group Full time

    Job Title: GCP Data EngineerWe are seeking an experienced GCP Data Engineer to join our team at Stefanini Group. As a GCP Data Engineer, you will be responsible for designing and building cloud analytics platforms to meet the ever-expanding business requirements of our clients.Key Responsibilities:Design and implement data pipelines using Google Cloud...

  • GCP Data Engineer

    3 weeks ago


    Dearborn, Michigan, United States Stefanini Group Full time

    Job Title: GCP Data EngineerWe are seeking an experienced GCP Data Engineer to join our team at Stefanini Group. As a GCP Data Engineer, you will be responsible for designing and building cloud analytics platforms to meet ever-expanding business requirements with speed and quality using lean Agile practices.Key Responsibilities:Analyze and manipulate large...


  • Dearborn, Michigan, United States Mindlance Full time

    Data Software Engineer PositionThe Data Software Engineer position at Mindlance involves the design, implementation, testing, and launch of new applications for loading dealer data and generating analytical insights. This role requires a strong foundation in Big Data, with expertise in Python and SQL query language.Key Responsibilities:Design and implement...

  • Data Engineer Lead

    7 days ago


    Dearborn, Michigan, United States Damco Solutions Full time

    Job Description:The Data Engineer Lead will be responsible for providing consultative services to the Software Development and Database Engineering teams. This person will work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deploying Ford's Data Platform.Key...


  • Dearborn, Michigan, United States Damco Solutions Full time

    Senior Data EngineerThe Senior Data Engineer role at Damco Solutions involves creating pipelines and transformations on GCP. The ideal candidate will have expertise in all combinations of data sources, including streaming, batch, and relational data. Key skills include Dataflow, Airflow, Python, Terraform, and exposure to Looker. This is a 12-month contract...

Sr. Data Engineer

2 months ago


Dearborn, United States EPITEC Full time

Position Description:

Materials Management Platform (MMP) is a multi-year transformation initiative aimed at transforming our Materials Requirement Planning & Inventory Management capabilities. This is part of a larger Industrial Systems IT Transformation effort. This position responsibility is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier Supply Chain, Supplier Collaboration


Skills Required:

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data
  • Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
  • Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
  • Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
  • Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.


Experience Required:

  • 8 years of professional experience in:
  • Data engineering, data product development and software product launches
  • At least three of the following languages: Java, Python, Spark, Scala, SQL and experience performance tuning.
  • 4 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using:
  • Data warehouses like Google BigQuery.
  • Workflow orchestration tools like Airflow.
  • Relational Database Management System like MySQL, PostgreSQL, and SQL Server.
  • Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub
  • Microservices architecture to deliver large-scale real-time data processing application.
  • REST APIs for compute, storage, operations, and security.
  • DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.
  • Project management tools like Atlassian JIRA
  • Automotive experience is preferred
  • Support in an onshore/offshore model is preferred
  • Excellent at problem solving and prevention.
  • Knowledge and practical experience of agile delivery


Experience Preferred:

  • Experience in IDOC processing, APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment


Education Required:

  • Requires a bachelor’s or foreign equivalent degree in computer science, information technology or a technology related field


Education Preferred:

  • Master's preferred

Additional Information :

  • GCP Certification preferred
  • Hybrid with up to 4 days a week on site.