Current jobs related to Hadoop ETL Developer - Charlotte, North Carolina - Motion Recruitment


  • Charlotte, North Carolina, United States Motion Recruitment Full time

    Senior Data Integration DeveloperWe are seeking a highly skilled Senior Data Integration Developer to join our team in Charlotte, NC. As a key member of our data integration team, you will be responsible for designing, developing, and deploying data integration solutions that meet the needs of our business stakeholders.Key Responsibilities:Design and develop...

  • Hadoop Developer

    4 weeks ago


    Charlotte, North Carolina, United States Pronix Inc Full time

    Job Title: Oracle/Hadoop DeveloperJob Summary:We are seeking an experienced Oracle/Hadoop Developer to join our team at Pronix Inc. The successful candidate will be responsible for designing, developing, and maintaining software for the ECR R&A team, collaborating with the development team and business partners to ensure successful delivery and...


  • Charlotte, North Carolina, United States The Dignify Solutions LLC Full time

    Job SummaryWe are seeking a highly skilled Senior Hadoop Developer to join our team at The Dignify Solutions LLC. The ideal candidate will have a strong background in Hadoop/Big Data technologies and experience working with Cloudera Data Hadoop (CDH) and Cloudera Distribution (CDP).Key ResponsibilitiesDesign and develop large-scale data processing pipelines...


  • Charlotte, North Carolina, United States The Dignify Solutions LLC Full time

    Hadoop Developer Job DescriptionWe are seeking a highly skilled Hadoop Developer to join our team at The Dignify Solutions LLC. The ideal candidate will have a strong background in Hadoop/Big Data technologies, including HDFS, SQOOP, Hive, Pig, Spark, Impala, and Oozie.Key Responsibilities:Design and develop large-scale data processing pipelines using Hadoop...

  • Senior ETL Developer

    4 weeks ago


    Charlotte, North Carolina, United States Motion Recruitment Full time

    Key Responsibilities:Collaborate with Data Modelers to design data requirements for reporting and analyticsDevelop and debug ETL integration code to meet defined source to target mappingsHandle complex ETL requirements and designWork with Business Stakeholders to build database objects to meet desired outputBuild and govern the Hadoop data layer to ensure...


  • Charlotte, North Carolina, United States Aloden, Inc. Full time

    Job Title: Big Data Hadoop DeveloperJob Summary:We are seeking a highly skilled Big Data Hadoop Developer to join our team at Aloden, Inc. As a key member of our software engineering team, you will be responsible for designing and implementing automated spark-based frameworks to facilitate data ingestion, transformation, and consumption.Key...

  • Hadoop Engineer

    3 weeks ago


    Charlotte, North Carolina, United States The Dignify Solutions LLC Full time

    Job DescriptionWe are seeking a highly skilled Hadoop Developer to join our team at The Dignify Solutions LLC.Key Responsibilities:Design and develop large-scale data processing systems using Hadoop and related technologies.Work with Cloudera Data Hadoop (CDH) and Cloudera Distribution (CDP) to ensure seamless data processing and integration.Implement data...


  • Charlotte, North Carolina, United States TechnoGen Full time

    Job Title: Senior Java Developer with Hadoop ExperienceJob Summary: We are seeking a highly skilled Senior Java Developer with Hadoop experience to join our team at TechnoGen.Key Responsibilities:* Develop, support, and maintain the Enterprise Credit Risk ETL platform* Design, develop, and test software for the ECR RCF team* Collaborate with the development...


  • Charlotte, North Carolina, United States TechnoGen Full time

    Job Title: Senior Java Developer with Hadoop ExperienceWe are seeking a highly skilled Senior Java Developer with experience in Hadoop and Agile development to join our team at TechnoGen.Key Responsibilities:Design, develop, maintain, and test software for the ECR RCF teamCollaborate with the development team and business partners to ensure successful...


  • Charlotte, North Carolina, United States Lorven technologies Full time

    Job Title: Hadoop Data EngineerLocation: Charlotte, NCDuration: Long termJob description:Responsibilities:Design, develop, and maintain Hadoop-based data processing applications.Build, operate, monitor, and troubleshoot Hadoop infrastructure.Develop tools and libraries to support data access and processing.Write and optimize MapReduce programs.Implement data...

  • Data Engineer

    2 weeks ago


    Charlotte, North Carolina, United States Randstad US Full time

    Job Title: Data Engineer - Hadoop/Google CloudJob Summary:We are seeking an experienced Data Engineer to join our team at Randstad US. The successful candidate will have a strong background in Hadoop and Google Cloud Platform, with a focus on designing and optimizing complex SQL queries and ETL processes.Responsibilities:* Migrate data to Google Cloud...


  • Charlotte, North Carolina, United States Motion Recruitment Partners LLC Full time

    SQL SSIS ETL Developer Job DescriptionWe are seeking a highly skilled SQL SSIS ETL Developer to join our team at Motion Recruitment Partners LLC. As a key member of our technology team, you will be responsible for designing, coding, testing, debugging, and documenting complex technology solutions for our clients.Key Responsibilities:Lead complex technology...

  • Data Engineer

    2 weeks ago


    Charlotte, North Carolina, United States Randstad Digital Full time

    Job Title: Data Engineer - Hadoop/Google CloudJob Summary:We are seeking an experienced Data Engineer to join our team at Randstad Digital. The successful candidate will have a strong background in Hadoop and Google Cloud Platform, with a proven track record of designing and implementing large-scale data processing systems.Key Responsibilities:Migrate data...

  • Data Engineer

    7 days ago


    Charlotte, North Carolina, United States Randstad Full time

    Job Title: Data Engineer - Hadoop/Google CloudJob Summary: We are seeking an experienced Data Engineer to join our team at Randstad Digital. The ideal candidate will have a strong background in Hadoop and Google Cloud Platform, with a proven track record of designing and optimizing complex data systems.Responsibilities:* Migrate data to Google Cloud...

  • Data Engineer

    4 days ago


    Charlotte, North Carolina, United States Randstad Full time

    Job Title: Data Engineer - Hadoop/Google Cloud ProfessionalJob Summary: We are seeking an experienced Data Engineer to join our team at Randstad Digital. The ideal candidate will have a strong background in Hadoop and Google Cloud Platform, with a proven track record of designing and optimizing complex data systems.Responsibilities: • Migrate data to...

  • Senior Data Engineer

    2 weeks ago


    Charlotte, North Carolina, United States Randstad Digital Full time

    Job SummaryWe are seeking an experienced Data Engineer to join our team at Randstad Digital. The ideal candidate will have a strong background in data engineering, with expertise in Hadoop, Google Cloud Platform, and SQL.Key ResponsibilitiesMigrate data to Google Cloud PlatformDesign and optimize complex SQL queriesDevelop and implement ETL processesWork...


  • Charlotte, North Carolina, United States raag solutions Full time

    Job DescriptionWe are seeking a highly skilled Hadoop Developer with expertise in SQL and Java to join our team at Raag Solutions.Key Responsibilities:Design, build, and test complex data processing systems using Hadoop and Java.Develop and maintain large-scale data warehouses using Hadoop, SQL, and Unix.Work with multiple database platforms, including SQL...


  • Charlotte, North Carolina, United States raag solutions Full time

    Job DescriptionWe are seeking a highly skilled Hadoop Developer with expertise in SQL and Java to join our team at Raag Solutions.Key Responsibilities:Design, build, and test complex data processing systems using Hadoop and Java.Develop and maintain large-scale data warehouses using SQL and Hadoop.Collaborate with cross-functional teams to identify and...


  • Charlotte, North Carolina, United States Aloden, Inc. Full time

    Job Summary:We are seeking a highly skilled Senior Data Integration Developer to join our team in Charlotte, NC. The successful candidate will have a strong understanding of data management, dimensional data modeling, and data integration solutions. The role will involve working closely with Data Modelers to structure data requirements into a data warehouse...

  • Big Data Developer

    4 days ago


    Charlotte, North Carolina, United States Collabera Full time

    Hadoop Developer Job DescriptionWe are seeking a skilled Hadoop Developer to join our team at Collabera. The ideal candidate will have experience working in MapR, Hadoop, Scala, PySpark, and Java, with a strong background in application development and implementation. Key Responsibilities:Design and develop big-data architecture using HadoopImplement ETL,...

Hadoop ETL Developer

3 months ago


Charlotte, North Carolina, United States Motion Recruitment Full time

Outstanding long-term contract opportunity A well-known Financial Services Company is looking for a Hadoop ETL Engineer in Charlotte NC (Hybrid).

Work with the brightest minds at one of the largest financial institutions in the world.

This is long-term contract opportunity that includes a competitive benefit package Our client has been around for over 150 years and is continuously innovating in today's digital age.

If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Contract Duration: 12 months plus

Job Expectations:

  • Design and implement automated spark-based framework to facilitate data ingestion, transformation and consumption.
  • Implement security protocols such as Kerberos Authentication, Encryption of data at rest, data authorization mechanism such as role-based access control using Apache ranger.
  • Design and develop automated testing framework to perform data validation.
  • Enhance existing spark-based frameworks to overcome tool limitations, and/or to add more features based on consumer expectations.
  • Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB, Kafka and object storage architecture.
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure
  • Collaborate with application partners, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure
.

  • Support ongoing data management efforts for Development, QA and Production environments
  • Provide tool support, help consumers troubleshooting pipeline issues.
  • Utilizes a thorough understanding of available technology, tools, and existing designs.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
Required Qualification

  • 5+ years of experience of software engineering experience
  • 5+ years of experience delivering complex enterprise wide information technology solutions
  • 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
  • 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs), Parquet or ORC file formats
  • 6+ years of ETL (Extract, Transform, Load) Programming experience
  • 2+ years of Kafka or equivalent experience
  • 2+ years of NoSQL DB like Couchbase/MongoDB experience.
  • 5+ experience working with complex SQLs and performance tuning
Desired Qualification

  • 3+ years of Agile experience
  • 2+ years of reporting experience, analytics experience or a combination of both
  • 2+ years of operational risk or credit risk or compliance domain experience
  • 2+ years of experience integrating with RESTful API
  • 2+ years of experience with CICD tools.