Current jobs related to Databrick Migration Engineer - McLean - Technology Ventures
-
Data Engineer
1 month ago
McLean, United States Harmonia Holdings Group, LLC Full timeHarmonia Holdings Group, LLC is an award-winning, rapidly growing federal government contractor committed to providing innovative, high-performing solutions to our government clients and focused on fostering a workplace that encourages growth, initiative, creativity, and employee satisfaction. Responsibilities:Design, implement, and maintain MongoDB data...
-
Data Engineer
3 weeks ago
McLean, United States Harmonia Holdings Group, LLC Full timeHarmonia Holdings Group, LLC is an award-winning, rapidly growing federal government contractor committed to providing innovative, high-performing solutions to our government clients and focused on fostering a workplace that encourages growth, initiative, creativity, and employee satisfaction. Responsibilities:Design, implement, and maintain MongoDB data...
-
Senior Data Engineer
4 weeks ago
McLean, United States Infinitive Inc Full timeJob Title: Senior Data EngineerWe are seeking a highly skilled Senior Data Engineer to join our dynamic team at Infinitive Inc.About the Role:The ideal candidate will have a strong background in data engineering, with expertise in Databricks, DevOps tools (Jenkins/Terraform), and data modeling concepts (3NF, Dimensional, Data Vault). As a Senior Data...
-
Senior Data Engineer
2 months ago
McLean, United States Infinitive Inc Full timeAbout the RoleWe are seeking a highly skilled Senior Data Engineer to join our dynamic team at Infinitive Inc. The ideal candidate will have a strong background in data engineering, with expertise in Databricks, DevOps tools (Jenkins/Terraform), and data modeling concepts (3NF, Dimensional, Data Vault).Key ResponsibilitiesData Engineering: Design, build, and...
-
Senior Data Engineer
2 days ago
McLean, United States Infinitive Inc Full timeAbout the RoleWe are seeking a highly skilled Senior Data Engineer to join our dynamic team at Infinitive Inc. The ideal candidate will have a strong background in data engineering, with expertise in Databricks, DevOps tools, and data modeling concepts.Key Responsibilities:Design, build, and maintain scalable data pipelines and ETL processes using Databricks...
-
Senior Data Engineer
3 months ago
Mclean, United States Capital One Full timePlano 1 (31061), United States of America, Plano, TexasSenior Data Engineer (Python, Kafka, Databricks, AWS)Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers,...
-
Data Engineer
3 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Python to Pyspark migration)Location: McLean, VA / Richmond, VA / Plano, TX (only locals) Duration: ContractRequired:Work is Python to Pyspark migrationManager is looking for 2-4 consultants and is open to team in a box kind of setupThe project needs to be completed by end of 2024 and there is not going to be any extensionDue to...
-
Data Engineer
3 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Python to Pyspark migration)Location: McLean, VA / Richmond, VA / Plano, TX (only locals) Duration: ContractRequired:Work is Python to Pyspark migrationManager is looking for 2-4 consultants and is open to team in a box kind of setupThe project needs to be completed by end of 2024 and there is not going to be any extensionDue to...
-
Data Engineer
4 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Python to Pyspark migration) Location: McLean, VA / Richmond, VA / Plano, TX (only locals) Duration: Short term contract Required: Work is Python to Pyspark migration Manager is looking for 2-4 consultants and is open to team in a box kind of setup The project needs to be completed by end of 2024 and there is not going to...
-
Data Engineer
7 days ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Pyspark)Location: McLean, VA / Richmond, VA / Plano, TX/Chicago, Il (only locals)Duration: ContractWork is Python to Pyspark migrationTop Skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
7 days ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Pyspark)Location: McLean, VA / Richmond, VA / Plano, TX/Chicago, Il (only locals)Duration: ContractWork is Python to Pyspark migrationTop Skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Senior Lead Software Engineer, Back End
4 weeks ago
Mclean, United States Capital One Full timeCenter 3 (19075), United States of America, McLean, VirginiaSenior Lead Software Engineer, Back EndDo you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers...
-
Data Engineer
2 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Pyspark)Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
1 week ago
mclean, United States Idexcel Full timeJob Title: Data Engineer (Pyspark)Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
2 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
2 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer (Pyspark)Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
2 weeks ago
McLean, United States Idexcel Full timeJob Title: Data Engineer Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Data Engineer
2 weeks ago
mclean, United States Idexcel Full timeJob Title: Data Engineer Location: McLean, VA / Richmond, VA / Plano, TX (only locals)Duration: ContractRequired:Work is Python to PySpark migrationTop skills:Python, Pyspark, SQL, AWS, Snowflake, Data Bricks, ETL, Kafka, Unix (Command/Shell Script) and ETL/DWH concepts
-
Cloud Migrations Consultant, WWPS FedCiv
4 weeks ago
West Mclean, VA, Fairfax County, VA; Virginia, United States Amazon Web Services, Inc. Full timeAre you an experienced commercial or open source database platform specialist? Are you an Oracle or SQL Server Platform Specialist? Have you worked on Open source Database platforms like PostgreSQL, MySQL, MariaDB or Casandra? Have you migrated Applications from one Database Engine to another? Do you like to engage with senior leadership and determine ways...
-
Senior Software Engineer
1 month ago
McLean, Texas, United States Cloud BC Labs Full timeJob OpportunityPosition: Lead / Full Stack EngineerLocations: HybridDuration: Contract PositionJob Summary:This senior engineer role focuses on full-stack development, driving innovation and efficiency in our projects.We are developing a content-driven UI (React, NextJs, Typescript, AWS) to enhance product offerings for partners and clients.The project aims...
Databrick Migration Engineer
3 months ago
Responsibilities
- Architect and design solutions to meet functional and non-functional requirements.
- Lead the design, implementation, and optimization of our Databricks platform.
- Work closely with our data engineering team to ensure that our Databricks platform is optimized for performance, scalability, and reliability.
- Develop and maintain a comprehensive understanding of our data pipeline and data architecture.
- Collaborate with other teams to ensure that our Databricks platform is integrated with our other systems and technologies.
- Develop and maintain documentation for our Databricks platform, including architecture diagrams, deployment guides, and operational procedures.
- Provide guidance and support to our data engineering team on Databricks-related issues.
- Create and review architecture and solution design artifacts.
- Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc.
- Proactively guide engineering methodologies, standards, and leading practices.
- Guidance of engineering staff and reviews of as-built configurations during the construction phase.
- Provide insight and direction on roles and responsibilities required for solution operations.
- Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle.
- Demonstrate strong analytical and technical problem-solving skills.
- Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions.
- Supporting and developing our people, including learning & development, certification & career development plans
- Providing technical governance and oversight for platform design and implementation
- Should have technical foresight to understand new technology and advancement.
- Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance
- Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions.
- Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives
- Development of Go-to-Market and Service Offering definitions for Data Engineering
- Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor.
- Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases.
- Build new Data capabilities, solutions, assets, accelerators, and team competencies.
Minimum qualifications
- Excellent technical architecture skills, enabling the creation of future-proof, complex global Platform solutions on Databricks.
- Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions.
- Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team.
- Familiar with solution implementation/management, service/operations management, etc.
- Maintains close awareness of new and emerging technologies and their potential application for service offerings and products.
- Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience.
- Experience in a Platform architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms.
- Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities.
- Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance & Compliance.
- Must have experience to design platform on Databricks.
- Experience in any cloud data warehouse like Redshift or Snowflake.
- Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills.
- Hands-on Experience to design and build Databricks based solution on any cloud platform.
- Hands-on experience to design and build solution powered by DBT models and integrate with databricks.
- Must be very good designing End-to-End solution on cloud platform.
- Must have good knowledge of Data Engineering concept and related services of cloud.
- Must have good experience in Python and Spark.
- Must have good experience in setting up development best practices.
- Good to have knowledge of docker and Kubernetes.
- Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC, SSO etc.
- Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc.
- Experience building and supporting mission-critical technology components with DR capabilities.
- Experience with multi-tier system and service design and development for large enterprises
- Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.
- Exposure to infrastructure and application security technologies and approaches
- Familiarity with requirements gathering techniques.
Preferred qualifications
- Must have designed E2E Platform architecture on Databricks covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption.
- Must have excellent coding skills either Python or Scala, preferably Python.
- Must have experience in Data Engineering domain
- Must have designed and implemented at least 2-3 project end-to-end in Databricks.
- Strong expertise in Apache Spark, Delta Lake, and other Databricks components for data processing and analytics.
- Delta lake
- db API 2.0
- SQL Endpoint – Photon engine
- Unity Catalog
- Security management
- Platform governance
- Data Security
- Proficiency in AWS services including but not limited to S3, EC2, IAM, VPC, EKS, Lambda, Glue, Private Link, KMS, CloudWatch, EMR etc.
- Must have knowledge of new features available in Databricks and its implications along with various possible use-case.
- Strong expertise in designing SOX compliant platform architecture.
- Must know how to manage various Databricks workspace and its integration with other applications.
- Proficient in designing and implementing Everything as a code
- Infrastructure as a code
- Configuration as a code
- Configuration as a code
- Security configuration as a code
- Must have strong expertise in designing platform with strong observability and Monitoring standards.
- Proficient in setting best practices of various DevSecOps activities including CI/CD.
- Must have knowledge of Databricks cluster optimization and its integration with various cloud services.
- Must have strong performance optimization skills to improve efficiency and reduce cost.
- Must have strong communication skills and have worked with cross platform team.
- Must have great attitude towards learning new skills and upskilling the existing skills.
- Responsible to set best practices around Databricks CI/CD.
- Must understand composable architecture to take fullest advantage of Databricks capabilities.
- Good to have Rest API knowledge.
- Good to have understanding around cost distribution.
- Good to have if worked on migration project to build Unified data platform.
- Good to have knowledge of DBT.
- Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools
- Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java, Python, etc.
- Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary.
- Experience coordinating the intersection of complex system dependencies and interactions
- Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc.
- Demonstrated knowledge of relevant industry trends and standards