GCP Architect

4 weeks ago


Pensacola, United States Cat America Full time
Job DescriptionJob Description

Key Responsibilities:

  • Design and implement solutions for data center migrations, encompassing both applications and databases, to the cloud or alternative data centers.
  • Configure and provision cloud resources, including virtual machines, storage, networking, and security services, to support migration activities and post-migration operations.
  • Implement automation and orchestration solutions to streamline provisioning, deployment, and management of cloud infrastructure components.
  • Design and implement security controls, policies, and procedures to protect cloud-based assets and data, ensuring compliance with industry standards and regulatory requirements.
  • Optimize cloud infrastructure performance, scalability, and cost-effectiveness through resource right-sizing, load balancing, and auto-scaling configurations.
  • Document cloud architecture designs, migration plans, implementation procedures, and operational best practices, ensuring knowledge transfer and continuity of operations.
  • Provide guidance, mentorship, and technical expertise to cross-functional teams, fostering a culture of collaboration, innovation, and continuous improvement.
  • Develop scalable solutions on the Google Cloud Platform (GCP), leveraging its suite of services and tools.
  • Collaborate closely with stakeholders to gather requirements and define architecture blueprints for data engineering projects on GCP.
  • Lead the implementation of data engineering projects on GCP, including building pipelines, data lakes, and data warehouses.
  • Execute migrations from on-premises environments to GCP, ensuring seamless transition and minimal disruption to operations.
  • Provide technical expertise and guidance to project teams, ensuring adherence to best practices and architectural standards.
  • Conduct performance optimization and tuning activities to enhance the efficiency and reliability of GCP-based solutions.

Must-Have Skills:

  • Demonstrated experience with GCP data engineering, including the implementation of data engineering projects on GCP/BigQuery.
  • Understanding of DevOps in GCP infra.
  • Proven ability to design and architect scalable solutions in GCP, leveraging its suite of services such as Compute Engine, Bigtable, Pub/Sub, Dataflow, and others.
  • Hands-on experience in building data pipelines and ETL processes on GCP, using tools like Apache Beam, Apache Spark, or similar technologies.
  • Experience with migration initiatives from on-premises environments to GCP, including data migration strategies and best practices.
  • 13+ years of total experience in the field of data engineering and architecture, with at least 3-4 years of experience in an architectural role.
  • Healthcare experience(Payer/Provider)

Nice-to-Have Skills:

  • GCP Data Engineer certification, demonstrating proficiency and expertise in GCP data technologies.
  • Familiarity with other cloud platforms such as Azure or AWS, as well as hybrid cloud environments.
  • Strong understanding of modern data architecture principles, including data governance, data modeling, and metadata management.