Capgemini logo

GCP Data Engineer

Capgemini

Capgemini is a global business and technology transformation partner, and they are seeking a GCP Data Engineer to contribute to the migration of legacy data warehouses to a Google cloud-based data warehouse. The role involves designing, implementing, and delivering data solutions while collaborating with Data Product Managers and Data Architects.

Responsibilities

  • Contribute to the migration of legacy data warehouse to a google cloud-based data warehouse
  • Collaborate with Data Product Managers, Data Architects to design, implement, and deliver successful data solutions
  • Help architect data pipelines for the underlying data warehouse and data marts
  • Design and develop very complex ETL pipelines in Google cloud Data environments
  • Our legacy tech stack includes Teradata and new tech stack includes GCP Cloud Data Technologies like BigQuery and Airflow and languages include SQL , Python
  • Maintain detailed documentation of your work and changes to support data quality and data governance
  • Support QA and UAT data testing activities
  • Support Deployment activities to higher environments
  • Ensure high operational efficiency and quality of your solutions to meet SLAs and support commitment to our customers (Data Science, Data Analytics teams)
  • Be an active participant and advocate of agile/scrum practice to ensure health and process improvements for your team

Skills

  • 8+ years of data engineering experience developing large data pipelines in very complex environments
  • Very Strong SQL skills and ability to build Very complex transformation data pipelines using custom ETL framework in Google BigQuery environment
  • Exposure to Teradata and ability to understand complex Teradata BTEQ scripts
  • String Python programming skills
  • Strong Skills on build Airflow Jobs and Debug issues
  • Ability to Optimize the Query in BigQuery
  • Hands-on experience on Google Cloud data Technologies ( GCS , BigQuery, Dataflow, Pub sub, Data Fusion , Cloud Function)
  • Experience with cloud data warehouse technology BigQuery
  • Nice to have experience with Cloud technologies like GCP (GCS , Data Proc, Pub/sub, Data flow, Data Fusion, Cloud Function)
  • Nice to have exposure to Teradata
  • Solid experience with Job Orchestration Tools like Airflow and ability to build complex Jobs
  • Writing and maintaining large Data Pipelines using Custom ETL framework
  • Ability to Automate Jobs using Python
  • Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
  • Very good experience with Code Version control repository like Github
  • Good Scripting skills, including Bash scripting and Python
  • Familiar with Scrum and Agile methodologies
  • Problem solver with strong attention to detail and excellent analytical and communication skills
  • Ability to work in Onsite / Offshore model and able to lead a Team

Benefits

  • Paid time off based on employee grade (A-F), defined by policy: Vacation: 12-25 days, depending on grade, Company paid holidays, Personal Days, Sick Leave
  • Medical, dental, and vision coverage (or provincial healthcare coordination in Canada)
  • Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
  • Life and disability insurance
  • Employee assistance programs
  • Other benefits as provided by local policy and eligibility

Company Overview

  • Capgemini is a software company that provides consulting, technology, and digital transformation services. It was founded in 1967, and is headquartered in Paris, Ile-de-France, FRA, with a workforce of 10001+ employees. Its website is https://www.capgemini.com.

Company H1B Sponsorship

  • Capgemini has a track record of offering H1B sponsorships, with 2856 in 2025, 3012 in 2024, 3424 in 2023, 4392 in 2022, 3311 in 2021, 5871 in 2020. Please note that this does not guarantee sponsorship for this specific role.

Job Type

Job Type
Full Time
Location
New Jersey

Share this job: