Sr. GCP Data Engineer C2C jobs in Irving, TX

Contract

Data Engineer C2C roles

Sr. GCP Data Engineer
Irving, TX

Any Visa

Rate – 50/hr. C2C

 

 

 

 

BigQuery for data warehousing and analytics
Dataproc for managing Apache Spark and Hadoop clusters
Airflow for orchestration of workflows and pipelines
Data streaming with Java

Job Title: Sr. GCP Data Engineer

Location: Irving, TX

Project Duration: 12 months

 

KEY RESPONSIBILITIES:

•     Lead and guide the data engineering team to deliver projects from concept to completion.

•     Collaborate with business stakeholders to understand and translate business requirements into technical solutions.

•     Redesign and optimize existing data pipelines and architectures as needed to ensure high performance and scalability.

•     Oversee the end-of-the-end data delivery process, from ingestion to transformation and reporting.

•     Perform code reviews, enforce best practices, and ensure the quality and consistency of the codebase.

•     Manage workflows and scheduling with tools like Apache Airflow to ensure smooth execution of pipelines.

•     Use GCP technologies like Dataflow, Apache Beam, BigQuery, Dataproc, and other services for data transformation, storage, and analysis.

•     Develop scalable solutions with Apache Spark, Hadoop, and other distributed systems on GCP.

•     Monitor and optimize the performance of data pipelines and processes.

•     Collaborate with DevOps and Cloud teams for CI/CD integration and infrastructure optimization.

•     Stay updated with latest developments in GCP services and recommend improvements to existing processes.

•     Provide mentorship to junior engineers and manage the team’s performance effectively.

 

REQUIRED SKILLS AND EXPERIENCE:

•     12+ years of experience in data engineering with at least 3-5 years on Google Cloud Platform (GCP).

•     Solid understanding of network/telecom domains, including relevant data types and use cases.

•     Expertise in GCP tools, including:

BigQuery for data warehousing and analytics
Dataproc for managing Apache Spark and Hadoop clusters
Airflow for orchestration of workflows and pipelines
Data streaming with Java

•     Strong experience in Apache Spark and Hadoop ecosystems.

•     Ability to design, develop, and optimize ETL/ELT pipelines for large-scale data.

•     Excellent leadership skills with experience managing teams and performing code reviews.

•     Hands-on experience in data modeling and data architecture design.

•     Familiarity with CI/CD processes and best practices for cloud environments.

•     Strong problem-solving skills and ability to redesign existing solutions if needed.

•     Excellent communication and interpersonal skills to collaborate with both technical and business teams.

 

PREFERRED SKILLS:

•     Experience in data governance and data security on cloud platforms.

•     Familiarity with real-time data streaming technologies.

•     Experience working in agile environments and using tools like Jira or Confluence.

•     Google Cloud certifications such as Professional Data Engineer are a plus.

 

Educational Qualifications:

•     Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

 

To apply for this job email your details to vineets@tsourceinc.net

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam