
C2C contract jobs
Position: Senior Data Engineer / Data Architect (Databricks & Snowflake)
Location: Boston, MA – Hybrid
Job Type: Contract
Duration: Long term
Job Summary
We are seeking a highly experienced Senior Data Engineer / Data Architect with deep expertise in Databricks, Snowflake, and Azure cloud data platforms. The ideal candidate will have extensive experience designing and implementing scalable data pipelines, Lakehouse architectures, and real-time data processing solutions, particularly in regulated domains such as Life Sciences or Healthcare.
This role requires strong proficiency in Spark (PySpark), Delta Lake, Medallion architecture, and cloud-native data engineering practices, along with a solid background in data warehouse modernization and performance optimization.
________________________________________
Key Responsibilities
• Design and implement end-to-end data engineering pipelines using Azure Databricks, ADLS Gen2, and Snowflake.
• Develop scalable ETL/ELT pipelines using PySpark, Spark SQL, Python, and Talend.
• Build and maintain Lakehouse architecture using Delta Lake and Medallion (Bronze, Silver, Gold) layers.
• Implement real-time and batch data ingestion pipelines, including streaming using Spark Structured Streaming.
• Design and enforce data governance, access control, and lineage using Unity Catalog.
• Optimize Spark workloads through partitioning, caching, broadcast joins, and cluster tuning to improve performance and reduce cloud costs.
• Architect and manage CI/CD pipelines using Azure DevOps, Jenkins, and Git for automated deployments.
• Integrate multiple data sources and systems, ensuring high-quality, reliable, and scalable data delivery.
• Collaborate with cross-functional teams including data analysts, scientists, and business stakeholders to support analytics and reporting needs.
• Support data warehouse modernization initiatives, including migration from legacy systems to cloud platforms.
To apply for this job email your details to mahesh.p@tekgence.com