
C2C requirements
Role: Big Data Engineer
Location: New York, NY / ONSITE
Duration: Long Term Project
Experience: 10+
Rate:$55-$60/hr on c2c
Job Role and Responsibilities:
· Design & Develop new automation framework for ETL processing
· Support existing framework and become technical point of contact for all related teams.
· Enhance existing ETL automation framework as per user requirements
· Performance tuning of spark, snowflake ETL jobs
· New technology POC and suitability analysis for Cloud migration.
· Process optimization with the help of automation and new utility development.
· Work in collaboration for any issues and new features.
· Support any batch issue
·Support application team teams with any queries
Required Skills:
· Must be strong in UNIX Shell, Python scripting knowledge
· Must be strong in Spark
· Must have strong knowledge of SQL
· Hands-on knowledge on how HDFS/Hive/Impala/Spark works
· Strong in logical reasoning capabilities
· Should have working knowledge of Github, DevOps, CICD/ Enterprise code management tools
· Strong collaboration and communication skills
· Must possess strong team-player skills and should have excellent written and verbal communication skills
· Ability to create and maintain a positive environment of shared success.
· Ability to execute and prioritize a tasks and resolve issues without aid from direct manager or project sponsor.
· Good to have working experience on snowflake & any data integration tool i.e. informatica cloud
Primary skills:
· Apache Hadoop
· Apache Spark
· Unix Shell scripting
· Python
· SQL
Good to have skills:
· Snowflake/Azure/AWS any cloud
· IDMC/any ETL tool
To apply for this job email your details to shahid.m@wonese.com