Position: Big Data Hadoop Developer with GCP Big Query
Location: Phoenix, AZ
Duration : Fulltime Permanent
(Day 1 Onsite) And (USC-GC-GC EAD – H1B - Transfer Workable)
Responsibilities :
· Minimum 10 years of experience in Big Data Hadoop technologies & and software development
· Hands on experience on GCP - Big Query in Big Data (7 + Years)
· Hands on experience on Big Data platforms using Hadoop components like HBase, Hive, Pig, Oozie, and Apache Spark Components such as Spark SQL and Spark Core
· Hands on experience in Hadoop MapR framework
· Hands-on experience working on scheduling tools like Event Engine, and Control-M.
· Hands-on experience in Unix shell scripting.
· Strong skills on Relational SQL and NoSQL databases
· Basic knowledge in Core Java and object-oriented/object function scripting languages such as Python
· Handy knowledge on advance Java like: web services using Rest or Spring.
· Good communication skills for day-to-day interaction with client