Position : Hadoop Scala Developer with Java Spark Experience
Location : Hybrid (NYC ,NY
Duration : Long term
Mode of Interview : Telephonic/Video
Job Description: Need at least 11+ years of experience.
Required Skills: Hadoop-Spark, Hadoop-MapReduce , Java
• 4+ years of experience in data processing & software engineering and can build high-quality, scalable data-oriented products
• Experience on distributed data technologies (e.g. Hadoop, MapReduce, Spark, EMR, etc..) for building efficient, large-scale data pipelines
• Experience of Test-Driven Development approach (e.g, using Cucumber, Fitnesse, etc)
• Strong Software Engineering experience with in-depth understanding of Python, Scala, Java or equivalent
• Strong understanding of data architecture, modelling and infrastructure
• Experience with building workflows (ETL pipelines)
• Problem solver with attention to detail who can see complex problems in the data space through end to end
• Willingness to work in a fast-paced environment
• MS/BS in Computer Science or relevant industry experience