Qualifications
• Minimum 15+ years of relevant experience in applications development
• Architect and design large-scale, distributed big data solutions using Java and big data technologies to handle high-volume data processing and analytics.
• Expertise in Bigdata ecosystem (Cloudera Distribution) using spark and Map Reduce
• Java (core) 1.8 or above - Hands-on experience (advanced concepts of data structures, memory management and design patterns)
• Apache Spark - Hands on (preferable on Java spark)
• Bigdata ecosystem (Good understanding of Hadoop, preferable cloudera distribution), exposure to Hive, Impala, Yarn, Kafka
• Good Data analysis and programing skills and understanding of large datasets (no sql joins)
• Good knowledge of Java and spark architecture and design principles
• Good experience in Junit/Testing frameworks
• Unix/Python shell scripting experience is a big plus
• Experience managing global technology teams
• Working knowledge of industry practices and standards
• Good communication skills