**Job Summary**:
• *Salary**
S$5,000 - S$7,100 / Monthly
• *Job Type**
• *Seniority**
Mid
• *Years of Experience**
At least 5 years
• *Tech Stacks**
Shell Script HDFS Shell UNIX Java Hive Spark SQL Scala Hadoop Python
• *The Opportunity**
Key Responsibilities
- Working with the latest tools and techniques
- Hands-on coding, usually in a pair programming environment
- Working in highly collaborative teams and building quality code
• *Essential Skills & Prerequisites**:
- Hand-on Development experience with Spark, ScalaSpark and Distributed computing.
- 4 to 6 years' experience designing and developing in Python.
- 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
- 3 to 5 years' experience with Unix shell scripting
- 3 to 5 years' experience with SQL
- 2 to 3 years' experience with Spark programming.
- Knowledge of micro-services architecture and cloud will be added advantage.
- Knowledge of Java and Scala will be added advantage