**Job Description & Requirements**
• *Responsibilities**
(a) Big Data Ingestion
- Extract (taking the data from its current location).
- Transform (cleansing and normalizing the data).
- Load (placing the data in a database where it can be analysed).
(b) Big Data Aggregation.
(c) Application Deployment and Testing.
(d) Security Vulnerability Fixing.
• *Big Data Ingestion**
- Transform: Different data structures from different platforms are to be analysed and ingested into the appropriate structure so that data aggregation is made possible.
- Load: Development of schedule job/task is needed to run daily ingestion of different platforms at different time.
- Performance tuning is required to balance out the system resources and MongoDB database.
• *Big Data Aggregation**
- High proficiency and in-depth knowledge of MongoDB is required.
- MongoDB aggregation pipeline knowledge is a must, to perform big data aggregation of various data sources.
- Complex and complicated BI business logic are to be translated into MongoDB aggregation pipeline.
- Performance tuning is needed to fine tune the time taken to compile the daily aggregated data from the big data.
- Data Accuracy is to be ensured for the compiled aggregated data.
• *Application Deployment and Testing**
- Performance and functional testing is to be carried out. Any discovered issue is to be fixed or fine tune before the launch.
- Deployment is needed to be carried out in AWS which is our cloud solution provider.
• *Security Vulnerability Fixing**
- All security vulnerabilities are to be fixed before the actual launch.
• *Requirements**:
The requirement specification comprises of back-end development and the big data aggregation pipeline development.
Programming skillset required
- Node JS
- Bootstrap
- AngularJS/ReactJS
- HTML5/CSS3
- MongoDB
• *Salary**: $7,000.00 - $8,000.00 per month
Schedule:
- Monday to Friday
Work Location: In person