Typical task breakdown:
- Identify, investigate, and obtain resolution commitments for platform and data issues to maintain and improve the quality and performance of assigned digital product data.
- Issue Identification: Reports in all forms from customers, dealers, industry representatives, and subsidiaries.
- Issue Investigation: Statistical analysis, data triage, and infrastructure problem-solving.
- Issue Resolution: Identify root causes, create SageMaker scripts to fix data, and perform break/fix tasks on data pipeline code.
- Develop scripts and automation tools to better detect and correct data issues.
- Develop monitoring and alerting capabilities to proactively detect data issues.
- Work directly on complex application and technical problem identification and resolution, including responding to off-shift and weekend support calls.
- Communicate with end users and internal customers to help direct the development, debugging, and testing of application software for accuracy, integrity, interoperability, and completeness.
- Employee is also responsible for performing other job duties as assigned by CLIENT management from time to time.
Interaction with team:
- Liaise with designers, engineers, and support teams to improve data pipeline performance & reliability.
Work environment:
Chicago or Peoria office ( hybrid schedule 2x days - could go 5 days in office in the future)
Education & Experience Required:
- Degree required with 5+ years’ experience in this capacity.
- Masters degree with 4+ years’ experience in this capacity.
- No degree but technical certifications with 8+ years exp in this capacity is welcomed as well.
Required Technical Skills
(Required)
- 2-4 years of python and SQL experience
Experience with development and delivery of microservices using serverless - AWS services (S3, Cloudwatch, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM)
- Background in data management, data engineering, or data operations
- Familiarity with ADO pipeline framework, or CICD experience (Jenkins)