Hadoop Expert

Chicago 5 days agoFull-time External
Negotiable
• *Please note: This is NOT a Data Engineer role. Position:** Hadoop Subject Matter Expert Industry: Public Sector Type: 12 months contract Location: Chicago, IL (on-site) Pay Rate: $70 - $73/hour Day to day: • Provide subject matter expertise in managing Hadoop and Data Science Platform operations, focusing on Cloudera Hadoop, Jupyter Notebook, OpenShift, Docker-Container Cluster Management, and Administration. • Integrate solutions with other applications and platforms outside the framework. • Manage platform operations across all environments, including upgrades, bug fixes, deployments, metrics/monitoring for resolution and forecasting, disaster recovery, incident/problem/capacity management. • Serve as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs • Manage day-to-day operations for platforms built on Hadoop, Spark, Kafka, Kubernetes/OpenShift, Docker/Podman, and Jupyter Notebook. • Support and maintain AI/ML platforms such as Cloudera, DataRobot, C3 AI, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite, and others. • Perform cluster management, upgrades, bug fixes, deployments, and disaster recovery activities. • Own incident response, problem management, and capacity planning. • Implement monitoring, alerting, and forecasting solutions to ensure system reliability. • Collaborate closely with development teams, vendors, and stakeholders to support agile software delivery. • Automate platform tasks using tools like Ansible, shell scripting, and Python. Must haves: • Strong knowledge of Hadoop Architecture, HDFS, Hadoop Cluster, and Hadoop Administrator's role • Intimate knowledge of fully integrated AD/Kerberos authentication. • Experience setting up optimum cluster configurations • Expert-level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, Zookeeper, and Postgres. • Hands-on experience analyzing various Hadoop log files, compression, encoding, and file formats. • Scripting and automation skills Python, Shell, Ansible • Experience in Monitoring Alerting, and Job Scheduling Systems. • Knowledge of databases: SQL, Cassandra, Postgres. • Strongly Proficiency with Unix/SQL scripting • Experience in agile development environments. Plusses: • Previous experience with Snowflake, Data Bricks or Cloud Platforms (azure) • Previous Financial Experience Salary and Compensation: The hourly rate for this position is between $70 - $73 per hour . Factors which may affect pay within this range may include geography/market, skills, education, experience, and other qualifications of the successful candidate. Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, (paid time, paid sick and safe time, hours of paid vacation time, weeks of paid parental leave, paid holidays annually - AS Applicable).