Hadoop Data Engineer

Chicago 11 days agoContractor External
424 - 459 / hr
Job Title: Hadoop Data Engineer Duration: 12 months with extension possible to 18 months Location: Chicago, IL (3 days per week onsite) Other Approved Locations: Charlotte, NC Pay Scale: $60-65/hr W2 (Cannot subcontract or C2C) Job Description: Matlen Silver has partnered with a leading global financial services firm to recruit a Hadoop Engineer with a Site Reliability or DevOps background to support NextGen Platforms built around Big Data Technologies (Hadoop, Spark, Kafka, Impala, Hbase) as well as Container and Automation technologies like Docker and Ansible. This role requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, Databricks, Snowflake, Talend, Greenfield, ELK, KPMG Ignite etc. The Hadoop Engineer is involved in the full life cycle of an application and part of an agile development process. They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility and education/experience within this role. Responsibilities: - Works on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise - Team member will be expected to provide subject matter expertise in managing Hadoop and Data Science Platform operations with focus around Cloudera Hadoop, Jupyter Notebook, OpenShift, Docker-Container Cluster Management and Administration - Integrates solutions with other applications and platforms outside the framework - He / She will be responsible for managing platform operations across all environments which includes upgrades, bug fixes, deployments, metrics / monitoring for resolution and forecasting, disaster recovery, incident / problem / capacity management - Serves as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs Desired Experience: - Hadoop, Kafka, Spark, Impala, Hive, Hbase etc. - Strong knowledge of Hadoop Architecture, HDFS, and Hadoop Cluster - Knowledge of fully integrated AD/Kerberos authentication - Experience setting up optimum cluster configurations - Debugging knowledge of YARN. - Hands-on with analyzing various Hadoop log files, compression, encoding, file formats - Expert level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, Zookeeper and Postgres - Strong technical knowledge of some of the following: Unix/Linux; Database (Sybase/SQL/Oracle), Java, Python, Perl, Shell scripting. - Experience in Monitoring Alerting, and Job Scheduling Systems - Strong grasp of automation / DevOps tools - Ansible, Jenkins, SVN, Bitbucket About Matlen Silver Experience Matters. Let your experience be driven by our experience. For more than 40 years, Matlen Silver has delivered solutions for complex talent and technology needs to Fortune 500 companies and industry leaders. Led by hard work, honesty, and a trusted team of experts, we can say that Matlen Silver technology has created a solutions experience and legacy of success that is the difference in the way the world works. Matlen Silver is an Equal Opportunity Employer and considers all applicants for all positions without regard to race, color, religion, gender, national origin, age, sexual orientation, veteran status, the presence of a non-job-related medical condition or disability, or any other legally protected status. If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at email and/or phone at: info@matlensilver.com // 908-393-8600.