Hadoop OR Snowflake OR Teradata Development LEADS - Sydney
These are 6-12 months contracts in Financial Services
Development LEAD -Hadoop, Big Data
• Hadoop * Spark, Hive * Python & PySpark * ETL tool * Teradata * Unix * Oracle / SQL Server / MySQL * AWS, Snowflake
• Experience in building Data pipelines
• Hadoop / Teradata platform, using Spark, TCF and various ETL frameworks
• Strong scripting knowledge in Java or Python
• Ability to read and decipher data algorithms
• Utilizing Big Data technologies to solve our customers' hardest data centric problems.
• Working closely with the Solution Designers and Business Analysts to perform Data Ingestion, Enrichment and Egression
• Working across the Group Data Warehouse, Big Data Platform, Ab Initio and Data Stage.
• Designing and building group data products by integrating diverse data from hundreds of internal and external sources
• Adopting tools, programming languages and templates to improve our data quality and efficiency
• Building and optimizing Big Data pipelines, architecture and data sets
• End to end ownership and accountability of the business requirements, design and code reviews, identify technical / performance issues and solve them.
• Learn and adapt to new technologies to solve problems and improve existing solutions
• Mentoring and providing support to junior Data Engineers within the team
Development LEAD - Snowflake
• SnowFlake *Python & PySpark *ETL tool *BigData *Teradata *Hadoop *Unix *Oracle / SQL Server / MySQL *AWS
• Experience in building highly scalable Data Warehouse systems.
• Must have exp on Unix
• Experience with SnowFlake data warehouse
• Deep understanding of SnowFlake architecture and processing
• Hands-on experience in Snowflake Cloud Development
• Experience in Snowflake Admin Management
• Experience with SQL and PLSQL, SQL query tuning, database performance tuning,
• Experience in writing complicated SQLs, analyzing query performance, query tuning, database indexes partitions, and stored procedure development.
• Experience with Data Ingestion into SnowFlake such as Snowpipe
• Nice to have experience in Hadoop to Snowflake/ Databricks migration experience.
Development LEAD - Teradata
• Teradata. *TCF (Teradata Control Framework) *GCFR (Teradata Global Control Framework) *SQL *ETL tool *BigData
• Experience in designing solutions and improving the existing architecture
• Proven experience as Teradata Engineer / Developer, with experience in advance data processing like TCF (Teradata Control Framework) or GCFR (Teradata Global Control Framework); and Teradata Multi-Load or FastLoad tools.
• Experience in Teradata BTEQ utility and have used advanced SQL scripting to run DDL, DML and macros and stored procedures.
• Shell script programming experience in Unix or Linux environment.
• Should have strong skills in Teradata performance tuning; creating stored procedures and macros; and running Teradata utilities.
• Experience in using data model concepts like Data Vaults, Enterprise Data Models.
• Experience in using ETL tools like DataStage, Informatica is desirable.
• Must have hands on experience on Performance Tuning and Design Patterns
• Must have had prior experience of mentoring junior Data Engineers within the team
• Exposure to Hadoop, using Spark and various ETL frameworks is a plus
Please send your detailed resume to peter@klareconsulting.com