Sr. Big Data Architect to extract and migrate existing data in Maximo V6/V7 to an Azure data lake for our Public Sector client
Location: Toronto (Hybrid 2/3 days per week on-site)
Duration: 6m (possibility of extension)
Responsibilities:
• Analyze the current data sets, standard Maximo 6 and version 7 data tables, use-cases.
• Develop internal requirements for future state, including discussions with RF users, and examining outputs to determine the future data model.
• Develop the data requests detailing; data sets, format and other variables required.
• Develop a data quality process to verify, confirm and quality assure that the provided data and data sets meet the requirements set above and approved by RF.
• Support, work closely and coordinate with RF users and other contractors to help extract the required data from the Maximo system(s).
• Prepare/review the data to be migrated to the new data lake. Liaise with third party or contractor IT staff to coordinate data transfer requirements, and eventual transfer of data.
• Validate the finalized database after migration to the data lake. After migration has occurred, testing should be done to ensure that the new system works as designed.
• Support other data related activities, e.g: enterprise data governance, training and other not Maximo data related activities.
Must have skills:
• Experience using Maximo MMS version 6 and 7 (tables, data, relationships between tables, ERD)
• Data integration – Moving data into the Azure Data Lake from source systems in the cloud (Azure) and on-premises data centers.
• Extensive experience in Microsoft Cloud solutions, i.e., designing, developing, and testing technologies such as Azure Data Lake, ADF, Synapse, Power BI, Databricks, etc.
• Updating and migrating SQL databases (RDBMS)
• Proven experience with deployments to the cloud, preferably in Platform as a Service (PaaS) or Database as a Service (DbaaS) modes.
• Experience using JIRA and Confluence