Global ERP Solutions logo

Data Integration lead

Global ERP Solutions

Mississauga, Canada

Share this job:
Posted:

Job Description

We're looking for a highly experienced Senior Azure Data Engineer to join our team and contribute to the next phase of enterprise data modernization. This role focuses on cloud-native engineering, data modeling, and building scalable data products using the Microsoft Azure ecosystem.

You will design and deliver solutions that power analytics, reporting, regulatory needs, and long-term data product strategies-particularly within US Wealth data domains.


Core Skills:

12+ years in Data Engineering, Data Warehousing, and ETL development

Strong hands-on experience with:

  • Azure Cloud Services (Data Lake, Synapse, ADF, Key Vault, Storage)
  • Azure Databricks (Notebooks, Clusters, Delta, Lakehouse)
  • PySpark & Python for large-scale data pipelines
  • SQL for transformations, optimization, and performance tuning
  • Unix/Linux for scripting and operations

Proven ability to build complex ETL pipelines, optimize performance, and convert legacy codebases into modern cloud-native architectures


Data Modeling & Architecture:

Experience building Conceptual, Logical, and Physical Data Models within Azure

Ability to translate business logic into clear Source-to-Target Mappings (STTM)

Strong understanding of data product design, metadata, lineage, and reusable models

Experience supporting Data Product strategies for US Wealth or similar financial domains (bonus)

Experience with CI/CD pipelines for data workloads

Strong proficiency with Jira and Confluence for Agile delivery

Ability to create technical specification documents, design flows, and development standards

Ability to mentor junior engineers and contribute to engineering best practices


Good to have:

Relevant certifications (Azure Data Engineer, Databricks, DP-203, etc.)

Financial Services background in:

  • Risk & Regulatory Data
  • Credit Cards / Retail Banking
  • Commercial or Wealth Data Platforms


What you will do:

  • Design, build, and maintain end-to-end Azure-based data pipelines
  • Develop optimized Databricks & PySpark workloads for batch and near-real-time processing
  • Perform code conversion and modernization as part of cloud transformation initiatives
  • Build and maintain scalable data models supporting analytics and data product use cases
  • Work closely with cross-functional teams-architecture, product, engineering, and QA
  • Ensure high standards of data quality, lineage, governance, and performance


Back to Listings

Create Your Resume First

Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.

It's fast, easy, and increases your chances of getting an interview!

Create Resume

Application Disclaimer

You are now leaving Internationalstudentshelpline.com and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.

Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.