Role: Data Integration Engineer
Contract Term: 12 months, renewable
Location: Hybrid: 3 Days Remote/ 2 Days In-office (Vaughan, Ontario)
Hours: 35 per week
Rate: Negotiable
Our Utilities client is one of Ontario’s leading energy providers, committed to building a smarter, more sustainable future. They are a technology-driven organization focused on building scalable, secure, and intelligent data ecosystems - With a mission is to empower business decisions through seamless data integration and cloud innovation. We are in search of a Data Integration Engineer to help unlock the full potential of their data ecosystem. If you thrive in collaborative environments, love solving complex problems, and want to make a real impact in the utilities sector, this role is for you.
🧠 Role Overview
As a Data Integration Engineer, you’ll be responsible for designing, developing, and maintaining enterprise-grade data pipelines using Informatica on Azure. You’ll work closely with cross-functional teams to ensure data flows reliably across systems, enabling analytics, reporting, and operational efficiency.
🔧 Key Responsibilities
- Develop and maintain ETL/ELT workflows using Informatica Cloud or PowerCenter
- Build scalable data pipelines on Azure , leveraging services like Azure Data Lake , Blob Storage , and Azure SQL
- Integrate data across cloud and on-prem systems, ensuring high performance and reliability
- Collaborate with business analysts, data architects, and application teams to understand integration requirements
- Monitor, troubleshoot, and optimize data flows for latency, throughput, and accuracy
- Document integration architecture, data mappings, and transformation logic
- Support data governance, quality, and compliance initiatives
🧰 Required Skills & Experience
- ETL Workflow Development:
- Proven experience designing and implementing end-to-end data pipelines using Informatica. This includes:
- Extracting data from diverse sources such as relational databases, APIs, flat files, and cloud platforms
- Applying complex transformation logic to cleanse, enrich, and standardize data
- Loading data into target systems like Azure SQL , Data Lake , or Synapse Analytics
- Building reusable mappings, workflows, and parameterized tasks to support scalable integration patterns
- Data Cleansing & Quality Assurance:
- Strong understanding of data profiling , cleansing , and validation techniques to ensure high-quality, trustworthy data. Responsibilities include:
- Identifying and resolving data anomalies, duplicates, and inconsistencies
- Implementing business rules for data standardization and enrichment
- Collaborating with data stewards and governance teams to align with quality benchmarks
- Leveraging Informatica’s built-in tools for data quality monitoring , error handling , and exception reporting
- Performance Optimization:
- Ability to tune ETL jobs for efficiency and scalability , including:
- Managing large volumes of data with partitioning and parallel processing
- Monitoring job execution and troubleshooting bottlenecks
- Implementing best practices for resource utilization and error recovery
- Documentation & Collaboration:
- Comfortable documenting technical specifications, data flow diagrams, and transformation logic. Able to work cross-functionally with business analysts, data architects, and application teams to translate requirements into actionable solutions.
- Hands-on expertise with Informatica (Cloud or PowerCenter)
- Strong understanding of Azure cloud services and data architecture
- Proficiency in SQL , JSON/XML , and REST APIs
- Experience with data modeling , data warehousing , and metadata management
- Excellent problem-solving and communication skills
✅ Nice to Have
- Experience working in or integrating systems within the utilities sector (e.g., energy, water, infrastructure)
- Familiarity with Azure Data Factory , Logic Apps , or API Management
- Knowledge of data governance frameworks and compliance standards