Metro Supply Chain is a strategic supply chain solutions partner to some of the world’s fastest growing and most reputable organizations. For 50 years, it has excelled at tailoring integrated, data-driven solutions, fueled by advanced systems and technology, that fulfill complex and challenging distribution needs. Managing 19 million square feet operating out of more than 175 sites across North America and Europe with a team of 9,000, it is one of Canada’s largest privately owned supply chain solutions companies. Metro Supply Chain is a 2024 winner of the Canada’s Best Managed Companies program, recognized for its strategic expertise, culture of innovation and commitment to its people and local communities.
SUMMARY
The Data Engineer is primarily responsible for development and maintenance of data products and pipelines which support BI initiatives across organization. This role is also responsible for designing, developing, and maintaining Snowflake-based data infrastructure solutions to meet the needs of the organization. This role includes designing, building, testing, and deploying scalable, reliable, and secure data pipelines, data lakes, and data warehouses on the Snowflake cloud platform.
RESPONSIBILITIES
- Design, develop, and maintain scalable ELT/ETL pipelines using Snowflake, Matillion, and orchestration tools (Airflow, dbt Cloud).
- Own the Matillion development lifecycle including modeling, tests, documentation, versioning, and deployment.
- Build reusable and modular data transformation frameworks supporting analytics, BI, and machine learning use cases
- Data Design and implement Snowflake data models using:
- Dimensional modeling, Data Vault, 3NF and enterprise modeling practices, Optimize compute performance through query tuning, clustering, micro-partitioning, warehouse optimization, caching, and Snowpipe. Monitor pipeline performance and apply cost-control strategies.
- Establish, maintain, and optimize the data infrastructure to support the ever-evolving needs of the cloud (snowflake) platform.
- Data Integrity: Uphold the accuracy of data through systematic quality checks. Work closely with our data engineering team to address any data inconsistencies.
- Data Quality Assurance: Implement mechanisms to guarantee data accuracy, ensuring quick turnaround to data requests.
- Write complex, optimized SQL for large datasets and multi-layered transformations.
- Troubleshooting ingestion, transformation, and modeling issues end-to-end across the data ecosystem.
- Implement automated data quality checks, validation rules, and CI/CD workflows.
- Conduct quality control checks to guarantee the accuracy and precision of deliverables.
- Troubleshoot data visualization issues and propose solutions to rectify errors. Participate in team meetings and provide project progress updates.
- Documentation: Ensure documenting end-to-end data ingestion development across all data pipelines. Uphold meticulous documentation standards for all data procedures and architectural choices.
- This job entails working with both internal and external stakeholders.
- Purpose: Collaborate with fellow team members and business stakeholders to understand their reporting and analytical needs. Conduct thorough analysis of business requirements, translate them into technical specifications, and develop data models and metrics to meet those requirements.
- This role involves working closely with cross-functional teams, including data engineers, WMS IT team, Operations, among others. This is essential to ensure seamless integration of data and alignment of data-driven initiatives. This role also involves limited supervision of other Data analysts’ projects, including technical support and peer review.
EXPERIENCE
- 5+ years of work experience with development, data warehousing, and end-to-end implementation of Snowflake cloud data warehouse or end-to-end data warehouse implementations on-premises.
- Production experience building high-volume data pipelines and ETL/ELT systems: Matillion, DBT, Azure Data Factory, or similar.
- Deep expertise with modern data warehouses (Snowflake strongly preferred) including advanced optimization techniques.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Professional experience with APIs and bulk-load patterns to capture data from multiple data sources.
- Deep understanding of data warehousing concepts, including complex data structures, data transformations and data analytics.
- Experience with end-to-end design and build process of Near-Real Time and Batch Data Pipelines, expertise with SQL and data modeling.
- Relevant experience in managing engineering teams to deliver enterprise data solutions using Data Lake, Data warehousing, and Business Intelligence capabilities.
EDUCATION
- University/College degree in Data Science/Analytics, Engineering, or related field
- Cloud certified: Azure Certification preferred
Why Join Us
- Work in an environment where safety is our first priority
- The opportunity to build a career with a growing company
- Medical, dental, and vision coverage for you and your family
- Life and disability insurance
- Wellness programs to support your family’s well-being
- A Retirement Savings Program with a company match
- Company team wear allowance
- Employee Appreciation Day
- Company sponsored social events
- Community volunteering