We’re looking for a highly technical Microsoft Fabric Data Engineer to design, build, and maintain scalable data solutions across our organization. You will develop pipelines, engineer Lakehouse architectures, and integrate data from key business systems, such as ERP platforms, operational applications, and historians, into a unified Fabric-based data environment.
The ideal candidate has hands-on experience with Fabric Lakehouse, Delta tables, MSSQL, APIs, event-driven integrations, and industrial/manufacturing data flows.
Key Responsibilities
- Build and maintain Fabric Pipelines, Dataflows, Lakehouses, and semantic models.
- Integrate data from ERP, MES, historians, and manufacturing equipment.
- Develop ETL/ELT pipelines using SQL, Delta Lake, APIs, and event-based patterns.
- Engineer data models optimized for high-volume manufacturing workloads.
- Ensure data reliability, lineage, quality, and governance across environments.
- Troubleshoot ingestion, integration, schema, and pipeline failures.
- Support modernization initiatives to digitize legacy/manual processes.
Key Skills
- Microsoft Fabric (Pipelines, Lakehouse, OneLake, Dataflows)
- Azure Storage / MSSQL
- SQL, Delta Lake, ETL/ELT design
- APIs, REST, webhooks, system integrations
- Manufacturing/MES/ERP data flows (e.g., production, yield, downtime, scheduling, traceability)
- Experience with Agile delivery (JIRA)