Position Name – Senior Data Modeler Specialist – GCP Cloud
Type of hiring – Subcon/Fulltime
Location – Toronto, ON (Onsite Hybrid)
Job Description:
Senior Data Modeller Specialist – GCP Cloud (10+ Years)
We are seeking a seasoned Senior Data Modeler Specialist with over 10 Years of experience in enterprise data modeling and a strong command of Google Cloud Platform (GCP). This role is critical to designing scalable, secure, and high-performance data models that power analytics, AI/ML, and operational systems across the organization.
Key Responsibilities
- Lead the design and implementation of conceptual, logical, and physical data models for cloud-native and hybrid environments
- Collaborate with data architects, engineers, and business analysts to translate business requirements into optimized data structures
- Develop and maintain data models for GCP-native services including BigQuery, Cloud SQL, Firestore, and Dataplex
- Define and enforce data modeling standards, naming conventions, and metadata management practices
- Conduct data profiling, lineage mapping, and source-to-target transformations for ingestion pipelines
- Partner with governance teams to ensure models align with data privacy, security, and compliance standards (e.g., PIPEDA, GDPR)
- Optimize models for performance and cost-efficiency in GCP, leveraging partitioning, clustering, and materialized views
- Support semantic layer design for BI tools like Looker, Tableau, or Power BI
- Document data definitions, relationships, and business rules using modeling tools and GCP-native metadata catalogs
- Mentor junior data modelers and contribute to the evolution of enterprise data architecture
Required Skills & Qualifications
- 10+ Years of hands-on experience in data modeling across large-scale, complex environments
- Deep expertise in relational, dimensional, and NoSQL modeling
- Strong proficiency in GCP services: BigQuery, Cloud SQL, Dataplex, Pub/Sub, Dataflow, and Looker
- Experience with data modeling tools such as Erwin, Power Designer, or SQL Developer Data Modeler
- Advanced SQL skills and understanding of query optimization and cost control in BigQuery
- Familiarity with Data Lakehouse architectures, data mesh, and modern ELT pipelines
- Solid understanding of data governance, metadata management, and data quality frameworks
- Bachelor's or Master’s degree in Computer Science, Information Systems, or related field
Preferred Qualifications
- GCP certifications (e.g., Professional Data Engineer, Cloud Architect)
- Experience in regulated industries such as finance, healthcare, or telecom
- Exposure to Agile/Scrum methodologies and CI/CD for data
- Knowledge of Python, dbt, Terraform, or other infrastructure-as-code tools
- Familiarity with AI/ML data preparation and feature store modeling