Required Skills & Qualifications:
• Strong experience with Snowflake (data warehousing, performance tuning, security).
• Hands-on expertise in Kafka for real-time data streaming and event-driven architectures.
• Proficiency in Azure Data Factory (ADF) for orchestration and integration.
• Solid knowledge of PySpark for big data processing and transformations.
• Advanced SQL skills for query optimization and data manipulation.
• Experience in ETL/ELT development and data modeling (star schema, normalization).
• Familiarity with Cloud Platforms (Azure and Snowflake).
• Understanding of Data Governance & Security principles.
• Knowledge of stream processing and real-time analytics.
• Experience with CI/CD tools (Azure DevOps, Git, Jenkins).