| Position | Data Engineer |
| Posted | 2025 December 14 |
| Expired | 2026 January 13 |
| Company | Yochana |
| Location | Vancouver | CA |
| Job Type | Full Time |
Latest job information from Yochana for the position of Data Engineer. If the Data Engineer vacancy in Vancouver matches your qualifications, please submit your latest application or CV directly through the updated Jobkos job portal.
Please note that applying for a job may not always be easy, as new candidates must meet certain qualifications and requirements set by the company. We hope the career opportunity at Yochana for the position of Data Engineer below matches your qualifications.
Role: Data Engineer
Location: Vancouver, BC, Canada
6+ Months Contract
Rate: CAD 52/hr on T4
Role Overview:
We are looking for a Data Engineer with strong expertise in modern data platforms and engineering practices. The ideal candidate will design, build, and optimize data pipelines and solutions leveraging Snowflake, Kafka, Azure Data Factory (ADF), PySpark, and other cutting-edge technologies. This role requires proficiency in ETL/ELT development, data modeling, and ensuring data governance and security across cloud environments.
Key Responsibilities:
• Design, develop, and maintain scalable ETL/ELT pipelines for structured and unstructured data.
• Implement data integration solutions using ADF, Kafka, and stream processing frameworks.
• Build and optimize data models in Snowflake and other cloud platforms for analytics and reporting.
• Ensure data quality (DQT), governance, and security compliance across all data processes.
• Develop and maintain CI/CD pipelines for data workflows and deployments.
• Collaborate with data architects, analysts, and business teams to deliver high-quality data solutions.
• Monitor and troubleshoot data pipelines for performance and reliability.
• Document technical designs, processes, and best practices.
Required Skills & Qualifications:
• 8-11 yearsStrong experience with Snowflake (data warehousing, performance tuning, security).
• Hands-on expertise in Kafka for real-time data streaming and event-driven architectures.
• Proficiency in Azure Data Factory (ADF) for orchestration and integration.
• Solid knowledge of PySpark for big data processing and transformations.
• Advanced SQL skills for query optimization and data manipulation.
• Experience in ETL/ELT development and data modeling (star schema, normalization).
• Familiarity with Cloud Platforms (Azure and Snowflake).
• Understanding of Data Governance & Security principles.
• Knowledge of stream processing and real-time analytics.
• Experience with CI/CD tools (Azure DevOps, Git, Jenkins).
After reading and understanding the criteria and minimum qualification requirements explained in the job information Data Engineer at the office Vancouver above, immediately complete the job application files such as a job application letter, CV, photocopy of diploma, transcript, and other supplements as explained above. Submit via the Next Page link below.
Next Page »