Posted: 22 hours ago
Job Description
<b>Data Engineer – Snowflake & DBT (Remote) </b><b><br></b><br>
Design and optimize cutting-edge data pipelines and warehouse solutions using Snowflake and DBT in a fully remote role across Québec or Ontario. This permanent opportunity offers a salary of $82–90K (negotiable based on experience) and the chance to work in a dynamic, cloud-based environment with strategic impact.<br>
<br>
<b>What is in it for you:</b><b><br></b>
<br>
<br>• Salary starting at $ negotiable based on experience).<br>
<br>• Annual bonus based on individual performance and company profitability, paid at the end of the fall.<br>
<br>• Permanent full-time position (40 hours/week), Monday to Friday, between 8 am and 5 pm.<br>
<br>• 3 weeks of vacation per year, depending on seniority.<br>
<br>• Comprehensive benefits package available after 90 days: dental and medical insurance, massage therapy, chiropractic care, and more.<br>
<br>• Retirement savings plan: voluntary contribution of up to 3% of salary, with matching employer contribution.<br>
<br>
<b>Responsibilities:<br></b>
<br>
<br>• Design, build, and maintain data pipelines, warehouses, and data models using Snowflake and DBT.<br>
<br>• Collaborate with cross-functional teams to gather data requirements and develop efficient data architectures.<br>
<br>• Implement and manage ETL/ELT processes across structured and unstructured data sources using tools such as Azure Data Factory and SQL.<br>
<br>• Enforce data governance protocols including quality, lineage, metadata management, and security compliance.<br>
<br>• Monitor system performance, conduct tuning, and proactively address bottlenecks.<br>
<br>• Maintain documentation of data processes, architecture, and technical specifications.<br>
<br>• Contribute to team knowledge by supporting peers and staying current on data engineering trends.<br>
<br>
What you will need to succeed:<br><br>• Bachelor's or graduate degree in computer engineering, data science, mathematics, or a related discipline.<br>
<br>• Relevant certifications in Azure Data Services or Snowflake are considered an asset.<br>
<br>• 4–6 years of experience in data engineering or a related field.<br>
<br>• Proficient in SQL and familiar with both relational and NoSQL databases (e.g., MS SQL Server, Snowflake, PostgreSQL, Cosmos DB).<br>
<br>• Hands-on experience with Snowflake and DBT for warehousing and data transformation.<br>
<br>• Skilled in designing and optimizing data pipelines and ETL/ELT workflows.<br>
<br>• Experience with cloud platforms, particularly Azure, and cloud-based storage systems.<br>
<br>• Familiarity with data pipeline and orchestration tools such as Azure Data Factory, Airflow, Azkaban, or Luigi.<br>
<br>• Experience leveraging REST APIs for data integration.<br>
<br>• Comfortable working in multidisciplinary teams to address complex data processing challenges.<br>
<br>
<b>Why Recruit Action?</b><b><br></b>
<br>
Recruit Action (agency permit: AP provides recruitment services through quality support and a personalized approach to job seekers and businesses. Only candidates who match hiring criteria will be contacted.<br>
<br>
# GE220725<br>
Create Your Resume First
Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.
It's fast, easy, and increases your chances of getting an interview!
Application Disclaimer
You are now leaving Govtjobs.ca and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.
Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.