Role Overview We're looking for a Data Engineer with experience in building scalable data pipelines and modeling workflows to help us manage the ingestion and transformation of data from various source systems into Snowflake. The ideal candidate will be able to design and lead all data engineering aspect of our analytics product, with a solid foundation in SQL, data modeling, and ELT orchestration tools, and is comfortable working with cloud infrastructure and modern data stack technologies. Responsibilities Design, develop, and maintain robust ELT pipelines to ingest data from multiple systems into Snowflake. Model raw data into clean, reliable, and analytics-ready datasets for internal teams and stakeholders. Implement and maintain data quality checks, monitoring, and alerting to ensure pipeline reliability. Collaborate with software engineers, analysts, and business users to understand data needs and provide scalable solutions. Write and optimize complex SQL queries in Snowflake for analytics and reporting use cases. Manage and document schema changes, data lineage, and data contracts across systems. Utilize tools such as dbt, Airflow, or similar to orchestrate and version data workflows. Support infrastructure-as-code practices using tools like Terraform or similar where relevant. Required Qualifications Conversational to professional english 3+ years of experience in data engineering or backend engineering with a focus on data workflows. Strong SQL skills, particularly in cloud data warehouses like Snowflake or BigQuery. Hands-on experience designing and maintaining ELT pipelines (dbt, Airflow, custom scripts). Familiarity with modern data modeling best practices (e.g., dimensional modeling, star schemas). Proficiency in a scripting language such as Python for data transformations and automation. Experience working with cloud environments (Azure preferred; AWS or GCP acceptable). Understanding of data governance, privacy, and access control principles. Nice to Have Experience integrating data from enterprise systems like Salesforce, HubSpot, or NetSuite. Familiarity with real-time data processing or event-based architectures. Exposure to CI/CD tools for data infrastructure (e.g., GitHub Actions, Azure DevOps). Experience with monitoring/logging tools for data pipelines. #J-18808-Ljbffr