[KL-377] | TRANSFORMATION SPECIALIST

Bebeedataengineer


Unlock Business Insights We are passionate about empowering organizations to make informed decisions. As a nearshore IT partner, we create high-performing teams to help clients innovate faster and more efficiently. Join Our Culture of Excellence We believe great culture drives great outcomes. We connect talent and technology to deliver measurable value for clients and meaningful career paths for our people. What You Can Expect: - Comprehensive benefits and wellness support - Flexible work model: hybrid, remote, or in-office - Real growth opportunities and leadership visibility - Inclusive, respectful culture that blends innovation with expertise About the Role We seek a hands-on Data Engineer with proven expertise in building business intelligence environments, designing and maintaining data marts, and handling historical dimensions in Snowflake. The ideal candidate demonstrates technical ownership, clarity of responsibility, and step-by-step reasoning when solving problems. This role is ideal for someone who thrives in fast-paced, high-impact data transformation pipelines and brings both technical depth and attention to data accuracy. Key Responsibilities: - Design and implement scalable data marts and BI data models in Snowflake using dimensional modeling (star/snowflake schemas) - Build and manage ETL/ELT pipelines using tools such as dbt and Azure Data Factory (ADF) - Handle slowly changing dimensions (SCD Types 1, 2, 3) with precision—ensuring no duplicates, no gaps, and proper historical tracking - Work closely with BI teams and business stakeholders to turn requirements into data solutions - Optimize Snowflake performance and cost-efficiency - Document data architecture, lineage, and transformations clearly - Ensure data pipeline reliability, quality, and compliance with governance standards Requirements: - 4+ years of hands-on experience as a Data Engineer in business intelligence environments - Deep experience with Snowflake and SQL, including table design, optimization, and data modeling - Expertise in building and maintaining data marts and working with historical data and dimensions - Experience using tools like dbt, ADF, and participating in fast-paced transformation pipelines - Ability to give direct, step-by-step answers to technical questions, reflecting ownership and clarity - Familiarity with data integration and orchestration tools (e.g., Apache Airflow, Talend, Fivetran) - Solid understanding of data governance, security best practices, and version control (e.g., Git) Nice to Have: - Experience with Power BI, semantic layers, or similar BI platforms (Tableau, Looker) - Exposure to cloud platforms such as Azure, AWS, or GCP - Knowledge of streaming frameworks (e.g., Kafka, Snowpipe, Kinesis) - Scripting experience in Python or Bash - Background in Agile/Scrum environments Equal Opportunity Employer: We celebrate diversity and strive to build a culture of inclusion where all individuals can thrive. We encourage applicants from all walks of life to join our team and make a lasting impact. ],

trabajosonline.net © 2017–2021
Más información