Job Description We are seeking a skilled Data Engineer to join our team. As a key member of our data engineering team, you will be responsible for designing and implementing efficient data pipelines, ensuring reliable and scalable data processing. Your primary focus will be on building and maintaining data pipelines and ETL/ELT processes on GCP to support analytics and reporting needs. You will work closely with cross-functional teams, including data scientists and product managers, to understand requirements and deliver high-quality data solutions. Key Responsibilities: - Design and implement data pipelines using GCP tools such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer - Collaborate with data scientists and product managers to understand requirements and deliver high-quality data solutions - Implement data models, schemas, and transformations to support analytics and reporting needs - Ensure data quality, integrity, and performance by monitoring and optimizing data pipelines Requirements To be successful in this role, you will need to have the following skills and qualifications: - 3+ years of experience in data engineering, with hands-on expertise in building data pipelines on Google Cloud Platform (GCP) - Proficiency with GCP tools such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer - Strong skills in programming languages like Python or Java and advanced SQL for data processing - Experience with data modeling, schema design, and data warehousing concepts What We Offer As a valued member of our team, you can expect a range of benefits and opportunities for growth and development: - A dynamic and multicultural environment that fosters collaboration and innovation - Growth opportunities and professional development - Diverse benefits package to support your financial and emotional well-being