GFK-773 DATA ARCHITECTURAL ENGINEER

Bebeedataengineer


Job Description As a key member of our data team, you will play a vital role in enhancing data-driven decision-making by developing and maintaining modern data infrastructure. Your primary objective will be to advance Analytics and Data Science efforts by designing and deploying data platforms and pipelines. You'll collaborate with cross-functional teams to innovate data tools and systems supporting Business Intelligence and Data Analysis in the dynamic freight industry. Key Responsibilities: - Build and maintain robust ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from diverse source systems into the data warehouse. - Develop and manage data pipelines tailored to feature engineering for machine learning systems, facilitating data-driven analytics and insight generation. - Collaborate closely with data engineering colleagues to review code, enhance design patterns, and implement new data systems using SQL and Python for optimal performance. - Author and update technical and functional documentation for systems owned by Data Engineering, ensuring clarity and accessibility for team members. - Establish data quality frameworks, aimed at improving the observability and reliability of data pipelines and engineering systems. - Work alongside analysts and business stakeholders to refine and optimize systems supporting Business Intelligence, dashboarding, and reporting tools. - Set up secure integrations between engineering and analytics platforms interfacing with the Data Platform. - Participate in an on-call rotation, addressing and resolving data pipeline issues promptly to maintain system integrity. Required Skills & Qualifications - 5+ years of experience in a technical role focused on data, with practical knowledge of building and deploying ETL/ELT pipelines. - Advanced proficiency in SQL, demonstrated by 2+ years of hands-on experience. - Intermediate experience with Python for data transformation, analytics, and automation. - Experience with modern data tooling, including Snowflake, Airflow, DBT, and Azure Data Factory (ADF). - Intermediate skills in using Business Intelligence (BI) tools. - Familiarity with agile frameworks for effective project management. - Bachelor's degree in Computer Science, IT, Analytics, or a related field. - Strong written and oral communication and presentation abilities. Benefits Join a powerful tech workforce and help us change the world through technology Professional development opportunities with international customers Collaborative work environment Career path and mentorship programs that will lead to new levels. We celebrate diversity and are committed to creating an inclusive environment for all employees.

trabajosonline.net © 2017–2021
Más información