**YOUR NEXT STEP IS AT RAPPI!** Rappi is one of the first Latin American unicorns and a start-up that continues to focus on growing and making life easier for our users. As a company, we seek to continue improving the services we already offer, add more to our offer and continue expanding throughout the Latin American continent. Ensure that data is available and up-to-date in Snowflake, building and maintaining ETL pipelines that integrate sources such as PostgreSQL, MySQL, Google Sheets, among others, loading of terabytes of data from hundreds of sources to our Snowflake Data Warehouse. Fast execution and prioritization are key for our team. **Main Responsibilities**: - Be highly proficient in SQL and NoSQL query languages. Snowflake administration and cost optimization. Make data accessible in the right way to different user groups in a secure way. - Astronomer - Airflow administration, setup new environments and new instances, setup environment variables. - Setup and maintain ETL pipelines with Fivetran connectors through SSH or Private Link connections. - Develop Python automations for JIRA ticket management, monitor and fix Fivetran connectors, run automated Snowflake queries for administration and monitoring. Design and implement different type of projects using AWS services and third-party tools. - Support any type of issues that happens in Slack channel related to Snowflake, Fivetran or Astronomer - Airflow. Resolve JIRA tickets. **Key Requirements**: - Advanced knowledge writing Python code, SQL and NoSQL - Experience working with Linux systems - Experience working with Snowflake - Experience working with Astronomer + Airflow - Experience working with AWS (DMS, S3, EC2, DocumentDB) - Experience working with CDC (Change Data Capture) and Schema Evolution, Building ETL/ELT pipelines, Data Modeling, Data Analytics - English B1+ **Desired Requirements**: - Experience working with Fivetran - Experience working with Atlassian (JIRA, Opsgenie, Confluence, Bitbucket) - Experience working with Slack - Experience working with PySpark - Experience working with Databricks He leído y acepto la Autorización de Datos Personales de Rappi S.A.S Conforme a la Política de Tratamiento de Datos PersonalesI have read and accept the Authorization of Personal Data from Rappi S.A.S In accordance with the Personal Data Treatment Policy