Data Engineer Location: Remote in Latin America Job Type: Monthly Contractor For many U.S. companies, the challenge of attracting and retaining affordable, highly skilled software developers within its borders remains significant. Meanwhile, across Latin America, a wealth of talented developers seeks opportunities to work on stable, innovative projects with industry-leading companies. At AgilityFeat, we’ve spent over a decade solving this puzzle. Since 2010, our U.S.-based company, with offices in Panamá and Colombia, has been the bridge connecting these two worlds. We’ve helped countless U.S. companies scale their development teams while opening doors for Latin American tech professionals to access career-defining opportunities. If you’re a Latin American developer looking to work with U.S. teams while enjoying the flexibility of remote work, we’d love to hear from you. Note: Fluent English is essential for seamless communication within our teams and with international stakeholders. About You You have a strong ability to design and develop scalable data pipelines and cloud solutions. You thrive in a collaborative environment, working closely with analysts, dashboard developers, and technical project managers. You have a solid understanding of business needs and how to translate them into robust data solutions. You are comfortable learning new data tools and best practices to enhance your expertise. Key Responsibilities Cloud Data Engineering: Design and develop cloud ELT and data pipelines using various technologies. DataOps & CI/CD: Develop CI/CD pipelines and implement DataOps best practices. Business & Technical Communication: Communicate effectively with business and technology teams to understand needs and pain points. Innovation & Optimization: Identify creative solutions that align with business objectives and improve system performance. Documentation & Design: Develop core design documents, including data flows, source-to-target mappings, data lineage, and data dictionaries. Technical Requirements Strong experience in designing and developing cloud ELT and data pipelines with technologies such as Python, SQL, Airflow, Talend, Matillion, DBT, and Fivetran. Experience working with cloud data warehouse technologies, including Snowflake, AWS Redshift, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Deep understanding of data modeling methodologies, including Kimball, Inmon, and Data Vault design approaches. Expertise in developing performant ELT jobs for data movement and transformation, as well as performing exploratory data analysis, data cleansing, and aggregation. Strong experience in scaling and automating data preparation techniques. Proficiency in developing and operating CI/CD pipelines and DataOps fundamentals. Soft Skills Strong analytical and problem-solving skills. Ability to communicate technical concepts to non-technical stakeholders. Ability to work both independently and collaboratively with a team. Strong organizational and time management skills. Preferred Qualifications Experience with Big Data Technologies such as Hadoop, Spark, and MongoDB. Certifications in Microsoft Azure, AWS, or Snowflake (Azure Data Engineer Associate, AWS Data Analytics Specialty, SnowPro Core, etc.). Certifications in Agile methodologies (Scrum, SAFe). Undergraduate degree in a relevant field such as Computer Science, Data Engineering, or Business. Fluent English is mandatory. All information must be submitted in English. #J-18808-Ljbffr