(BW-299) - SENIOR DATA ENGINEER

Agentic Dream


At Agentic Dream, we're accelerating the transformation of global data landscapes. We're seeking highly experienced Senior Data Engineers to help deliver a robust and scalable data architecture across our global ERP systems. If you're passionate about building end-to-end pipelines, enabling AI/BI solutions, and thriving in fast-paced, high-stakes environments, this is your challenge. Learn more about us at: https://www.agenticdream.com/solutions Requirements Technical Requirements: - 7+ years of experience in enterprise-scale data engineering. Strong proficiency in: - SQL (Advanced) and Python for Data Processing. - Spark or Databricks for distributed data workflows. - Cloud platforms such as Azure Data Lake/Blob, Synapse, or equivalents. - ETL orchestration tools like Azure Data Factory (ADF), Airflow, or dbt. - API integrations and data ingestion from ERP systems (e.g., NetSuite, QuickBooks, Salesforce, RF Smart, etc.). Demonstrated experience with: - Master data frameworks, unit conversion, and ERP-to-warehouse mapping. - Handling both structured and unstructured data. - Data modeling best practices (star schema, snowflake schema, etc.). Soft Skills & Work Commitment - Fluent English (C1 level) – required for daily client calls and clear technical documentation. - Strong interpersonal and collaboration skills to work with cross-functional teams (BI, QA, DevOps, Business Analysts). Nice to Have - Experience standardizing data across global manufacturing or supply chain environments. - Familiarity with Power BI datasets, alert triggers, and integration with Microsoft Teams/email. - Exposure to AI/ML pipelines, including data preparation for machine learning models or anomaly detection systems. Responsibilities Data Integration & Pipeline Development - Design, build, and maintain scalable ETL/ELT pipelines from diverse ERP sources into centralized Data Lakes and Warehouses. - Develop connectors for structured/semi-structured data using Python, SQL, APIs, or middleware solutions. Data Lake & Warehouse Engineering - Implement bronze, silver, and gold layers for ingestion, cleaning, and curated datasets. - Organize data structures for optimized use in Power BI and AI systems. Data Standardization & Cleansing - Align global units of measure (lbs, kg, packaging, linear feet) across products and regions. - Execute data deduplication, enrichment, and harmonization from disparate systems. Architectural Collaboration - Work closely with the Data Architect Lead on schema definitions, partitioning strategies, and infrastructure design. - Set up and maintain sandbox/staging environments for safe testing. Power BI & AI Enablement - Provide ready-to-use, clean datasets to support BI dashboards and AI/ML use cases. Documentation & Governance - Document pipeline architectures, data transformation logic, and integration points clearly. - Ensure adherence to data governance policies and assist with metadata management.

trabajosonline.net © 2017–2021
Más información