-DATA ENGINEERING

13-14 USD/Hour
Lancesoft


Job Description: Design and develops complex software that processes, stores and serves data for use by others. Designs and develops complex and large-scale data structures and pipelines to organize, collect and standardize data to generate insights and addresses reporting needs. Writes complex ETL (Extract / Transform / Load) processes, designs database systems and develops tools for real-time and offline analytic processing. Ensures that data pipelines are scalable, repeatable and secure. Improves data consistency and integrity. Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards. Has knowledge of large-scale search applications and building high-volume data pipelines. Knowledge of Java, Hadoop, Hive, Cassandra, Pig, Postgres, Snowflake, DOMO, Iceberg. Complexity & Problem Solving: - Learns routine assignments of limited scope and complexity. - Follows practices and procedures to solve standard or routine problems. Autonomy & Supervision: - Receives general instructions on routine work and detailed guidance from more senior members on all new tasks. - Work is typically reviewed in detail at frequent intervals for accuracy. Communication & Influence: - Builds stable internal working relationships. - Communicates and seeks guidance/feedback regularly from more senior members of the team. - Primarily interacts with supervisors, project leads, mentors, or other professionals in the same discipline. - Explains facts, policies, and practices related to discipline. Key Responsibilities: •Creating and engineering data pipelines for end-to-end delivery of raw and streamed datasets from cloud platform into data warehouses like Snowflake. •Data modeling to facilitate efficient use of the data warehouse and high availability of operational reports and dashboards •Dashboard and report design and creation in tools such as Domo or Tableau, based on business requirements •Implement and manage production support processes around data lifecycle, data governance, master data management, data quality, coding utilities, storage, reporting and other data integration points. •Assist and work closely with business stakeholders, platform development teams and data/research scientists by delivering data driven solutions to leverage company data to drive business outcomes, hence driving the growth of the Yum platform and all the brands our platform supports. •Participate and lead design sessions, demos, and prototype sessions, testing and training workshops with business users and other IT associates You have: •A bachelor's degree or equivalent in Computer Science / Information Systems or a related field •5+ years hands-on coding SQL, PLSQL, working with databases like Postgres, Mongo, Couchbase •5+ years of experience with tools like Domo or Tableau •2+ years'experience with Kafka, Snowflake/SnowPipe, hands-on coding skills in languages like Golang, JavaScript, Python, Docker / shell scripting and hands on experience with Debezium or equivalent CDC tooling •3+ years'experience working on a and modeling within a massively parallel processing database (MPP) such as Snowflake •A collaborative attitude We prefer experience with: •Postgres •Debezium •Kafka •Snowflake •Domo •JSON data structures •Agile methodologies •DevOps technologies (docker, AWS, GitLab) •Git •Microservice architecture •Ecommerce

trabajosonline.net © 2017–2021
Más información