SENIOR DATA DEVOPS ENGINEER - UJ482

Epam Systems


Join EPAM Systems as a Senior Data DevOps Engineer and play a crucial part in defining and implementing the architecture of our projects. In this role, you will work closely with customers, peers, and vendors to resolve complex issues, develop strategic solutions, and maintain technical standards. About the Role - Lead the design, deployment, and management of data infrastructure in the cloud using major cloud platforms such as AWS, Azure, or GCP. - Develop and maintain robust CI/CD pipelines for data infrastructure and applications. - Automate and streamline data-related processes to ensure scalability, reliability, and efficiency. - Ensure the security, availability, and optimal performance of data platforms. - Provide technical leadership, guidance, and mentorship to engineering teams. - Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments. - Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability challenges. About Us EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture, collaborating with multi-national teams and contributing to innovative projects that deliver creative and cutting-edge solutions. Requirements - Minimum of 3 years of relevant professional experience. - Proven expertise in designing and implementing Data DevOps solutions. - Strong proficiency in cloud platforms such as Azure, GCP, or AWS. - Extensive experience with Infrastructure as Code tools like Ansible, Terraform, or CloudFormation. - Demonstrated ability to set up and manage CI/CD pipelines using popular tools like Jenkins, Bamboo, TeamCity, GitLab CI, or GitHub Actions. - Proficiency in scripting languages and automation tools such as Python, PowerShell, or Bash. - Solid understanding of containerization and orchestration technologies like Docker and Kubernetes. - In-depth knowledge of network protocols and mechanisms, including TCP, UDP, ICMP, DHCP, DNS, and NAT. - Experience installing, configuring, and optimizing data tools like Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools. - Professional mastery of the Linux operating system. - Strong SQL skills. - Excellent collaboration and communication skills.

trabajosonline.net © 2017–2021
Más información