Job Description "> At our organization, we create innovative software solutions for leading brands and startups. Our company is recognized as a top performer in the industry, with multiple Best Place to Work awards. We are seeking an experienced DevOps Engineer to join our team. This individual will design, deploy, and operate scalable Kubernetes environments supporting data and analytics workloads. Key Responsibilities - Design and deploy scalable Kubernetes environments (EKS or similar) supporting data and analytics workloads; - Build, automate, and maintain complex data pipelines using Argo Workflows for orchestration, scheduling, and workflow automation; - Lead or support migration of source code repositories and CI/CD pipelines to GitLab or other Git-based platforms; - Develop and manage infrastructure with Terraform and related tools, implementing infrastructure automation and repeatable deployments in AWS and Kubernetes; - Support high-availability S3-based data lake environments and associated data tooling, ensuring robust monitoring, scalability, and security; - Instrument, monitor, and create actionable alerts and dashboards for Kubernetes clusters, Argo workflows, and data platforms to quickly surface and resolve operational issues; - Participate in incident, problem, and change management processes, proactively driving improvements in reliability KPIs (MTTD/MTTR/availability); - Collaborate with Data Engineering, SRE, Product, and Business teams to deliver resilient solutions and support key initiatives like Git migration and cloud modernization; - Apply best practices in networking (Layer 4-7), firewalls, VPNs, IAM, and data encryption across the cloud/data stack; - Engage in capacity planning, forecasting, and performance tuning for large-scale cloud and Kubernetes-based workloads. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience; - 5+ years of production experience operating and managing Kubernetes clusters (preferably in AWS, EKS, or similar environments); - Strong hands-on experience with AWS cloud services; - Deep hands-on experience with Argo Workflows, including developing, deploying, and troubleshooting complex pipelines; - Experience with Git, GitLab, and CI/CD, including leading or supporting migration projects and the adoption of GitOps practices; - Effective at developing infrastructure as code with Terraform and related automation tools; - Practical experience in automating data workflows and orchestration in a cloud-native environment; - Proficient in SQL and basic scripting (Python or similar); - Sound understanding of networking (Layer 4-7), security, and IAM in cloud environments; - Proficient in Linux-based systems administration (RedHat/CentOS/Ubuntu/Amazon Linux); - Strong written and verbal communication skills; - Ability to collaborate in cross-functional environments; - Track record delivering reliable, secure, and scalable data platforms in rapidly changing environments; - Experience working with S3-based data lakes or similar large, cloud-native data repositories; - Upper-Intermediate English level. Benefits - Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps; - A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands; - Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.