We are seeking a highly proficient remote Senior Big Data Engineer with extensive experience in Databricks, Python, AWS, and Data Warehousing. The ideal candidate should be well-versed in software development lifecycle methodologies and proficient in advanced SQL coding. As a Senior Big Data Engineer, you will be responsible for developing and implementing scalable and resilient systems, utilizing data lake solutions, and stream processing engines to meet business goals. Responsibilities Develop and implement scalable, resilient systems utilizing Data Lake solutions Participate in the full software development lifecycle (SDLC) utilizing Agile methodologies Build and maintain data pipelines using stream processing engines such as Kafka, Spark, etc. Perform advanced data analysis and modeling using relevant programming languages Ensure data quality and integrity through proper testing and validation techniques Requirements Bachelor's/Master's Degree in Computer Science, Information Systems, or equivalent At least 3 years of relevant experience in Data Software Engineering Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development Extensive experience in Databricks, Python, AWS, and Data Warehousing Advanced SQL coding skills B2+ English level Nice to have Experience with Apache Airflow and Apache Spark Expertise in Relational Database Design We offer/Benefits - International projects with top brands - Work with global teams of highly skilled, diverse peers - Healthcare benefits - Employee financial programs - Paid time off and sick leave - Upskilling, reskilling and certification courses - Unlimited access to the LinkedIn Learning library and 22,000+ courses - Global career opportunities - Volunteer and community involvement opportunities - EPAM Employee Groups - Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn