




Summary: Join as a Senior Data DevOps Engineer with AWS expertise to architect and optimize scalable cloud data infrastructures and lead automation strategies. Highlights: 1. Architect and deploy scalable data infrastructure using AWS services 2. Automate infrastructure workflows leveraging Infrastructure as Code tools 3. Guide and mentor team members while promoting best practices Join our team as a Senior Data DevOps Engineer with AWS expertise to architect and optimize scalable cloud data infrastructures. You will lead infrastructure automation strategies, mentor teammates, and work closely with cross\-functional groups to deliver robust cloud data solutions. We invite you to apply and contribute your expertise to our innovative projects. **Responsibilities** * Architect and deploy scalable data infrastructure using AWS services including ECS, RDS, Athena, Glue, S3, EBS, CloudFormation, IAM, and Redshift * Automate infrastructure workflows leveraging Infrastructure as Code tools such as Terraform and AWS CloudFormation * Collaborate with data engineering and analytics teams to optimize large data pipelines with Python and AWS Glue * Define best practices and manage CI/CD pipelines adapted to AWS environments using GitHub Actions, AWS CodePipeline, Jenkins, or similar * Enhance cloud data platform performance, reliability, scalability, and cost efficiency * Integrate and evaluate advanced data processing tools like Apache Airflow and Apache Spark on AWS EMR * Diagnose and resolve complex issues in AWS\-hosted systems, performing root cause analysis for critical problems * Implement strong data security measures including IAM, encryption, and AWS compliance best practices * Conduct capacity planning, resource tuning, and cost management for data platform components * Guide and mentor team members while promoting best practices **Requirements** * At least 3 years of professional experience in Data Engineering, DevOps, or related areas focused on cloud infrastructure and automation * Advanced Python programming skills for automation and workflow optimization * Strong SQL skills for large\-scale data management * Extensive experience with AWS data infrastructure tools such as Redshift, Glue, EMR, and Athena * Proficiency with Infrastructure as Code tools including Terraform, AWS CloudFormation, or Ansible * Experience implementing CI/CD pipelines using Jenkins, GitHub Actions, AWS CodePipeline, or others * Hands\-on expertise with distributed data processing and orchestration using Apache Spark and Apache Airflow * Advanced Linux administration and tuning experience in cloud settings * Solid understanding of networking protocols like TCP, UDP, ICMP, DNS, and NAT in cloud environments * Experience automating large\-scale infrastructure deployment with Terraform or equivalent * Proven skills installing and managing data platforms such as Apache Kafka and NiFi * Ability to lead complex projects and support junior and mid\-level team members * Professional English proficiency at B2 level or higher **Nice to have** * Experience with other cloud platforms such as Azure or GCP and hybrid cloud architectures * Knowledge of statistical and data visualization tools like R, Tableau, or Power BI * Familiarity with advanced CI/CD tools such as Bamboo * Experience with hybrid cloud and on\-premises data solutions ensuring secure, scalable flows * Background in container orchestration technologies including Kubernetes, Docker, or AWS ECS/EKS


