




Summary: Seeking a Lead Data DevOps Engineer with AWS to lead the design and management of secure and scalable cloud data infrastructures, guiding technical strategies and fostering team development. Highlights: 1. Lead design and management of secure and scalable cloud data infrastructures 2. Guide technical strategies and oversee critical projects 3. Mentor team members and provide technical leadership on AWS data infrastructure Seeking a Lead Data DevOps Engineer with AWS to lead the design and management of secure and scalable cloud data infrastructures. You will guide technical strategies, oversee critical projects, and foster team development while collaborating with stakeholders to deliver reliable data solutions. If you are ready to lead in a dynamic environment, apply now. **Responsibilities** * Lead the architecture, deployment, and enhancement of AWS data platforms including ECS, RDS, Athena, Glue, S3, and Redshift * Drive the adoption of Infrastructure as Code using Terraform and AWS CloudFormation * Collaborate with data engineering and analytics teams to develop efficient data pipelines using Python and AWS Glue * Define and implement best practices for CI/CD pipelines with GitHub Actions, Jenkins, and AWS CodePipeline * Improve system scalability, reliability, performance, and cost\-effectiveness * Evaluate and integrate Apache Airflow and Apache Spark EMR for data processing * Manage incident response and root cause analysis for AWS infrastructure issues * Develop and enforce IAM configurations, security policies, and compliance measures * Lead resource planning, capacity management, and cost optimization * Mentor team members and provide technical leadership on AWS data infrastructure * Work with stakeholders to prioritize projects and set delivery timelines * Champion continuous improvement and a culture of engineering excellence **Requirements** * Over 5 years in cloud\-focused Data Engineering or DevOps roles * At least 1 year in leadership roles * Advanced Python programming skills for automation * Strong SQL expertise for large dataset management * Deep experience with AWS data services including Redshift, Glue, EMR, Athena, and S3 * Proficient in Terraform and AWS CloudFormation for Infrastructure as Code * Experience maintaining CI/CD pipelines with Jenkins, GitHub Actions, or AWS CodePipeline * Knowledge of distributed data frameworks such as Apache Spark and Airflow * Strong Linux and container orchestration skills (Docker, ECS/EKS) * Comprehensive understanding of AWS networking and troubleshooting * Experience with streaming platforms like Apache Kafka * Proven leadership and mentoring experience * English language proficiency at B2\+ level **Nice to have** * Experience with Azure or GCP cloud platforms * Knowledge of data analysis and visualization tools like R, Tableau, or Power BI * Familiarity with advanced CI/CD platforms and configuration management * Experience designing secure hybrid cloud and on\-premises solutions * Hands\-on Kubernetes cluster management in AWS EKS/ECS environments


