···
Log in / Register
Senior Data DevOps Engineer with Azure expertise
Indeed
Full-time
Onsite
No experience limit
No degree limit
79Q22222+22
Favourites
Share
Description

Summary: Lead the design and optimization of data infrastructure and workflow automation on Azure, ensuring high pipeline efficiency, stability, and performance. Highlights: 1. Lead data infrastructure design and optimization on Azure. 2. Automate data operations with Python for enhanced reliability. 3. Collaborate with teams to optimize data system scalability. We invite a Senior Data DevOps Engineer to join our team and lead the design and optimization of data infrastructure and workflow automation on Azure. This role involves creating scalable data solutions and collaborating closely with data engineering and other teams to ensure high pipeline efficiency, stability, and performance. Apply now to become part of our innovative data team. **Responsibilities** * Design, deploy, and maintain data infrastructure utilizing Azure services including Data Lake, Databricks, Synapse, and Data Factory * Collaborate with data engineering to build and sustain efficient data pipelines and workflows * Automate data operations with Python to enhance reliability and efficiency * Manage CI/CD pipelines using Jenkins, GitHub Actions, or similar technologies * Work alongside cross\-functional teams to optimize data system scalability, reliability, and performance * Install and configure data tools such as Apache Spark and Kafka in cloud and on\-premises environments * Continuously monitor data systems to identify and solve performance and scalability issues * Troubleshoot complex challenges across data platforms and pipeline processes **Requirements** * Minimum of 3 years experience in Data Engineering or similar roles * Strong Python programming expertise in batch processing workflows * Advanced SQL skills for querying and managing extensive datasets * Deep experience with Azure cloud services for data infrastructure * Hands\-on knowledge of Infrastructure as Code tools like Ansible, Terraform, or CloudFormation * Capability to set up and maintain CI/CD pipelines via Jenkins, GitHub Actions, or similar * Experience with data tools such as Spark, Airflow, or R for workflow management * Advanced Linux OS skills including scripting and system administration * Comprehensive understanding of network protocols including TCP, UDP, ICMP, DHCP, DNS, and NAT * English communication proficiency at B2\+ level, both written and spoken **Nice to have** * Familiarity with additional cloud platforms such as AWS or GCP * Experience with Kubernetes for container orchestration in data workflows * Knowledge of monitoring and observability tools including Prometheus, Grafana, or Azure Monitor * Exposure to Big Data technologies and complex analytics workflows * Hands\-on experience with cloud data governance and security best practices

Source:  indeed View original post
Valentina Rodríguez
Indeed · HR

Company

Indeed
Valentina Rodríguez
Indeed · HR
Active today
Similar jobs

Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.