We are seeking a DataOps Engineer to support our Tech Delivery and Infrastructure Operations teams by ensuring the reliability, automation, and performance of our analytics and data platforms. The role focuses on DataOps, blending DevOps and SRE practices to sustain and optimize data environments across global business units. You will oversee end-to-end data operations, from SQL diagnostics and pipeline reliability to automation, monitoring, and deployment of analytics workloads on cloud platforms, working closely with Data Engineering, Product, and Infrastructure teams to maintain scalable, secure, high-performing systems.
Key Responsibilities
- Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility
- Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
- Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance
- Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git)
- Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments
- Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis
- Work closely with security and compliance teams to maintain data governance and protection standards
- Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads