Main Tasks
- Operate and optimize Azure resources (ADF, Key Vault, Monitor, Event Hub)
- Administer Databricks workspace access and cluster configs
- Apply Infrastructure-as-Code (Terraform/Bicep)
- Manage CI/CD pipelines for Scala and PySpark-based pipelines
- Integrate build steps (e.g., Maven/SBT, Python wheels) into automated deployments
- Enforce DevSecOps and IaC standards
- Monitor Spark job execution, analyze failures and stage-level issues using Spark UI and logs
- Configure alerts, metrics, and dashboards for pipelines and infrastructure
- Lead post-incident reviews and reliability improvements
- Administer Power BI tenant configuration, workspace access, and usage monitoring
- Operate and monitor on-premises or VM-hosted enterprise gateways
- Troubleshoot dataset refreshes and hybrid data integration
- Support runtime execution of production pipelines and ensure SLA adherence
- Collaborate with engineers to resolve Spark performance issues or deployment errors
- Participate in schema evolution and environment transitions
- Enforce platform policies (tagging, RBAC, audit logging)
- Maintain credential and secrets security using Key Vault and managed identity
- Conduct audits across Azure, Databricks, and Power BI environments
continental