Job Overview
The Data Quality Engineer will ensure the accuracy, consistency, and reliability of data across the Microsoft Fabric Data Warehouse, its integrations, and connected services. The role focuses on testing and validating data pipelines, event-driven processes, and Azure-based integrations such as Logic Apps and Service Bus. The engineer will collaborate closely with data engineers, architects, and business units to maintain high-quality, trustworthy data for analytics and reporting.
Key Responsibilities
- Design, develop, and execute data quality and validation tests for Microsoft Fabric Data Warehouse objects (tables, views, semantic models).
- Validate ETL/ELT pipelines, including Fabric Dataflows and integrated source systems, for accuracy and completeness.
- Test and monitor event-driven data processes, including Azure Service Bus messages, queues, and topics.
- Validate Azure Logic Apps workflows and integrations that move or transform data between systems.
- Perform end-to-end testing of data ingestion, transformations, and delivery to downstream consumers or BI systems.
- Develop and maintain automated data quality checks, reconciliation scripts, and test frameworks.
- Identify, analyze, and report data quality issues, including root cause analysis and impact assessment.
- Collaborate with Data Engineers, BI teams, and business units to define data quality rules, metrics, and acceptance criteria.
- Monitor data quality KPIs, implement controls to prevent defects, and ensure reliability in production pipelines.
- Document test cases, workflows, data quality rules, and testing outcomes clearly for technical and business stakeholders.
- Familiarity with unit testing, automated data validation, and integration testing tools such as dbt, Great Expectations, tSQLt, pytest, Azure Logic Apps testing, and Service Bus testing scripts.
- Perform unit testing and validation of Azure Logic Apps workflows and Service Bus message flows.
- Develop and maintain automated data quality checks and integration tests using tools such as dbt, Great Expectations, or custom scripts.
- Validate and test API endpoints and integration workflows using tools such as Postman to ensure data is accurately transmitted between systems.
- Perform end-to-end testing of event-driven processes and service integrations using Postman and automated scripts.