IN.JobDiagnosis logo

Job Title:

Data Quality Engineer

Company: NOBL Q

Location: Gurgaon, Haryana

Created: 2026-01-17

Job Type: Full Time

Job Description:

Job Summary:We are seeking a skilled Data Quality Engineer to ensure the accuracy, reliability, and integrity of our data pipelines and workflows. The ideal candidate will have hands-on experience in data engineering concepts, with a strong focus on quality testing, validation, and pipeline orchestration.Key Responsibilities:Design, develop, and execute data quality test cases to validate data pipelines and ETL/ELT processesMonitor and trigger data pipelines, ensuring smooth execution and timely data deliveryRun and maintain data quality scripts to identify anomalies, inconsistencies, and data integrity issuesPerform data profiling and validation across multiple data sources and targetsCollaborate with data engineers to implement data quality checks at various stages of the pipelinePerform root cause analysis (RCA) for data anomalies and pipeline failuresTroubleshoot pipeline failures and data quality issues, working to resolve them efficientlyDocument data quality standards, testing procedures, and validation resultsGenerate data quality reports and communicate findings with engineering teamsDevelop automated testing frameworks to improve data quality validation efficiencyFocus primarily on validating and assuring quality of existing pipelines (not building full pipelines)Required Technical Skills:Strong understanding of data engineering concepts including ETL/ELT processes, data warehousing, and data modelingProficiency in SQL for complex data validation and queryingExperience with scripting languages such as Python or Shell scripting for automationHands-on experience with data pipeline orchestration tools (e.g., Apache Airflow, Azure Data Factory, AWS Glue)Knowledge of data quality frameworks and tools (e.g., Great Expectations, Deequ, custom validation scripts)Familiarity with cloud platforms (AWS, Azure, or GCP) and their data servicesUnderstanding of data formats (JSON, Parquet, Avro, CSV) and data storage systemsExposure to logging/monitoring tools (CloudWatch, Datadog, ELK, etc.) is a plusPreferred Skills:Experience with big data technologies (Spark, Hadoop, Kafka)Knowledge of CI/CD practices for data pipelinesFamiliarity with version control systems (Git)Understanding of data governance and compliance requirementsExperience with data visualization tools for quality reportingSoft Skills:Strong analytical and problem-solving abilitiesExcellent attention to detailGood communication skills to collaborate with cross-functional teamsAbility to work independently and manage multiple priorities

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.