hero

BUILT DIFFERENT?

Join the SKALE Ecosystem Today
If you are a SKALE Network portfolio companyclaim your profile.

Data Quality Engineer (dbt) in India

Braintrust

Braintrust

Data Science, Quality Assurance
Uttar Pradesh, India
Posted on Feb 6, 2025
Job Description

Job Title: Data Quality Engineer (dbt)

Job Type: Freelance, Ad-hoc basis

Location: Remote in India

Job Summary:

We are seeking a highly skilled Data Quality Engineer to ensure the accuracy, completeness, consistency, and reliability of data used within our systems and processes. This part-time freelance role requires a proactive individual with strong technical expertise in data quality standards, validation, and resolution of data issues. The ideal candidate will work closely with engineering and data teams to develop robust data quality frameworks and monitoring solutions.

Key Responsibilities:

Develop and enforce data quality standards to identify and resolve issues.

Perform data profiling, validation, monitoring, and reporting to ensure data integrity.

Utilize advanced SQL for querying large datasets and performing complex data operations.

Leverage PySpark for data processing, transformation, validation, and analysis.

Use Python for scripting and automation of data quality checks.

Work with Databricks to collaborate, scale, and automate data pipelines.

Assist in designing and maintaining data governance frameworks for quality assurance.

Monitor, troubleshoot, and improve existing data pipelines and workflows.

Required Skills & Experience:

Advanced SQL expertise for data querying, validation, and performance tuning.

Hands-on experience with PySpark for large-scale data processing.

Strong programming skills in Python for data analysis and automation.

Experience with Databricks for scalable data analytics and data pipeline automation.

Ability to collaborate effectively with data engineers, analysts, and business teams.

Nice to Have (Not Essential):

Experience with Google Cloud Platform (GCP) including BigQuery, Google Composer, and Spark Serverless.

Familiarity with DBT (Data Build Tool), as it is an emerging framework in our environment.

Exposure to Great Expectations, our current tool for performing data quality checks.

This role requires a highly competent and proactive individual with the ability to lead data quality initiatives and maintain high standards of data reliability.