Senior Data Quality Specialist (R-18814)
Eyeota
Key Responsibilities:
- Execute a comprehensive data quality monitoring strategy that aligns with the organization's Data Quality Standards and business objectives.
- Develop a deep understanding of Dun & Bradstreet’s inventory data and its end‑to‑end lifecycle.
- Perform baseline data quality assessments to proactively identify and address potential issues.
- Apply advanced data analysis and profiling techniques to uncover anomalies, trends, and root causes.
- Collaborate with business stakeholders to ensure requirements are accurately captured, clarified, and documented.
- Automate data quality monitoring processes and internal workflows to improve efficiency and consistency.
- Create or update data models to ensure data is structured, organized, and optimized for accurate reporting and analysis.
- Use Power BI and/or Looker to design, build, connect, and manage dashboards that provide actionable insights from data quality metrics.
- Implement a robust data validation framework supported by automated testing and continuous monitoring.
- Communicate effectively with globally distributed stakeholders using tools such as JIRA and Confluence.
- Capture business requirements thoroughly and ensure a clear understanding of all use cases.
- Recommend enhancements to internal processes within the data quality team to drive continuous improvement.
- Generate regular reports highlighting key data quality metrics, trends, and areas requiring attention.
- Review data to identify patterns or inconsistencies that may indicate processing errors.
- Develop comprehensive documentation of data quality processes, procedures, and findings, while ensuring junior team members maintain proper documentation.
- Ensure compliance with all data governance policies and procedures.
- Stay current with industry best practices and emerging technologies related to data quality.
- Provide mentorship and guidance to junior data quality engineers, supporting their professional development and technical growth.
Key Requirements:
- Bachelor’s degree in Business Analytics, Computer Science, Information Technology, or a related field.
- 8+ years of experience with strong, demonstrated expertise in data analysis, querying languages, data modeling, and the software development life cycle.
- Advanced proficiency in SQL, preferably with BigQuery.
- Skilled in Python for data processing, automation, and analytical workflows.
- Strong agile mindset with deep knowledge of Scrum, Kanban, and agile project management practices.
- Hands‑on experience in database design, data modeling, and implementing best‑practice data architecture.
- Experience working with cloud technologies, preferably Google Cloud Platform (GCP).
- Exposure to Firebase Studio or similar application development platforms.
- Experience using modern AI‑assisted development tools such as Copilot Studio, Gemini Code Assist, or Claude Code.
- Familiarity with workflow orchestration tools such as Airflow, GCP Composer, and infrastructure‑as‑code using Terraform.
- Proficiency with the Microsoft Office Suite, including Excel, Word, Outlook, and Teams.
- Ability to mentor and provide guidance to junior team members.
- Strong commitment to meeting deadlines and adhering to release schedules.
- Strong analytical, process improvement, and problem‑solving skills.
- Excellent communication skills with the ability to clearly articulate data issues and recommended solutions.
- Proven experience collaborating with global teams across multiple time zones.

