Data Quality

 Analysts devote significant effort to addressing data quality issues, as it underpins successful data-driven strategies. Butterfly's data quality service provides our clients with the support needed to design, plan, and execute an effective data quality strategy. 

We help organizations develop their business rules and reports which deliver quality control and assurance. Once implemented, client data can be assessed, cleansed, enhanced and enriched, benefiting all downstream consumers.


Features

  • Experience - On-site consultants, with extensive data management experience

  • Profiling – evaluating completeness and consistency of data sources

  • Parsing – break complex data elements into logical parts

  • Standardization – consolidation of related data values

  • Enrichment - combining data with reference sources

  • Matching – create associations providing a complete view of data

  • Rules – defining the allowable data structures, attributes, and values

  • Actions – for dealing with non-compliant data

  • Remediation - services which cleanse and enhance data

  • Expertise - with specialized tools (e.g. SAS DataFlux Data Management Studio)

Benefits

  • Realization of outcomes from technology investment

  • Competitive advantage through use of innovative and emerging technologies

  • Reduced data duplication, fewer silos, enhanced matching

  • Consistent, single view of customer data across the organization

  • Defines standards for data ingestion, driving continuous improvement

  • Metrics to measure and compare quality

  • Reduces the time spent investigating reconciliation errors

  • Reduces the cost of processing dirty data

  • Reduces model development time and improves accuracy

  • Supports assurance with regulatory compliance (GDPR)

Description

The service can be customized to integrate with the tools and services in your data pipeline, and we can deploy either on-premise or to your chosen cloud provider.

It can be delivered as:

  • a one-off exercise, supported by our consultancy team

  • an ongoing batch service

  • an ongoing transaction service

Butterfly has a standard set of procedures for initiating the service:

  1. Agree the service model and scope

  2. Agree the security model and service standards

  3. Agree the data transfer and/or system integration methods

  4. Configure the necessary infrastructure

  5. Conduct an initial assessment/review

  6. Refine the service parameters as appropriate

  7. Implement and test the service

  8. Confirm the service acceptance and commence 

The off-boarding process is dependent on the service model chosen, and will be agreed  as part of the acceptance criteria. 

For consultant-led services, we will ensure a successful handover of the knowledge and documentation to the relevant team. Any infrastructure put in place for the agreed service period will be decommissioned, and any data retained will be securely destroyed.

Planning

Planning a data quality project is often more complex than expected. The business rules and logic is often implemented over multiple disparate systems, and not typically well-documented, and can highlight unexpected inconsistencies and ambiguities.

It is important to recognize the need for feedback loops, and ongoing resource to maintain any solution. 

You will also need to understand and manage impacts to downstream systems and users. 

Butterfly’s planning service builds on the trusted CRISP-DM methodology, to provide our clients with practical business analysis and cloud architecture support.

We offer tailorable support to most effectively assist you in planning a data quality project using cloud-first technologies, including:

  • Defining your objectives, capabilities, and refining your business case

  • Assessing the challenge, identifying needs, dependencies, and options

  • Preparing for change, engaging users, and managing stakeholders

  • Developing effective solutions, proof of concepts, and viable products

  • Evaluating options for optimal suitability and effectiveness

  • Deploying change efficiently, migrating data and models

  • Monitoring results and decommissioning legacy solutions

We are practiced in Agile project delivery, and recognize the need to balance value and control with flexibility, without compromising on our Service Standards.