Put data behind every decision

We empower organisations with data management and analysis services, turning complex data ecosystems into strategic capabilities that deliver competitive results. By combining the deep expertise of our UK-based specialists with global-scale technology and innovation, we help you make every decision with confidence, knowing your data is accurate, reliable and ready to drive impact.

Our trusted clients and partners

Who we are

We are a B-Corp–certified, end-to-end data consultancy with over 20 years of experience, helping organisations turn complex data challenges into meaningful solutions.

By combining deep expertise with a technology-agnostic approach, we design solutions that use the right tools for each situation - supported by globally trusted partners such as SAS, Snowflake, Informatica and Databricks.

Our experience in highly secure environments ensures that sensitive data is handled safely and in line with rigorous compliance standards.

Recognised across multiple Government Commerical Agency and other public sector frameworks, we support organisations in delivering value, enabling citizen-focused projects and obtaining insights that drive smarter decisions.

From improving data quality and cloud adoption to advanced analytics and AI/ML, we guide both private and public sector organisations through every stage of the data journey, whilst always remaining focused on ethical, practical and impactful outcomes.

Our services

Turn untapped potential into continuous improvement

Data quality, governance, and privacy

Ensure your data is accurate, well-governed, and safeguarded for evolving privacy standards, whilst establishing a trusted foundation for AI.

Data engineering, integration, and cloud adoption

Design and implement scalable data platforms that enable seamless integration, automation and cloud-based operations to support modern analytics and AI solutions.

Data analytics and visualisation

Transform complex datasets into clear, interactive visual insights that support smarter, faster decision-making.

Data science and AI solutions

Apply advanced AI and machine learning to unlock predictive insights, automate workflows, and drive measurable business value.

Our experience

Why Butterfly Data?

Proven expertise

With over 20 years’ experience, our dedicated team of data scientists, engineers and technologists, familiar with secure and compliant data practices, bring unrivalled expertise, adding real value without the overhead costs associated with larger firms.

Abstract design of curved golden lines forming a flowing wave pattern on a black background.

Innovative technology

We use cutting-edge technologies from leading vendors like SAS, Databricks, and Snowflake to boost performance and accelerate business transformation.

Abstract yellow curved lines forming a flowing wave pattern on a black background.

A personalised approach

Every organisation is unique, and so is its data. We build close relationships with your team, tailoring our services to align with your business objectives and solve your challenges.

Spiral pattern made of small golden dots on a black background.

Data for good

As a proud B-Corp, we use the power of data for good – partnering and collaborating with organisations that align with our core values to create a positive impact.

Abstract flowing yellow mesh lines forming wave-like curves on a black background.

Measurable results

Chosen by industry leaders for our agility and commitment to excellence, we let the data speak for itself.

Yellow curved parallel lines forming an abstract wave pattern on a black background.

Simple procurement

Easily procure our services, either directly or via key public sector frameworks, including G-Cloud, DOS, Spark, ACE, and NVfI.

Abstract geometric design of overlapping yellow square lines forming a diamond shape on a black background.

“The invaluable work that Butterfly Data have undertaken with a key collaborator of mine will feed directly into my work, making it both simpler and faster and enabling me to better identify data gaps. Incredibly useful. Thank you."

Abstract dark background with smooth, curved yellow light streaks.
Butterfly DATA guide

Everything you need to know about Butterfly Data

Download our guide here.

a laptop on a table
resources

Insights to power better decisions

Butterfly Data grows the way good things often do—organically, through finding the right people at the right time. This year so far, we have welcomed six new team members whose backgrounds span solar physics, cruise ship IT operations, digital forensics, army cadets and at least one Venice-to-Athens cycling adventure. What brings them together is a genuine enthusiasm for data and a belief that the work we do at Butterfly Data matters.

We are pleased to introduce them properly.

Harvey Thomas

Harvey joins us from a background in full-stack application development, having built software across web, desktop and mobile platforms using Microsoft .NET and React. He spent nearly three years at Swansea Council – progressing from GIS application developer apprentice to junior applications developer – before moving into digital forensic analysis at EX1 Digital Forensic Services. His Level 4 IT apprenticeship included a distinction in SQL, and he was head boy of his comprehensive school.

I get satisfaction from taking hard to read data and turning it into something useful.

Harvey is most excited about data engineering and is motivated by the simple but powerful idea of turning hard-to-read data into something genuinely useful for people, not just organisations. When he isn't at work, you can find him watching anime, playing badminton or gaming with his cat Nox close by.

Wildcard fact: Harvey was a Lance Corporal in the Army Cadet Force.

Remy Furtado

Remy brings over 15 years of experience across the insurance, banking and financial services sectors, with a career that has taken him from Henderson Global Investors and AXA PPP Healthcare to Dufrain Consulting, Volante Global and Lockton. He holds a dual master's — including an Erasmus Mundus MSc across three European universities — and certifications from Microsoft, SAS, Databricks and Collibra, among others.

Working for a data consultancy allows me to apply my extensive industry experience to solve complex data challenges while learning continuously and driving business transformation through client partnerships.

His specialisms lie in data governance, data quality and regulatory compliance, and he has a particular talent for translating technical complexity into clear business insight. He is also a current volunteer for his local parish, where he manages everything from Gift Aid administration to the parish website.

Before all of that, Remy spent several years living aboard cruise ships, managing IT operations for vessels sailing through the Caribbean, Alaska, the Mediterranean and across the Panama Canal – an experience that taught him to troubleshoot critical systems under pressure and adapt to constantly changing environments.

Wildcard fact: Remy has overseen onboard technology systems, from point-of-sale to navigational infrastructure, while sailing some of the world's most iconic waterways.

Chris Willcock

Chris joins the Butterfly Data team after six years at EDF Energy, where he progressed from data and technology graduate to senior data engineer. In that time he developed a strong specialism in end-to-end data platform design — building scalable, enduring solutions that balance engineering rigour with real business value. He holds an AWS Associate Developer certification and is expert-level in both Python and MySQL.

His goal at Butterfly Data is to continue developing reusable, best-practice engineering patterns that can serve clients across a range of industries, bringing reliability and discipline to every engagement. Outside of work, Chris solves Rubik's Cubes (currently sub-20 seconds, with a personal best of 14.3 seconds), walks the Malvern Hills, cooks, and games.

I wanted to work in data consultancy to continue to deliver reliable data solutions across a number of different industries and clients, gaining exposure to new and interesting data and business problems.

Wildcard fact: Chris cycled from Venice to Athens with no training whatsoever. He describes this as foolish. We think it is extraordinary.

Daniel Johnson

Daniel comes to Butterfly Data from academia, where his career has been anything but conventional. His PhD at the Jeremiah Horrocks Institute focused on modelling and simulating rotating sunspots using high-performance computing, producing a space weather forecasting model that he presented to audiences ranging from the general public to policymakers at the Houses of Parliament. He went on to work as a postdoctoral research fellow at the University of St Andrews, engineering solutions to outstanding research problems in computational physics and collaborating with scientists around the world.

He brings to Butterfly Data a generalist skill set honed across the full technical project lifecycle: data acquisition, engineering, analysis, science and stakeholder delivery. He is a self-described continuous learner who chose data consultancy specifically for the variety of problems it presents.

I find that working on varied projects is the best route to personal development for me, data consultancy offers this variety.

On most Saturdays, Daniel is still actively working on his solar physics research and writing academic papers — because some habits are worth keeping.

Wildcard fact: Daniel is Butterfly Data's most northerly employee.

Louise Palin

Louise arrives with 15 years of experience in team leadership, process optimisation and stakeholder management, with much of it gained at Coleg Llandrillo Rhyl, where she managed operational databases, led financial and procurement processes across curriculum teams, facilitated senior leadership meetings and coordinated learner panels and industry focus groups. She is also a qualified orienteering coach and Level 2 fitness instructor.

She joins Butterfly Data at an exciting point in her own professional journey – developing her technical and agile project management skills as a Scrum master. She is most interested in data quality and approaches every working day by tackling the hardest task first.

I wanted to join Butterfly Data to combine technical problem-solving with strategic business impact — facilitating the transformation of data into actionable insights for clients whilst accelerating my own learning.

When she isn't working or training for endurance events, Louise is spending time with family and friends.

Wildcard fact: Louise can fit in a holdall bag.

Ayub Hassan

Ayub joins our DevOps team, bringing deep expertise in cloud-native infrastructure, Kubernetes, automation and CI/CD. His professional background includes building and maintaining Kubernetes platforms on AWS EKS and Azure AKS, implementing Infrastructure as Code with Terraform and CloudFormation, developing CI/CD pipelines across Jenkins, GitHub Actions, and GitLab CI and leading cloud migrations from on-premise into AWS and Azure. He holds certifications as an AWS Solutions Architect Associate, Kubernetes and Cloud Native Associate and HashiCorp Terraform Associate and completed a software development bootcamp with Le Wagon to ground his DevOps instincts in strong engineering fundamentals.

Ayub's ambition is to design and lead large-scale, resilient cloud-native platforms that improve developer experience and operational efficiency at scale, while continuing to bridge the gap between software engineering, DevOps and modern data platforms. He is most interested in data integration and rates his excitement about the work Butterfly Data does at five out of five.

What attracted me to data consultancy was the opportunity to work on technically challenging, large-scale environments where cloud engineering, infrastructure, automation and data platforms directly contribute to solving real operational and business problems.

Outside of work, Ayub plays football, trains in karate, goes to the gym and tutors maths and physics.

Wildcard fact: Ayub holds a black belt in karate.

A team worth growing

Each of these individuals chose Butterfly Data as the place to do their best work. As we continue to grow organically, we do so because we attract people who care about the importance of data, client impact and each other. We are proud of that and we are proud of the team that is building it with us.

You can discover more about our team here. If you are interested in joining or working with us, visit our Join Us page or get in touch.

Your legacy system isn’t just slowing you down; it’s holding you back from what’s next. For many organisations, legacy technology migration is no longer optional; it is becoming critical to staying operational.

These systems have delivered resilience over many years, but continued modernisation is becoming increasingly important to managing cost and risk in the public sector.

Central governments, local authorities, healthcare and defence—wherever you look, people still rely on vital services running on platforms that never anticipated today’s data needs. Think about SAP stacks from decades ago. SAS 9 systems driving fraud checks and benefits analysis. Bespoke on-premise builds that have evolved over many years, often supported by long-standing, experienced employees.

It is understandable why many organisations choose to keep these systems for now. It is tough to make the budget argument. So another year goes by, often without a clear data migration strategy in place. 

But delaying migration keeps getting more expensive. Technical debt rarely causes immediate disruption until suddenly it does. McKinsey estimates that technical debt can account for 20-40% of the value of an organisation’s entire technology estate before remediation efforts even begin. Addressing the issues proactively can help organisations reduce the risk of disruption and maintain resilience during critical periods. 

So what does “legacy risk” actually look like in everyday life?

Quite often, the challenges associated with legacy infrastructure emerge gradually, rather than a single incident. Analysts may find themselves spending longer than expected reconciling exports because older systems don’t connect smoothly with newer reporting tools. Security teams may need additional steps to complete audits where platforms weren’t originally designed with today’s compliance expectations in mind. And in cloud programmes, progress can sometimes slow when one part of the estate requires extra care to integrate safely with more modern environments.

You also see situations where important operational knowledge sits closely with a small number of people who have worked with these systems for a long time. That creates a natural focus on continuity and knowledge sharing, ensuring that understanding is retained as teams evolve.

These are not unusual or unexpected situations. They are a reflection of how complex, long-lived systems and services evolve over time, particularly in large organisations. And they sit alongside the day-to-day reality of teams working with highly sensitive and important data, doing their best to maintain reliable services while gradually modernising the underlying technology.

A practical way to understand what “legacy risk” looks like at scale is to look at how common it is across the public sector. Recent UK government analysis shows that essential services still run on technology that is decades old, with around 28% of central government systems classified as legacy in 2024, up from 26% the previous year. The picture is uneven across organisations, with legacy systems making up anywhere from 10% to as much as 60–70% in some NHS trusts and police forces. In many cases, organisations also don’t yet have a complete inventory of these systems, which means the full extent of dependency isn’t always clearly visible.

What this means in practice becomes clearer when you look at how some services still operate day to day. In the Home Office, for example, reporting has highlighted that parts of asylum casework continue to rely on systems that have been in place for decades, including a 25-year-old case database, despite most operations having been migrated. In practical terms, that kind of environment means information doesn’t always sit in one place. Instead, it can be distributed across multiple systems that evolved at different times, requiring data to be joined up after the fact rather than flowing through a single, unified record.

Taken together, this doesn’t point to isolated system failure but rather a structural reality that builds up over time. Public services tend to evolve in layers, and the data infrastructure evolves with them. So day-to-day work often involves moving between systems, reconciling records and bridging gaps between older platforms and newer tools. It is less visible than a single outage or incident, but it shapes how work actually gets done behind the scenes, while wider modernisation continues in parallel.

The way we talk about legacy system migration has changed

For years, “legacy migration” was a costly and time-consuming process that rarely achieved its intended goals, which has resulted in some organisations taking a more cautious approach.

But migration isn’t what it used to be. Modern cloud data migration for the government now uses proven frameworks, automation and phased delivery to reduce risk, including iterative and phased migration approaches recommended in UK government guidance. All these factors mean migration can get done—safely, reliably and with less disruption—if you approach it with discipline.

For many organisations, the real question is no longer whether to migrate, but when and how to do it safely.

How we handle legacy migration

Butterfly Data doesn’t believe in forcing everyone into a one-size-fits-all template. Every old data environment is its own puzzle. What we do offer is a clear, repeatable process to cut through uncertainty at each stage, plus deep platform expertise to handle truly complex jobs.

Here is how we break down our approach:

1. Discovery and assessment

We start by mapping your landscape in detail: apps, data stores, dependencies, performance roadblocks, quality issues and technical limits. This gives us the solid foundation we need for any decisions after.

2. Legacy migration strategy and planning

With discovery done, we define a clear data migration strategy: which platform to pick, which order to do things in, how to phase delivery, and how to measure real success, all aimed at minimal business disruption.

3. Data exploration and source-to-target design

We dig into your data, find legacy structures and relationship quirks, and spotlight quality gaps. Then we chart source-to-target mappings with rules and transformations – the blueprint before we write a line of migration code.

4. Data migration development and execution

We create automated workflows for extraction, transformation, filtering and loading, designed to be repeatable, trackable and auditable. Data stays secured through it all.

5. Validation, testing and pilot migration

We run thorough parallel tests to check the data's accuracy, quality and value in the new setting. Pilots reveal real-world impact so we can solve problems before the big cutover.

6. Production migration and transition

We deploy with control, keeping downtime low. And, when it is time, we help retire old systems—scrubbing out technical debt instead of just shifting it around.

7. Post-migration support and optimisation

Right after migration, we stick around to sort early issues, tune performance and make sure everything’s humming in the new environment. We don’t leave until it works, not just when it’s “done.”

Why this matters now

You only have so much time for a controlled, careful migration. As your legacy systems age, the number of individuals who possess in-depth knowledge about them decreases. Security holes grow. And the gap between your systems and what the cloud can do just keeps widening.

HMRC’s upcoming programme to exit legacy data centres and migrate hundreds of critical services to cloud infrastructure reflects a broader government push to reduce technical debt and improve operational resilience.

Organisations that invest in legacy migration and the right foundations gain more than resilience—they unlock scalable data platforms that support everything that comes next, such as advanced analytics, AI and ML, real-time data handling and the kind of decision support today’s services demand.

We are an experienced employee-owned, B-Corp-certified consultancy with a vetted team who are committed to delivering the most effective solutions for our clients. We don’t push unnecessary migrations. We care about making your organisation stronger: technically, operationally and strategically, with the right technology for you.

If you are carrying legacy risk you know you need to fix—or you don’t know where to start—we would love to talk.

Ready to transform your data?

Book a free discovery call to explore how our tailored data solutions can help you manage complex datasets, gain actionable insights and drive measurable results.