Artificial Intelligence in Healthcare

Two areas of machine learning (ML) have undergone rapid advances over the last few years: Natural Language Processing (NLP) and Image Recognition.

The vast quantity of available text on the internet has allowed the first change, enabling autocompletion and machine translation. Moore’s law and the innovation in graphical processing units (GPUs) have driven the second change, leading to transformation in everyday technology such as: automatic number plate recognition at your local carpark; expense scanning apps that make dealing with your receipts a thing of the past (no more complex spreadsheets to manually fill out); and face recognition to unlock your phone, PC, and even help get you through the airport faster. 

So why are some areas of our lives lagging behind? Why are hospitals still full of paper charts, scribbled whiteboard rotas, and post it notes?

Well, I am pleased to say that some really fantastic projects have been kicked off right here in the UK and Butterfly Data has, so far, completed two for different parts of the NHS. Many of the projects listed below are part of the National Institute for Health Research’s long term plan for a £250m NHS AI Lab, including an Accelerated Access Collaborative (AAC). In Round One, Optos PLC received an AI Award to get their Optomap, which is ultra-widefield imaging technology for detecting Diabetic Retinopathy. The awards cover different phases of development, from feasibility through real-world testing and then eventually to real world usage. Back in September 2020, Optos were in Phase Four – initial health system adoption, which would involve use in three or more NHS sites over a 12-36 month period. Working with their parent company, Nikon, as well as Google and Verily Life Sciences, they announced their product launch on 19 April 2022 as the first CE-marked AI-based solution in this field. In order to train the AI model, the data scientists used thousands of labeled images hand graded by close to 100 Ophthalmologists, with each image being graded several times by different experts to ensure consistency. Once the algorithm has been trained, it can be used to power an application called an Automated Retinal Disease Assessment (ARDA) instantly.

Another Phase Four project I have been following with interest is CogStack, a Natural Language Processing application being developed at King’s College London. Back in February, Butterfly Data gathered together stakeholders from NHS improvement, researchers, privacy experts, and NLP developers for a workshop about ‘the art of the possible’ when it comes to innovation in redacting or de-personalising unstructured text data for use by researchers. In the session where we asked participants to list the tools currently in use, several mentioned CogStack. It has been built using open source components such as Apache Nifi, Python, and Jupyter notebooks - more details are available on Github for those interested. This is a fantastic example of ‘coding in the open’ and encouraging a collaborative approach. 

The tool is in use in five NHS Trusts and has been used to analyse 12 million free text documents. Unlocking the hidden insights and value in this unstructured data is one of the complex challenges facing the NHS (and all health authorities globally) as they modernise their processes and systems.

There is still a lot of opportunity and challenge in this area, but real progress is being made and will be coming to a hospital near you, soon!

Previous
Previous

Goodbye-Pod

Next
Next

How YouTube can help you gain more Tech skills