Computational Biology Salon: Calculating the incalculable

By Julia Hawkins

06 Jun 2018

Share

Last week LocalGlobe hosted a computational biology salon bringing together life sciences investors, tech investors and Andrew Steele, a computational biologist from the Crick Institute.

What is computational biology?

It applies methods from a wide range of mathematical and computational fields to the study of biology. These include mathematical modelling, machine learning and computational simulation, where medical applications include drug discovery and development, personalised medicine, diagnostics and prevention.

Why did we host it?

We have a unique health tech ecosystem developing between King’s Cross and Cambridge that is accelerating the UK’s expertise in this area:

— The Francis Crick Institute in King’s Cross — Biomedical discovery institute dedicated to understanding the fundamental biology underlying health and disease. 1500 scientists based on the same road as our new office!

— The Wellcome Sanger Institute in Cambridge — was conceived as a large scale DNA sequencing centre to participate in the Human Genome Project. It employs 900 people and their mission is to understand “the role of genetics in health and disease”

— Cambridge and Babraham — The Babraham Research Campus is one of the UK’s leading campuses to support early-stage bioscience enterprise

— London and King’s Cross in particular were recently coined “the global powerhouse of AI”, home to emerging world leading AI companies DeepMind, ASI and Benevolent AI to name a few

— Tech giants — both in London and in Cambridge — are growing increasingly interested in health and data. Google’s DeepMind set up DeepMind Health to enable their technology to service patients, nurses and doctors, and Apple visited the FDA more times than any other company in 2017.

Why is computational biology of interest?

There have been huge advances in this field over the last 15 years.

The Human Genome Project released its first draft sequence of a human genome. The project took 13 years and cost $2.7 bn.

By 2013 Illumina, a leading US gene technology company, had reduced the cost of sequencing a human genome to from $1m in 2007 to $4,000.

Illumina sequencers at Wellcome Sanger Campus

The price to sequence a genome continues to fall. The UK’ s Oxford Nanopore has opened up DNA analysis to researchers who don’t have direct access to sequencing technologies, freeing them up to perform analyses in their own labs or in the field in real time. Their MinION, a handheld device, costs $1,000 and includes materials to run initial sequencing experiments.

There is also great interest in this field from Governments who want to use technology to accelerate medicine and disease research. Just last week Jeremy Hunt, the secretary of state for health and social care in the UK, announced ambitious plans for the deployment of artificial intelligence software across the NHS, with a particular focus on cancer detection. James Wise, partner at Balderton, wrote an excellent piece in The Sunday Telegraph last weekend urging the NHS to develop its own AI software. Read his full article here.

At LocalGlobe, we’re very excited about this area and have made two investments in this space so far, both based in Israel: Zebra, who are transforming patient care with AI. They have developed an automated radiology assistant to read and diagnose medical imaging studies, helping radiologists to increase productivity and achieve higher levels of accuracy. And MyHeritage, a world leading online family history and DNA testing platform.

A computational biologist’s view

Andrew Steele, computational biologist at the Crick with a background in physics, is also, science presenter and writer, joined us to give an overview of the Crick and some of his most recent work.

— Steele works on understanding genomic data and medical records using machine learning techniques such as neural networks as part of the Luscombe Lab

— His PhD was in in physics, examining magnetism and superconductivity. However he switched fields in after realising that ageing, and all the diseases it causes, is the single largest cause of contemporary human suffering

— He feels that by understanding the underlying ageing process, we could treat or even prevent many of these diseases simultaneously

— His work at the Crick involves using machine learnings models to examine electronic health records and he presented one of his studies which showed that using machine learning it is possible to outperform conventional survival models for predicting patient mortality

— His study looked at 100k coronary heart disease patients over their lifetime including 3m hospital diagnoses, 60m prescriptions and 29m GP records

— You can find a preprint about this work on the BioRxiv

Key takeaways from the discussion:

We are at a unique moment in time due to current access to vast amount of longitudinal health data, and there is tremendous value in this data in understanding clinical outcomes.

We have more data than we know what to do with, and pharmaceutical companies need help asking the right questions. And with the rise of genetic testing, patients will demand medical professionals to be more knowledgeable about genetic data, and we will need more genetic counsellors to provide guidance.

We continue to see opportunities for machine learning to enable and assist healthcare professionals in decision-making, starting with images and image recognition software. For instance in dermatology, algorithms now match the performance of dermatologists in their ability to diagnose skin lesions.

Indeed, national health systems currently have a competitive advantage here due to wealth of longitudinal health data that is available

But more can be done by healthcare companies to explain the role of patient data in research in order to encourage people to consent to sharing their data

There is potentially a positive role for tech giants to play in this field. Rare disease communities, for example, already use Facebook to build strong support groups. Could tech giants build, support and fund patient registry data, and possibly even therapies?

There are some positive trends for the 350m people around the world who are affected by rare diseases, starting with good news for monogenic diseases (NB: there are more than 7k recognised rare diseases, only 5% of which currently have a therapy)

Gene editing tools are becoming increasingly boilerplate, allowing for faster and cheaper prototyping and iteration — the timeline to a clinical trial is coming down from average 10–15 years to 3–4 years in some best cases where the single gene is known

Even though patient populations are small, NICE (The National Institute for Health and Care Excellence) has approved gene therapies costing up to £150k per patient based on QALY (Quality Adjusted Life Year, which is a measure of disease burden, including both quality and the quantity of life lived) where a single treatment “cures” the disease

A key highlight from the discussion was that biotech and life science VCs used to operate very separately from tech VCs (entirely different ecosystems, different people making different types of investments). But with the advent of boilerplate (bio) technologies and the increasing use of healthcare data, there is huge scope for these two worlds to start merging and collaborating. At LocalGlobe, we think very interesting opportunities will lie at the intersection.

Thank you to everyone who came including Azeem Azhar, Andrew Elder, Andy Richards, Arthur Stril, Rebecca Todd, team LG, and to Platoon for hosting.

We look forward to continuing the discussion!