Recursion Pharmaceuticals teamed up with AI biotech Exscientia last year in a deal worth almost $700 million. The matchup marked the biggest life sciences AI merger to date and comes amid a “surge in AI partnerships and acquisitions in the past five years [that] signals the opportunities the technology offers to life sciences companies,” according to an EY report.
Now, successfully integrating these companies into the more traditional drugmaking processes will take creative thinking and a full commitment to the changes ahead, said Recursion CFO and president of Recursion UK, Ben Taylor, who compared the current moment to “when biologics were growing and coming of age” about a decade ago.
“You had every different strategy come out of pharma, from going big to acquire assets, or going smaller to grow it from there, or not acquiring at all to grow entirely in house,” Taylor said. “You’re seeing the same thing happen, because most companies haven’t had a lot of validation coming through on AI platforms yet, and that’ll be a triggering part for a lot of pharma companies to start thinking about more.”
While Exscientia was struggling under the weight of losing a big Bayer partnership and leadership changes in the last few years, Recursion was building up its pipeline of AI-discovered drugs and securing high-profile partnerships. But the companies had a lot in common, and together they now have 10 programs in the pipeline and hundreds of millions of dollars in potential milestones lined up with Big Pharma partners like Sanofi and Roche, Taylor said.
"You’re not going to solve all of the different aspects to create a new molecule with a single algorithm — you need to integrate them."

Ben Taylor
CFO, Recursion Pharmaceuticals
While the new Recursion’s technology is promising and backed by world-leading computer chip maker NVIDIA, the company’s pipeline is early in clinical stages. Its most advanced program is in the rare disease cerebral cavernous malformation and has crossed the phase 2 threshold, while its programs in oncology and other rare diseases are earlier in the process or preclinical.
Although the supersized AI marriage is still a novel concept, Taylor believes life sciences leaders will start to see more of these deals down the road, bringing new technologies into the fold and changing the way drugs are made.
Here, Taylor discusses the AI disruption in biopharma, what companies should be looking for as they explore partnerships and acquisitions, as well as the challenges associated with collecting and using big data to drive further shifts in the industry.
This interview has been edited for brevity and style.
PHARMAVOICE: AI is clearly becoming a disruptor in the life sciences — why is that happening at this point in time?
BEN TAYLOR: The right way to think about AI is as a far better tool than [what] has been available previously that gives you functionality that wasn’t there before. I would liken it to when tools like Excel were coming all of a sudden, and it was so much easier to do something on a spreadsheet rather than using a calculator. We’re seeing a similar level jump in computation, going from the traditional methods we’ve been using for the last 20 to 30 years to something where we can perform a highly efficient, multi-parameter function.
In biology and chemistry, there are no simple problems — everything is a complex, multi-parameter problem. If, for instance, NVIDIA weren’t able to provide the world with their processors, you couldn’t even do the analysis, but beyond that, the architecture of the analysis is fundamentally changed to allow us to perform deep learning on big data problems. We’re finding a needle in a haystack, and there’s a huge variety of algorithms to optimize and, for example, figure out where this atom should be placed on this molecule. You’re not going to solve all of the different aspects to create a new molecule with a single algorithm — you need to integrate them.
Speaking of integration, why did the Recursion and Exscientia deal make sense from your perspective?
Both companies were founded about the same time 12 years ago with a very similar mission, which is drug discovery. At the time, there were new technologies but they weren’t predictive, and so even with some of the smartest people in the world, we still haven’t been able to solve the very basic, fundamental problem of a failure rate that’s over 95%. Why is that happening? It’s because we didn’t have predictive ways of understanding what’s going to happen later on in a clinical trial. You actually have to predict how a molecule should work, and both companies had that vision with Recursion focusing on the biology and Excientia focusing on the chemistry. It’s been fascinating to watch the integration because it’s been much easier than you would imagine. Now the chemists have all these amazing tools to play with in biology, and the biologists have these amazing tools to play with in chemistry, and it came together so powerfully.
What should drugmakers be looking for as they scan the horizon for AI partners or acquisitions?
Look for the use case. Look for the validation. We’re in an industry where lots of people have lots of ideas on how to do both, and the ideas all sound like they’re going toward the same goal, or they use the same language behind it. So the only way you can really differentiate is by defining the actual product you produce with a platform or technology. It’s funny, because I’ve been deep in this for years, and I read a press release, and I’m like, ‘Oh, it sounds like they do exactly what we do. And then I dig in, and it’s like, ‘Oh, they do nothing like what we do.’ The only way you can really get your arms around that is to see what’s being produced.
On the flipside, what should AI companies be looking at when they team up with the life sciences?
The No. 1 thing is commitment, because it’s always easy to find a partner or even some money around new and exciting technologies. In fact, I think a lot of partners have a part of their budget that’s set aside to say, ‘Hey, go and spend this on new, crazy ideas.’ The partners that make a difference — if you look at our Roche and Sanofi partnerships, for example, with multiple programs each — their side comes in as committed as we are, and that comes from the top of the organization all the way through. When they’ve had relationships that haven’t progressed as quickly, it’s almost always because you just don’t have that commitment and that sense of doing whatever it takes to get this done. You’ve got to invest the time. If you’re not bought in on the process changing, it’s like taking an analog cable and putting it in the middle of a fiber optic — either way, you’re limited by whichever is the lowest quality point. So if a large pharma is saying they’re going to run everything traditionally, then insert AI for three months and go back to the traditional method. You’re going to lose most of the benefits of the more novel platform that way. In our relationship with Sanofi and Roche, and all of our relationships, we’re able to break down a lot of silos into an AI-first environment.
If you think about the pharma decision making process, it takes a lot to move through it. There are a lot of different signoffs, committees, groups — all sorts of things. I can always tell the companies that are committed to AI because you see that smoothness of decision making, and that engagement comes from all levels inside the organization. If you don’t have that, getting the milestone across can be really tough, because it always requires another level of approval.
If we look at data as currency in the life sciences, is there a limit to the availability for AI systems to make use of?
I don’t think there’s a limit. But most of the data isn’t worth anything. A lot of the data that has been collected historically generally isn’t in the right format or is very specific and not that useful. We use public datasets, or when we’re in a partnership we use their data and dig into everything we can, but honestly it only gets us to a very rough starting position. More useful is creating a fit-for-purpose dataset to be able to answer questions — and it’s fit directly for the purpose of the questions you’re asking. Since we work in proprietary drugs, and if you actually want to come up with novel concepts, you have to start with something that’s going to answer very specific questions.
So you're going to have to create your own data on that as well, right? That's the No. 1 concern. We once had a discussion with a pharma partner who was asking us to come in and take a look at their old data and help them build models with it. The conclusion we came to was that it was going to be far more efficient and probably predictive to actually recreate all of the data that they had been doing in a much smaller context. You could get to it without trying to recreate 30 years of data, right? Because all of their data was in different formats and based on very specific projects, you would have had an army of people working night and day to try and wrangle that into shape to where you could actually use it. So that's something I think a lot of people miss.