What Faculty is planning next

The AI firm’s links to Dominic Cummings made it headline news. Now it’s putting AI to work on businesses and the NHS

As the deadly first wave of the coronavirus pandemic hit the UK in March 2020, Marc Warner stopped going to his usual work meetings. The NHS had put out a plea for help from technology firms, and Warner, the CEO of London-based artificial intelligence firm Faculty, rang his investors and told them he would be focussing on the pandemic response. A symphony of government and NHS contracts ensued – five of them directly linked to the pandemic management. Warner says he threw himself into the work and didn’t attend company meetings for months. “One of the jokes we make is that 2020 was so bad I had to code again,” he says of the pandemic’s earliest days. “No one in Faculty wants that – I was never a great coder.”

Faculty’s NHS work focused on two major areas: helping to bring together unruly sprawls of health information, and building AI for the NHS to forecast where supplies are needed. The first of these projects, called the NHS Covid-19 Data Store, is a collection of health datasets, including everything from populations and NHS staff levels to Covid-19 death data and calls to emergency services. The system, which Faculty created alongside Google, Amazon, Microsoft and Palantir, organises and makes data available to decision-makers. “Keeping data in a good state is an incredibly tricky problem in a normal situation,” Warner says. “It was very clear, fairly early on, that it was going to be incredibly powerful to have a really good view of the system, [of] what's going on now.”

The second stage was predicting what would happen next. What this looked like was an AI model – based on previous bed use and NHS 111 call volumes, among other data – that lets hospitals predict potential outbreaks for the next three weeks. The model has been given to local NHS Trusts after being used nationally by NHS England and forecasts where ventilators and beds, among other things, would be needed most. As Faculty’s pandemic work progressed, Warner, who has a PhD in quantum computing and a degree in physics, moved to “red teaming” its output: digging into the AI models, trying to break things.

The decision to focus on the pandemic had repercussions, he says: some of Faculty’s work became “unaligned” without the CEO overseeing it. “The other set of consequences that truthfully, we never anticipated going into it, [were] the media repercussions of me being more directly involved,” Warner says. The company’s early work in politics and government connections have made it, to some, one of the most controversial AI companies around.

In 2020, The Guardian reported that Faculty had been given 14 government contracts in two years – including for its pandemic work – and that Warner’s brother and former employee Ben, now a Downing Street advisor, was sitting in meetings of the Scientific Advisory Group for Emergencies (Sage). Faculty’s original NHS Covid-19 Data Store contract said it could retain intellectual property rights to train its AI models with NHS data – something that would have allowed the company to profit from the access to usually inaccessible information. However, Faculty says this was later rewritten at its request to make clear it wasn’t using the data.

This is on top of questions around the work Faculty did for the Vote Leave campaign during the 2016 EU referendum, and also about £260,000 of work for controversial former government advisor Dominic Cummings. (Details of the work have not been disclosed.) Faculty denies receiving any preferential treatment during the pandemic response and says it stopped working in politics in 2019. “If you want to help solve important problems, you will be exposed to scrutiny and that is a good thing in general,” Warner says. He adds 80 per cent of the company’s contacts are with private companies, rather than the government.

Two former members of staff say that, internally, Faculty is highly focused on the ethics of the work it does. They say Warner’s academic background shapes the company’s culture and that individual data scientists can refuse to be involved in projects they find ethically objectionable. This is in addition to an ethics board which assesses projects it does.

Warner founded Faculty in 2014 – originally under the name of ASI Data Science – as a fellowship and mentoring company. From the outset, he decided it would train PhD graduates in data science and AI, with a final practical placement taking place at an established company. Since then, more than 300 PhDs have and 200 companies have taken part. While the fellowship sits at the heart of the company, it doesn’t make it a profit, the CEO says. Instead, it allows Faculty to hire PhDs – more than 50 of its 120-plus staff hold one – and make connections with companies.

Faculty’s focus, Warner says, is deploying custom AI into businesses and helping them make the technology useful. To do this it has built its own data science software. The tool allows data scientists to import large datasets, run AI against them and then visualise the results. Warner says it’s like Excel, but for data scientists.

The real trick, though, is getting more companies to use AI – for this Faculty is deploying its AI within companies focusing on consumer tech, health, engineering and government. Helping a variety of companies use AI has lessons that can be applied across all businesses. Warner stresses that Faculty’s clients own any AI model that has been trained on their data. However, as most AI that’s currently used matches patterns in data, it’s possible for the algorithms and code libraries to be used with different clients. Warner says: “A customer has data, we help them find patterns in it, and we help them use that understanding to sort of push the world in a direction that they want it to.”

The hype around AI, which ballooned following DeepMind’s AI victory against Go champion Lee Sedol in 2016, has largely subsided and businesses are faced with the harsh reality that getting returns on investment in AI isn’t guaranteed. Research from the Boston Consulting Group and MIT Sloan Management in 2020 found only 11 per cent of firms using AI are getting a “sizeable” return on their investments.

“The typical problem is: it's too broad as a proposition,” says Richard Sargeant, Faculty’s chief operating officer. One unnamed shipping company asked it to predict the price of containers for all of its routes for the next year, a task Sargeant says would have been like predicting exactly what the stock market is going to do. Instead of a broad-brush approach, Sargeant and Warner say deploying AI in narrow situations is best – where it can excel and perform better than humans. To do this they need to understand how a business operates and how the data it has can be used. They cite examples of helping a retailer reduce the number of printed catalogues they send to customers and the UK’s Network Rail using cameras on the front of trains and AI to identify trees and plants encroaching onto railways.

For Warner and Sargeant, the secret to companies effectively using AI is building out a whole approach to it. It’s not just having the right data and knowing what can be done with it. The business value of AI comes from the entire system and not how advanced the AI model is, Sargeant says. “There will definitely be some small firms that just want to buy a black box off the shelf,” he says. “But my hunch is that for most enterprises… they're going to need to think in terms of the combination of technology and expertise.”

Updated 06.05.21, 09:10 GMT: The headline and standfirst of this article have been updated

More great stories from WIRED

This article was originally published by WIRED UK