Skip to Content

Why Alibaba is betting big on AI chips and quantum computing

Meet the man behind Alibaba’s gamble on emerging tech.
September 25, 2018
VCG | Getty

During the opening ceremony of Alibaba’s 2018 computing conference last week, Simon Hu, president of Alibaba Cloud, invited the MC to taste some tea on the stage—but, first, to distinguish between tea roasted by hand and by machine.

While the MC stared helplessly at two saucers filled with nearly identical-looking tea leaves, Hu pulled out his smartphone. He took a photo of each saucer and fed them into an app developed by Tmall, one of Alibaba’s e-commerce platforms. Using an algorithm specially trained to tell the difference between different kinds of tea leave, the app solved the problem.

It was a small example of the interplay between Alibaba’s research on fundamental technologies and the demands of its business. Later on in the four-day conference, the company announced more plans to develop technologies that could be turned into useful things in everyday life, including an AI-chip subsidiary called Pingtou Ge and a team that’s developing a quantum processor.

The person leading all these research efforts is Jeff Zhang, Alibaba’s chief technology officer and head of its DAMO Academy research lab. Zhang sat down with MIT Technology Review at the event to discuss his company’s plans.

TR: Why did Alibaba create a chip company?

Zhang: This gives us an advantage. Chips are becoming more specialized. Many companies are developing AI chips, but each company has different data loads and business needs. This means each company needs customized chips optimized for its own software. This is something traditional chip companies can’t do.

But of course, we will also utilize some already widely used chip designs, and combine these with our own designs to create more value.

What business model will the chip company pursue?

There are three routes. The first is IP licensing. We can license our chip designs for other companies to use. The second is providing processor solutions for various industries through IP aggregation. For example, we can provide customized processors for home appliances and cars. The third is making chips for our own cloud and data centers.

What about Alibaba’s road map for quantum computing?

This is going to play out over a longer term, because we don’t have a processor for it yet. Once we have a processor, we’ll need to answer the question of what to use it for. Quantum computers might be good for applications in cryptography or large simulations of materials, but they will not be suitable for all kinds of tasks. And the more qubits you have, the more challenges there are, because you need to make sure all these qubits can work together seamlessly and reliably.

The number of qubits is not our only goal. We want to solve the engineering problems of quantum computing. How do you run existing programs on quantum processors? You’re not going to run every program on quantum processors, right? So you’ll need to determine which computation tasks are for quantum computers and which are for classical ones. We are building a superconducting quantum computer in our Hangzhou headquarters, and we want to reach a point where quantum computing is scalable.

How do you prioritize DAMO Academy’s research?

Right now all of our work revolves around data and intelligence. We’ll have more and more data, so how do you collect, store, and process it? We process almost all our data using AI, which is the intelligence part of DAMO’s research. And this includes algorithms and processors.

How does Alibaba attract technical talent?

Most of Alibaba’s businesses are basic services, such as e-commerce, cloud computing, and logistics, which are services fundamental to the entire country of China. The fact that China wants to keep growing will provide a lot more space for Alibaba. People want to come here because they feel what they do can make a difference for many people, or even a whole country’s progress in a particular area. This is how Alibaba is unique as a platform. It means every bit of work you do on this platform can be amplified. I think that kind of social and economic value is very attractive for technical experts.

 

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.