Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top
Quantum Computing Boom: How Tech Giants are disrupting the AI industry?
Views 3.4M Contents 114

AI vs Quantum Computing from a pure scientific perspective (long text, enter with caution).

I work in AI, while my husband is in quantum computing. We both have PhDs from top universities and work as frontline researchers. Quantum computing is seen as the next big thing after AI, currently perhaps only existing in the stock market realm... the stock market is just the stock market... Quantum computing stocks may continue to skyrocket, but scientifically speaking, there will only be certain application opportunities if it develops well in the next 5-10 years (not daring to say 20 years, as 3 years ago we thought AGI would take 10 years, yet we see its form now).
Just like everyone is regretting not buying Bitcoin over a decade ago... once quantum computing is achieved, many people may regret not investing in certain stocks 20 years later.
I really want to short, but my husband disagrees (so I won't short). It seems the stock market doesn't care whether something is currently profitable or not: the big picture matters, investors believe in it. Otherwise, why is it that even though all AI practitioners can't do without the A100, NVIDIA is still plummeting. Quoting a top scientific researcher: quantum computing research is still very interesting, there are countless unsolved problems everywhere, publishing papers endlessly, much higher efficiency and value compared to working on AI models.
On the scientific level, I recommend everyone to understand a few concepts: quantum advantage, NISQ (noisy intermediate-scale quantum era), physical qubit, logic qubit, single/double gate accuracy
Firstly, not all quantum algorithms are necessarily better than classical ones! Currently, even though humanity has mathematically proven the existence of quantum advantage problems, it has not truly achieved quantum advantage yet (to achieve this, let's start with Nature Science directly). Quantum advantage realizes exponential acceleration, but for a linear regression problem, quantum algorithms are useless.
Furthermore, quantum advantage can only be demonstrated after scaling up. The current hardware scale (number of physical qubits) and accuracy (single/double qubit gate accuracy) are not sufficient to run a practically valuable system on quantum hardware. This is also why a few years ago IBM published a random number generator study in Nature... gate noise and shot noise in quantum hardware are also significant issues. Of course, AI can provide solutions to reduce noise, as reflected in DeepMind's latest AlphaQubit paper. However, in reality, AI is still more powerful, especially in the QC application.
Lastly, I am very bullish on quantum computing, but it is important to note that hardware and software are different. Quantum hardware and quantum algorithm companies are currently somewhat off track. Most quantum algorithms are tested on classical computers, and emulating a quantum computer on a classical computer usually overestimates an algorithm, because real hardware has many problems. Additionally, quantum algorithms themselves have many theoretical problems that are difficult to solve (similar to AI...). There are actually not many quantum algorithms that can run as expected on quantum hardware.
Two common questions:
Will quantum machine learning replace deep learning? Not for now. Really want to say that QML is still purely a research direction, more theoretical. Go read the papers, all math formulas, long derivations, I have never seen any quantum machine learning algorithm in action (even simulated on classical computers) that can outperform classical/deep learning algorithms.
Quantum decryption to crack cryptocurrencies? Bitcoin enthusiasts don't need to worry too much... still a long way off for now.
Overall, the current stage of quantum computing development can barely be compared to the introduction of NN and backpropagation in the late 1980s and early 1990s. Following this analogy, perhaps we can see quantum computing in 30 years as today's AI.
Finally, I wish the Industry healthy development, with both quantum computing and the AI Industry doing well, we can apply for funding and do better research to publish more papers. But also hope everyone remains cautiously optimistic, giving science and technology more time and patience.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
10
+0
3
See Original
Report
4688 Views
Comment
Sign in to post a comment
  • 72674030 : Will quantum computing reach the level of today's AI in 30 years? Is it true or false?[undefined]Is this an optimistic speculation or a result of rational inference?

  • 雪宝宝 OP 72674030 : It is the law of scientific development, 30 years have passed quickly! Other normal industries take 50-100 years, when was Einstein's gravitational wave proposed, and when was it verified? AI was proposed by Turing in the 1950s, in the late 1970s to the 1980s there was an AI winter, and people could not find a practical way. The emergence of backpropagation has just begun to make everyone realize the potential of neural networks. You can Google the history of AI. AI is the fastest developing industry in the history of science, this year AI for science (my research direction) Alpha Fold paper won the Nobel Prize in less than 4 years, while in other fields it takes over 10 years.

  • 雪宝宝 OP 72674030 : After 30 years, based solely on the history of AI, this order of magnitude is definitely more than 10 years, and it's hard to estimate exactly. It depends on whether there will be extraordinary talents to solve the specific core issue. The article also mentions AGI. After the emergence of transformers, the AGI process has taken a big step forward.
    If scientists could accurately estimate the time, we would have switched careers early. Now, with AI, the pace of scientific development has accelerated by an order of magnitude. AI can basically solve a major scientific problem in a few years (hence the crazy publishing in Nature Science).