1. Home
  2. quantum computing

quantum computing

Harvard scientists claim breakthrough, ‘advent of early error-corrected quantum computation’

The team’s results, once reviewed, could represent a significant milestone in quantum computing research.

When industry insiders talk about a future where quantum computers are capable of solving problems that classical, binary computers can’t, they’re referring to something called “quantum advantage.”

In order to achieve this advantage, quantum computers need to be stable enough to scale in size and capability. By-and-large, quantum computing experts believe the largest impediment to scalability in quantum computing systems is noise.

Read more

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

Moody’s launches quantum-as-a-service platform for finance

The platform allows clients to compare and benchmark quantum and classical algorithms for a given task.

According to the Moody’s Analytics website, QFStudio is a software-as-a-service (SaaS) offering that will serve as “a continuous integration, benchmark, and delivery platform” for quantum computing solutions.

The burgeoning quantum computing sector is expected to experience steady growth over the next few decades. A recent forecast predicted that quantum computing technologies would grow from a market capitalization in the United States of about $138 million in 2022 to $1.2 billion by 2030.

Currently, most of the focus in quantum computing is on research and development. Companies such as IBM, Microsoft, Google, D-Wave and Rigetti have quantum, cloud-based quantum and hybrid quantum computing offerings on the market, but most of these solutions are geared toward early movers exploring use cases.

Read more

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

Meta’s AI boss says there’s an ‘AI war’ underway and Nvidia is ‘supplying the weapons’

The outspoken executive also said that Meta isn’t pursuing quantum computing because it isn’t currently useful.

Meta AI boss Yann LeCun sounded off on the industry-wide state of artificial intelligence and quantum computing during a recent event to celebrate the 10 year anniversary of the founding of Meta’s Fundamental Artificial Intelligence Research (FAIR) team. 

During LeCun’s commentary, he commented on Nvidia’s current stranglehold on the AI hardware industry, the likelihood human-level AI will emerge in the near future, and why Meta isn’t currently pursuing quantum computing alongside its competitors.

The artificial intelligence war

LeCun’s views on the imminence of so-called human-level AI are well-documented.

By comparison, Elon Musk recently gave the bold prediction that a “Digital God” would arrive within the next 3 to 5 years.

In the middle, perhaps, lies Nvidia CEO Jensen Huang. He recently stated that AI would be able to complete tests in a manner “fairly competitive” with humans in the next five years.

Read more

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

IBM brings ‘utility-scale’ quantum computing to Japan as China and Europe struggle to compete

Experts predict the global quantum computing sector will have grown from about $930 million in 2023 to $6.5 billion by 2030, but some global markets may be better poised for growth than others.

IBM announced the completed installation of a 127-qubit quantum computing system at the University of Tokyo on Nov. 27. According to the company, this marks the arrival of the first “utility-scale” quantum system in the region.

The system, dubbed a “Quantum System One” by IBM and featuring the company’s Eagle processor, was installed as part of an ongoing research partnership between Japan and IBM. According to a blog post from IBM, it will be used to conduct research in various fields, including bioinformatics, materials science and finance.

Per Hiroaki Aihara, executive vice president of the University of Tokyo:

“For the first time outside North America, a quantum computer with a 127-qubit processor is now available for exclusive use with QII members… By promoting research in a wide range of fields and realizing social implementation of quantum-related technologies, we aim to make a broad contribution to a future society with diversity and hope.”

While Japan and the University of Tokyo reap the benefits of working with a U.S. quantum computing partner, China’s second-largest technology firm, Alibaba, has decided to shutter its own quantum computing laboratory and will reportedly donate its equipment to Zhejiang University.

Local media reports indicate the Alibaba move is a cost-cutting measure and that dozens of employees associated with the quantum research lab have been laid off. This follows the cancellation of a planned cloud computing spinoff earlier this month, with Alibaba stating that the partial United States export ban on computer chips to China has contributed to “uncertainty.”

Related: US official confirms military concerns over China’s access to cloud technology

The quantum computing sector is expected to grow by more than $5.5 billion between 2023 and 2030, according to estimates from Fortune Business Insights. This has led some experts to worry over the state of quantum computing research in areas outside of the U.S. and China.

Koen Bertels, founder of quantum computing accelerator QBee and a professor at the University of Ghent in Belgium, recently opined that Europe had already lost the artificial intelligence race and couldn’t afford to lose at quantum computing.

“In addition to being behind in funding, talent, and strategy,” wrote Bertels, “Europe isn’t only competing against the US.”

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

Microsoft enters $100M partnership with Canadian firm after quantum breakthrough

Photonic’s founder and Chief Quantum Officer says the company can bring a quantum computer to market within the next five years.

Canadian quantum computing firm Photonic has emerged from stealth to raise $100 million for its all-silicon quantum computing platform. Among the investors is new partner Microsoft, who will co-develop quantum networking solutions with the startup.

The investment and partnership come as numerous experts in the industry laud Photonic’s novel approach to quantum computing as a “breakthrough” for the field.

Photonic’s technique involves building quantum computers using silicon spin qubits with a spin-photon interface — in other words, a computer that uses qubits made of light to perform quantum computations on silicon hardware.

Related: IBM, Microsoft, others form post-quantum cryptography coalition

In quantum computing, a qubit is analogous with a binary computer’s bits. However, whereas a binary, or classical, computer can only perform calculations using ones and zeros, a qubit can tap into exotic features of quantum physics called “superposition” and “entanglement.” These quantum states allow qubits to compute in a way that would resemble a binary bit being able to use ones, zeros, ones and zeros, neither ones nor zeros, and other even less intuitive combinations.

A spin qubit takes things a step further by adding electron spin. And, by developing a qubit with a photonic spin interface in an all-silicon hardware solution, Photonic believes it has found the missing piece of the puzzle when it comes to quantum computing.

Stephanie Simmons, founder and Chief Quantum Officer of Photonic, says the company expects to bring a fault-tolerant, fully-functional, quantum networking system to market as early as within the next five years.

Per Simmons, the partnership with Microsoft will help to facilitate that timeline:

“We’re incredibly excited to be partnering with Microsoft to bring forth these new quantum capabilities. Their extensive global infrastructure, proven platforms, and the remarkable scale of the Azure cloud make them the ideal partner to unleash the transformative potential of quantum computing and accelerate innovation across the quantum computing ecosystem.”

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

EU to assess export controls on AI tech and semiconductor chips

The European Commission outlined four critical technology areas, including AI technology and semiconductor chips, on which it plans to run assessments to consider export controls.

The European Commission is conducting risk assessments and considering export controls on “critical technology areas,” including artificial intelligence (AI) and semiconductor technologies, according to a press release from the ommission. 

On Oct. 3, European Union officials said they identified four areas that need assessment regarding technology risk and risk of technology leakage: AI, advanced semiconductors technologies, quantum technologies and biotechnologies.

According to the announcement, these technologies were chosen based on their transformative nature, the risk of civil or military fusion and the risk that the technology could be used to violate human rights.

Thierry Breton, commissioner for the internal market of the European Union, called the move an important step for EU resilience, adding:

“We need to continuously monitor our critical technologies, assess our risk exposure and – as and when necessary – take measures to preserve our strategic interests and our security.”

He continued, “Europe is adapting to the new geopolitical realities, putting an end to the era of naivety and acting as a real geopolitical power.”

The risk assessments will be carried out by the end of the year. Any results or initiatives based on the risk assessments will be presented by spring 2024.

Related: US and Vietnam make deals on AI chips and tech worth billions

The commission says the next steps include engaging with the 27 EU member states to begin collective assessments of the abovementioned areas.

This development follows the European Commission’s June 20 enactment of the Joint Communication on European Economic Security Strategy, which is a pillared initiative including “protection against risks” and promoting European competitiveness in specific markets.

The United States has also been focusing on assessing the export risks of its own technology in similar sectors. Recently, it banned the export of high-level AI semiconductor chips to China.

Many lawmakers in the U.S. have also supported legislation that would mandate companies to report investments in Chinese technologies.

The decisions stemming from the U.S. have sparked countries overseas to consider their own course of action regarding AI technologies.

Magazine: Blockchain detectives: Mt. Gox collapse saw birth of Chainalysis

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

IBM, Microsoft, others form post-quantum cryptography coalition

The coalition includes Google sibling company SandboxAQ and the University of Waterloo.

IBM Quantum and Microsoft have formed a coalition to tackle post-quantum cryptography alongside not-for-profit research tank MITRE, U.K.-based cryptography firm PQShield, Google sibling company SandboxAQ, and the University of Waterloo.

Post-quantum cryptography (PQC) addresses the potential threat posed by quantum computers of the future. Current cryptography schemes rely on mathematical problems to stymie decryption attempts.

Cracking or bypassing such encryption with a classical computer would be close to impossible. Some experts estimate that it would take a binary computer system roughly 300 trillion years to break a 1,024-bit or 2,048-bit RSA key.

RSA, named for the computer scientists who first discussed it, is largely considered the standard for encryption.

Theoretically speaking, however, a quantum computer with sufficient hardware and architecture should be able to break RSA and similar encryption schemes within a matter of weeks, days, or even hours.

According to a press release from MITRE:

“Preparing for a PQC transition includes developing standards for the algorithms; creating secure, reliable, and efficient implementations of those algorithms; and integrating the new post-quantum algorithms into cryptographic libraries and protocols.”

Technologies such as blockchain and cryptocurrency, which rely on mathematical encryption, could be particularly vulnerable to decryption attacks by the theoretical quantum computers of the future. However it's currently unclear how long it could be before such threats could come to fruition.

Related: Scientists warn the ‘quantum revolution’ may stagnate economic growth

One study, conducted in 2022, determined that it would take a quantum computer with 300 million qubits (a very generalized measure of the potential processing power of a quantum system) to crack the Bitcoin blockchain fast enough to do any damage. By comparison, today’s most advanced quantum computers average a little over 100 qubits.

However, per the architecture described in that paper, it’s possible that more advanced qubit arrangements, chipsets, and optimization algorithms could significantly change the calculus involved and drop the theoretical 300-million-qubit requirement exponentially. For this reason, the global technology community is turning to quantum-safe encryption.

The National Institute of Standards and Technology chose four proposed post-quantum encryption algorithms in 2022, CRYSTALS-Kyber, CRYSTALS-Dilithium, SPHINCS+, and Falcon as candidates for a PQC-safe encryption standard.

On Aug. 24, 2023, NIST announced that three of the algorithms had been accepted for standardization with the fourth, Falcon, expected to follow suit in 2024.

Now that the algorithms have been accepted and (mostly) standardized, the coalition is set to begin its mission of using the deep knowledge and hands-on experience amassed by its members to ensure key institutions, such as government, banking, telecommunications, and transportation services are able to transition from current to post-quantum encryption.

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

New research shows how brain-like computers could revolutionize blockchain and AI

A CMOS-compatible neuromorphic computing chip could be on the horizon thanks to breakthrough research out of Technische Universität Dresden.

Researchers from Technische Universität Dresden in Germany recently published breakthrough research showcasing a new material design for neuromorphic computing, a technology that could have revolutionary implications for both blockchain and AI.

Using a technique called “reservoir computing,” the team developed a method for pattern recognition that uses a vortex of magnons to perform algorithmic functions near instantaneously.

It looks complicated because it is. Image source, Nature article, Korber, et. al., Pattern recognition in reciprocal space with a magnon-scattering reservoir

Not only did they develop and test the new reservoir material, they also demonstrated the potential for neuromorphic computing to work on a standard CMOS chip, something that could upend both blockchain and AI.

Classical computers, such as the ones that power our smartphones, laptops, and the majority of the world's supercomputers, use binary transistors that can either be on or off (expressed as either a “one” or “zero”).

Neuromorphic computers use programmable physical artificial neurons to imitate organic brain activity. Instead of processing binaries, these systems send signals across varying patterns of neurons with the added factor of time.

The reason this is important for the fields of blockchain and AI, specifically, is because neuromorphic computers are fundamentally suited for pattern recognition and machine learning algorithms.

Binary systems use Boolean algebra to compute. For this reason, classical computers remain unchallenged when it comes to crunching numbers. However, when it comes to pattern recognition, especially when the data is noisy or missing information, these systems struggle.

This is why it takes a significant amount of time for classical systems to solve complex cryptography puzzles and why they’re entirely unsuited for situations where incomplete data prevents a math-based solution.

In the finance, artificial intelligence, and transportation sectors, for example, there’s a never-ending influx of real-time data. Classical computers struggle with occluded problems — the challenge of driverless cars, for example, has so far proven difficult to reduce to a series of “true/false” compute problems.

However, neuromorphic computers are purpose-built for dealing with problems that involve a lack of information. In the transportation industry, it’s impossible for a classical computer to predict the flow of traffic because there are too many independent variables. A neuromorphic computer can constantly react to real-time data because they don’t process data points one-at-a-time.

Instead, neuromorphic computers run data through pattern configurations that function somewhat like the human brain. Our brains flash specific patterns in relation to specific neural functions, and both the patterns and the functions can change over time.

Related: How does quantum computing impact the finance industry?

The main benefit to neuromorphic computing is that, relative to classical and quantum computing, its level of power consumption is extremely low. This means that neuromorphic computers could significantly reduce the cost in terms of time and energy when it comes to both operating a blockchain and mining new blocks on existing blockchains.

Neuromorphic computers could also provide significant speedup for machine learning systems, especially those that interface with real-world sensors (self-driving cars, robots) or those that process data in real-time (crypto market analysis, transportation hubs).

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

Cerebras Systems secures $100M AI supercomputer deal with UAE’s G42

Alongside G42, Cerebras aims to expand the supercomputer, with plans to establish 36 exaflops of AI computing in the coming year.

Cerebras Systems has announced a deal worth around $100 million with G42, a technology group based in the United Arab Emirates (UAE). The agreement entails providing the initial installment of an artificial intelligence (AI) supercomputer, with the possibility of delivering up to nine more units, the company said in a statement on July 20.

Cerebras, headquartered in Silicon Valley in the United States, said G42 has committed to acquiring three of its Condor Galaxy systems — an innovative network comprising nine interconnected supercomputers. The first supercomputer in this network, known as Condor Galaxy 1 (CG-1), has a performance of 4 exaflops and comprises 54 million cores.

These systems will be manufactured in the United States to expedite their deployment. The first system is scheduled to be operational this year, while the remaining two, CG-2 and CG-3, are expected to be online in early 2024, the company said.

Visual representation of the supercomputer features. Source: Cerebras

This agreement emerges amid a global quest among cloud computing providers to find alternatives to Nvidia chips, the current market leader in AI computing. Nvidia’s products are facing shortages due to the soaring demand from AI services like ChatGPT and others. In this context, Cerebras and several other startups strive to challenge Nvidia’s dominance in the AI computing sector.

According to Cerebras CEO Andrew Feldman, discussions are underway for the potential acquisition of up to six more supercomputers by late 2024. Alongside G42, the company aims to expand the supercomputer, with plans to establish 36 exaflops of AI computing in the coming year.

Related: OpenAI launches ‘custom instructions’ for ChatGPT so users don’t have to repeat themselves in every prompt

In the announcement, Feldman expressed his intention to relocate to the UAE for three months, collaborating closely with G42 to advance its computing service using the systems. He described the endeavor as a “rare opportunity to revolutionize a massive market.“ G42, headquartered in Abu Dhabi, said it planned to leverage the Cerebras systems to offer AI computing services to healthcare and energy companies.

Cointelegraph reached out to Cerebras for more information on the terms of the deal and its future plans but has not yet received a response.

Collect this article as an NFT to preserve this moment in history and show your support for independent journalism in the crypto space.

Magazine: AI Eye: AI’s trained on AI content go MAD, is Threads a loss leader for AI data?

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges

Scientists warn the ‘quantum revolution’ may stagnate economic growth

There are “traps” lying in wait for innovators at the vanguard of fintech and quantum computing, according to researchers.

Quantum computing technologies are slowly beginning to trickle out of the laboratory setting and into commercial industries. While it remains to be seen when mainstream adoption will occur, a number of companies are currently engaged in experiments and trials with paying clients to develop quantum computing solutions. 

According to a pair of researchers from the University of Cambridge and Bandung Institute of Technology, respectively, this represents a critical period wherein the world still has the opportunity to prepare itself for what they’re deeming “the quantum revolution.”

In a recently published commentary in the Nature journal, researchers Chander Velu and Fathiro Putra describe the ‘productivity paradox’ and explain how the mainstream adoption of quantum computing could slash economic growth for a decade or more.

Per their commentary:

“The digital revolution took decades and required businesses to replace expensive equipment and completely rethink how they operate. The quantum computing revolution could be much more painful.”

The productivity paradox is a business and finance term that explains why the introduction of new, better technology doesn’t usually result in an immediate increase in productivity.

We’ve seen this in nearly every aspect of the nascent blockchain and cryptocurrency industries. As the requirements for mining increase, for example, so do the costs associated with entering the space in any competitive capacity.

Less than a decade ago, it was fashionable to mine cryptocurrency with your desktop PC’s spare compute. As the rates of adoption have risen, so have corporate interests and the costs of entry.

Screenshot of chart showing mining hashrates over time on Blockchain.com

And, as fintech is one of the industries experts predict will experience immediate disruption from the quantum computing sector, it’s likely we’ll see direct integration with mining, blockchain and cryptocurrency technologies immediately.

Related: Researchers demonstrate ‘unconditionally secure’ quantum digital payments

To explain the productivity paradox, the researchers cite a period lasting from 1976 through 1990 where labor productivity growth — a measure of how productive individuals are at work over time — slowed to a crawl. The reason for this stagnation involved the onset of the computer era.

Essentially, the costs associated with the global switch from paper to computers combined with the need to retrain the entire workforce and create entirely solution ecosystems and workflows caused the trend of growth to stall out until the integration finally completed during the mid-1990s.

The researchers see a similar predicament occurring as quantum computers go from brushing up against usefulness to, potentially, becoming a backbone technology for business.

The two main roadblocks to a smooth transition into the quantum age, according to the researchers, are a lack of general understanding of the technology among leaders and risk aversion.

While businesses with a clear use case, such as shipping or pharmaceutical companies, may be quick to adopt quantum solutions, the rate-of-return might not appeal to risk-averse businesses looking for immediate impact.

To mitigate these concerns and accelerate the adoption of quantum computing, the researchers suggest a renewed focus from governments and researchers on illustrating the potential benefits of quantum computing and the development of language and terminology to explain the necessary concepts to the business community and the general public.

The researchers conclude by stating that the first order of business when it comes to preparing for the quantum computing future is to ensure that the “quantum internet” is ready for secure networking.

No Middleman, No Problem? What 2025 Holds for Decentralized Exchanges