We have introduced IBM Condor, a 1,121 superconducting qubit quantum processor based on our cross-resonance gate technology. Condor pushes the limits of scale and yield in chip design with a 50% increase in qubit density, advances in qubit fabrication and laminate size, and includes over a mile of high-density cryogenic flex IO wiring within a single dilution refigerator.
So, it sounds like this is actually another fridge sized system.
I may be mistaken but the fridge sized copper monstrosity is the system that cool the quantum chip, so unless they miniaturized the cooling system it didn't change.
From what I remember the chip itself is pretty small, the size is all due to the cooling component.
Also keep in mind you've probably seen a development version of a quantum computer, where things are set up to be easily accessible to allow fixing and tinkering, without regard for size and optimization of space.
It's faster than Moore's law, but I don't know whether it can be sustained.
For years, IBM has been following a quantum-computing road map that roughly doubled the number of qubits every year. The chip unveiled on 4 December, called Condor, has 1,121 superconducting qubits arranged in a honeycomb pattern. It follows on from its other record-setting, bird-named machines, including a 127-qubit chip in 2021 and a 433-qubit one last year.
For now they are only being used for research purposes. For example, simulating Quantum effects in many atom physics and implementing error correction for future quantum computers. Any real applications still need some time but the pace of development is really quite something.
Currently, there is basically only one real world application we really know: Factoring numbers into prime factors. And we can't know for sure whether there will be more even.
Wasn't there a study that, with the current approach of evaluating an average to break it down to a few finite states, they might never be able to do for what they were developed; cracking passwords?
If by "cracking passwords" you mean reversing password hashes in a database, quantum computers aren't going to make a big dent there. The standard industry ways of doing that wouldn't be affected much by QCs. Breaking encryption, OTOH, with QCs is a concern, but also vastly overrated. It would take orders of magnitude more qubits to pull off than what's been worked on so far, and it may not be feasible to juggle that many qubits in a state of superposition.
I get really annoyed when people focus on breaking encryption with QCs. They are far more interesting and useful than that.
QC can make logistics more efficient. Have you ever seen photos of someone unpacking a giant Amazon box holding one little micro SD card? Amazon isn't dumb about these things, but our best methods of packing an entire truck is a guess. Packing algorithms would take too long to calculate how to perfectly pack it, so they come up with a solution that seems OK, and that leads to a few "filler" boxes that are unnecessarily large, among other inefficiencies. QC can solve this problem without taking the age of the universe to come up with a solution.
The order in which that truck delivers those packages can also be made more efficient with QC.
Then there's molecular simulations, which have the promise of making medications that are more effective, more likely to pass trials, and with fewer side effects. This can be done far faster on a QC.
I dont think that the only use of Quantum computers is password cracking, rather that one of the types of work loads thats much easier on a quantum computer.
It's worth noting that the laser was much the same way. It was described early on as a solution in search of a problem, and lasers have had an incredible impact on technology.
It takes about a billion qbits to break 2048bit encryption, so a while. I saw something about reducing it to about 20 million qbits recently, but it's still a while off.
IIRC, those several million qubit computers out there right now aren't really comparable, either. They're using a ton of qubits, expecting a lot of them to fall out of superposition, but hoping they have enough to get a useful result. IBM's approach is to try to get the most out of every qubit. Either approach is valid, but IBM's 1000 qubits can't be directly compared to millions of qubits used elsewhere.
I think it's closer to 20,000,000 and that is out the Noise Intermediate Scale Quantum computing, meaning modern chips would need to double or quadruple the number of qubits for error detection and error correction in order to run even basic algorithms. That's not to mention that they'd need to be super cooled for up to eight hours and stay in a super position without decoherence into their ground states before performing the Shor's Algorithm.
TL;DR: We need an improvement over 20000x and better tech to break RSA, but this is a good step forward!
So, basically, we're still in the ENIAC stage of quantum computers. They're cool and all, can do some awesome stuff, but are no where near the potential they could be.
It's actually impossible to do a direct comparison of flops to what I guess we'd call quflops, as the algorithms are not directly comparable. Quantum computers are good at quantum algorithms that can do operations in a single time step that a classical computer couldn't, likewise, to simulate a classical computer on a quantum computer would be very resource intensive.