What is a qbit?

Quantum computing is based on different mathematics and different hardware from conventional, binary computing.

April 5, 2010
What is a qbit?

Still in development, it will have unique advantages when it becomes a reality.

Let’s go back to the 1940s and the first machines that worked with binary numbers, had a “program” that controlled their operations and an internal architecture that would still be familiar to us today – processing with a central unit having its own temporary memory, a main storage unit, all surrounded by input and output devices.

These were Turing machines, as defined in a 1948 essay by Alan Turing himself. His definition was – very generally – that it was an automatic machine for solving problems. Later, this was slightly modified by the Church-Turing thesis: “Turing machines can perform any computation that can be performed.”

All well and good – obvious, really. But the seemingly circular definition of the Church-Turing thesis contains limitations that have become problematic on a practical level. Conventional computers running a binary number system are slow – especially when processing the enormously complicated maths and algorithms common in a wide range of studies.

It’s like the old applied maths problem of a chess game. The winning outcome of any zero-sum game can be determined by a tree search – a simple yes/no diagram of all possible choices. A tree search for noughts and crosses is simple and quick. You could do it in your head – experienced players do. A tree search for a chess game would take a modern supercomputer about 100 years to run. It will give you the right answer – the perfect winning chess game – but you won’t be around anymore to play it.

This really becomes an issue when you move on to more complex problems like stock market analyses, weather and climate models, astrophysics, nuclear physics and so on. Even solving a linear equation, a binary machine is performing thousands or millions of floating point operations per second (FLOPS) to solve what you can do in your head, just not so quickly.

Present supercomputers can run at close to PetaFLOPS – two quintillion operations per second. It’s still not fast enough for the really big problems. The other limitation on conventional machines is physical, rather than mathematical.

We are approaching the upper limit of Moore’s Law, as long as we are still using silicon. Yes, we have steadily doubled the number of transistors on chips roughly every 18 months – that wasn’t exactly what Moore said, by the way, but it has become common usage.

However, we are now working with layers approaching a thickness of only three atoms, at which point electron leakage renders the devices unreliable. Not to mention the cooling problems of having so many transistors in such a small volume.

A new direction

The mathematical limitations are being tackled with a new maths that will only be familiar to quantum physicists and “quants” in the stockbroking field who use very advanced statistical models and analyses.

This is the core of quantum computing and it is as “new” as binary systems were when they replaced previous, unwieldy machines that were designed to work in decimal.

Quantum computers already exist, in one form or another, and they work with qbits, not the usual 1/0 bits. A qbit is, mathematically, a state vector in a two-level quantum system or a vector that applies to complex numbers. Putting it simply and practically, a qbit can have the value 1,0 or both. In quantum language the simultaneous 1/0 value is called a superposition.

It seems impossible to use something that doesn’t have a rigidly fixed and measureable value. It conjures up scary memories of the Heisenberg Uncertainty Principle and the Schrodinger’s Cat problem.

In fact, we do know the maths that enable us to work with qbits – it’s basically a matter of probabilities. If you think of the traditional 1/0 values as a sphere, 1 being the north pole and 0 being the south pole, the possible states of a qbit – determined by probability amplitude – will fall on the surface of that sphere.

Quantum states also have the unique property of entanglement. This is a non-local property of quantum states (qbits, for example) whereby two sets will have higher correlation than is possible in classical mathematics. Simply and somewhat inaccurately expressed, this allows quantum machines to work quickly on problems that a conventional machine can solve – but only with impossible time scales.

Just on a “fun” note, the fact that entanglement is a non-local property opens up the possibility of building something akin to the Star Trek transporter – one day.

A quick example is code breaking – which is where we started with the binary machines of the 1940s.

Let’s take a typical code (RSA) with a range of number values up to 2048. Using maths tools like Shor’s algorithm and the general number field sieve, it would take a supercomputer 2000 times the lifetime of the universe to crack that code. A quantum machine could crack it in one second, using the (to most of us) incomprehensible mathematics that govern probability and quantum entanglement.

Scientists have been doing this for decades with particles. Quantum computers are just looking to do exactly the same with information.

Apart from being more efficient on many problems, quantum computers have unique abilities to handle complexity, thanks again to the opaque mathematics of probability that govern their operation.

This didn’t happen overnight. The original work on the number theory goes back to the 1970s. By the early 1980s, Richard Feynman addressed quantum problems that a conventional Turing machine could not solve and proposed a quantum machine that would be capable of solving these problems.

During the 1990s, there were a number of breakthroughs. Artur Ekert demonstrated a secure communications system based on entanglement. Peter Shor invents his famous algorithm in 1994. In 1995, the US Department of Defense arranged the first workshop on quantum computing – these were the same guys who sponsored ARPANET, that became the global internet.

By 1998, the first working machines were demonstrated – only using two or three qbits, but a start nonetheless.

This new century has seen a vast – literally hundreds – number of breakthroughs in the physical systems that enable quantum computing. These include optical systems, new laser technology, molecular magnets, controlled electron states and many others.

In 2009, the first universal programmable quantum computer was demonstrated and, this year, a quantum machine simulated a hydrogen molecule with accurate results. Google has demonstrated a quantum algorithm that could be used for very fast searches on enormous data volumes. Microsoft Research is also working on quantum.

Quantum computing is not just theoretical – it is a commercial opportunity and the major players are already moving on this.

Storage

In late 2008, researchers at Lawrence Berkeley National Laboratory showed coherent transfer of state from an electron-spin qbit (a “processing” qbit) to a nuclear-spin qbit (a “memory” qbit). This lays the foundations of quantum storage.

Whatever the hardware involved, quantum storage has the potential for greater information density than can be achieved with binary systems and it would be faster to search and access.

This is a possible way to get past the imminent collapse of Moore’s Law, at least as it applies to silicon-based hardware.

There are a number of “escape routes” being worked on – carbon fibre, DNA computing and various nanotechnologies. These may well be first deployed in binary systems but they will, in all probability, be components of future quantum machines also.

Another technology aspect is that quantum machines don’t just need cooling, they need supercooling. This is a way to deal with quantum decoherence. If a qbit loses its state, it cannot be recovered. There are several ways this could happen but the obvious one is exposure to heat. Coincidentally, supercooled quantum interference devices (SQUIDS) are part of the quantum computing hardware possibilities also.

This is not as scary as it sounds. Semiconductors had the same problem with data corruption, as did the various forms of magnetic storage technology that we still use. Tweaking the engineering of the hardware and building safety procedures into the controlling software resolved that for all but the extreme cases – systems can still be taken down by cooling system failures, solar flares or cosmic ray impacts. Quantum computing will be as reliable and secure. The list of necessary technologies to make it so is rapidly being ticked off.

There are two key points to quantum storage. Firstly, the maths behind quantum computing allows for storage that is more efficient by orders of magnitude. Secondly, quantum computing depends on hardware that is, almost by definition, new nanotechnology with scalability beyond what we can achieve with silicon.

Apart from Moore’s Law and what happens when it reaches its limit, we face the problem of expanding storage needs. In business, the amount of information stored, as required by legislation around the world, represents an enormous overhead, whether it is stored in-house or outsourced. In engineering and science, the amount of storage required for complex problems can literally increase exponentially.

The future? 

Yes, quantum computing is still new, still in development – and any discussion of its future will have a question mark attached.

Because it is based on the mathematics of probability, a quantum machine is very good at solving certain classes of problem that cannot practically be solved by conventional machines. This doesn’t mean that we will be working with “qWord” documents in the future – many everyday computer uses will continue to depend on binary systems, either because of legacy issues or just because it works better for the simple stuff. But future Word documents will probably be stored and searched in quantum systems.

For specialised science, practical quantum computing is probably five years away. Specialised business applications will take a bit longer. It might be ten years before quantum computing is commonplace. Even then, it will be invisible to the end user, running data centres, server farms and, probably, a significant amount of internet infrastructure.

Don’t expect to have a supercooled desktop on your desk any time soon.

The ultimate potential is unguessable – we just don’t know until we start using it and developing it further.

But, to get very blue-sky for a moment, stop and think about what it says above – quantum machines can solve problems that binary machines will also solve, but only given impossible amounts of time for the processing. Now think about AI.

Artificial intelligence has been an elusive breakthrough for 60 years. It is absolutely the Holy Grail of computer science. As happened with steam-powered railway engines, electricity supply grids, nuclear technology or travelling to the moon, there are many who say it can’t be done.

There are many more who think it can be done and will be done. And quantum computing might offer the key. Its ability to handle complexity recommends it as a new approach to the fuzzy logic and complicated interconnection models that we know are the basis of natural intelligence – the basic way our brains work. Our brains are not built with client/server architecture – nor do they function in binary code.

People have built very basic AI systems – not quite as smart as a bacterium – by modelling them in binary. Anything beyond that spends so much time coming up with an answer it just doesn’t work.

So, apart from the key issue of building software that can edit and modify its own code without ending up in a total mess, we need a whole new level of computing to get closer to AI. Quantum systems might be a big step in this direction.

For the rest of us, quantum computing will generally remain an invisible, enabling technology. For highly specialised users in science, engineering and complex financial analyses, it will become essential.

It also goes hand-in-hand with some radical innovations in hardware and technology that will have a huge impact in communication systems and control systems of almost any kind as these breakthroughs move from the laboratory to the IT world and beyond.

EngineerIT

What is a qbit? << Discussion

Free Email Newsletter:
Subscribe

Shutterstock is the image partner of MyBroadband – technology images can be found here

Join the conversation

Connect with MyBB

twitterfacebookandroidappleblackberrynewsletterfeed

Poll

Will you use an LTE service as a primary broadband connection if data costs are low enough?

View Results

Loading ... Loading ...

More News

Biggest salaries in SA telecoms

Rich man

The top earning telecommunications CEO in South Africa received a total pay package of R48 million

Shuttleworth R250 million to fight SA government in concourt

Mark Shuttleworth

South African IT billionaire Mark Shuttleworth says he intends to put a sum of R250 million into a trust to address constitutional rights of African citizens

Zuma decision caused missed deadline: Broadband Infraco

Broadband Infraco

The state-owned fibre operator has explained why it missed its reporting deadline

Sights and sounds of the 2014 MyBroadband conference

MyBroadband conference stage

The 2014 MyBroadband conference took place at Gallagher Estate in Midrand recently – this is what attendees experienced

Free MyBroadband Newsletter:
Subscribe
X
bool(true)