Quantum Computing- II
Which technology to choose?

26th November 2017


"The quantum phenomena does not occur in  Hilbert space. They occur in a laboratory."

- Asher Peres.  "Topological quantum computation", Sankar Das Sarma et. al., Physics Today, July 2006.


"As you will see, the entry fee is pretty steep,
which provides at least one good reason why all sorts of people aren’t already
putting together their quantum processors."

- D. P.  Divincenzo


    The limitations of the existing digital technology in performing quantum computations were discussed earlier.  It was clear that we had to think beyond the traditional silicon based technologies which were used in a wide range of computational devices.  So the immediate problem now became, how to define the criterion for the technology considered suitable for quantum computing.

    In Quantum Computing, the Classical bits or Cbits (either |0> or |1>) are replaced by Quantum bits or Qubits (states ‘0’ and ‘1’ exist simultaneously i.e. |00>, |01>, |10>, and |11>).  The gates formed by Qubits have this important property that the matrices corresponding to quantum-gates must be unitary, i.e. they must conserve the probability space.  What it means is that as the calculations progress in a quantum computer, the probability values assigned to various internal states in a quantum computers must not "leak" or change their values due to the interaction caused by the laboratory environment.  In other words, the physical bits storing information about Qubits, must not have any interaction among themselves
due to external environmental factors, before the measurement on the final state of the Quantum Computer is made.  It is estimated that at the room temperature (300 K) and macroscopic scales, the quantum coherence is destroyed in 10-23 seconds.  This is the problem of "decoherence" affecting the realization of the quantum computing. 

    In a carefully controlled laboratory environment, ultra-low temperature, ultra-high vacuum and magnetic shielding, the decoherence time can be improved at the atomic scale.  Please note by mentioning atomic scale we are moving away from conventional digital technology as we know it.  The decoherence time for a Qubit in a quantum computer, is characterized by the following parameters1:

  • The relaxation time, is the average time the system takes for it to decay from |1> to |0>,
  • The phase coherence time, is the average time over which the Qubit energy-level differences does not vary, and
  • The time required to execute one quantum gate.
Based on these parameters, the Coherence Quality Factor (CQF), can be calculated.  For a technology to be viable for building Qubits, the corresponding CQF value must be 104 or higher.  The CQF values for leading technologies are summarized below:
    Ion-traps provide highest CQF, but building a system with large number of Qubits is a problem.  Photonic qubits (Cavity-QED Optical) have similar issues related to the scalability despite high CQF.  NMR based system has limited scalability as number of states which can be stored on each molecule, themselves are limited.

    The SQUID (Superconducting Quantum Interference Devices) technology, is currently used by IBM and D-Wave.  SQUID has the advantage that it uses most of the techniques and equipment available in microelectronics manufacturing.  Hence there is smaller learning curve and the end-product is reliable.  The drawback is low CQF and therefore stringent environmental controls are required.

    Quantum dots are based on the principle that if the electron gas is confined in 3-D within Fermi length (~nm) in a perfect lattice configuration in compound semiconductors, its energy levels are quantized.  The real advantage is that by changing the location of atoms in the lattice, the energy levels can be precisely fine-tuned.  If the technology can be improved to increase CQF values for quantum dots, they can provide an effective solution within the existing solid-state technology.

    The technologies discussed so far, have their limitations as far as the true quantum computing is concerned.   They provide probabilistic solutions, i.e. the same problem is run multiple times to get multiple answers which are within a certain range. We then pick the most probable solution.  However this is still a high entropy procedure as we have to make multiple measurements to get the most probable solution.  Ideally we want a single measurement, a zero-entropy procedure and a deterministic solution.  How to accomplish this rather idiosyncratic objective?  To do so we must move beyond relativistic limits and eliminate the time-axis completely.  We will be discussing some topological concepts in j-space next.
1. Quantum Computing Devices: Principles, Designs and Analysis by Goong Chen, David A. Church, Berthold-Georg Englert, Carsten Henkel, Bernd Rohwedder, Marlan O. Scully, and M. Suhail Zubairy.
To be continued..

Previous Blogs:



Best viewed with Internet Explorer

Creative Commons License

Information on www.ijspace.org is licensed under a Creative Commons Attribution 4.0 International License.