Quantum Computing
Often confused as the next iteration of binary-based computers, quantum computing serves as a separate method of digital calculation with unique use cases. Since their introduction in 1978, classical computers have commonly depended upon components including a mainboard, central processing unit (CPU), (if dedicated) graphics processing unit (GPU), random access memory module (RAM), long-term storage medium (HDD, SSD, etc.), input/output sources, power source, and enclosure [1]. While each has been developed and refined over time, they collectively allow for code execution with calculations necessary to display visual information.
As the name implies,
quantum computing exercises principles of superposition, entanglement, decoherence,
and interference to calculate probability which results in dependence upon
vastly different technologies to function. Qubits, like a combination of binary
bits and the theory of Schrodinger’s cat, perform calculations with fixed ones
and zeros as well as propose their coexistence thereby being exponentially more
powerful. Despite this practice, they can only yield a singular answer comprised
of fixed numbers and determine them by examining the most likely possibilities.
With respect
to physical differences, classical computers all use CPU cores within a processor
architecture to execute tasks while quantum exercises various types of qubits including
superconducting, trapped ion, and dots. In addition to being most common,
superconductive bits require the most controlled environment at extremely cold
temperatures and limited functionality intervals. Unlike this, trapped ion bits
can function in less specific environments with longer functionality and provide
more specific results, but additionally suffer from constrained scalability. (Qubits
and CPU cores follow the same calculation principles such that adding more
allows for additional instantaneous calculations). Lastly, dots are small-scale
semiconductors which use an electron as a bit and can be largely manufactured
at lesser costs due to shared technology with modern solar research. (Rather than using the electron state for
calculation purposes, it moves across junctions from energy obtained by
sunlight absorption and resultantly generates usable electricity).
Due to current
design limitations, operational requirements, computing specialties, manufacturing
costs, and feasibility, mobile devices have no current use for qubit processors
but will eventually be limited by silicon. Moreover, technology manufacturers such
as Apple, Google, and Microsoft should personally be focusing on revised
encryption with established digital financial service standards rather than
moving products.
Looking past
the challenges brought by every technological breakthrough, it’s important to do
good with what’s given all while working to prevent the worst. If used
correctly, quantum computers could allow for previously impossible advancements
in material science, medicine, climate restoration, and more.
[1] L. Morgan. “The Complete Computer
Processor History.” https://www.hardwarecentral.com/processor-history/
(accessed January 24, 2025).
[2] J. Schneider and I. Smalley. “What is quantum computing.” https://www.ibm.com/think/topics/quantum-computing (accessed January 25, 2025).
Comments
Post a Comment