DED9

What Is Quantum Computing And Why Are We Waiting For A Computational Revolution?

Quantum Computing Can Be Described As A Gateway To A World In Which The Most Complex Processes Are Performed In A Fraction Of A Second. This Is The Great Revolution With Which We May Be Far Away, But We Are Fast Approaching It!

Fortunately, we are at a time in history when technology is more in the spotlight than ever before, and billions of people worldwide are directly and indirectly dealing with fascinating, sophisticated, and astonishing technologies that are constantly evolving. We have already witnessed several major milestones in this field, such as the revolution of personal computers and smartphones, completely changing people’s jobs and lifestyles. However, perhaps these achievements can be described as the prelude to a larger revolution called quantum computing.

In the 1950s, people only had access to giant computers in large rooms with adequate ventilation. In the late 1970s and 1980s, people benefited from computers in their homes due to the microcomputer revolution, and until the 1990s, people used laptops that they could carry in their bags. We now carry smartphones in our pockets that are thousands of times faster than the original computers. However, it seems that we will soon be stuck in this cycle; Because despite years of significant progress in this area and the creation of a modern and exciting era, classical computers face limitations and problems and are unable to solve them. This is exactly where quantum computing comes into play.

One of the most important limitations of classical computers is the reduction of transistors. At present, chip companies have reduced the transistor units to almost the size of an atom, which is a dramatic and unbelievable development. To better understand it, imagine that billions of transistors can be embedded in small coin-sized silicon! But even if a classic computer helps us do amazing things, under the hood is actually just a calculator that uses a sequence of bits.

How Do Conventional Computers Work?

Transistor

There are probably billions of people worldwide dealing with all kinds of computers every day, from smartphones to supercomputers. Simply put, today’s computers can be considered beneficial devices that allow users to send emails, shop online, interact with their friends on social networks, or even immerse themselves in the rich world of gaming. Still, as mentioned earlier, these advanced achievements with multi-dozen-core processors, compelling graphics cards, etc., include a tired, old-fashioned calculator that dates back decades and is based on a set of pre-defined instructions called They use the program.

Today’s computers can be likened to a magic trick that performs strange and unbelievable tasks in front of viewers. Still, in reality, the magician uses only the basic principles to deceive the viewer under his sleeve or hat. Conventional computers have two tricks that somehow form the basis of them all. They can store numbers in memory and process stored numbers with simple mathematical operations (such as addition and subtraction). In addition, they can perform more complex tasks by wrapping simple operations in a set called an algorithm. Both key tricks of classical computers (storage and processing) are performed using switches called transistors.

The most advanced computers today are calculators that are close to saturation.

Today’s computers use processing units called bits. The bit of a stream of electrical or optical pulses represents the numbers 1 or zero. In fact, all our activities, from tweets and emails to songs and video content, are essentially long strings of these binary digits. In the meantime, transistors are responsible for storing and processing these binary numbers. These tiny switches in the everyday world can compare to socket switches that can be turned on or off. They are on mode can use to store binary numbers “1,” and their off mode can use to store binary numbers “0”.

In other words, each of the zeros or ones is called a binary number, and it is interesting to note that with an eight-bit string, 255 different characters such as AZ, az, 0-9, and the most common symbols can store. Computers are calculated using circuits called logic gates, which are made up of several connected transistors. Logical gates compare the patterns of bits stored in temporary memory called registers and then convert them into new patterns of bits, equivalent to what our brain calls addition, subtraction, or multiplication. Physically, an algorithm that performs a specific calculation is in the form of an electronic circuit made up of several logic gates, with the output of one gate entering as the next input.

What has been said seems astonishing, But this is where the computing power of conventional computers comes close to saturation. The more information needed for storage, the more bits and transistors will need; Therefore, transistors play a significant role in today’s computers, and we are approaching the limit of their downsizing.

Currently, chipmakers such as TSMC are researching and developing one-nanometer chips. The main challenge for chip manufacturers is finding the structure of the transistor and the appropriate transistor material. Meanwhile, the transistor contacts that deliver power to the transistor are critical to their operation (transistors). The downsizing of most of the technologies used in the semiconductor industry increases the contact resistance and, as a result, their performance is limited; Therefore, TSMC and other chipmakers must find a contact material that has very little resistance, transmits large currents, and most importantly is cost-effective for mass production, and this may become the Achilles heel of the industry in the coming years.

In general, we are approaching the limits of energy efficiency using classical methods, and according to a report from the Association of Semiconductor Industries, by 2040, we will no longer be able to compute all machines around the world. This is exactly why the computer industry is trying to make quantum computers work on a commercial scale. Creating useful quantum computers will not be easy at all, But its product can really take the computing world to new heights.

What is quantum computing?

Quantum theory is a branch of physics that deals with the world of atoms and the smaller (subatomic) particles within them. In fact, quantum computing is based on principles that scientists have observed for years in nature’s smallest particles, such as atoms, photons, or electrons. ” Quantum computing is our way of imitating nature to solve challenging but solvable problems,” says Bob Sutor, IBM’s head of quantum technology.

You might think that atoms are minimal copies of other elements in the universe and behave exactly like them, But I suppose you might be wrong. At the atomic scale, the laws change, and many of the classical laws of physics that we take for granted in our everyday world will no longer have any application or meaning. Perhaps the quantum world can be described as very strange and at the same time familiar, which has confused and fascinated many scientists.

At present, many quantum principles remain unknown; But it can say with certainty that this field has many potentials to transform many industries. Successful control of these particles in a quantum computer can explode computing power, which has driven extraordinary innovations in many areas that require complex computations, such as drug discovery, climate modeling, and financial optimization.

What is a quantum computer?

The basic idea behind quantum computers is that quantum physics’s properties and laws can store and perform operations on data. Quantum computers rely on the quantum properties of subatomic particles to perform some calculations faster than classical computers. In traditional computers, processing calculations are performed on a base 2 basis, and each bit is turned on or off. Still, the unit of quantum information, qbit, can use the quantum-based properties to provide on-off or any value between them called overlap. (Explained in detail below).

The main difference between a classical supercomputer and a quantum computer is that the latter uses some of the properties of quantum mechanics to manipulate data in a way that ignores intuition.

The main principles of quantum computers to the classical computer build about four decades ago. Theory with some elements of quantum mechanics, by American physicist Paul Bnyaf (Paul Benioff), goes back over the years. Many people have paid to R & D and theorizing about it. Still, some scientific communities believe that David Deutsch was the first motivator and founder of quantum computer research.

People like Richard Feynman, David Deutsch, and Yuri Manin, in their research hierarchy, came up with the idea of ​​a quantum mechanical model of a Turing machine, which showed that a quantum computer uses to simulate things that could not be easily done using a classical computer. He simulated classical physics. In 1994, for example, Dan Simon theorized that a quantum computer could be exponentially faster than a classical computer. Quantum computers do not intend to replace classical computers; Because they are different, expensive, and very complex tools that will help us solve complex problems beyond a classic computer’s capabilities.

Basically, entering the big data world where the information needed to store grows, more bits and transistors are needed to process it. In most cases, classical computers are limited to one task at a time; Therefore, the more complex the problem, the more time is spent on processing; A problem that requires more power and time than today’s computers is called an unsolvable problem. These are problems that quantum computers are expected to solve by exploiting their strange and very unique properties. To better understand the workings of quantum computers, it is best to familiarize yourself with two fundamental quantum phenomena, entanglement, and entanglement, before beginning anything.

What is the quantum equation?

Qubits are the basic elements from which quantum computers derive their processing power. In fact, qubits are the quantum counterparts of bits in traditional computers that we introduced in the previous sections. One of the big differences between traditional and quantum computers has to do with the concept of cosmology. A classical computer can be based on mode A or B (in binary terms one or zero); Quantum computers, on the other hand, can use a combination of the two.

Usually, people who work in optics and light beams may have a relative view of nakedness; Because the beam of light sometimes behaves as if it were made up of particles like a continuous stream of cannonballs; they are sometimes wave-like waves of energy in space. This is called ” particle-wave duality ” and is one of the ideas that come from quantum theory. Understanding that something can be two things simultaneously, like a particle and a wave, is difficult, insane, and completely alien to our everyday experience.

In our everyday world, objects seem to be moving continuously, the water in the tub is constantly rising, and a rocket easily ascends to the sky. Still, in the quantum realm, everything jumps and shakes. The electron attached to the center of the atom can immediately jump from one orbit to another, originally disappearing from the original orbit and reappearing in the new orbit. In the world we know, a car is not a bicycle and a bus simultaneously.

However, in quantum theory, this is just something crazy that can happen. The most obvious example of this is the confusing theory known as the ” Schrdinger cat .” In short, in the strange world of quantum theory, we can imagine a situation in which a cat-like creature could be alive and dead simultaneously!

When you enter the world of atomic and subatomic particles, everything starts behaving unexpectedly. In fact, these particles can exist in more than one state at a time. This is the ability that quantum computers use, and qubits bring this special feature to quantum computers. Imagine a sphere to show the difference. Bit (conventional computer processing unit) can be in either of the two poles of the sphere, But a qubit (processing unit of quantum computers) can exist anywhere on the globe.

If you do not understand, let’s give a better example by giving a simple example. In today’s world, a skateboard can be in one place or position at a time, such as the left side of a ramp (zero) or the right side (number one); But Brahmanism allows the skater to behave like an atom and be between the numbers zero and one. Skaters can practically be present in both places at the same time.

In general, Brahmani shows that a computer using qubits can store large amounts of information and use less energy than conventional computers. Entering the quantum realm of computation, which no longer adheres to the traditional and limiting laws of physics; So we can build processors that are more than a million times faster than today’s processors.

What is quantum entanglement?

Another feature of quantum computers is the phenomenon of entanglement, which is perhaps the strangest and scariest feature of quantum. This is a kind of quantum connection between qubits. Imagine that one ion acts as a qubit and is in the above combination of zeros and ones, and you engage it with a second ion. These two intertwined ions will maintain a special relationship. In fact, you will find the story interesting when you realize that changes in one ion affect the other, and this can happen when both ions are separated by a very long distance (such as the distance between two planets). Be!

Our common assumption is that objects have visible and hidden properties, and we believe that making changes in one cannot affect the other; Quantum entanglement, which Albert Einstein calls “terrible remote action,” rejects this assumption and provides flawless arguments and irrefutable evidence; Thus, quantum entanglement is a phenomenon observed at the quantum scale in which entangled particles are somehow connected; In such a way that the actions performed on one particle, regardless of the distance between the two particles, affect the other.

No one really knows why and how it works in confusion; So we do not want to confuse you with more complicated book descriptions. It is enough to know that doubling the number of bits doubles their processing power; But thanks to the entanglement, adding extra qubits to quantum computers increases the number of shredders. Quantum computers then engage the qubits in a kind of quantum Daisy chain to perform their insane magic.

How does a quantum computer work?

In the previous sections, we learned about half of the workings of quantum computers (intertwining and entanglement); But if you were to define or describe a quantum computer, what would come to mind? You may see an ordinary computer that is just bigger and has mysterious physical magic in it; But let me put it, forget about laptops, desktops, or supercomputers. A quantum computer is fundamentally very different in appearance and, most importantly, how information is processed.

There are currently several ways to build a quantum computer and its most important component, the qubits. Companies are experimenting with errors to find a better formula than their competitors. For example, Google and IBM use superconducting circuits that cool to near zero. Honeywell, on the other hand, produces the telecommunication design of qubits from electrically charged charge atoms. Intel qubits are single electrons distinguished by their mechanical quantum mechanical properties, and Xanadu uses photons for its qubits, and its quantum processors operate at room temperature; But let’s start by describing one of the salient designs (superconducting circuits) to explain how it works.

Imagine a string of bulbs hanging upside down, which is actually the most complex thing you have ever seen. Instead of a thin screw of wire, the silver bundles are organized and beautifully woven around a core. They are arranged in layers that narrow down, and then the golden plates divide the structure into sections.

The exterior of this body is called a “chandelier” because of its undeniable resemblance. This supercharged refrigerator uses a special liquid helium mixture to cool the computer’s quantum chip to absolute zero (the coldest theoretical temperature possible). At such low temperatures, the small superconducting circuits in the chip acquire their quantum properties; Properties used to perform computational tasks that are virtually impossible on classical computers. Quantum devices are often very different from their older counterparts; But with one exception, the core of some of the most advanced quantum computers is still a chip; The difference is that this material is not made of silicon, but of superconducting materials.

Superconductors are not unusual materials; Because aluminum is one of the most important, and niobium (a rare metal) is another commonly used item. However, superconductors are vital materials that, if cooled to a certain temperature without any resistance, can conduct electricity and reduce energy consumption; This is inevitable and crucial in a world that needs to reduce energy consumption more than ever.

You may be wondering why superconductors are a good candidate for quantum technologies. The answer is somewhat predictable. Because superconductivity is itself a macroscopic quantum phenomenon, electric charge carriers in a superconductor first pair up and then condense in a quantum state as if they were a large atom. Using small contacts between superconductors and Josephson connections, researchers can engineer various quantum circuits to their liking and run quantum algorithms on them.

Thus, it can say that superconductors reveal quantum properties on the scale of everyday objects and make them desirable candidates for building computers; Computers that can perform some tasks better than the best supercomputers today. As a result, there is a growing demand from leading technology companies such as IBM, Google, and Microsoft to build quantum computers on an industrial scale using superconductors.

As we noted earlier, traditional computer processors work in pairs, and the billions of transistors that control the information on your laptop or smartphone are on (1) or off (0). Using a set of circuits called gateways, computers perform logic operations based on the status of those switches (transistors). Classic computers are designed to follow the rules of inflexibility. This makes them very reliable; But it also does not make them suitable for solving some problems; But their quantum counterparts, by substituting qubits for conventional bits, can overcome these limitations and bring more flexibility in computing.

If you ask a typical computer to get out of the maze of heavy constraints and processes, it will, in turn, test each of the leading branches and paths and bypass them all individually to find the right branch. However, a quantum computer can travel all the leading paths in parallel simultaneously; A property that is actually related to the same phenomenon is disassembled.

The classic way to show quantum mechanics is to shine light through a double-slit barrier. Some light passes through the gap above and some below, and light waves strike each other to create an interference pattern. Now the light must reduce to shoot the individual photons one by one. Logically, each photon must pass through a single slit and have nothing to interfere with; However, you are still faced with an interference pattern.

Here, what happens in quantum mechanics is shown, and until you recognize them on the screen, each photon is in a state of disarray, as if it were traveling all possible paths together. This means that a single dot is displayed on the screen until the monotonous mode disappears. Properties such as entanglement allow scientists to put multiple qubits in a single state without even touching each other. While there are separate qubits in the combination of the two modes, this increases exponentially as the number of qubits increases.

Making quantum computers

If you’ve been following this article, you’ve probably noticed the enormous potential of quantum computing and its role in computing in the future. However, this is just the beginning, and putting its ideas to use quantum computers in various environments and situations is still a major challenge. In other words, despite proving the exciting concepts and reasons for quantum computing, it is fair to say that the world will not accept this technology any time soon and that it is not close to the enormous potential and power of quantum computing.

The power of a quantum computer lies in the fact that the system can be placed in a combination of many modes. This fact is sometimes used to argue that it is impossible to build or control a quantum computer. The principle of the argument is that the number of parameters required to describe its status is vast. Controlling a quantum computer and ensuring that its state is not affected by various sources of error will be an engineering challenge. The problem, however, is not in their complex quantum state but in ensuring that the main set of control signals will do what they have to do and that the qubits will behave as expected.

Conventional computers based on transistors and familiar architectures have been built for decades, and in other words, we have gained special expertise in building and developing these processing machines. On the other hand, building quantum machines means reinventing the whole idea of ​​the computer from the beginning to the present. In this way, naturally, there are many problems, such as making more durable qubits, precise control, and enough to do instrumental work. Next, there is the big problem of inherent errors in a quantum system, technically called “noise.”

Environmental noise seriously jeopardizes any calculation of quantum computers. Of course, it should be noted that there are ways such as error correction to deal with these problems; But they are usually more complicated. There is also the fundamental issue of receiving data from inside and outside the quantum computer, which is itself a complex computational problem. Some critics argue that these issues cannot be resolved; While others accept these problems and believe that they can solve over time.

What is quantum coherence?

The interaction of qubits with their environment in a way that causes them to decompose and eventually lose their quantum behavior is called “quantum coherence.” The quantum state of qubits is very fragile, and the slightest vibration or change in temperature and disturbances known as quantum noise can cause them to become inefficient. That’s why researchers are doing their best to protect qubits in ultra-cold refrigerators and vacuum chambers.

Despite increasing advances in low-error quantum computing, researchers have not yet succeeded in eliminating the errors of dual-gate gateways, one of the building blocks of quantum computing, and numerous problems persist; Hence, they turn to the so-called error correction process to deal with this noise. Quantum error correction is essential for achieving error-resistant quantum computations. It can deal not only with the noise stored in quantum information but also with quantum gates and defective preparations and measurements.

In other words, because the true isolation of quantum systems is challenging, quantum error correction systems have been developed. Qubits are not digital data bits; Therefore, conventional error correction methods, including the triple redundancy method, cannot be used. Scientists often use intelligent quantum algorithms and add more qubits to correct errors in quantum computers.

In traditional computers, these problems are often solved by adding balance bits. A balance bit or twin bit is a single bit that can add to binary strings. In fact, the number of bits that are 1 is added to the bits to indicate evenness or individuality, and the goal of parity is to provide a simple way to check for subsequent errors. This method does not work in quantum computers due to the different nature of qubits and attempts to measure them to destroy data. Previous research has suggested that one possible solution to this problem could be to group qubits into clusters called logical qubits.

However, thousands of standard qubits are likely needed to create an integrated and highly reliable unit known as logical qubits. This destroys much of the computing power of a quantum computer. A logical qubit is a physical or abstract qubit that operates according to specified quantum algorithms or quantum circuits. Depending on the unit conversion, it has sufficient coherence time for quantum logic gates to use. It should note that any logical qubit can require a thousand physical qubits. For example, important quantum computations, including Shor algorithms used to break current encryption, require thousands of logical qubits.

In fact, addressing the error mechanisms can be likened to peeling an onion, peeling from one layer to another. Using logic qubits destroys much of the computing capacity of a quantum computer, and the qubits that are added are themselves prone to error, and as they are added, problems spread. Another point is that researchers have not been able to produce more than 128 standard qubits; So we are still many years away from useful quantum computers.

Quantum computer applications

One of the most promising applications of quantum computers is to simulate the behavior of matter to the molecular level. Automakers such as Volkswagen and Daimler are using quantum computers to simulate the chemical composition of electric car batteries to find new ways to improve their performance. Pharmaceutical companies, meanwhile, use them to analyze and compare compounds that could lead to new drugs.

These giant and highly advanced machines are also great for optimization problems; Because they can quickly crush many potential solutions. Airbus, for example, uses quantum computers to calculate the most cost-effective take-off and landing routes, and Volkswagen has unveiled a service that calculates optimal routes for buses and taxis in cities to minimize congestion. Some researchers also think that these machines can use to accelerate artificial intelligence.

In general, it takes several years for quantum computers to reach their full potential. Currently, the universities and the professions they work on are facing a shortage of skilled researchers in this field and a shortage of suppliers of some key components. But if these bizarre computing machines live up to their promises, they can transform the entire industry and flourish global innovation. This is why many governments and technology companies are striving for quantum superiority.

What is quantum superiority?

Quantum excellence is a turning point in which a quantum computer can complete mathematical calculations significantly beyond the capabilities of even the most powerful supercomputers. It is not yet clear exactly how many qubits are needed to achieve this goal; Because researchers are finding new algorithms to increase the performance of classic computers, existing hardware continues to improve. Quantum computing has been developed exclusively by scientists and in the laboratory for many years; But new advances are driving this revolutionary technology to practical applications. Achievements such as a stronger cooling system, more advanced chips, increased processing capacity, improvements in the error correction process, etc., remind us that we may not have a long way to go before this type of computer becomes popular in certain industries and businesses.

There is a lot of debate about how important this milestone will be. Instead of waiting for the announcement of superiority, companies are currently testing quantum computers made by IBM, Rigti, and D-View. Chinese companies like Alibaba also have access to quantum computers. Some businesses are buying these computers; Others, on the other hand, prefer quantum-based cloud computing.

In this article, we tried to present the topics and basics of quantum computing understandably, and if you get a little misled in some areas, do not worry; you are definitely not alone. Since the whole field of quantum computing is still largely abstract and theoretical, we really need to know that quantum computing will create a new renaissance in the industry’s future if the promises are fulfilled and many challenges are met. Doing business, inventing new drugs and materials, protecting data, exploring space, forecasting weather events, and many more.

 

Die mobile Version verlassen