Global Trend Radar
Web: www.ibm.com US web_search 2026-05-02 01:59

キュービットとは?

原題: What is a qubit? - IBM

元記事を開く →

分析結果

カテゴリ
AI
重要度
72
トレンドスコア
36
要約
キュービット(量子ビット)は、量子コンピュータにおける情報の基本単位であり、データをエンコードするために使用されます。従来のビットが0または1のいずれかの状態を持つのに対し、キュービットは重ね合わせの原理により、0と1の両方の状態を同時に持つことができます。この特性により、量子コンピュータは従来のコンピュータよりもはるかに高い計算能力を持つことが可能です。
キーワード
What is a qubit? | IBM Quantum What is a qubit? By Josh Schneider , Ian Smalley Published 28 February 2024 Updated 02 April 2026 What is a qubit? A qubit, or quantum bit, is the basic unit of information used to encode data in quantum computing. It can be best understood as the quantum equivalent of the traditional bit used by classical computers to encode information in binary. The term “qubit” is attributed to American theoretical physicist Benjamin Schumacher. Qubits are generally, although not exclusively, created by manipulating and measuring quantum particles (the smallest known building blocks of the physical universe), such as photons, electrons, trapped ions, superconducting circuits and atoms. Enabled by the unique properties of quantum mechanics, quantum computers use qubits to store more data than traditional bits and vastly improve cryptographic systems. They can also perform advanced computations that would take thousands of years—or even be impossible—for classical supercomputers to complete. Powered by qubits, quantum computers might soon prove pivotal in addressing many of humanity’s greatest challenges, including cancer and other medical research, climate change , machine learning and artificial intelligence (AI) . Research newsletter Never miss a tech breakthrough Discover emerging research in AI, quantum, hybrid cloud, and more from IBM’s experts with the monthly Future Forward newsletter. See the IBM Privacy Statement . Thank you! You are subscribed. Your subscription will be delivered in English. You will find an unsubscribe link in every newsletter. You can manage your subscriptions or unsubscribe here . Refer to our IBM Privacy Statement for more information. Understanding quantum computing Representing the next generation in computing power, quantum computing uses specialized technology—including computer hardware and algorithms that take advantage of the principles of quantum mechanics. This capability enables it to solve complex problems that classical computers or supercomputers can’t solve or can’t solve quickly enough. Quantum computers, first conceptualized in the 1980s, have evolved significantly—advancing from theoretical ideas to real-world hardware implementations. Today, IBM Quantum® makes real quantum hardware—a tool scientist only began to imagine three decades ago—available to hundreds of thousands of developers. When physicists and engineers encounter difficult problems, they turn to supercomputers. However, even supercomputers are binary-code-based machines reliant on 20th-century transistor technology and they struggle to solve highly complex problems. These classical computers are also subject to material restrictions, such as overheating, putting hard limits on their ability to process information. There are some complex problems, such as the modeling of individual atoms in a molecule, that we do not know how to solve with classical computers at any scale. The laws of quantum mechanics dictate the order of the natural world. Computers that make calculations by using the quantum states of quantum bits should, in many situations, be our best tools for understanding and solving our most complex problems. When studying quantum computers, it is important to understand that quantum mechanics is not like traditional physics. Describing the behaviors of quantum particles presents a unique challenge, as most common-sense paradigms for the natural world simply lack a vocabulary to comprehend the seemingly counterintuitive behaviors of quantum particles. IBM Quantum Computing What is quantum computing? Use IBM Quantum Platform suite of applications to support your quantum research and development needs. Get access to a free, in-depth, university-level introduction to quantum computing fundamentals. Sign in to IBM Quantum Learning Qubits versus bits There are many different types of bits and qubits, but all qubits must adhere to the laws of quantum physics and be able to exist in a quantum superposition. A classical bit can exist in either a 0 position or a 1 position. However, qubits can also occupy a third state known as a superposition. A superposition represents 0, 1 and all the positions in between taken at once for a total of three separate positions. While qubits can encode three separate positions, they are still used to convey information through a binary system. In such systems, the term bit can refer to either the material or process used to represent a 0 or 1. It can also refer to the measurement of that bit, meaning whether it is a 0 or a 1. Understanding bits In traditional or classical computing, a single bit can be thought of as a piece of binary information, notated as either a 0 or a 1. Modern computers typically represent bits as either an electrical voltage or current pulse (or by the electrical state of a flip-flop circuit). In these systems, when there is no current flowing, the circuit can be considered to be off and this state is represented as a 0. When current is flowing, the circuit is considered on and this state is represented as a 1. The term “bit” is itself a portmanteau for “binary digit” and binary bits are the foundational basis of all computing. Whether recording a digital video, animating a 3D model or operating a calculator—all data from operating systems to software are built out of binary code, which is a collection of bits. A computer byte consists of eight bits, which is the minimum number of bits needed to convey a single textual character in binary. Bits can be represented electrically, by running (or not running) current through a silicon chip, for example. Bits can also be represented physically, as a hole or the absence of a hole in a piece of paper, as was used in antiquated punch-card computing . Any two-level system in which the state of the system can be described in only one of two potential positions (for example up or down, left or right, on or off) exists. Such a system can be used to represent a bit. Understanding qubits While quantum technologies do use binary code, the quantum data derived from a quantum system—such as a qubit—encodes data differently from traditional bits, with a few remarkable advantages. Researchers have established various ways to either create qubits or use naturally occurring quantum systems as qubits. However, in nearly all instances, quantum computers require extreme refrigeration to isolate qubits and prevent interference. Theoretically, any two-level quantum system can be used to make a qubit. A quantum system is described as two-level when certain system properties can be measured in binary positions, such as up or down. Multilevel quantum systems can be used to create qubits when two aspects of that system can be effectively isolated to produce a binary measurement. Just as traditional computers can use multiple types of bits—such as electrical current, electrical charge or holes punched (or not punched) in a piece of paper for punch-card computing. They rely on different physical representations. Quantum computers, likewise, can use multiple types of bits. Certain bits are better suited to certain functions and an advanced quantum computer will likely use a combination of bit types to achieve different operations. Because each bit can represent either a 0 or a 1, by pairing two binary digits of information, we can create up to four unique binary combinations: 0 0 0 1 1 0 1 1 While each bit can be either a 0 or a 1, a single qubit can be either a 0, a 1 or a superposition . A quantum superposition can be described as both 0 and 1 or as all the possible states between 0–1 because it represents the probability of the qubit’s state. On the quantum level, qubit probability is measured as a wave function. The probability amplitude of a qubit can be used to encode more than one bit of data and perform complex calculations when combined with other qubits. When processing a complex problem, such as factoring a large prime number, traditional bits become bound up by holding large quantities of information. Quantum bits behave differently. Because qubits can hold a superposition, a quantum computer that uses qubits can calculate a larger volume of data. As a helpful analogy for understanding bits versus qubits, imagine you are standing in the center of a complicated maze. To escape the maze, a traditional computer must “brute force” the problem, trying every possible combination of paths to find the exit. This kind of computer would use bits to explore new paths and remember which ones are dead ends. Comparatively, a quantum computer might, figuratively speaking, at once derive a bird’s-eye view of the maze, testing multiple paths simultaneously and revealing the correct solution. However, qubits do not “test multiple paths” at once. Instead, quantum computers measure the probability amplitudes of qubits to determine an outcome. As these amplitudes function like waves, they also overlap and interfere with each other. When asynchronous waves overlap, it effectively eliminates possible solutions to complex problems and the realized coherent wave or waves present the solution. Mixture of Experts | Quantum Decoding AI: Weekly News Roundup An exciting new announcement on fault-tolerant quantum computing: IBM Quantum Starling will arrive by 2029. Join our world-class panel of engineers, researchers, product leaders and more as they cut through the AI noise to bring you the latest in AI news and insights. Watch the latest podcast episodes Techsplainers | Podcast Listen to: 'Part 1: What is a qubit?' Follow Techsplainers: Spotify and Apple Podcasts Find more episodes What is quantum entanglement? Einstein first referred to quantum entanglement as “spooky action at a distance.” Quantum entanglement occurs when two qubits—or more generally, quantum particles—become linked. In this state, the properties of one cannot be defined without reference to the other, regardless of the distance between them. When two qubit

類似記事(ベクトル近傍)