The frenzy triggered by Computex Taipei 2024, which took place early this month, makes everyone clearly sense the arrival of the artificial intelligence (AI) era. Now that the bustling exhibition is over, a question we should consider is: What would be the biggest change in the AI era?
The importance of computing power has been widely recognized, and the development of AI applications is thriving. However, AI’s deeper impact on the world is that it marks the progress of technology from “bit” to “token.” The impact on society as a whole would be a shift from digitalization to tokenization.
In fact, this idea has been indicated in many of Nvidia CEO Jensen Huang’s (黃仁勳) speeches and interviews. He has been emphasizing the importance of floating-point numbers or tokens, saying that in the era of AI, a large number of tokens would be produced and a large amount of AI computational power would turn into AI factories.
This would drive the world to invest trillions of dollars in the innovation of computing frameworks, creating new economic value worth hundreds of trillions of dollars for the world. This is the core of the AI revolution.
In this wave of the AI gold rush, if we liken graphics processing units (GPUs) and AI computing power to “shovels” for digging gold, then the economic value generated by these tokens is the “gold mine” to be dug.
When we see the world’s Internet giants rushing to get their hands on GPUs, what we should really pay attention to is not only the “shovels” themselves, but the real target of their huge investments: the new global economic value created by the tokens in the AI wave.
In the digital era, the bit is the most basic computing unit. In the AI era, the most basic computing unit would be the token.
If you look up the definition of token on the Internet, the answer would be: In the field of AI, “token” usually refers to the smallest unit in the word processing process.
“Tokenization” is the process of breaking a continuous sequence of words into tokens. These tokens can be words, phrases, sentences or other smaller units of text. “Token” seems like a very technical term, but why is it so important? Because it is the smallest unit of computing in AI.
In text-based AI, tokens are like all the words contained in an AI dictionary. All language input must first be tokenized — to find out the appropriate tokens from this dictionary — to let the AI know what you want to express.
The result of the AI’s computation would also be output in tokens, which would then be translated back to human language through the process of de-tokenization.
The number of tokens contained in the AI dictionary is a factor to decide the range of the AI’s capabilities.
Having the right tokens to express itself can greatly increase the AI’s capabilities.
Without the proper tokens of expression, the AI would be poor in words.
The biggest difference between tokens and bits is that tokens are not just numeric expressions, but contain more implicit meanings, so that the meanings contained in these tokens can also be computed.
For example, the tokens of “Taiwan,” “US,” “Asia” and “North America” contain more meanings than the simple numeric zeroes and ones.
The training of AI models is to understand the meanings and connections between the tokens through studying a large amount of data.
So, when we ask the AI: “The relation between the US and North America equals to Taiwan and what?” the trained AI system would be able to correctly identify the relation between the tokens, and answer “Asia.”
Tokens not only function in the field of text, but use many different types of signals, such as images, video and audio, robot movement, weather information, factory data, environment perception for autonomous driving, DNA and protein structure, as well as physical and chemical signals — these can also be converted into tokens to allow AI systems to carry out computation and produce AI results.
Therefore, in the future world, AI computing will deal with huge amounts of tokens.
The large amount of data in human history — from ancient times to the present, including text, video, knowledge and measurement records — would be converted into tokens to train powerful AI models.
All kinds of inquiries and external inputs to the AI system are also converted into tokens to drive the AI system.
The AI-generated tokens are then translated into words, images, sounds, robot movements, weather forecasts, factory simulations, physics and mathematics answers or drug structures that can be understood by the outside world to further influence the world.
In fact, from a historical point of view, this wave of AI-driven tokenization is the latest advancement of civilization.
Human civilization has gone through several important stages in processing signals from the natural world, from “human observation signals,” “physical signals,” “analog signals” and “digital signals” to the latest “AI token signals.”
During the Renaissance, science, mathematics, astronomy and medicine began to flourish.
The natural phenomena that can be observed by human senses, including astronomy, physics, chemistry and medicine, began to be systematized through science and mathematics.
Natural phenomena of astronomy, physics, chemistry and medicine observed by human senses began to be systematically put in order through science and mathematics.
Nature was observed and described by human senses, and the observational data of natural phenomena were described and systematized in objective and scientific formulas of physics and mathematics.
In the first industrial revolution, as scientific knowledge based on Newtonian mechanics matured, the power of machines, such as steam engines, trains and ships, drove the development of civilization.
More importantly, the invention of various types of machines allowed the mass production of precision machines like clocks, watches, gears and textile machines.
Since this period, human beings have been able to control and process “physical signals” such as temperature, pressure, speed and so on, through the power of machinery.
In the second industrial revolution, through Scottish physicist James Clerk Maxwell’s equations of electromagnetism, mankind gained an understanding of the abstract forces of electricity and magnetism.
This led to telephones, radio, electricity and motors. From there, humans were able to utilize electricity and radio waves to process and transmit signals in the form of “analog signals.”
In recent decades, the third industrial revolution, also known as the digital revolution, took place, seeing the emergence of semiconductors, integrated circuits, computers, the Internet, mobile communications, smartphones and many other technologies.
Since this period, human beings have converted signals into “digital signals” expressed as zeroes and ones, thus dramatically increasing the accuracy and complexity of signal processing.
The computation, communication and storage of digital signals’ information built up the present technological civilization.
In this wave of AI progress, with the evolution of machine learning, neural network architectures and large language models, the “AI token signals” enable the implicit relations and meaning between information to be learned and reasoned by AI systems, to create more intelligent functions.
AI is still developing, and if we can successfully unleash the huge potential of AI, it would become the fourth industrial revolution.
In the AI gold rush, Taiwan’s ability to provide high-quality semiconductors and computing mainframes is as crucial as the must-have shovels for gold mining.
The world’s current computing mainframes are worth about US$1 trillion, and the demand for AI computing power could even double to US$2 trillion, Huang said.
Yet the higher value of the “gold mine” is hidden in the huge AI applications based on tokens.
He said that in the future, the products and services created by AI tokens would be valued at more than US$100 trillion. This is the core of this AI boom.
Therefore, we are now in a critical period in the evolution of human history and civilization. Taiwan’s position as a key player in the world’s semiconductor and information and communications industry chain has attracted global attention.
We should not stop there. We should grasp the trend of AI technology evolution and further grasp the world’s trend from digitalization to tokenization to advance the overall technological, economic and social progress.
Liang Bor-sung is senior director of MediaTek Inc’s Corporate Strategy and Strategic Technology division, a visiting professor in National Taiwan University’s Department of Computer Science and Information Engineering and Graduate School of Advanced Technology, and a professor-ranked specialist at National Yang Ming Chiao Tung University’s Institute of AI Innovation, Industry Academia Innovation School.
Translated by Lin Lee-kai
Congratulations to China’s working class — they have officially entered the “Livestock Feed 2.0” era. While others are still researching how to achieve healthy and balanced diets, China has already evolved to the point where it does not matter whether you are actually eating food, as long as you can swallow it. There is no need for cooking, chewing or making decisions — just tear open a package, add some hot water and in a short three minutes you have something that can keep you alive for at least another six hours. This is not science fiction — it is reality.
A foreign colleague of mine asked me recently, “What is a safe distance from potential People’s Liberation Army (PLA) Rocket Force’s (PLARF) Taiwan targets?” This article will answer this question and help people living in Taiwan have a deeper understanding of the threat. Why is it important to understand PLA/PLARF targeting strategy? According to RAND analysis, the PLA’s “systems destruction warfare” focuses on crippling an adversary’s operational system by targeting its networks, especially leadership, command and control (C2) nodes, sensors, and information hubs. Admiral Samuel Paparo, commander of US Indo-Pacific Command, noted in his 15 May 2025 Sedona Forum keynote speech that, as
In a world increasingly defined by unpredictability, two actors stand out as islands of stability: Europe and Taiwan. One, a sprawling union of democracies, but under immense pressure, grappling with a geopolitical reality it was not originally designed for. The other, a vibrant, resilient democracy thriving as a technological global leader, but living under a growing existential threat. In response to rising uncertainties, they are both seeking resilience and learning to better position themselves. It is now time they recognize each other not just as partners of convenience, but as strategic and indispensable lifelines. The US, long seen as the anchor
Kinmen County’s political geography is provocative in and of itself. A pair of islets running up abreast the Chinese mainland, just 20 minutes by ferry from the Chinese city of Xiamen, Kinmen remains under the Taiwanese government’s control, after China’s failed invasion attempt in 1949. The provocative nature of Kinmen’s existence, along with the Matsu Islands off the coast of China’s Fuzhou City, has led to no shortage of outrageous takes and analyses in foreign media either fearmongering of a Chinese invasion or using these accidents of history to somehow understand Taiwan. Every few months a foreign reporter goes to