
几年前,人工智能还只是一个新奇概念,人们在社交媒体上看到的多是一些略显粗糙、甚至有些怪异的图片和视频。如今,人工智能似乎已经无处不在:几乎每个月都有新模型涌现,好莱坞的部分领域开始引入人工智能技术。即便人工智能目前还未显著提升工作效率,它很可能已经渗透到职场之中。如此大规模的扩张,需要巨额基础设施投入。英伟达(Nvidia)的首席执行官黄仁勋表示,公司正准备大规模提供这些构成人工智能的基础模块。
在周一于圣何塞举行的英伟达GTC大会主题演讲中,黄仁勋称,公司已经将未来一年的需求预期翻倍。他说:“我预计到2027年,人工智能需求至少将达到1万亿美元。事实上,我们会面临供不应求的局面。我确信算力需求将远远高于这个数字。”
他也在为这一前景的实现提前布局,并采取了一项不同寻常的激励措施,以吸引顶尖人才,并从员工身上挖掘更多算力:向工程师发放价值接近其薪资一半的人工智能词元(token)。
人工智能热潮正在将基础设施投资推向新高。科技公司当前在数据中心建设上的投资高达7,000亿美元,这一规模接近瑞典等发达经济体的国内生产总值,也超过了经通胀调整后的“阿波罗登月计划”总成本的两倍。英伟达是数据中心建设浪潮中的关键供应商,提供驱动人工智能工厂运行的处理器。1万亿美元的需求预期进一步表明,人工智能基建扩张正在开足马力,毫无减速之势,即便像超威半导体(Advanced Micro Devices)这样的竞争对手仍在拼命追赶,也难以撼动英伟达的领先地位。这一切发生之际,微软(Microsoft)首席执行官萨蒂亚·纳德拉以及《大空头》(Big Short)的原型迈克尔·伯里等商业领袖均警告人工智能市场可能存在泡沫。
黄仁勋在做出这一预测的同时,还提出人工智能体可能很快就会接管世界,并宣布了一系列与天基计算相关的计划,旨在将人工智能送入太空轨道——这一概念此前也被埃隆·马斯克强调为应对不断扩张的数据中心能源需求的潜在解决方案。
黄仁勋表示:“我们正在彻底重置,并启动人类历史上规模最大的建设浪潮。今天,全球大多数正在建设人工智能工厂、芯片工厂和计算机工厂的行业代表都汇聚于此。”
公司最新财报也为黄仁勋的说法增加了可信度。上个月,英伟达公布2026财年营收为2,159亿美元,同比增长65%,创下历史最高年度业绩。其中,仅数据中心业务收入就同比增长75%,达到623亿美元。
人工智能词元:未来的薪酬形式?
在企业领导者试图利用人工智能提升员工生产率之际,黄仁勋展示了英伟达如何将这一目标落地:向工程师支付“词元”,即人工智能运行所需的“货币”,以提高他们的产出。
他说:“我完全可以想象,未来我们公司里的每一位工程师都需要一笔年度词元预算。他们的基本年薪可能是几十万美元,我打算在此基础上再给他们价值相当于底薪一半的词元,让他们的产出提升10倍。”
词元是人工智能模型用来处理语言和识别模式的基本数据单位,对人工智能的未来应用至关重要。人工智能公司OpenAI估计,一个词元大约相当于4个字符,一段一到两句的提示通常需要约30个词元。例如,“Fortune Magazine”可以被拆分为5个词元:“For”“tune”“Mag”“az”“ine”。
按照黄仁勋描述的额度,工程师每年将可以使用数十亿个词元,从而释放出巨大的算力。在他的设想中,词元将成为公司为工程师提供的一项额外福利,为他们进行深度研究提供所需的算力支持。
黄仁勋称,其他科技公司很快也会效仿,将词元作为吸引顶尖人才的招聘工具。
他说:“词元目前已经成为硅谷的招聘工具之一:这份工作会附带多少词元?原因很清楚,每一位拥有词元使用权的工程师都会有更高的工作效率。”(财富中文网)
译者:刘进龙
几年前,人工智能还只是一个新奇概念,人们在社交媒体上看到的多是一些略显粗糙、甚至有些怪异的图片和视频。如今,人工智能似乎已经无处不在:几乎每个月都有新模型涌现,好莱坞的部分领域开始引入人工智能技术。即便人工智能目前还未显著提升工作效率,它很可能已经渗透到职场之中。如此大规模的扩张,需要巨额基础设施投入。英伟达(Nvidia)的首席执行官黄仁勋表示,公司正准备大规模提供这些构成人工智能的基础模块。
在周一于圣何塞举行的英伟达GTC大会主题演讲中,黄仁勋称,公司已经将未来一年的需求预期翻倍。他说:“我预计到2027年,人工智能需求至少将达到1万亿美元。事实上,我们会面临供不应求的局面。我确信算力需求将远远高于这个数字。”
他也在为这一前景的实现提前布局,并采取了一项不同寻常的激励措施,以吸引顶尖人才,并从员工身上挖掘更多算力:向工程师发放价值接近其薪资一半的人工智能词元(token)。
人工智能热潮正在将基础设施投资推向新高。科技公司当前在数据中心建设上的投资高达7,000亿美元,这一规模接近瑞典等发达经济体的国内生产总值,也超过了经通胀调整后的“阿波罗登月计划”总成本的两倍。英伟达是数据中心建设浪潮中的关键供应商,提供驱动人工智能工厂运行的处理器。1万亿美元的需求预期进一步表明,人工智能基建扩张正在开足马力,毫无减速之势,即便像超威半导体(Advanced Micro Devices)这样的竞争对手仍在拼命追赶,也难以撼动英伟达的领先地位。这一切发生之际,微软(Microsoft)首席执行官萨蒂亚·纳德拉以及《大空头》(Big Short)的原型迈克尔·伯里等商业领袖均警告人工智能市场可能存在泡沫。
黄仁勋在做出这一预测的同时,还提出人工智能体可能很快就会接管世界,并宣布了一系列与天基计算相关的计划,旨在将人工智能送入太空轨道——这一概念此前也被埃隆·马斯克强调为应对不断扩张的数据中心能源需求的潜在解决方案。
黄仁勋表示:“我们正在彻底重置,并启动人类历史上规模最大的建设浪潮。今天,全球大多数正在建设人工智能工厂、芯片工厂和计算机工厂的行业代表都汇聚于此。”
公司最新财报也为黄仁勋的说法增加了可信度。上个月,英伟达公布2026财年营收为2,159亿美元,同比增长65%,创下历史最高年度业绩。其中,仅数据中心业务收入就同比增长75%,达到623亿美元。
人工智能词元:未来的薪酬形式?
在企业领导者试图利用人工智能提升员工生产率之际,黄仁勋展示了英伟达如何将这一目标落地:向工程师支付“词元”,即人工智能运行所需的“货币”,以提高他们的产出。
他说:“我完全可以想象,未来我们公司里的每一位工程师都需要一笔年度词元预算。他们的基本年薪可能是几十万美元,我打算在此基础上再给他们价值相当于底薪一半的词元,让他们的产出提升10倍。”
词元是人工智能模型用来处理语言和识别模式的基本数据单位,对人工智能的未来应用至关重要。人工智能公司OpenAI估计,一个词元大约相当于4个字符,一段一到两句的提示通常需要约30个词元。例如,“Fortune Magazine”可以被拆分为5个词元:“For”“tune”“Mag”“az”“ine”。
按照黄仁勋描述的额度,工程师每年将可以使用数十亿个词元,从而释放出巨大的算力。在他的设想中,词元将成为公司为工程师提供的一项额外福利,为他们进行深度研究提供所需的算力支持。
黄仁勋称,其他科技公司很快也会效仿,将词元作为吸引顶尖人才的招聘工具。
他说:“词元目前已经成为硅谷的招聘工具之一:这份工作会附带多少词元?原因很清楚,每一位拥有词元使用权的工程师都会有更高的工作效率。”(财富中文网)
译者:刘进龙
Just a few short years ago, AI was a novel concept generating uncanny, sloppy photos and videos that appeared across your social media feeds. Today, it’s seemingly ubiquitous. New models are popping up almost every month. There’s AI integration in pockets of Hollywood. And even if it’s so far failing to boost your productivity at the office, AI has most likely already appeared in your workplace. That sprawling expansion requires enormous infrastructure investment. And Nvidia CEO Jensen Huang said his company is expecting to deliver those building blocks at a massive scale.
During his keynote address Monday at Nvidia’s GTC conference in San Jose, Huang said the company doubled its demand forecast within the next year. “I see through 2027 at least $1 trillion,” he said. “In fact, we are going to be short. I am certain computing demand will be much higher than that.”
And he’s already preparing for that reality with an unusual incentive to attract top talent and wring more computing power from his workforce: offering engineers AI tokens worth nearly half their salary.
The AI boom is pushing infrastructure investments to new heights. Tech companies are investing a staggering $700 billion into the data center buildout, a sum that rivals the GDP of developed economies like Sweden, and is more than double the total inflation-adjusted cost of the Apollo missions—projects that sent humans to the moon. Nvidia is a critical supplier in that buildout, providing the processors that power AI factories. The $1 trillion demand figure is further proof that the buildout is all gas, no brakes, even as competitors like Advanced Micro Devices (AMD) struggle to close the gap. All of this comes despite looming fears of an AI bubble, as flagged by business leaders like Microsoft CEO Satya Nadella and “Big Short” investor Michael Burry.
Huang made the prediction alongside claims that AI agents could soon run the world, as well as announcements around space-based computing designed to launch AI into orbit, a concept Elon Musk has spotlighted as a potential solution to the energy demands of expanding data centers.
“We are completely resetting and starting the largest buildout of human history,” Huang said. “Most of the world’s industries building AI factories, building chip plants, building computer plants are represented here today.”
The company’s recent earnings reports have added credibility to Huang’s claims. Last month, Nvidia posted $215.9 billion in revenue for fiscal 2026, up 65% from a year ago, the highest annual result ever. Data center revenue alone rose 75% from a year ago, reaching $62.3 billion.
AI tokens: the future of pay?
As business leaders aim to harness AI to boost worker productivity, Huang offered a glimpse at how Nvidia plans to operationalize that ambition: paying engineers in tokens—the currency of AI—to amplify their output.
“I could totally imagine in the future every single engineer in our company will need an annual token budget,” he said. “They’re going to make a few 100,000 a year as their base pay. I’m going to give them probably half of that on top of it as tokens so that they could be amplified 10 times.”
Tokens are the basic units of data or words that AI models use to process language and recognize patterns, making them critical to the future of AI deployment. AI company OpenAI estimates that one token is equal to approximately four characters, with a single one-to-two sentence prompt requiring about 30 tokens. “Fortune Magazine,” for example, may be broken down into five tokens: “For” “tune” “Mag” “az” “ine.”
At the allowance levels Huang described, engineers would have access to billions of tokens annually, unleashing a torrent of compute power. In Huang’s scenario, tokens would be an added employment perk for engineers at his firm, arming them with the power needed to conduct deep research for the company.
The Nvidia CEO said other tech firms will quickly follow suit and use tokens as a recruiting tool to attract top industry talent.
“It is now one of the recruiting tools in Silicon Valley: how many tokens come along with my job,” he said. “The reason for that is very clear because every engineer that has access to tokens will be more productive.”