首页 500强 活动 榜单 商业 科技 领导力 专题 品牌中心
杂志订阅

C3.ai上市即暴涨,硅谷传奇西贝尔畅谈未来

JEREMY KAHN
2020-12-14

C3.ai在截至2020年4月的12个月时间里营收飙升了71%。

文本设置
小号
默认
大号
Plus(0条)

亿万富翁、C3.ai首席执行官汤姆•西贝尔称,90%的公司在实施人工智能方案时会失败。然而,西贝尔在这类失败中看到了机会。

C3销售的软件能够让大型公司实施和管理大规模人工智能应用,该公司的股价在周三纳斯达克的首发涨幅超过了150%,并在首次公开募股中筹集了6.51亿美元,其估值超过了100亿美元。

西贝尔在上市数小时后对《财富》说:“基本上我们所有的客户在尝试打造其自有的人工智能系统时都失败过一次、两次或三次。”他指出,他在其职业生涯初期也看到了同样的规律。他一开始在甲骨文销售数据库软件,当时还是上个世纪80年代。随后,他打造了Siebel Systems这家开发销售团队自动化和客户关系软件的公司,并于2005年以58.5亿美元的价格售给了甲骨文。他说:“历史不会重复,但却有相似之处。这些公司尝试了几次,开掉了首席信息官,然后又开始沉下心来做这件事情,最后成功了。”

西贝尔于2009年创建了C3,公司在截至2020年4月的12个月时间里营收飙升了71%,达到了1.57亿美元。然而,公司的开支增速更快,尤其是研发成本及其销售和营销支出。最终,公司在这一期间亏损了7000万美元。

西贝尔说:“我们正在打造一个在结构上具有盈利性、能够创造正现金流的业务。”然而他指出,投资公司的增长意味着接下来几年中公司在运营方面会继续亏钱。他表示,现金流会在3-4年后转正,而且从长期来看,业务的利润率应该有望超过20%。

公司目前拥有50多家客户,但大部分营收(44%)仅来自于三家公司:油田服务公司贝克休斯,法国能源公司Engie,以及工业设备制造商卡特彼勒。西贝尔称,公司正在迅速实现其业务的多元化,而且他知道自家公司正在与至少50位新客户接洽,并有望于未来6个月中拿下这些客户。

这位首席执行官表示,他认为人工智能软件最大的市场在于医疗行业,他预测该技术将有助于推动个性化医疗的革命,并帮助医生确定哪些患者患某类疾病的可能性最大,继而通过早期干预来进行预防。他说,公司还将帮助开发更具针对性的癌症和其他疾病的疗法。

向来直言不讳的西贝尔指出,即将上任的拜登政府应该为人工智能公司设立规范,尤其是人工智能伦理方面的规范。他说:“人工智能技术被用于很多场景,尤其是社交媒体,而且给社会造成了很大的伤害。”他提到了因社交媒体引发或加剧的青年人精神健康问题以及政治极端化和误报问题,在这些领域,政府应该加以干预并规范相应的技术。他还指出,政府应出台规定,来规范公司应如何使用个人识别信息来培训人工智能软件。

考虑到中国多年以来已在这类技术方面投入了数十亿美元的投资。他说:“毫无疑问,我们正在与中国打一场人工智能战争。”

C3.ai与美国空军签订了多份大单,为空军打造一个能够预测飞机零部件何时需要更换的系统,从而提升具有执飞能力飞机的数量。C3.ai还与首要国防合同商Raytheon签订了合约。尽管为美国军队提供人工智能系统对于一些科技公司来说引发了世人的争议——谷歌在2018年便是因为员工的抗议而退出了五角大楼Maven项目,但西贝尔称,只要人类在自家公司帮助部署的任何系统中依然存在,那么公司向美国军队提供人工智能技术就没什么问题。他说:“我们十分自豪为民主政府以及支持人权和个人自由的政府提供服务,而且我们将继从事这项事业。”

尽管一些人工智能研究人员拒绝向继续挖掘碳氢化合物的原油公司提供系统,但西贝尔向《财富》透露,他对于向大型能源公司提供人工智能软件没有意见,此举可以帮助他们提升效率。他说:“我们为全球最大的一些公用设施和最大的油气公司提供的服务能够让它们重新改造自身,并帮助它们转变为安全、低成本的能源公司,同时大幅提升清洁能源的可靠度。”

西贝尔表示,总之,仅靠人工智能伦理官和人工智能伦理部门来贯彻人工智能伦理理念是远远不够的。他说:“这种做法仅仅是在回避问题。首席执行官需要具备这种理念,整个管理团队和董事会亦是如此。”(财富中文网)

译者:冯丰

审校:夏林

亿万富翁、C3.ai首席执行官汤姆•西贝尔称,90%的公司在实施人工智能方案时会失败。然而,西贝尔在这类失败中看到了机会。

C3销售的软件能够让大型公司实施和管理大规模人工智能应用,该公司的股价在周三纳斯达克的首发涨幅超过了150%,并在首次公开募股中筹集了6.51亿美元,其估值超过了100亿美元。

西贝尔在上市数小时后对《财富》说:“基本上我们所有的客户在尝试打造其自有的人工智能系统时都失败过一次、两次或三次。”他指出,他在其职业生涯初期也看到了同样的规律。他一开始在甲骨文销售数据库软件,当时还是上个世纪80年代。随后,他打造了Siebel Systems这家开发销售团队自动化和客户关系软件的公司,并于2005年以58.5亿美元的价格售给了甲骨文。他说:“历史不会重复,但却有相似之处。这些公司尝试了几次,开掉了首席信息官,然后又开始沉下心来做这件事情,最后成功了。”

西贝尔于2009年创建了C3,公司在截至2020年4月的12个月时间里营收飙升了71%,达到了1.57亿美元。然而,公司的开支增速更快,尤其是研发成本及其销售和营销支出。最终,公司在这一期间亏损了7000万美元。

西贝尔说:“我们正在打造一个在结构上具有盈利性、能够创造正现金流的业务。”然而他指出,投资公司的增长意味着接下来几年中公司在运营方面会继续亏钱。他表示,现金流会在3-4年后转正,而且从长期来看,业务的利润率应该有望超过20%。

公司目前拥有50多家客户,但大部分营收(44%)仅来自于三家公司:油田服务公司贝克休斯,法国能源公司Engie,以及工业设备制造商卡特彼勒。西贝尔称,公司正在迅速实现其业务的多元化,而且他知道自家公司正在与至少50位新客户接洽,并有望于未来6个月中拿下这些客户。

这位首席执行官表示,他认为人工智能软件最大的市场在于医疗行业,他预测该技术将有助于推动个性化医疗的革命,并帮助医生确定哪些患者患某类疾病的可能性最大,继而通过早期干预来进行预防。他说,公司还将帮助开发更具针对性的癌症和其他疾病的疗法。

向来直言不讳的西贝尔指出,即将上任的拜登政府应该为人工智能公司设立规范,尤其是人工智能伦理方面的规范。他说:“人工智能技术被用于很多场景,尤其是社交媒体,而且给社会造成了很大的伤害。”他提到了因社交媒体引发或加剧的青年人精神健康问题以及政治极端化和误报问题,在这些领域,政府应该加以干预并规范相应的技术。他还指出,政府应出台规定,来规范公司应如何使用个人识别信息来培训人工智能软件。

考虑到中国多年以来已在这类技术方面投入了数十亿美元的投资。他说:“毫无疑问,我们正在与中国打一场人工智能战争。”

C3.ai与美国空军签订了多份大单,为空军打造一个能够预测飞机零部件何时需要更换的系统,从而提升具有执飞能力飞机的数量。C3.ai还与首要国防合同商Raytheon签订了合约。尽管为美国军队提供人工智能系统对于一些科技公司来说引发了世人的争议——谷歌在2018年便是因为员工的抗议而退出了五角大楼Maven项目,但西贝尔称,只要人类在自家公司帮助部署的任何系统中依然存在,那么公司向美国军队提供人工智能技术就没什么问题。他说:“我们十分自豪为民主政府以及支持人权和个人自由的政府提供服务,而且我们将继从事这项事业。”

尽管一些人工智能研究人员拒绝向继续挖掘碳氢化合物的原油公司提供系统,但西贝尔向《财富》透露,他对于向大型能源公司提供人工智能软件没有意见,此举可以帮助他们提升效率。他说:“我们为全球最大的一些公用设施和最大的油气公司提供的服务能够让它们重新改造自身,并帮助它们转变为安全、低成本的能源公司,同时大幅提升清洁能源的可靠度。”

西贝尔表示,总之,仅靠人工智能伦理官和人工智能伦理部门来贯彻人工智能伦理理念是远远不够的。他说:“这种做法仅仅是在回避问题。首席执行官需要具备这种理念,整个管理团队和董事会亦是如此。”(财富中文网)

译者:冯丰

审校:夏林

Nine out of 10 companies fail when implementing artificial intelligence software, says Tom Siebel, the billionaire founder and chief executive of C3.ai. And in that failure Siebel has found opportunity.

C3, which sells software that enables large companies to implement and manage large A.I. applications, saw its shares soar more than 150% in their first day of trading on Nasdaq Wednesday. The company raised $651 million in the initial public offering that saw the business valued at more than $10 billion.

“Virtually every one of our customers failed one, two, or three times” trying to build their own A.I. systems, Siebel told Fortune hours after the IPO. He says this is the same pattern he had seen earlier in his career, first selling database software in the 1980s at Oracle, and later building Siebel Systems, the sales-force automation and customer relationship software company he sold to Oracle for $5.85 billion in 2005. “History doesn’t repeat itself, but it sure rhymes,” he says. “These companies try a few times. They fire the CIO, and then they get serious and get the job done.”

C3, which Siebel founded in 2009, saw its revenues leap 71% to $157 million in the 12 months through April 2020. But the company’s expenses, particularly its research and development costs and its sales and marketing spending, are growing even faster. As a result, the company lost $70 million in the same period.

“We are building a structurally profitable, structurally cash positive business,” Siebel says. But, he notes, investing in the company’s growth means that it will continue to lose money on an operating basis for the next few years. He says that the cash flow should turn positive three to four years from now and that the business should be able to generate profit margins in excess of 20% in the long run.

The company has about 50-odd customers but generates a large portion of its sales, 44%, from just three of them: oil services firm Baker Hughes, French energy company Engie, and industrial equipment maker Caterpillar. Siebel says that the firm is rapidly diversifying and that he knows of at least 50 additional customers the company was working on closing in the next six months.

The CEO says he sees the biggest market for A.I. software in health care, where he predicts it will help usher in a revolution in personalized medicine, helping doctors to determine which patients are most likely to develop certain diseases and intervene earlier to prevent them. He says it will also help with more targeted treatments for cancer and other conditions.

Siebel, who is nothing if not outspoken, says the incoming Biden administration should set guidelines for A.I. companies, particularly around A.I. ethics. “There are many cases where A.I. is being used, particularly by social media, to, I think, enormous social detriment,” he says. He points to mental health issues in young people that are caused or exacerbated by social media as well as political polarization and disinformation as areas where the government should step in and regulate technology. He also says regulation is needed on how companies could use personally identifiable information in training A.I software.

In light of China’s multiyear, multibillion-dollar investment in these technologies, “Make no mistake, we’re at war with China in artificial intelligence,” he says.

C3.ai has major contracts with the U.S. Air Force, for which it has built a system that predicts when aircraft parts will need to be replaced, helping the force keep more of its planes ready to fly, as well as with Raytheon, a major defense contractor. And while providing A.I. systems to the U.S. military has proved controversial for some tech companies—with Google pulling out of the Pentagon’s Project Maven in 2018 following an uproar among its employees—Siebel says C3 has no issue providing A.I. technology to the U.S. military, so long as a human remains “in the loop” in any system the company helps deploy. “We’re proud to serve democratic governments and governments that support human rights and individual liberty, and we will continue to do so,” he says.

And while some A.I. researchers have also objected to providing systems to oil companies that continue to extract hydrocarbons, Siebel tells Fortune he has no problem providing A.I. software to large energy companies to help them become more efficient. “What we’re doing for some of the largest utilities in the world, and some of the largest oil and gas companies in the world, is that we’re allowing them to reinvent themselves,” he says. “To help them convert themselves to safe energy, secure energy, lower-cost energy, and much, much higher reliability clean energy.”

In general, Siebel says, A.I. ethics is too important to delegate to an A.I. ethics officer or an A.I. ethics department. “That’s just a cop-out,” he says. “The CEO needs to own this, the whole management team needs to own this, and the board.”

0条Plus
精彩评论
评论

撰写或查看更多评论

请打开财富Plus APP

前往打开