立即打开
谷歌搜索引擎和地图大更新,共有三大看点

谷歌搜索引擎和地图大更新,共有三大看点

JONATHAN VANIAN 2022-05-14
谷歌周三在年度谷歌I/O开发者大会上宣布了更新的消息。

Alphabet对谷歌搜索、谷歌地图和人工智能助手的更新有哪些看点? GEERT VANDEN WIJNGAERT —— 彭博社/盖蒂图片社

谷歌(Google)搜索引擎、谷歌地图(Google Maps)以及语音激活的谷歌助手(Google Assistant)即将迎来重磅更新。

谷歌周三在年度谷歌I/O开发者大会上宣布了更新的消息。由于新冠疫情,今年的大会改为在线上召开。

谷歌高级副总裁普拉巴卡·拉加万对《财富》杂志表示,更新旨在帮助公司紧跟用户习惯的变化,使搜索变得更直观、更简单。他说道,儿童、青少年和首次使用网络的其他群体,不一定知道要在搜索框中输入两个关键词,因此“我们不能还用20年前的方式满足用户的搜索需求。”

以下是此次更新的三大看点:

搜索将变得更直观、更本地化

从今年晚些时候开始,人们可以使用智能手机摄像头,在谷歌的核心搜索功能中输入更复杂的问题。通过这项新功能,用户可以拍摄服装、食品和家电等商品的照片或者在线截屏,然后检索附近出售这些商品的餐厅和商店列表。

谷歌最早通过其谷歌智能镜头(Google Lens)产品推出了一项类似功能。但拉加万称当时只是一次试验;现在智能镜头技术已经足够成熟,可以合并到谷歌的核心搜索功能当中。

拉加万表示:“现在我们认为它已经足够好,因此每一个搜索栏,无论是在iPhone还是安卓手机上,除了有关键词栏,还会有摄像头和话筒。”

谷歌还宣布正在开发一项名为“场景探索”的新功能,目前未确定该功能的发布日期。该项功能将帮助用户更快检索通过智能手机镜头看到的物品的信息。谷歌表示,例如,新功能的用户可以扫描整个商店,然后检索每一件商品的详细信息,而不是每次只能搜索一件商品。

在谷歌地图里走进餐厅

谷歌地图将推出一项名为“沉浸式视图”的功能,可帮助用户以3D视角参观旧金山和洛杉矶等城市的商店、标志性建筑和景点等。与传统的3D地图视图不同,沉浸式视图细节更丰富,将使用无人机拍摄的航空图像、卫星图像和地面视觉效果。在大会之前的功能演示中,谷歌高管展示了用户如何通过这项新功能,以虚拟的方式走进商店,例如咖啡厅,人们可以先查看内部的环境再决定是否要实际前往店铺消费。

用户还可以查看伦敦大本钟等全球知名景点在每天不同时间和不同天气状况下的状态。

沉浸式视图功能将在今年晚些时候上线,覆盖洛杉矶、旧金山、伦敦、纽约和东京等城市。

与人工智能助手交流变得更自然

与亚马逊(Amazon)等竞争对手一样,谷歌正在努力使与语音激活的谷歌助手的互动变得更加自然,就像是与真人交流,而不是经常错误理解用户指令的聊天机器人。

通过该公司的“Look and Talk”功能,Nest Hub Max智能家居显示屏的用户只需要看一眼设备的数字屏幕然后问一个问题就能激活谷歌助手,不需要说“嘿,谷歌”来激活。谷歌高管表示,用户需要同意设备分析其语音和面部才能使用该功能。拉加万表示:“并不是随便某个人都能走进你的房间,激活助手打开灯或其他家电。”

Nest Hub Max用户可以命令谷歌助手关闭家中联网的灯或者设定定时关闭,不需要喊“嘿,谷歌”来激活这个动作。这项功能的目的是减少人们说“嘿,谷歌”的次数,因为这让人觉得很讨厌。

最后,谷歌计划更新谷歌助手,使软件在人们自然地与其对话时不会出现停顿,比如长时间暂停或者说“嗯”等。(财富中文网)

译者:刘进龙

审校:汪皓

Alphabet对谷歌搜索、谷歌地图和人工智能助手的更新有哪些看点? GEERT VANDEN WIJNGAERT —— 彭博社/盖蒂图片社

谷歌(Google)搜索引擎、谷歌地图(Google Maps)以及语音激活的谷歌助手(Google Assistant)即将迎来重磅更新。

谷歌周三在年度谷歌I/O开发者大会上宣布了更新的消息。由于新冠疫情,今年的大会改为在线上召开。

谷歌高级副总裁普拉巴卡·拉加万对《财富》杂志表示,更新旨在帮助公司紧跟用户习惯的变化,使搜索变得更直观、更简单。他说道,儿童、青少年和首次使用网络的其他群体,不一定知道要在搜索框中输入两个关键词,因此“我们不能还用20年前的方式满足用户的搜索需求。”

以下是此次更新的三大看点:

搜索将变得更直观、更本地化

从今年晚些时候开始,人们可以使用智能手机摄像头,在谷歌的核心搜索功能中输入更复杂的问题。通过这项新功能,用户可以拍摄服装、食品和家电等商品的照片或者在线截屏,然后检索附近出售这些商品的餐厅和商店列表。

谷歌最早通过其谷歌智能镜头(Google Lens)产品推出了一项类似功能。但拉加万称当时只是一次试验;现在智能镜头技术已经足够成熟,可以合并到谷歌的核心搜索功能当中。

拉加万表示:“现在我们认为它已经足够好,因此每一个搜索栏,无论是在iPhone还是安卓手机上,除了有关键词栏,还会有摄像头和话筒。”

谷歌还宣布正在开发一项名为“场景探索”的新功能,目前未确定该功能的发布日期。该项功能将帮助用户更快检索通过智能手机镜头看到的物品的信息。谷歌表示,例如,新功能的用户可以扫描整个商店,然后检索每一件商品的详细信息,而不是每次只能搜索一件商品。

在谷歌地图里走进餐厅

谷歌地图将推出一项名为“沉浸式视图”的功能,可帮助用户以3D视角参观旧金山和洛杉矶等城市的商店、标志性建筑和景点等。与传统的3D地图视图不同,沉浸式视图细节更丰富,将使用无人机拍摄的航空图像、卫星图像和地面视觉效果。在大会之前的功能演示中,谷歌高管展示了用户如何通过这项新功能,以虚拟的方式走进商店,例如咖啡厅,人们可以先查看内部的环境再决定是否要实际前往店铺消费。

用户还可以查看伦敦大本钟等全球知名景点在每天不同时间和不同天气状况下的状态。

沉浸式视图功能将在今年晚些时候上线,覆盖洛杉矶、旧金山、伦敦、纽约和东京等城市。

与人工智能助手交流变得更自然

与亚马逊(Amazon)等竞争对手一样,谷歌正在努力使与语音激活的谷歌助手的互动变得更加自然,就像是与真人交流,而不是经常错误理解用户指令的聊天机器人。

通过该公司的“Look and Talk”功能,Nest Hub Max智能家居显示屏的用户只需要看一眼设备的数字屏幕然后问一个问题就能激活谷歌助手,不需要说“嘿,谷歌”来激活。谷歌高管表示,用户需要同意设备分析其语音和面部才能使用该功能。拉加万表示:“并不是随便某个人都能走进你的房间,激活助手打开灯或其他家电。”

Nest Hub Max用户可以命令谷歌助手关闭家中联网的灯或者设定定时关闭,不需要喊“嘿,谷歌”来激活这个动作。这项功能的目的是减少人们说“嘿,谷歌”的次数,因为这让人觉得很讨厌。

最后,谷歌计划更新谷歌助手,使软件在人们自然地与其对话时不会出现停顿,比如长时间暂停或者说“嗯”等。(财富中文网)

译者:刘进龙

审校:汪皓

Google's search engine as well as Google Maps and voice-activated Google Assistant are getting major upgrades.

Google announced the updates Wednesday during its annual Google I/O developer conference, held virtually this year due to the COVID-19 pandemic.

The tweaks are intended to help keep up with changing user habits and make searching more intuitive and easier, Google senior vice president Prabhakar Raghavan told Fortune. Children, young adults, and others coming online now for the first time don't necessarily know to type in two keywords in the search box, he said, and that therefore “We cannot be serving the same queries and the same needs we did 20 years ago.”

Here’s three takeaways from Google's announcements:

Search will be more visual and local

People will be able to enter more complicated queries into Google's core search using their smartphone cameras sometime later this year. With the new feature, people will be able to take photos or capture screenshots online of goods like clothing, food, and home appliances, and retrieve a list of all the nearby restaurants and stores that sell those items.

Google had debuted a similar function via its Google Lens product. But Raghavan characterized that as experimental; the Lens technology is now good enough to be incorporated into Google's core search.

“Now we feel good enough about it so that every search bar, whether your iPhone or Android, has of course a keyword bar, but also a camera and a microphone,” Raghavan said.

Google also said it is developing a new feature, which does not have a release date, called “scene exploration” that will let people more quickly retrieve information about objects they view through their smartphone cameras. The company said that users of the new feature will be able to scan an entire store shelf, for instance, and retrieve details about every item rather than having to focus on just one item at a time.

Now you can go inside restaurants using Google Maps

Google Maps is getting a feature called “immersive view” that will let people visit stores and iconic buildings and attractions in cities like San Francisco and Los Angeles in 3D. Unlike Google’s traditional 3D map view, immersive view is much more detailed, and uses a combination of aerial imagery captured by drone, satellite imagery, and on-the-ground visuals. In a demonstration prior to the conference, executives showed how the new feature will let users virtually enter certain stores like cafes so they can inspect the interiors before they decide they want to visit in real life.

People will also be able to see what popular world attractions like London's Big Ben look like at different times of the day and in different weather conditions.

The immersive view feature will debut later this year for a few cities including Los Angeles, San Francisco, London, New York City, and Tokyo.

Talking to an A.I. assistant should become more natural

Similar to competitors like Amazon, Google is trying to make interacting with its voice-activated Google Assistant more of a natural experience akin to conversing with an actual human rather than a chatbot that frequently misunderstands what people tell it.

The company’s “Look and Talk” feature will let customers of Google’s Nest Hub Max smarthome display activate Google Assistant by merely looking at the device’s digital screen and then asking a question, so they don’t have to activate the assistant by saying “Hey Google.” Google executives said that the feature only works for people who have consented to have their voices and faces analyzed by the device “so some random person cannot walk into your house and turn on the lights or whatever,” Raghavan said.

Nest Hub Max users will also be able to command Google Assistant to turn off Internet-connected home lights or set timers without having to say “Hey Google” to initiate the action. The goal is to reduce the number of times people need to say, “Hey Google,” which can be annoying.

Eventually, Google plans to update its Google Assistant so that the software doesn’t stumble when people talk naturally to it, such as when they take long pauses or utter “ums.”

热读文章
热门视频
扫描二维码下载财富APP