一觉醒来,联想搞起多件大事,首款AI PC展示,还牵手英伟达和AMD等芯片“顶流”。
在美国举办的Lenovo Tech World 2023上,联想出展示了首款AI PC,并称“个人电脑迎来全新的朝阳”。叠加华为小米等手机厂商也在抢滩AI换机潮,未来每个人都能在终端拥有自己的模型,或成为新趋向。
在实现“从口袋到云端”计算能力的路上,联想“朋友圈”亮相。当日,联想与英伟达、AMD等巨头宣布战略合作。其中,杨元庆与黄仁勋联合公布重大计划,推出混合人工智能解决方案。在杨、黄对话环节,黄仁勋对汽车计算、AIGC等话题的观点和看法,更全面的展现出来。
“未来的个人电脑将是AI PC”
美国时间10月24日,联想集团Tech World 2023开幕,主题为“AI for All”,展示出联想对人工智能全面进击、推动人工智能规模化落地的态度和方向。
联想集团董事长兼CEO杨元庆在现场展示了被联想视为“革命性产品”的AI PC。通过建个性化的本地知识库,通过模型压缩技术,它可以运行个人大模型,实现AI自然交互。
杨元庆认为,智能设备好比是赛车,它是人工智能触达终端用户的终极载体。“联想的大模型压缩技术能让用户自己的智能终端和设备拥有运行个人级大模型的能力。未来的个人电脑将是AI PC,未来的手机将是人工智能手机,未来的工作站将是人工智能工作站。”
联想高级副总裁及智能设备业务集团总裁Luca Rossi介绍,一年前,联想就已经把AI算法放入PC设备。未来在CPU基础上,联想PC还要引入NPU,实现更好管理AI工作量、提升性能、控制电池消耗等目的。个人模型方面,NPU的引入也能大大提升AI能力。
AI PC能够带来哪些变化呢?
比如,通过个人知识库,可以实现智能化行程安排,预定旅游航班、帮助创建表格文档等。与使用其他软件相比,由于是放在端侧,更具有隐私性、安全性。因此,AI PC能为隐私和数据保驾护航。
具体模式怎样?
联想的个人AI Twin通过键盘上的AI交互和用户的自然语言、以及名为AI NOW的新概念功能来实现,这是针对AI PC 的个人AI助手解决方案。
从背后支撑来看,在这些支持人工智能功能的设备和边缘设备上,将建有本地知识库,更好地了解用户。个人大模型将使用存储在设备或家庭服务器上的个人数据进行推理。除非用户授权,否则用户的个人数据永远不会被共享或发送至公有云,从而确保了个人隐私和数据安全。它甚至可以根据个人的思维模式预测任务,并自主寻找解决方案。由此,设备就好像是用户的数字延伸,就像用户的双胞胎(AI Twin)一样。
联想AI PC,并不只是单纯的硬件AI,软件也会有很大作用。一方面,联想在研发自己的大模型,另一方面也可能会有其他厂商的大模型引入。公司方面接受e公司记者采访时透露,目前还暂未考虑AI PC所带来的盈利模式重构问题。
AI PC与没有加载AI的个人电脑相比,是一个高端产品。至于价格,公司方面没有能够给出具体数字,只是表示“价格还要取决于配置”。
虽然进行了现场展示,不过联想明确,AI PC今年无法面世,约明年9月前后正式推向市场,且后续会不断进化焕新。
从产业逻辑来看,PC市场非常成熟。消费电子周期起伏,疫情期间,全球PC市场曾达到近些年顶峰,但是此后也一度连续多季度下跌,苹果等头部玩家的出货量也出现下滑。
目前市场的一致判断是,PC市场正在出现复苏向好。个人电脑出货量环比在过去两个季度均有所增长,其中第三季度环比增长11%。据Luca介绍,目前联想PC渠道库存已经优化,而AI PC虽然短期不会成为主流,但也有望带来价格中枢上移。这有望推动PC市场的量价“双击”。
消费电子都在快上大模型
生成式AI、大语言模型对不少产业都带来“灯塔式指引”,但人工智能真正落地,离不开硬件和设备的负载。
实际上,除了AI PC之外,在AI大势之下,作为用户量最广、用户粘性最高的智能终端,手机更是成为AI大爆炸大普及的第一载体。近一年来,包括华为、小米、OPPO、荣耀在内的中国智能手机行业参与者,都已高调入局大模型。
以小米为例,其大模型技术的主力突破方向是轻量化和本地部署,目前小米自研的13 亿参数端侧模型已经在手机端跑通了 Demo,而且部分场景效果可以媲美 60 亿模型在云端的运算效果。
这折射出消费电子厂商在大模型方面“端云一体”“端云协同”的新趋势。中国信通院相关人士此前接受e公司记者采访时曾表示,如果能够在端侧模型解决的部分能力或者部分功能,就没有必要上升到云端去;如果在调用大模型的过程中需要用到更多信息和功能,就需要结合云端能力。
杨元庆认为,要打造全景式的人工智能,“从口袋到云端”的计算能力, 多种形态的应用都是我们所需要的,而不同行业的解决方案也十分关键。与此同时,人们既青睐公共大模型回答问题的功能,又常常希望问出的问题和得到的答案仅仅留存在自己的设备上或企业内部。对于如何能够做到“既要-也要”的问题,他给出的答案是,将公共大模型和个人大模型,以及企业级大模型相结合。
“朋友圈”顶流
微软、NVIDIA、英特尔、AMD、高通等全球顶级AI科技公司CEO同日都参加了这次大会。其间,围绕设备、基础设施和解决方案领域,联想与多家顶流之间还宣布了多项战略合作。
在智能化解决方案领域,联想集团将个人人工智能双胞胎和企业级人工智能双胞胎的相关愿景,构建进与微软的未来解决方案中。在基础设施领域,联想集团与NVIDIA合作推出新的混合人工智能计划。
此外,联想还与英特尔携手推动AI在客户端、边缘、网络和云端的所有工作负载上的规模化应用。联想集团与AMD在智能设备、基础设施和解决方案等方面继续紧密合作。同时,高通正在发力具有数十亿参数的生成式人工智能模型,使联想产品能够为用户的生产力和创造力提供更多支持。
最受关注的还是杨元庆与黄仁勋公布的重大计划——推出混合人工智能解决方案。这意味着,作为英伟达的市场合作伙伴,联想将提供基于NVIDIA MGX架构的新的企业级AI解决方案、联想混合人工智能服务及其他从层面的服务。这些解决方案使企业可以采用混合云方法——通过NVIDIA AI Foundation云服务构建自定义人工智能模型,然后由NVIDIA最新的生成式人工智能设计的硬件设备和软件驱动的联想本地系统运行它们。
杨元庆介绍,在与NVIDIA的紧密合作下,联想将提供完全集成的系统,将人工智能驱动的算力引入数据生成的各个地方,从边缘到云端,帮助企业轻松部署定制化的生成式人工智能应用,推动创新和转型,覆盖千行百业。
此外值得关注的是,大会现场,杨元庆与黄仁勋进行了一场深度对话。话题围绕汽车计算、AIGC等话题展开。黄仁勋表示,继此前25年的相识、合作之后,英伟达和联想还要备战下一个25年的合作。
黄仁勋在对话中透露,交通运输业将史无前例地成为一个计算机产业;对于AIGC,他认为,需要创建基础设施和解决方案堆栈,使每个公司都有可能受益于人工智能。
附:杨元庆(YY)与黄仁勋(Jensen Huang)对话全文
YY: So Jensen, I know our media friend here have a lot of questions for you. But today, I want to try to ask you some questions on behalf of that, they can tell me so whether I’m a good interview or not. How about that?
元庆:Jensen,我知道我们的媒体朋友有很多问题要问你。今天,我想试着代表他们问你一些问题,他们可以告诉我,我是否是一个好的提问者。怎么样?
So definitely, my first question is about the vehicle computing. We have worked on that for quite a while to capture the opportunity, although it‘s not what we want to focus today, but we really want to hear from you. So, what’s your perspective on future vehicle computing?
我的第一个问题是关于车计算的。我们已经在这一领域努力了很长时间,以抓住这个机会,尽管这不是我们今天重点关注的,但我们真的很想听听你的意见。你对未来汽车计算的看法是什么呢?
Jensen Huang: The largest industry in the world is transportation, and it is about to become a computer industry for the very first time. So what‘s inside the computer? What’s inside the car used to be engines and electronic parts, but in the future is gonna be a computer. And so this is gonna be one gigantic new market for the computer industry. And it‘s terrific that Lenovo is going to be part of it with your heritage and building extraordinary computers. You’ll be able to bring that to the automotive industry.
黄仁勋:世界上最大的产业是交通运输业,如今它将史无前例地成为一个计算机产业。那么电脑里面是什么?汽车里以前是发动机和电子零件,但在将来,汽车里面会是一台电脑。因此,这对于计算机行业来说是一个极其庞大的全新市场。联想对此早已布局并打造非凡的计算机,这真是太棒了。你将能够把计算机带到汽车行业。
YY: Thank you, Jensen. Beyond vehicle computing, we collaborate broadly in workstation, gaming pc, high performance computer, now extending into AI and smart infrastructure. Recently, definitely, the hottest topic is Chat-GPT. NVIDIA has had such an important role enabling this new era. You have been the leader of AI for more than a decade. So could you share with us your insight on generative AI particularly what‘s the prospect of its applications?
元庆:谢谢你,Jensen。除了车载计算,我们还在工作站、游戏pc、高性能计算机方面进行广泛合作,目前已扩展到人工智能和智能基础设施领域。最近,最热门的话题无疑是Chat-GPT。NVIDIA在使能AI新时代发挥了重要的作用。十多年来,你一直是人工智能的领导者。那么,你能和我们分享一下你对生成式人工智能的见解吗?特别是它的应用前景如何?
Jensen Huang: We saw some pretty cool examples of generative AI just now in the demos. But when you take a step back and think about what happened in the last decade, several very important inventions were discovered. For the very first time, a computer could write software that no humans can, that requires AI infrastructure, a new type of computing infrastructure. This is the largest time expansion in the history of our partnership. We‘ve been working together now for 25 years. Yes. Could you imagine YY and me 25 years ago? So it’s been a very, very long ago.
黄仁勋:我们刚才在演示中看到了一些生成式人工智能的非常酷的例子。当你退一步思考过去十年发生的事情时,你会发现一些非常重要的发明出现了。人类历史上首次,计算机可以编写人类无法编写的软件,这需要人工智能基础设施,一种新型的计算基础设施。这是我们最为长久的合作。我们已经合作了25年了。你能想象25年前的元庆和我吗?这是很久以前的事了。
Over the years, we‘ve worked on workstations and laptops and servers and super computers. The most energy efficient super computers in the world are powered by NVIDIA and Lenovo. Now, there’s a brand new type of computer. I call it AI factory. This AI factory is a dedicated computing infrastructure with the super computers that we build that are dedicated to optimize artificial intelligence. This AI factory has a singular purpose. It takes raw material data that comes into it, and a process it and refines it with a great deal of processing, and it outputs intelligence.
多年来,我们一直致力于打造工作站、笔记本电脑、服务器和超级计算机。世界上最节能的超级计算机由NVIDIA和联想携手打造。现在,有一种全新的电脑,我称之为AI工厂。这个AI工厂是一个专门的计算基础设施,配套着我们建造的专门用于优化人工智能的超级计算机。这个AI工厂仅有一个单一目的,即将进入其中的原始数据进行处理,并通过大量处理进行提炼,输出智能。
Now, a decade ago, the type of intelligence we were able to produce was understanding the meaning of the data, understanding speech, understanding images, understanding the meaning of the data. But now, for the very first time, you saw some examples of that, just a second ago, we are able to generate data.
十年前,我们能够制造的智能是理解数据的含义,理解语音,理解图像,理解数据的意义。但现在,就在一秒钟前,你第一次看到了一些这样的例子,我们能够生成数据。
Now, the ability for a computer do not only understand data of all kinds, unstructured all kinds. It could be words and sounds and pixel images, and it could be proteins and chemicals. It could be a motion, whatever information could be digitized. Computers now have the ability to understand the meaning of it. What is the meaning that‘s embedded in the data? Now we can generate it. And this revolution started in the cloud, but it’s now one of the biggest opportunities is for us to bring it to the enterprise. And the reason for that is because the vast majority of the world‘s data is embedded in enterprises. It is for us, that data is confidential as proprietary, is intensely sensitive, for some people is regulated and unable to move to the cloud.
现在,计算机的能力不仅仅是理解各种数据,各种非结构化的数据。它可以是单词、声音和像素图像,也可以是蛋白质和化学物质,可以是一个运动,任何信息都可以数字化。计算机现在有能力理解它的含义。数据中的含义是什么?现在我们可以生成它了。这场革命始于云,现在最大的机会之一是我们将它带到企业中。因为世界上绝大多数数据都嵌入在企业中。数据是保密的,是非常敏感的,对于有些人来说是受到监管的,且无法转移到云端。
And so what we need to do is to create the infrastructure as well as the solution stack that makes it possible for every company to be able to take advantage of artificial intelligence. It includes four parts. We have a back here showing you there are four parts. The first part is the pre-trained models. We call that AI foundation. These foundational models are in our cloud, and they‘re optimized to deliver the most performance that could be as interactive as possible as well as most cost efficient. These models are quite large process, and the faster we can process it, the lower the cost.
因此,我们需要做的是创建基础设施和解决方案堆栈,使每个公司都有可能受益于人工智能。我们向您展示了四个部分。第一部分是预训练模型,我们称之为人工智能基础。这些基础模型在我们的云中,它们经过优化,可以提供尽可能交互式和最具成本效益的最佳性能。这些模型需要相当庞大的过程,我们处理得越快,成本就越低。
The second thing is how do we take this AI model, which fundamentally is kind of like a brain, but it doesn‘t really do anything unless you turn it into an application. One of the things you hear plenty about in the near future. And surely in this conference is the notion called a retrieval augmented generation. It’s another way of saying creating a chat bot and this chat bot has the ability that‘s augmented by data that’s yours. We now have the capability to vectorizer, turn this database into a semantic database, not a relational database, not an unstructured database, but a semantic database, a database that understands meaning. You could talk to this database, ask questions, and because it understands the words that you‘re using, and it understands the meaning of the data that’s inside the storage and the vector database. It has the ability now to respond in a way that makes sense to you.
第二部分是我们如何看待这个人工智能模型,它本质上有点像大脑,但除非你把它变成一个应用程序,否则它什么都不会做。这是你在不久的将来经常听到的事情之一,有一个概念被称为检索增强生成(RAG)。这是创建聊天机器人的另一种说法,这个聊天机器人的功能可以通过你的数据得到强化。我们现在有能力矢量化,把这个数据库变成一个语义数据库,不是关系数据库,不是非结构化数据库,而是语义数据库,一个可以理解意义的数据库。你可以与这个数据库交谈,提出问题,因为它能理解你正在使用的单词,也理解存储和向量数据库中数据的含义。它现在有能力以一种你能理解的方式做出回应。
The second component is what is called a RAG, a retrieval augmented generative models. We can take this application, this new form of enterprise application, which is about assembling AIs. This is the way applications will be built in the future. We assemble the AIs. Instead of writing code or writing SQL queries, we‘re gonna assemble AIs like we assemble teams. And these AIs will perform all kinds of things. They’ll perform things like customer service. It‘ll help you create marketing campaigns. It’ll monitor all the activities in and out of your company to detect fraud. One of my favorite examples is talking to data. You‘ll be able to talk to your storage, whatever you have in your PC. You’ll be able to talk to data.
第二个组成部分是所谓的RAG,一种检索增强生成模型。我们可以采用这种新形式的企业应用程序来整合AI的。这就是未来构建应用程序的方式,我们将像整合团队一样整合AI,而不是通过编写代码或编写SQL语言。这些AI将执行各种各样的事情,他们会做客户服务之类的事情,帮助您创建营销活动,监控您公司内外的所有活动以发现欺诈行为。我最喜欢的例子之一是与数据交流,你将能够与你的存储设备对话,无论你的电脑里有什么。
RAG has the ability to create all kinds of applications. It now needs to run On-Prem. This is where our partnership with Lenovo really comes in. The first part is the operating system. We‘ve been working with VMware. VMware is an operating system enterprise, 500,000 enterprises around the world uses VMware. We’ve been working with VMware to turn that operating system, which was originally designed for virtualization, which is now designed for artificial intelligence. And it‘s called VMware private AI we’ve been working on it for some 4 years. And that operating system runs on a run TAM (unclear) called and NVIDIA AI enterprise. That run TAM does all of the data processing, all the deep learning, all of everything from training, fine tuning, to guard railing, all the way to inference and deployment of the models. That entire stack runs on a Lenovo Think Server that we‘ve been working on together. These are the four components, the four parts of our partnership, taking generative AI to the world’s enterprise.
RAG具有创建各种应用程序的能力,它现在需要On-Prem上运行。这就是与联想合作真正发挥作用的地方。第一部分是操作系统。我们一直在与VMware合作,全球约有50万家企业使用VMware。我们与VMware合作,将最初设计用于虚拟化的操作系统转变为用于人工智能的操作系统,它被称为VMware私有人工智能,我们已经研究了大约4年。该操作系统运行在一个名为和NVIDIA AI Enterprise的TAM上。TAM完成了所有的数据处理、所有的深度学习,从训练、微调到防护栏杆,一直到模型的推断和部署。整个堆栈运行在我们一直在合作的联想服务器上。这是我们合作伙伴关系的四个组成部分,将生成式人工智能带入世界各地的企业。
YY: Yeah, Jenson. I 100 % agree with you. So I believe more generative AI application will happen in enterprise space. My second question is, how can we partner to bring generative AI to more vertical industries and enterprises?
元庆:是的,Jensen。我百分之百同意你的看法。因此,我相信更多的生成式人工智能应用将出现在企业领域。我的第二个问题是,我们应如何合作将生成式人工智能带给更多垂直行业和企业?
Jensen Huang: There are four components. There‘s the pre-trained model, the knowledge of how to build these applications. How do you vectorize the databases? How do you put some semantic features into databases? How do you create these applications out of generative AI. And how do you deploy it? Installing and implementing the operating system, knowing how to orchestrate these workloads and then running it in your data center? I can’t imagine a better partnership than the two of us, where the knowledge and the technologies and the reach of the world‘s markets from transportation to health care, to financial services. The reach of the world’s markets is so broad with the Lenovo team, the combination between us, we can bring this technology that just by everybody.
黄仁勋:共有四个组成部分。这里有预训练模型,以及如何构建这些应用程序的知识。如何对数据库进行矢量化?如何将一些语义特征放入数据库?如何从生成式人工智能中创建这些应用程序?如何部署、安装和运行操作系统,如何协调这些工作负载,然后在数据中心中运行它。我想不出有比我们两个更好的合作伙伴了,我们的知识和技术以及在世界市场的影响力,从交通到医疗保健,再到金融服务。有了联想团队整个世界市场是如此广阔,通过我们的结合可以把这项技术带给所有人。
YY: Yeah, so Jensen, I‘m so happy to take our partnership to the next level. So today, actually, we are very excited to announce a new joint hybrid AI initiative. We want to bring the next generation of cloud AI technology to enterprise everywhere. That means as your time to market partner, Lenovo will deliver new enterprise AI solutions, based on NVIDIA MGX architecture, Lenovo hybrid AI services and so much more.
元庆:是的,Jensen,我很高兴能把我们的合作关系提升到一个新的水平。事实上,今天我们非常兴奋地宣布一项新的混合人工智能联合倡议。我们希望将下一代云人工智能技术带到各地的企业中。这意味着,作为你的合作伙伴,联想将提供基于NVIDIA MGX架构、联想混合人工智能服务等的全新的企业级人工智能解决方案。
Jensen Huang: That‘s right. This is one of the major pillars of Lenovo vision and strategy of AI for all. It’s terrific.
黄仁勋:没错。这是联想“AI for All”人工智能愿景和战略的主要支柱之一。太棒了。
YY: Thank you, Jensen. For your trust and great partnership. Let‘s continue to unleash the power of AI to drive innovation and transformation, starting from celebrating this hybrid AI initiative.
元庆:谢谢你,Jensen。感谢您的信任和良好的合作伙伴关系。让我们从庆祝这一倡议开始,继续释放人工智能的力量,推动创新和转型。
Jensen Huang: Thank you YY. For the next 25 years.
黄仁勋:谢谢YY,也为了接下来的25年。
责编:叶舒筠
校对:冉燕青
(来源于:查股网)
查股声明:此消息系转载自查股合作媒体,查股网登载此文出于传递更多信息之目的,并不意味着赞同其观点或证实其描述。文章内容仅供参考,不构成投资建议。投资者据此操作,风险自担。