1 00:00:00,000 --> 00:00:04,500 A hearty good morning afternoon evening as the case may be to you 2 00:00:04,500 --> 00:00:08,900 and Welcome to our chat GPT special event. It's a real 3 00:00:08,900 --> 00:00:12,800 pleasure to be here and it's an honor to share the stage with Tim and I am 4 00:00:12,800 --> 00:00:16,800 just super excited to hear what our experts and our practitioners have to 5 00:00:16,800 --> 00:00:20,700 say today just to set the stage a little bit for this event. 6 00:00:20,700 --> 00:00:24,700 We've tried to put together today's program in a way that doesn't 7 00:00:24,700 --> 00:00:28,800 just stand back and admire the gleam and the shine of this shiny new 8 00:00:28,800 --> 00:00:29,700 object that we all 9 00:00:30,000 --> 00:00:34,700 have and chat GPT and generative AI. We obviously think it's 10 00:00:34,700 --> 00:00:38,800 super interesting. Of course and every day it seems like we're marking 11 00:00:38,800 --> 00:00:42,800 about the pace of development accelerating here. But we also think 12 00:00:42,800 --> 00:00:46,800 that just holding this celebratory surface level type of event would be doing 13 00:00:46,800 --> 00:00:50,600 you, our customers disservice. This is a 14 00:00:50,600 --> 00:00:54,800 complicated technology and it has implications fundamental 15 00:00:54,800 --> 00:00:58,900 and profound a lot of things might change and they might 16 00:00:58,900 --> 00:00:59,800 change in ways that 17 00:01:00,000 --> 00:01:04,200 We just don't anticipate. Now, that's of course, always been true of 18 00:01:04,200 --> 00:01:08,500 technology and part of O'Reilly's Mission over the past. Few decades, has been 19 00:01:08,500 --> 00:01:12,800 to figure out where the current is flowing and where the innovators are 20 00:01:12,800 --> 00:01:16,800 leading us and then to amplify their signals out to the rest of the world. 21 00:01:17,500 --> 00:01:21,300 So, this event is part of that Journey for us. We need to think, 22 00:01:21,600 --> 00:01:25,800 overall, not just about the possibilities and the promise of chat GPT 23 00:01:26,200 --> 00:01:29,700 and generative Ai, and the outcomes, and the results that there, 24 00:01:29,900 --> 00:01:33,600 Continued progress will bring, but we also need to be thinking about some of the 25 00:01:33,600 --> 00:01:37,900 pitfalls. What do you need to be careful of if you're on the Leading Edge 26 00:01:37,900 --> 00:01:41,400 right now. But especially as you get deeper into production, with this 27 00:01:41,400 --> 00:01:45,000 technology, what are some of the things that you will need to watch for 28 00:01:45,900 --> 00:01:49,600 in search of those answers? Today, we'll hear from one of the folks leading the 29 00:01:49,600 --> 00:01:53,900 charge for generative AI at one of the world's leading cloud and productivity software 30 00:01:53,900 --> 00:01:57,800 companies will hear from the CEO of the company behind one of the most 31 00:01:57,800 --> 00:01:59,800 popular software development Tools in it. 32 00:02:00,000 --> 00:02:04,400 Assistance on how chat GPT and its future. Iterations will change software 33 00:02:04,400 --> 00:02:08,600 development will hear from a partner of ours, on actually using chat 34 00:02:08,600 --> 00:02:12,900 GPT and production and circumventing. Some of its challenges will learn 35 00:02:12,900 --> 00:02:16,500 from one of our steam trainers about building apps. Using chat GPT 36 00:02:17,100 --> 00:02:21,800 will also hear from a research vice president at a global search giant about how they're thinking 37 00:02:21,800 --> 00:02:25,700 about intelligence and artificial general intelligence or AGI. 38 00:02:27,000 --> 00:02:31,700 We'll learn about a large Financial advisory company and their efforts and using chat 39 00:02:31,700 --> 00:02:35,500 GPT to enhance the advisory Services, they provide to their 40 00:02:35,500 --> 00:02:39,700 customers. And then we'll wrap up today with the look inside O'Reilly 41 00:02:39,700 --> 00:02:43,700 about how we're using large language models to enhance. Personalized 42 00:02:43,700 --> 00:02:44,100 learning. 43 00:02:45,200 --> 00:02:49,700 So, whether you are a champion of generative Ai and you're 44 00:02:49,700 --> 00:02:53,600 eager to use chat GPT and its ilk. And every task that every 45 00:02:53,600 --> 00:02:57,600 tool and every project that you have or if you're a generative AI 46 00:02:57,600 --> 00:03:01,600 skeptic and you think a lot of the Pomp and the circumstance around chat 47 00:03:01,600 --> 00:03:05,700 GPT is smoke but no fire then I hope you'll 48 00:03:05,700 --> 00:03:09,900 find content. And points of view today that will expand your view make 49 00:03:09,900 --> 00:03:13,900 you think and maybe even rethink your position and hopefully feel a little bit 50 00:03:13,900 --> 00:03:14,800 better prepared for 51 00:03:14,900 --> 00:03:18,700 or whatever the future might hold. So, let me take one moment here to direct you to a 52 00:03:18,700 --> 00:03:22,900 couple of resources. Our team has put together for you on the landing page for the 53 00:03:22,900 --> 00:03:26,900 event. There's some recommended reading that's highlighting some of the base content 54 00:03:27,000 --> 00:03:31,800 on the O'Reilly platform that we've pulled together to help set the stage for today's event and 55 00:03:31,800 --> 00:03:35,600 also to whet your appetite. There's also a chat GPT expert 56 00:03:35,600 --> 00:03:39,900 playlist. That's going to grab smaller chunks of content from across all of our relevant 57 00:03:39,900 --> 00:03:43,600 Library. Will break that out in the fundamentals. First steps. And next 58 00:03:43,600 --> 00:03:44,800 steps and 59 00:03:45,000 --> 00:03:49,800 We keep your eyes peeled because we have a rich pipeline of books, live events on 60 00:03:49,800 --> 00:03:53,600 demand courses, live courses, all around generative AI that are 61 00:03:53,600 --> 00:03:57,600 coming in the months ahead and we're super excited to roll those out. So 62 00:03:58,000 --> 00:04:02,800 that's enough for me, let's go into the first session here on reshaping search 63 00:04:02,800 --> 00:04:06,900 with Bing. This is with Jordi Rivas and Tim O'Reilly, Tim 64 00:04:06,900 --> 00:04:10,700 obviously is our fearless leader founder CEO. Chairman of 65 00:04:10,700 --> 00:04:14,800 O'Reilly media company that has been providing the picks and shovels of 66 00:04:14,900 --> 00:04:18,600 Learning to the Silicon Valley Gold Rush for the past 35 years. 67 00:04:19,200 --> 00:04:23,900 The company has a history of convening conversations that reshape the computer industry. 68 00:04:24,300 --> 00:04:28,500 If you've heard the term open source, software Web 2.0, the maker 69 00:04:28,500 --> 00:04:32,800 movement government, as a platform or the WTF economy, he's had a 70 00:04:32,800 --> 00:04:36,500 hand in framing. Each of those big Ideas. We also have Jordi 71 00:04:36,500 --> 00:04:40,800 Rebus of corporate vice president of search and AI at Microsoft, 72 00:04:41,100 --> 00:04:44,800 leading, the product and Engineering teams responsible for being searched world. 73 00:04:44,900 --> 00:04:48,400 Glide, as well as Microsoft search and being in for Enterprise. 74 00:04:49,100 --> 00:04:53,600 Since 2008, Jordi has led multiple product and Engineering teams focused 75 00:04:53,600 --> 00:04:57,800 on AI products, and services, such as search speech vision, 76 00:04:57,800 --> 00:04:59,000 and digital assistants. 77 00:05:00,100 --> 00:05:04,600 He was on point for being product and growth in 2014 and took on all of being in 78 00:05:04,600 --> 00:05:08,900 2020 and in February of 2023, he and team released the 79 00:05:08,900 --> 00:05:12,800 new Bing, which was the first large-scale search engine with search 80 00:05:12,800 --> 00:05:16,900 grounded GPT answers. And multi-turn chat based search launching a new 81 00:05:16,900 --> 00:05:20,600 era of a I powered search. So, without further Ado, 82 00:05:20,800 --> 00:05:22,400 let's let's hear from Jordi and Tim 83 00:05:24,600 --> 00:05:28,600 Good morning. I'm so glad to be here with Jordi Reeboks 84 00:05:28,600 --> 00:05:32,300 who's the corporate VP for search and AI at 85 00:05:32,300 --> 00:05:36,600 Microsoft. Welcome, Jordy, thank you. Damn, it's great being here. 86 00:05:37,600 --> 00:05:41,600 It's been a charge of being for a while and suddenly being is in the 87 00:05:41,600 --> 00:05:45,900 news and we're going to be talking to Jordi about all 88 00:05:45,900 --> 00:05:49,200 the reasons why. And in particular about 89 00:05:49,700 --> 00:05:50,300 AI, 90 00:05:51,100 --> 00:05:55,900 So, I guess I'd start with the question of, what 91 00:05:55,900 --> 00:05:59,600 have you learned from users? Since the release of a, I powered 92 00:05:59,600 --> 00:06:03,500 bang. And how's that, you know, shaping your future 93 00:06:03,800 --> 00:06:04,500 strategy. 94 00:06:08,000 --> 00:06:12,700 Thanks Tim. It's great talking to you. Well, 95 00:06:12,700 --> 00:06:15,500 we've learnt quite a lot, we 96 00:06:16,700 --> 00:06:20,500 have generally learned. That users are really enjoying the 97 00:06:20,500 --> 00:06:24,300 product and that's really what makes us 98 00:06:24,700 --> 00:06:28,900 excited about what we've done. We get a lot of positive 99 00:06:28,900 --> 00:06:32,400 feedback about how they enjoy the ability 100 00:06:32,400 --> 00:06:34,700 to transition smoothly, 101 00:06:34,900 --> 00:06:38,500 The from the search mode to the chat mode and depending on the query, they 102 00:06:38,500 --> 00:06:42,800 go into one mode or the other or sometimes, both and 103 00:06:42,800 --> 00:06:46,900 being able to be with back and forth. We've also learned that they're 104 00:06:46,900 --> 00:06:50,600 using the product in different ways that we did not anticipate like 105 00:06:50,800 --> 00:06:54,300 primarily we build a search product and 106 00:06:54,800 --> 00:06:58,100 when we look at for example, the chat mode, we see about 107 00:06:58,100 --> 00:07:02,800 60% of all the chats have to do with search and they have to do 108 00:07:02,800 --> 00:07:04,700 with information. 109 00:07:06,400 --> 00:07:10,400 Retrieval information, exaction search in general, 110 00:07:10,400 --> 00:07:14,500 right? And we thought that that percentage would be higher. We notice 111 00:07:14,500 --> 00:07:15,000 a 112 00:07:16,300 --> 00:07:20,800 Number of users are about 15-20 percent are 113 00:07:20,800 --> 00:07:24,700 using it to create and they are 114 00:07:25,000 --> 00:07:29,200 not using it essentially as a search engine. But to maybe create 115 00:07:29,200 --> 00:07:33,800 letters songs, poems, basically they cut and 116 00:07:33,800 --> 00:07:37,700 paste information, they use it to go to the 117 00:07:37,700 --> 00:07:41,800 source coding and so that we expected some but maybe 118 00:07:41,800 --> 00:07:45,500 was a higher percentage than we thought and then also the other 119 00:07:45,600 --> 00:07:49,600 other remaining 20% or so is are using it just to 120 00:07:49,600 --> 00:07:53,600 chat and just to basically chitchat with the 121 00:07:53,600 --> 00:07:57,800 search engine on Surly to do anything creative, non 122 00:07:57,800 --> 00:08:01,800 Surly to do anything formal in Search and so 123 00:08:01,800 --> 00:08:05,800 that was also higher percentage that we expect. Does that seem like that's 124 00:08:05,800 --> 00:08:09,800 just Tire kicking or do you think it's I 125 00:08:09,800 --> 00:08:13,900 want to experience the AI or is there something there about people are 126 00:08:13,900 --> 00:08:15,500 just enjoying this 127 00:08:15,600 --> 00:08:17,800 idea of conversing with this 128 00:08:19,900 --> 00:08:23,800 this program that has unexpected responses, 129 00:08:25,400 --> 00:08:29,800 So it's kind of like exploring the web in the early days you were just clicking on things for the hell of it. You 130 00:08:29,800 --> 00:08:33,500 want to find out what's Happening Here is like wow what's it going to say 131 00:08:35,200 --> 00:08:37,000 yes? In fact 132 00:08:38,100 --> 00:08:42,300 It's hard to tell. Exactly. Right. What 133 00:08:42,900 --> 00:08:46,600 people have in mind, sometimes when they're using the product? But our 134 00:08:46,600 --> 00:08:50,800 intuition is that they definitely are 135 00:08:51,300 --> 00:08:55,600 surprised and maybe impressed with the capabilities 136 00:08:56,100 --> 00:09:00,900 of the chat mode, meaning the conversation capabilities and they're having 137 00:09:00,900 --> 00:09:04,900 fun with it. And you know, if you look at the logs, you 138 00:09:04,900 --> 00:09:07,000 see people having all sorts of 139 00:09:07,100 --> 00:09:10,800 Types of conversations, right? And so, so it's definitely 140 00:09:12,600 --> 00:09:16,700 great to see that. People are enjoying it. But you see people doing all 141 00:09:16,700 --> 00:09:20,900 sorts. Like, even though I gave you some rough percentages, you see the 142 00:09:20,900 --> 00:09:24,600 same user using the product in different ways meaning that they 143 00:09:25,000 --> 00:09:29,500 sometimes just want to have some fun in chat a little bit, sometimes they want to create 144 00:09:29,500 --> 00:09:33,600 something and most of the time as we were hoping they are actually 145 00:09:33,600 --> 00:09:36,500 searching the web. Yeah. Although from 146 00:09:37,300 --> 00:09:41,700 Point of view of the other behaviors are, are really good news. 147 00:09:41,700 --> 00:09:45,700 Because, of course, you know, I guess you already 148 00:09:45,700 --> 00:09:49,500 have GitHub copilot. And, and you've announced that, you know, 149 00:09:49,500 --> 00:09:53,500 copilot is going to be coming to office. And so if 150 00:09:54,300 --> 00:09:58,700 in effect, people are going to be using that, 151 00:09:58,700 --> 00:10:02,900 hey, I want to create something and because it's, I 152 00:10:02,900 --> 00:10:06,800 think I forget who said this originally, but, you know, these 153 00:10:07,100 --> 00:10:11,900 Large language models are like having you know, like 1,000 154 00:10:11,900 --> 00:10:15,800 interns at your disposal because suddenly you can ask 155 00:10:15,800 --> 00:10:19,200 it to do first drafts of virtually anything. 156 00:10:20,000 --> 00:10:24,000 And it may not be right, but as I think soft, you said 157 00:10:25,300 --> 00:10:27,300 it can even be usefully wrong. 158 00:10:30,700 --> 00:10:34,900 I think that the beauty of these models is that they 159 00:10:34,900 --> 00:10:38,500 can inspire you and of course, if you are searching for 160 00:10:38,500 --> 00:10:42,800 something on the web and you want to, you have a 161 00:10:42,800 --> 00:10:46,900 question that is a complex question that generally would not 162 00:10:46,900 --> 00:10:50,600 be answer answered by a 163 00:10:50,600 --> 00:10:54,900 traditional search engine. They've got this ability to extract the 164 00:10:54,900 --> 00:10:58,600 information from multiple sources and 165 00:10:58,700 --> 00:11:02,800 He's a new answer for you that you wouldn't find anywhere 166 00:11:02,800 --> 00:11:06,800 else and that's like magic but at the same time that is Magic 167 00:11:06,800 --> 00:11:10,600 to in being able to ask for inspiration. When you're 168 00:11:10,600 --> 00:11:14,700 writing an essay, writing a letter writing, some source code, right? And 169 00:11:14,900 --> 00:11:18,200 and this ability to transition to that scenario, 170 00:11:18,900 --> 00:11:22,700 you know, in a very smooth way, is one of the things that I think makes the 171 00:11:22,700 --> 00:11:25,000 product really fun. Yeah, I 172 00:11:26,100 --> 00:11:30,800 One of kind of unpack a little bit more this notion that 173 00:11:30,800 --> 00:11:34,500 it's changing the nature of search you know you're not just getting a 174 00:11:34,500 --> 00:11:38,400 result that you know 175 00:11:38,500 --> 00:11:42,800 maybe you know the yeah in the old days you guys would say we think this is 176 00:11:42,800 --> 00:11:46,900 going to be the kind of synthetic result that people want but now you're 177 00:11:46,900 --> 00:11:50,800 really generating it on the fly in response to very very specific 178 00:11:50,800 --> 00:11:54,900 requests. Can you come out 179 00:11:54,900 --> 00:11:55,400 of that? 180 00:11:55,900 --> 00:11:59,800 Change. Yeah. Lolly the nature of search, but give you insight 181 00:11:59,800 --> 00:12:03,700 into the changing nature of all future applications where you 182 00:12:03,700 --> 00:12:06,400 just going to be able to be much more 183 00:12:07,700 --> 00:12:11,600 specific in natural language with what you want 184 00:12:11,600 --> 00:12:15,200 and the program are responds 185 00:12:16,400 --> 00:12:20,800 with, you know the kind of this synthetic response that's just for 186 00:12:20,800 --> 00:12:21,100 you. 187 00:12:23,300 --> 00:12:27,900 That's right. And I think that is the the beauty of 188 00:12:27,900 --> 00:12:31,800 you like of the combination of search with this 189 00:12:31,900 --> 00:12:35,800 large language model technology, that the fact that it 190 00:12:35,800 --> 00:12:39,800 can create new answers that you wouldn't find 191 00:12:40,000 --> 00:12:44,900 anywhere. And now the answers are there, it just sometimes you would need to go through 192 00:12:44,900 --> 00:12:48,600 a lot of sources of spend a lot of time and find that information 193 00:12:48,800 --> 00:12:51,800 on your own and we can save you all that time. 194 00:12:51,900 --> 00:12:55,900 I'm and I'll give you an example if you, for 195 00:12:55,900 --> 00:12:59,900 example, I don't know view, if you are a runner but if you run marathons, you know, my 196 00:12:59,900 --> 00:13:03,900 wife, you know, is a runner. And so she ran this Seattle 197 00:13:03,900 --> 00:13:07,700 marathon and she was wondering, okay, well I like to run the New York 198 00:13:07,700 --> 00:13:11,900 City marathon someday but what should I do differently to prepare for 199 00:13:11,900 --> 00:13:15,800 it? Huh and if you if you ask this in the natural 200 00:13:15,800 --> 00:13:19,600 way, you type that query in a traditional search engine. Well 201 00:13:19,800 --> 00:13:21,800 often it'll break because it's a long natural. 202 00:13:22,000 --> 00:13:26,800 Worry. Sarge engines are generally used 203 00:13:26,800 --> 00:13:30,900 to queries that are 2 to 3 Watts. Max. 204 00:13:31,100 --> 00:13:35,900 That's if you look at the vast majority of searches that that's what they are. 205 00:13:36,100 --> 00:13:40,700 But now, you can type these long complex, natural query 206 00:13:41,100 --> 00:13:45,700 and then what the the new being we'll do 207 00:13:45,900 --> 00:13:49,800 is we'll, you know, look at all the results that you have on the web 208 00:13:49,800 --> 00:13:51,800 and you will find some results that 209 00:13:51,900 --> 00:13:55,800 It have to do with the Seattle Marathon others that have to do with the New York 210 00:13:55,800 --> 00:13:59,500 City marathon, but you won't really have a specific 211 00:13:59,500 --> 00:14:03,600 result that will be like oh I ran the Seattle - on and then 212 00:14:03,800 --> 00:14:07,900 I went and run the New York City marathon. This is what you know, these are the 213 00:14:07,900 --> 00:14:11,800 tidbits of information that you should know it, just that. So 214 00:14:11,800 --> 00:14:15,900 the result that happened to exist, but then it'll 215 00:14:15,900 --> 00:14:19,500 be created for you by being able to 216 00:14:19,500 --> 00:14:21,800 extract that knowledge and reason. Overall this 217 00:14:21,900 --> 00:14:25,800 Documents and then you'll be able to get the answer that you're looking for. Did your wife 218 00:14:25,800 --> 00:14:29,700 actually try that query and was it correct? Or was it usefully wrong 219 00:14:29,800 --> 00:14:32,600 but it was correct. 220 00:14:34,100 --> 00:14:38,200 No, in fact there is a bit of a, you know, misconception 221 00:14:38,600 --> 00:14:42,800 that oh wow, this models makeup 222 00:14:42,800 --> 00:14:46,900 stuff and that that is true and I think that that is 223 00:14:46,900 --> 00:14:50,500 one thing that we learn when when we 224 00:14:51,900 --> 00:14:54,600 you know, when we saw gbd4 for the first time last summer, 225 00:14:55,700 --> 00:14:59,900 we knew that it was special. It was a breakthrough in a large language 226 00:14:59,900 --> 00:15:03,700 models. It was able to reason see 227 00:15:03,900 --> 00:15:07,700 Emphasize extract information in ways that we haven't seen before. 228 00:15:07,900 --> 00:15:11,300 And so we're very excited about it, but it did have two problems 229 00:15:12,000 --> 00:15:16,900 one as you probably know is that it was trained with data through 230 00:15:16,900 --> 00:15:20,400 2021. So was not able to answer questions 231 00:15:21,100 --> 00:15:25,600 beyond that day and secondly is what we were just talking 232 00:15:25,600 --> 00:15:29,900 about the fact that he would make up stuff it would house in eight. That's the 233 00:15:29,900 --> 00:15:33,800 term that that people use. And obviously when you 234 00:15:34,000 --> 00:15:38,900 Want to build a search engine, those two are big problems right off and 235 00:15:38,900 --> 00:15:42,800 for a lot of other applications obviously you seen what's happened. WeChat 236 00:15:42,800 --> 00:15:46,900 GPD, people are using it for a lot of other things and it's really, really helpful, 237 00:15:47,000 --> 00:15:51,600 but if you want search, you really want to be able to answer questions up to right 238 00:15:51,600 --> 00:15:55,300 now and you also want to make sure that the 239 00:15:55,600 --> 00:15:59,800 information is accurate. And so what we did we didn't just put 240 00:16:00,300 --> 00:16:03,500 say the GPT model gbd4 inside being 241 00:16:03,900 --> 00:16:07,900 What we actually did, we combine search and be 242 00:16:08,100 --> 00:16:12,900 in a process that we call grounding. We basically have a model on top 243 00:16:12,900 --> 00:16:16,400 of the GT Model that we call Prometheus, which what it 244 00:16:16,400 --> 00:16:20,600 does, it takes the party from the user, with the context, if it's in a 245 00:16:20,600 --> 00:16:24,900 conversation and then it actually leverages 246 00:16:25,000 --> 00:16:29,900 both search and GPT to break the query into 247 00:16:29,900 --> 00:16:33,800 pieces. And that's if you've used a new being, you'll notice that sometimes 248 00:16:33,900 --> 00:16:37,200 Tess searching for searching for Tina. Can do several 249 00:16:38,100 --> 00:16:42,400 additional searches. Well, we're actually doing, we're going to the being 250 00:16:42,400 --> 00:16:46,300 index and then we're selecting the information from the 251 00:16:46,300 --> 00:16:50,800 web that's fresh that's comprehensive and we package it. 252 00:16:51,100 --> 00:16:55,700 And then we provide that information into the gbt model and the model reasons 253 00:16:55,800 --> 00:16:59,400 over it. So we're grounding the model with 254 00:16:59,400 --> 00:17:03,700 fresh accurate search information. That's why if you look at a lot of the 255 00:17:03,800 --> 00:17:07,700 The shootouts out there when people are comparing being with 256 00:17:08,600 --> 00:17:12,900 chat gbd or with barn, a lot of the times, you know, 257 00:17:12,900 --> 00:17:16,900 the conclusion is also, if you've got a recent query a 258 00:17:16,900 --> 00:17:20,500 fresh worry, you if you want more accurate results and 259 00:17:20,600 --> 00:17:24,700 and you know you want to reduce the risk of host 260 00:17:24,700 --> 00:17:28,700 nation. The new being does a really great job and it's because of this 261 00:17:28,700 --> 00:17:32,600 church. Handing process is part of what we call again the Prometheus 262 00:17:32,600 --> 00:17:33,700 model. Yeah. 263 00:17:36,100 --> 00:17:40,900 We've been talking about the user interface and the opportunities there. What 264 00:17:40,900 --> 00:17:44,600 about the business model? You know, the business model of search has been 265 00:17:44,600 --> 00:17:48,500 advertising, what's your sense of 266 00:17:48,500 --> 00:17:52,900 how this is going to change? You know, it's obviously 267 00:17:54,000 --> 00:17:57,600 a big challenge for you know anybody who's 268 00:17:59,000 --> 00:18:03,700 suddenly you have a hold of game, what's your thinking about how ads will 269 00:18:03,700 --> 00:18:04,500 fit into this? 270 00:18:04,600 --> 00:18:06,100 Or whether they will fit it. 271 00:18:08,000 --> 00:18:12,400 We think that's will play an important role. In fact, even today, if 272 00:18:12,400 --> 00:18:16,400 you use the new being and you go to the chat mode, 273 00:18:17,400 --> 00:18:21,800 you notice that we, we have references for 274 00:18:21,800 --> 00:18:25,300 the result, for their, we call the chat and Sir. 275 00:18:26,200 --> 00:18:30,600 If you hover on, do references, you'll notice that sometimes we actually include 276 00:18:30,700 --> 00:18:34,900 ads in the list and sometimes also for certain types 277 00:18:34,900 --> 00:18:37,600 of worries. For example, if it's a hotel suite, 278 00:18:38,400 --> 00:18:42,200 You will also see product dots which are the ones that are 279 00:18:42,200 --> 00:18:46,200 images with some Decks that are part of the 280 00:18:46,200 --> 00:18:50,800 conversations. We actually attach them at the bottom of the chunk response. And so we 281 00:18:50,800 --> 00:18:54,900 already have this text ads embedded into the chat answer 282 00:18:54,900 --> 00:18:58,700 with the citations you know, and the product ads and so definitely 283 00:18:58,700 --> 00:19:02,700 adds we think will continue to play a role but there 284 00:19:02,700 --> 00:19:06,900 might be some ablution as well in terms of the 285 00:19:06,900 --> 00:19:07,600 way we want to. 286 00:19:07,700 --> 00:19:11,500 Dice like the the plug-in 287 00:19:11,500 --> 00:19:15,800 architectures that were talking about where companies can include their 288 00:19:15,800 --> 00:19:19,700 own mini applications. If you like that get connected to the 289 00:19:19,800 --> 00:19:23,900 large language model that that may lead to other ways 290 00:19:23,900 --> 00:19:27,900 of monetizing. But I think at the heart of it I think it's still 291 00:19:27,900 --> 00:19:30,400 going to be the even by advertising. I 292 00:19:32,000 --> 00:19:36,900 want to come back to the plug-in discussion but let's go a little 293 00:19:36,900 --> 00:19:37,700 further on this. 294 00:19:37,900 --> 00:19:41,500 Sizing model you have clearly when in the 295 00:19:41,500 --> 00:19:45,400 early web search engines, they were still kind of doing the 296 00:19:45,400 --> 00:19:49,400 in-your-face display advertising and then we 297 00:19:49,400 --> 00:19:53,100 got, you know, pay per click, which for a pretty 298 00:19:54,900 --> 00:19:58,700 for all the better part of a decade was pretty well aligned with 299 00:20:00,100 --> 00:20:04,900 user intent. You know users are searching for something and there's a parallel stream of 300 00:20:04,900 --> 00:20:07,600 ads. People get paid only when 301 00:20:07,700 --> 00:20:11,800 they're relevant. So both the query and the ad 302 00:20:11,800 --> 00:20:15,700 were were relevant to the user and in 303 00:20:15,700 --> 00:20:19,500 this you know scenario I do we have to 304 00:20:19,500 --> 00:20:23,800 reinvent add relevant so I think to make them useful to 305 00:20:23,800 --> 00:20:27,200 the user but there's also I guess a question 306 00:20:28,200 --> 00:20:32,700 yeah. What Larry and Sergey a Google originally called the problem of mixed motives when you have 307 00:20:32,700 --> 00:20:36,700 advertising because your your goal that your 308 00:20:36,700 --> 00:20:37,600 goal is to 309 00:20:37,700 --> 00:20:41,800 Get the user. The best result, you might come up with one 310 00:20:41,800 --> 00:20:45,500 thing. If you start injecting ads to have Ali, 311 00:20:45,500 --> 00:20:49,500 which are not the goals of the user. But the goals of the advertiser, 312 00:20:49,900 --> 00:20:52,600 how do you think about that in this AI context? 313 00:20:54,400 --> 00:20:58,900 Well, I'd seen search have always been a balance, right? I think 314 00:20:58,900 --> 00:21:02,800 it's well known that. Obviously, you used to components to 315 00:21:02,800 --> 00:21:06,800 decide whether to show an ad in search results. One is 316 00:21:07,200 --> 00:21:11,800 the relevance of the ad. And obviously, the more relevant, the add, the 317 00:21:11,800 --> 00:21:15,500 more clicks, the ad will get and the more 318 00:21:15,500 --> 00:21:19,900 conversion. So we'll provide to The Advertiser. But then there is also the bit. How 319 00:21:19,900 --> 00:21:22,600 much is The Advertiser bidding for the 320 00:21:23,200 --> 00:21:27,200 The words in the query and this, you know, 321 00:21:27,200 --> 00:21:31,600 competition sort to speak in the heating mechanism of 322 00:21:31,800 --> 00:21:35,900 and it creates the, you know what, we call the Marketplace, right? 323 00:21:36,000 --> 00:21:40,900 And so we definitely know that the more relevant the 324 00:21:40,900 --> 00:21:44,900 atom. It's better for every for everyone. 325 00:21:45,200 --> 00:21:48,700 And so that's usually the pivot on how we 326 00:21:50,500 --> 00:21:52,800 how we start when it comes to jerk. 327 00:21:53,000 --> 00:21:57,500 Showing that the same will be true for job, right? I think ultimately chat, 328 00:21:57,500 --> 00:22:01,500 response is composed of sentences that come from different 329 00:22:01,500 --> 00:22:05,800 references or those products that I was mentioning. And so if 330 00:22:05,800 --> 00:22:09,900 the user, you know, hovers on one of those annotations and 331 00:22:09,900 --> 00:22:13,800 season that, if that is relevant, you know what? Basically, we 332 00:22:13,800 --> 00:22:17,800 will we have marked and labeled Watson at and what's not 333 00:22:18,000 --> 00:22:22,800 much like, well, right will surge, then we'll see Clicks in it and 334 00:22:22,900 --> 00:22:26,700 All right, about the advertised this, the kids to the clicks to become 335 00:22:26,700 --> 00:22:30,800 conversions. Yeah, I think the the point that I would want to make 336 00:22:30,800 --> 00:22:34,900 is that, you know, many of the people who are in 337 00:22:34,900 --> 00:22:38,800 the audience of you currently do search advertising social 338 00:22:38,800 --> 00:22:42,700 media marketing. So on and this is a new frontier 339 00:22:42,700 --> 00:22:46,200 which is going to have new rules and there's going to be a 340 00:22:46,500 --> 00:22:50,100 mutual process of discovery by platforms 341 00:22:50,500 --> 00:22:52,900 and you know and basically 342 00:22:53,000 --> 00:22:57,600 Really companies that are trying to use those platforms about what 343 00:22:57,600 --> 00:23:01,900 works in this new environment. So there's I guess we're headed for a, I guess a period 344 00:23:01,900 --> 00:23:05,700 where people will need to be learning a lot about what's new and out 345 00:23:05,800 --> 00:23:08,100 out out goes that make sense to you. 346 00:23:10,100 --> 00:23:14,900 They'll definitely be song. Some changes in terms 347 00:23:14,900 --> 00:23:16,300 of of the 348 00:23:17,600 --> 00:23:21,600 The details. But if high level, I think that the 349 00:23:21,600 --> 00:23:25,800 concepts that I was mentioning that adds being 350 00:23:25,800 --> 00:23:29,500 relevant bits being taken into account the marketplace 351 00:23:29,500 --> 00:23:33,700 working. I think it's gonna operate with similar 352 00:23:33,700 --> 00:23:37,300 fundamentals. We get the same question about SEO. 353 00:23:37,700 --> 00:23:41,700 Yeah, we need to SEO differently but a lot of 354 00:23:41,700 --> 00:23:45,700 it is going to be similar because remember, 355 00:23:45,900 --> 00:23:46,600 one of the 356 00:23:46,900 --> 00:23:50,600 The key differences. Yeah. That the new being has say, Chachi PT 357 00:23:50,800 --> 00:23:54,400 is this grounding mechanism and the grounding mechanism is just from search 358 00:23:54,400 --> 00:23:58,900 results. And and ultimately is all David on the search results as 359 00:23:58,900 --> 00:24:02,600 well. You're right, that there will be some details 360 00:24:02,600 --> 00:24:06,800 that people will need to think through, you know, in terms of, you know how 361 00:24:06,800 --> 00:24:10,600 you SEO on your bid and so on, I think it's going to be more of an 362 00:24:10,600 --> 00:24:14,900 evolution right now and saying, Revolution, are you familiar with 363 00:24:14,900 --> 00:24:16,800 what adapt dot a is doing? 364 00:24:16,900 --> 00:24:17,100 Ting. 365 00:24:17,300 --> 00:24:21,500 I am not I mean, I heard a little bit. Yeah, 366 00:24:21,600 --> 00:24:25,900 nothing. I'm just so curious because they're 367 00:24:25,900 --> 00:24:29,500 envisioning building, you know, 368 00:24:29,500 --> 00:24:33,700 conversational interfaces to virtually every app. Now, of course, 369 00:24:33,700 --> 00:24:37,700 Microsoft has already announced. They're going to build a lot of this technology into 370 00:24:37,700 --> 00:24:41,900 office, for example, but I do wonder 371 00:24:41,900 --> 00:24:45,800 how just how deep this, you 372 00:24:45,800 --> 00:24:47,300 know, change is going to be. 373 00:24:47,300 --> 00:24:51,500 Be you know you think about the change from da store windows 374 00:24:52,000 --> 00:24:56,800 from you know windows to the web and now we're really 375 00:24:56,900 --> 00:25:00,900 you know and yeah window web to mobile was it 376 00:25:00,900 --> 00:25:04,800 was a change in you know the type of applications but it 377 00:25:04,800 --> 00:25:08,800 didn't really change the UI that much. Yeah it was it had to be 378 00:25:08,800 --> 00:25:12,100 more compact but it was still you know 379 00:25:13,300 --> 00:25:15,600 and assume touch screens and so on. 380 00:25:17,200 --> 00:25:21,500 But it was still very information-rich and so I hear we have a whole new 381 00:25:21,500 --> 00:25:25,500 paradigm and to the extent that it becomes not just about 382 00:25:25,500 --> 00:25:29,700 information but about you know, being able to control 383 00:25:30,100 --> 00:25:34,900 operating systems apps, whatever. How do you see that progressing or is 384 00:25:34,900 --> 00:25:38,400 it just too early to tell? Well, 385 00:25:40,000 --> 00:25:44,800 definitely a I'd love to be able to go five years 386 00:25:44,800 --> 00:25:46,800 from now and see where we are. 387 00:25:46,900 --> 00:25:50,600 Hunting, we haven't seen 388 00:25:50,600 --> 00:25:54,800 anything yet of what's possible. I think there is 389 00:25:54,800 --> 00:25:58,700 a lot of capabilities in these models 390 00:25:58,700 --> 00:26:02,600 that are going to really even surprised as, 391 00:26:02,600 --> 00:26:06,900 you know, even years from now. I think people are even 392 00:26:06,900 --> 00:26:10,300 today using the capabilities in ways. We did not anticipate 393 00:26:10,300 --> 00:26:14,600 and it's really exciting. 394 00:26:14,600 --> 00:26:16,800 I think you gave several 395 00:26:16,900 --> 00:26:20,900 all examples in terms of, you know, PC web, you know, 396 00:26:20,900 --> 00:26:24,800 mobile, I think definitely is another of this big steps 397 00:26:24,800 --> 00:26:28,300 where the basically the ux, the way people 398 00:26:28,300 --> 00:26:32,700 experience products is fundamentally going to 399 00:26:32,700 --> 00:26:36,200 change. Its going to be more natural, is what I be. 400 00:26:38,300 --> 00:26:42,700 It would be a lot easier to do essentially anything, whether it's 401 00:26:42,800 --> 00:26:46,500 now, we talk a lot about writing and how the 402 00:26:47,000 --> 00:26:51,500 LMS, I'm helping you write essays letters, right? 403 00:26:51,500 --> 00:26:55,700 Goat out there, helping in Search and that's why 404 00:26:55,800 --> 00:26:59,400 you see us at Microsoft, integrating it it to be on into 405 00:26:59,400 --> 00:27:03,800 office. But one of the things that also 406 00:27:03,800 --> 00:27:07,800 impressive and I unfortunately cannot see too much about it is that we have 407 00:27:07,800 --> 00:27:11,800 all sorts of companies coming to wash your where we're hosting. Yeah, the 408 00:27:11,800 --> 00:27:15,600 GB models and there are all sorts of startups or more 409 00:27:15,600 --> 00:27:16,600 mature companies, 410 00:27:16,900 --> 00:27:20,600 That they have all these different use cases, that 411 00:27:20,900 --> 00:27:24,700 that go well, they range from you name. It, anything 412 00:27:24,700 --> 00:27:28,300 that has to do from customer support to creating in 413 00:27:28,400 --> 00:27:32,500 innovations that we hadn't even thought about. And so, definitely, it's going to be a new 414 00:27:32,500 --> 00:27:36,300 world. I think it's what happened a lot 415 00:27:36,500 --> 00:27:40,900 faster than we think. Like, I've been reading articles in well, even when, you 416 00:27:40,900 --> 00:27:44,900 know, even the internet to time or, you know, there 417 00:27:44,900 --> 00:27:46,600 was an article, I think 418 00:27:47,600 --> 00:27:51,900 In the New York Times talking about how well these revolutions, you know, everybody thinks 419 00:27:51,900 --> 00:27:55,400 they're happening tomorrow. But look, even 420 00:27:56,000 --> 00:28:00,800 you took a long time for engines to replace horses, see, whether they were giving all these 421 00:28:00,800 --> 00:28:04,800 different paradigms, right? But this is different because this is already on the 422 00:28:04,800 --> 00:28:08,800 web, the applications are really exploding as we speak. 423 00:28:09,600 --> 00:28:12,500 And so even if you if you just look at 424 00:28:13,800 --> 00:28:16,400 more genetically, how people have 425 00:28:17,200 --> 00:28:21,900 Have embraced charge a big deal, how quickly know it, reach 100 million 426 00:28:21,900 --> 00:28:25,900 users, right? If you if you block how long it 427 00:28:25,900 --> 00:28:29,400 took to Facebook to complete 100 million users, or 428 00:28:29,400 --> 00:28:33,800 Netflix, guys, this, you know, you see this curse that are 429 00:28:33,800 --> 00:28:37,500 like this and then the the chat GPT curve is like 430 00:28:37,700 --> 00:28:41,900 this, right? And so same thing is happening with us being like we're seeing, you 431 00:28:41,900 --> 00:28:45,900 know, this is a big spike, we always use users. I think that's 432 00:28:46,100 --> 00:28:46,900 ultimately what's wrong? 433 00:28:47,200 --> 00:28:51,700 And this, these tools are so useful and delightful that that, 434 00:28:52,300 --> 00:28:56,800 you know, when people find out what they can do, they definitely want to be part of it. 435 00:28:56,800 --> 00:29:00,700 And so that's why I think it's definitely going to happen faster 436 00:29:01,400 --> 00:29:05,600 along, folks team, you mentioned that connection to Azure. And 437 00:29:07,000 --> 00:29:11,900 you know, what do you think this is going to be a booster Azure that, you know, you've 438 00:29:11,900 --> 00:29:15,300 got that is the the hosting platform for 439 00:29:15,700 --> 00:29:16,900 both, you know? 440 00:29:17,300 --> 00:29:21,500 Opening II and the Bing that is that a, we 441 00:29:21,500 --> 00:29:25,600 swole? Yeah, yeah, we definitely think so. Because 442 00:29:26,400 --> 00:29:30,800 obviously, we've been working closely with, oh, buddy. I and we're hosting older 443 00:29:30,800 --> 00:29:34,700 models, in Azure and we have a, by scaling 444 00:29:34,700 --> 00:29:38,800 being. We've learned a lot on how to scale these models. We fix a 445 00:29:38,800 --> 00:29:42,400 lot of bags. A lot of problems, you know, how do you get 446 00:29:42,900 --> 00:29:46,900 tens of millions of users on it in a reliable way? 447 00:29:47,100 --> 00:29:50,900 A and so definitely we think 448 00:29:52,100 --> 00:29:55,600 what we are ahead and it gives us an opportunity to really 449 00:29:57,000 --> 00:30:01,600 be at the Forefront of enabling. All these other companies 450 00:30:02,300 --> 00:30:06,800 that are interested in leveraging is capabilities to come to wiser and 451 00:30:06,900 --> 00:30:10,500 basically you know be able to 452 00:30:10,700 --> 00:30:14,600 take advantage of the latest technology that 453 00:30:14,800 --> 00:30:16,400 best tested 10 best. 454 00:30:17,000 --> 00:30:21,700 Best run, given everything that we've done. Do you think we'll open a, i plugins 455 00:30:22,200 --> 00:30:26,200 require hasher, or work better with Azure or 456 00:30:28,100 --> 00:30:29,800 there any, are there any dependencies 457 00:30:33,300 --> 00:30:37,400 To be honest. I'm not sure about the plugins, what the restrictions are going to be. 458 00:30:39,800 --> 00:30:43,900 I do think that given that the GPT models themselves will be 459 00:30:43,900 --> 00:30:47,600 an Azure, there will be benefits if people build 460 00:30:47,900 --> 00:30:51,900 on Archer and definitely, I think that were probably a 461 00:30:51,900 --> 00:30:55,800 lot of companies will start because it'll just be simpler 462 00:30:57,000 --> 00:31:01,800 at the same time to be fully Frank with you. I'm not Felicia 463 00:31:02,200 --> 00:31:06,900 Okay, that's fine. Hey, you talk about how fast 464 00:31:06,900 --> 00:31:10,800 this is going to go and you probably heard about the Open 465 00:31:10,800 --> 00:31:14,500 Letter by Elon Musk and others calling for an AI. Pause, 466 00:31:14,900 --> 00:31:16,200 what do you think of that? 467 00:31:18,300 --> 00:31:22,700 Well I think it's good that there is a healthy debate about these Technologies. 468 00:31:23,100 --> 00:31:27,600 I think like any new technology there are always 469 00:31:28,900 --> 00:31:32,600 questions and and positive uses of it or 470 00:31:32,800 --> 00:31:36,900 - users of it. But when I look at what we've done in and look 471 00:31:36,900 --> 00:31:40,500 at the user Delight that that's really what I 472 00:31:40,500 --> 00:31:44,000 see. I mean I obviously you can always 473 00:31:44,800 --> 00:31:47,900 hypothesize a lot of different things about how the technology. 474 00:31:48,000 --> 00:31:52,900 Apologies can be used but if you you look at the millions of users that are using 475 00:31:52,900 --> 00:31:56,900 the products today, they are really using them to make their 476 00:31:56,900 --> 00:32:00,800 lives better and they're finding that they are more productive than they can 477 00:32:00,800 --> 00:32:04,600 really get inspired. And that's really what I think is the 478 00:32:04,600 --> 00:32:08,800 core of what's driving this trouble, 479 00:32:08,800 --> 00:32:12,600 Lucia is the fact that people are finding the tools 480 00:32:12,600 --> 00:32:16,800 really useful and at the end of the day, that's what I'm focused on. I 481 00:32:16,800 --> 00:32:17,800 want to thank to make the new 482 00:32:18,000 --> 00:32:22,800 Being as helpful and as delightful 483 00:32:22,800 --> 00:32:26,800 for users, as you can be. And I think that at the end of the day, 484 00:32:27,200 --> 00:32:31,800 this is what's going to Prevail while at the same time, we don't need to be thoughtful and we do 485 00:32:31,800 --> 00:32:35,600 make sure not just with LMS, but with AI 486 00:32:35,600 --> 00:32:39,500 technology in general and we've said this at Microsoft from from the 487 00:32:39,500 --> 00:32:43,900 beginning that there needs to be a responsible uses, there needs to be 488 00:32:44,100 --> 00:32:47,800 some regulation that, you know, make sure 489 00:32:47,900 --> 00:32:51,700 Sure that the the use much like any other 490 00:32:52,100 --> 00:32:56,300 product really? There's got to be a regulation to make sure it's use the proper way. 491 00:32:56,600 --> 00:33:00,800 Yeah, my concern is there's been a lot of fear-mongering by 492 00:33:00,800 --> 00:33:04,600 people who bought into this idea of you know, Rogue 493 00:33:04,600 --> 00:33:07,900 AI, you know. And this is is, you know, 494 00:33:09,900 --> 00:33:13,900 you know, just, you know, I think it's been referred to people talking about, like, it's a nuclear 495 00:33:13,900 --> 00:33:17,800 bomb. My sense is we're nowhere near that point, you know. And if I 496 00:33:18,000 --> 00:33:22,700 They site surveys of people are afraid of very, very bad results but this was a 497 00:33:22,700 --> 00:33:26,600 survey of people imagining artificial general intelligence and in 498 00:33:26,600 --> 00:33:30,900 2059 you know and yet it's presented as though we're at 499 00:33:30,900 --> 00:33:34,900 that point today and I don't think we're there at all. I mean these 500 00:33:34,900 --> 00:33:38,300 are tools that are invoked by 501 00:33:38,300 --> 00:33:42,800 humans and the kinds of risks that we have to worry about are things 502 00:33:42,800 --> 00:33:46,800 like, you know, will they accelerate this the ability 503 00:33:46,800 --> 00:33:47,800 to create Miss 504 00:33:47,900 --> 00:33:51,900 Nation will they, you know, they'll be a lot of frontiers 505 00:33:51,900 --> 00:33:52,200 of 506 00:33:56,000 --> 00:33:59,800 You know, about by which Bad actors can, for example, generate, 507 00:34:00,900 --> 00:34:04,400 you know, deep fakes, it's becoming easier and easier. 508 00:34:05,600 --> 00:34:09,600 But those are not, you know, you know, Humanity, 509 00:34:09,600 --> 00:34:13,700 ending kinds of problems. There are problems, 510 00:34:13,700 --> 00:34:17,500 like the ones that we've been facing on the internet for 511 00:34:18,000 --> 00:34:22,900 four decades magnified, perhaps, but the fact is the countermeasures are also going 512 00:34:22,900 --> 00:34:24,300 to be magnified and 513 00:34:24,900 --> 00:34:28,600 The may even be some opportunities where you go. Wow, if people 514 00:34:28,600 --> 00:34:30,500 learn to not trust what? They 515 00:34:32,700 --> 00:34:36,900 see it. See and hear that could be a good thing, you know. It's like you go. 516 00:34:36,900 --> 00:34:39,400 Oh wait, I really have to verify this and 517 00:34:40,700 --> 00:34:44,900 anyway, that's that's a. Wow. Yeah, I know that that's a great 518 00:34:44,900 --> 00:34:48,100 reflection and and I think you're you're right. 519 00:34:49,200 --> 00:34:53,800 Obviously, the way we look at these tools. Are you 520 00:34:53,800 --> 00:34:54,600 define it? Very well. 521 00:34:54,700 --> 00:34:58,300 Well, they are accelerators. We know when people are using 522 00:34:58,600 --> 00:35:02,600 copilot in GitHub that they basically get 523 00:35:02,700 --> 00:35:06,800 roughly to X, you know, relativity, 524 00:35:06,800 --> 00:35:10,800 boast. In terms of the amount of code that they can write and the 525 00:35:10,800 --> 00:35:14,900 same is going to happen. When you have these tools in Microsoft, Word, 526 00:35:15,500 --> 00:35:19,900 PowerPoint, you are going to see how people can be basically a 527 00:35:19,900 --> 00:35:23,900 lot more productive than what they are today. Will be able to answer 528 00:35:23,900 --> 00:35:24,600 questions in. 529 00:35:24,900 --> 00:35:28,900 That they were never able to get before and that's that's the 530 00:35:28,900 --> 00:35:32,900 beauty of the integration that we've done in the new being. And so I feel like 531 00:35:32,900 --> 00:35:36,500 at the end of the day that's that's what's going to be the core 532 00:35:36,800 --> 00:35:40,700 of the value that people will see. Unfortunately, there 533 00:35:40,700 --> 00:35:44,900 is a lot of fear because of, you 534 00:35:44,900 --> 00:35:48,500 know, some of it is all these Hollywood movies of Dai taking over the 535 00:35:48,500 --> 00:35:52,800 role and and, and we are far from an AGI. I mean, these are, 536 00:35:53,000 --> 00:35:54,600 these are tools. I mean, this is not 537 00:35:54,700 --> 00:35:57,700 Not, you know, the the 538 00:35:58,600 --> 00:36:02,300 artificial general intelligence that that people talk about. No 539 00:36:03,300 --> 00:36:07,600 no. I mean there's so many predictions and Wendy if we'll get there, 540 00:36:07,600 --> 00:36:11,600 right? And so I think, I think that there is unfortunately, a lot of them. 541 00:36:11,600 --> 00:36:15,900 Now, there are potential problems that we need to be very aware 542 00:36:15,900 --> 00:36:19,800 of. And you mentioned a couple that I think, you know, are an important 543 00:36:19,800 --> 00:36:23,000 problems. One is the ability to proliferate 544 00:36:23,000 --> 00:36:24,600 misinformation, is who 545 00:36:24,800 --> 00:36:28,800 Be enhanced by our Styles and that, that's true. But you said something very 546 00:36:28,800 --> 00:36:32,600 important to our ability to protect from, it will be in Cannes 547 00:36:32,600 --> 00:36:36,800 to and reality is that we as a search engines, we've 548 00:36:36,800 --> 00:36:40,800 dealt with Spam junk Museum formation 549 00:36:40,800 --> 00:36:44,700 for years and we're constantly developing Technologies 550 00:36:44,700 --> 00:36:48,500 to try to, you know, down 551 00:36:48,500 --> 00:36:52,900 rank this type of information for queries. And so, 552 00:36:53,100 --> 00:36:54,500 so that will continue 553 00:36:55,300 --> 00:36:59,500 the other one that you mentioned are defects, definitely, I think there should be 554 00:36:59,800 --> 00:37:03,600 some way to regulate like if 555 00:37:03,900 --> 00:37:07,900 somebody creates a fake video of you and I do 556 00:37:07,900 --> 00:37:11,700 an interview saying crazy things, so there should be 557 00:37:12,000 --> 00:37:16,700 some way to say, okay, well, this not this, not right, you know, and so 558 00:37:16,700 --> 00:37:20,100 exactly what obviously Regulators need to step in, 559 00:37:21,200 --> 00:37:24,600 but at the same time, the ability to create videos 560 00:37:24,700 --> 00:37:28,600 Now just from trauma side. Or now, even in the new 561 00:37:28,600 --> 00:37:32,800 being we've got the dolly opening I model where you can 562 00:37:32,800 --> 00:37:36,600 type a few words, and you can create images, 563 00:37:37,000 --> 00:37:41,800 that's really powerful. And and we see users loving it and and 564 00:37:41,800 --> 00:37:45,400 finding inspiration with it and being a lot more productive in their 565 00:37:45,400 --> 00:37:49,900 jobs with it. And so you gotta obviously have a balance 566 00:37:49,900 --> 00:37:53,200 and we need to be thoughtful but I also worry that the 567 00:37:53,200 --> 00:37:54,100 fear-mongering 568 00:37:54,700 --> 00:37:57,800 Is exaggerated, and we need to, you know, 569 00:37:58,800 --> 00:38:02,800 be balanced and think through this, you know, with pros and cons. 570 00:38:02,800 --> 00:38:06,600 But I think at the end of the day, the Browns will clearly 571 00:38:06,600 --> 00:38:10,600 outweigh the price will clear away the 572 00:38:10,600 --> 00:38:14,400 concept. We're already seeing that. Yeah, a lot of people are 573 00:38:14,400 --> 00:38:18,800 speculating that they'll be lots of job losses because 574 00:38:19,000 --> 00:38:22,900 it'll make certain kinds of jobs obsolete. What do you think about that? 575 00:38:24,100 --> 00:38:28,800 Well, I think that they'll be some changes and much like with 576 00:38:28,800 --> 00:38:32,300 any technology, some jobs might be 577 00:38:32,300 --> 00:38:36,200 lost at the same time. Others will be created and 578 00:38:37,200 --> 00:38:41,900 I don't know if this is a great reference, but I 579 00:38:41,900 --> 00:38:44,600 do remember when Excel was 580 00:38:45,500 --> 00:38:49,400 created, a lot of accountants were thinking, oh my goodness, I'm going to lose my 581 00:38:49,400 --> 00:38:53,700 job and it is through that today one accountant. 582 00:38:53,900 --> 00:38:57,900 Excel can do what maybe 50 accounts we were able 583 00:38:57,900 --> 00:39:01,700 to do, you know, maybe 50 years ago, right? But 584 00:39:01,700 --> 00:39:05,900 accountants have not lost their jobs. They are, we have been a lot more 585 00:39:05,900 --> 00:39:09,800 productive. That's a lot of accountants out there, right? 586 00:39:09,800 --> 00:39:13,600 And so so I kind of envisioned the same Evolution 587 00:39:13,900 --> 00:39:17,700 with this technology. Making everybody more productive and 588 00:39:17,700 --> 00:39:21,900 yes, maybe there will be a given the bottom button in an, in a company 589 00:39:21,900 --> 00:39:23,700 that will say, oh, wow, before we 590 00:39:23,800 --> 00:39:27,900 Needed 10 people to do this job, maybe now we only need, you 591 00:39:27,900 --> 00:39:31,700 know, a couple but but there will be other 592 00:39:31,700 --> 00:39:35,700 jobs for this other folks that will be able to leverage the other 593 00:39:35,700 --> 00:39:39,500 lamps and ultimately make Society more productive or lap. 594 00:39:39,800 --> 00:39:43,400 Yeah, I think there's a fundamental law that 595 00:39:43,400 --> 00:39:47,500 economists talk about which is that when things become 596 00:39:47,500 --> 00:39:51,300 cheaper, people tend to use more of them here up. So 597 00:39:51,800 --> 00:39:53,700 you know all of a sudden there's a there's a 598 00:39:53,800 --> 00:39:57,700 A lot of capabilities that, yes, you know, 599 00:39:57,700 --> 00:40:01,900 we were paying people do and now maybe they'll be being done. 600 00:40:02,200 --> 00:40:06,700 They will be done by an AI, but it's not like, there's not enough work to. 601 00:40:06,900 --> 00:40:10,700 Yeah. I mean, there's so much that needs doing that, this becomes the 602 00:40:10,700 --> 00:40:14,800 superpower. I think companies that really understand this will 603 00:40:14,800 --> 00:40:18,400 tackle new problems and go. Wow, we now have new 604 00:40:18,400 --> 00:40:22,800 capabilities that let us solve things that we couldn't afford this 605 00:40:22,800 --> 00:40:23,700 all before and 606 00:40:23,800 --> 00:40:27,800 Now we add we can't afford, you know? 607 00:40:28,400 --> 00:40:32,900 Yeah, no no I think. Anyway, I think 608 00:40:33,900 --> 00:40:37,700 you're inside this is absolutely correct. 609 00:40:37,700 --> 00:40:41,500 And in fact, another angle of this is that these 610 00:40:41,500 --> 00:40:45,800 tools will be able to not only make you more productive 611 00:40:46,000 --> 00:40:50,800 but also save you time on doing things that are not that much. Pipe is what 612 00:40:50,800 --> 00:40:53,700 Satya often talks about like the grand jury. 613 00:40:54,100 --> 00:40:58,700 Of work, right? The less interesting and maybe less creative 614 00:40:58,700 --> 00:41:02,900 aspect of your job. You can easily just 615 00:41:03,100 --> 00:41:07,900 get it done with two stools and I'll give you an example. In fact, you know, a few 616 00:41:07,900 --> 00:41:11,700 months ago, obviously, we were talking a lot about the 617 00:41:12,100 --> 00:41:16,800 large language models and, and my boss asked me to go to this 618 00:41:16,800 --> 00:41:20,900 offside that right to document about it. And, you know, I was working 619 00:41:20,900 --> 00:41:23,600 really hard on the new bang and I like, oh my goodness. I 620 00:41:23,800 --> 00:41:27,900 I gotta do this and it sounds like that. You know, intuitively I could write, I 621 00:41:27,900 --> 00:41:31,800 could take a few hours to do it, but I just went to the new baby and I 622 00:41:31,800 --> 00:41:35,000 said, okay, well tell me, you know, 623 00:41:35,700 --> 00:41:39,700 write a few paragraphs about what's a large language model? 624 00:41:40,100 --> 00:41:44,800 Now, write a few paragraphs about the benefits. What are some of the 625 00:41:45,200 --> 00:41:49,700 challenges will draw backs, right? Literally, something that would probably have taken 626 00:41:49,700 --> 00:41:53,700 me, you know, three to four hours in half an hour. 627 00:41:53,900 --> 00:41:57,700 Was done with the paper. Of course, there was a section of that paper that 628 00:41:58,000 --> 00:42:02,800 I had to. You know, it was more the mine 629 00:42:02,800 --> 00:42:06,600 sites, serve to speak but a lot of the background, a lot of 630 00:42:06,600 --> 00:42:10,900 the information that's out there. You know, I was able to get 631 00:42:10,900 --> 00:42:14,600 that done, you know, super fast and what was interesting is that I 632 00:42:14,600 --> 00:42:18,400 posted this document very late at night and 633 00:42:19,800 --> 00:42:23,600 and my boss and my boss was really wow, that was very quick jerk. 634 00:42:23,800 --> 00:42:27,900 Ian, the document is very well written. Well, 635 00:42:33,000 --> 00:42:37,600 I think we're out of time and it just this wonderful to 636 00:42:37,600 --> 00:42:41,900 hear that optimism. That this is an accelerator 637 00:42:41,900 --> 00:42:45,800 for productivity and are and 638 00:42:45,800 --> 00:42:49,900 not that it's just that the benefits so far, outweigh 639 00:42:49,900 --> 00:42:53,700 the risks. I'm with you on that. Thank 640 00:42:53,800 --> 00:42:54,600 You very much. 641 00:42:56,100 --> 00:42:56,700 Thank you.