1 00:00:05,966 --> 00:00:08,343 (mysterious music) 2 00:00:08,385 --> 00:00:11,805 - [Kal] I'm here on Wall Street where it all started. 3 00:00:13,766 --> 00:00:17,102 On May 6th, 2010 at 2:45 PM, 4 00:00:17,144 --> 00:00:21,273 one guy sitting in his parent's house way over in London 5 00:00:21,315 --> 00:00:22,858 unleashed an algorithm 6 00:00:22,900 --> 00:00:24,777 back here on the New York Stock Exchange 7 00:00:24,818 --> 00:00:27,029 with a billion-dollar sell order. 8 00:00:28,071 --> 00:00:30,449 The plan was to spoof the market 9 00:00:30,491 --> 00:00:32,576 and cancel the order before it was completed. 10 00:00:32,618 --> 00:00:34,453 But before that could happen, 11 00:00:34,495 --> 00:00:37,164 the rest of the world's automated stock trading bots 12 00:00:37,206 --> 00:00:38,957 also started selling, 13 00:00:38,999 --> 00:00:40,834 setting off a chain reaction 14 00:00:40,876 --> 00:00:42,419 that cratered the market. 15 00:00:44,463 --> 00:00:47,508 - The Dow Jones Industrial dropped a thousand points 16 00:00:47,549 --> 00:00:49,343 in just 15 minutes. 17 00:00:49,384 --> 00:00:50,969 - [Kal] That's nearly as big as the drop 18 00:00:51,011 --> 00:00:52,471 that kicked off the Great Depression. 19 00:00:52,513 --> 00:00:54,056 - [Male] What the heck is going on down here? 20 00:00:54,097 --> 00:00:56,558 I don't know, there is fear in this market. 21 00:00:56,600 --> 00:01:00,979 - Cancel the cruise, switch your kids to public school. 22 00:01:01,021 --> 00:01:04,691 It's a flash crash, people. Well, fuck. 23 00:01:04,733 --> 00:01:08,195 36 minutes later, the market rebounded, 24 00:01:08,237 --> 00:01:11,156 but the flash crash of 2010 marked the first time 25 00:01:11,198 --> 00:01:16,036 humans got a visceral, first-hand, anus-puckering glimpse 26 00:01:16,078 --> 00:01:20,082 of how AI was going to take over our financial system. 27 00:01:20,123 --> 00:01:22,167 And you should see where it is now. 28 00:01:24,086 --> 00:01:26,880 It's why I'm here in India foraging for wild honey, 29 00:01:26,922 --> 00:01:29,258 but more on that later. 30 00:01:29,299 --> 00:01:31,885 To understand how AI found this place or why, 31 00:01:31,927 --> 00:01:34,471 you've got to first understand what AI is. 32 00:01:34,513 --> 00:01:37,808 And if you think you already know, I bet you're wrong. 33 00:01:37,850 --> 00:01:41,895 Whether you like it or not, you're all connected by money. 34 00:01:41,937 --> 00:01:44,731 I'm Kal Penn, exploring this giant beast 35 00:01:44,773 --> 00:01:48,277 that is the global economy. 36 00:01:49,528 --> 00:01:51,780 (tense music) 37 00:01:51,822 --> 00:01:54,741 So what is AI exactly? 38 00:01:54,783 --> 00:01:58,370 I'm in San Francisco to first establish what it's not, 39 00:01:58,412 --> 00:02:02,457 which is pretty much everything science fiction told you, 40 00:02:02,499 --> 00:02:06,503 especially giant freaking robots. 41 00:02:08,130 --> 00:02:10,173 (rock music) 42 00:02:23,812 --> 00:02:26,064 Whoa! 43 00:02:26,106 --> 00:02:29,693 Oh, my God. 44 00:02:29,735 --> 00:02:31,778 I'm here with Julia Bossmann, 45 00:02:31,820 --> 00:02:33,780 who serves on the World Economic Forum's 46 00:02:33,822 --> 00:02:35,949 Artificial Intelligence Council, 47 00:02:35,991 --> 00:02:40,287 advising world leaders on how to harness AI's potential. 48 00:02:40,329 --> 00:02:42,414 The job comes with perks 49 00:02:42,456 --> 00:02:44,541 even better than watching cars get destroyed, 50 00:02:44,583 --> 00:02:47,252 like selfies with Canada's number-one sex symbol 51 00:02:47,294 --> 00:02:48,837 not named Drake. 52 00:02:51,256 --> 00:02:52,883 Oh, damn! 53 00:02:52,925 --> 00:02:55,052 - So how are you gonna get home after this now? 54 00:02:55,093 --> 00:02:56,720 - I still think it'll drive. 55 00:03:00,474 --> 00:03:02,643 - [Kal] We're meeting at a company called Mega Bots 56 00:03:02,684 --> 00:03:05,228 which built giant robots to fight other robots. 57 00:03:06,897 --> 00:03:09,691 Sort of like an even nerdier Medieval Times. 58 00:03:13,528 --> 00:03:15,155 According to Julia, 59 00:03:15,197 --> 00:03:17,616 these robots are not just to fund theme park attachments. 60 00:03:17,658 --> 00:03:19,952 They are technological dinosaurs 61 00:03:19,993 --> 00:03:22,204 because of one important distinction. 62 00:03:23,914 --> 00:03:25,415 - [Julia] In these robots, we are the brains, 63 00:03:25,457 --> 00:03:27,584 but AI is the artificial brains. 64 00:03:27,626 --> 00:03:29,711 - [Kal] Interesting, can you expand on that? 65 00:03:29,753 --> 00:03:31,546 - We are basically making 66 00:03:31,588 --> 00:03:34,132 computers now that can learn things on their own. 67 00:03:34,174 --> 00:03:37,010 And they don't necessarily need to have bodies. 68 00:03:37,052 --> 00:03:39,638 So a lot of the AI that we've built already 69 00:03:39,680 --> 00:03:41,807 lives in giant data centers. 70 00:03:41,848 --> 00:03:44,643 - [Kal] So if you had to explain it 71 00:03:44,685 --> 00:03:46,311 to somebody who was 13 years old, 72 00:03:46,353 --> 00:03:47,562 how would you explain AI? 73 00:03:47,604 --> 00:03:49,940 - I think a very general definition of it 74 00:03:49,982 --> 00:03:52,901 could be that AI is... 75 00:03:52,943 --> 00:03:57,447 making machines do things that we didn't explicitly program them to do. 76 00:03:57,489 --> 00:04:00,450 So in traditional programming you have, you know, 77 00:04:00,492 --> 00:04:03,996 your set of rules and the algorithm and you know if this, then that, 78 00:04:04,037 --> 00:04:07,165 and it's all laid out by the humans that program it. 79 00:04:07,207 --> 00:04:09,084 If you look at the medical literature, 80 00:04:09,126 --> 00:04:11,461 one database contains millions of entries 81 00:04:11,503 --> 00:04:14,673 and no doctor could read all these research papers 82 00:04:14,715 --> 00:04:17,551 to stay up with the current field, but a machine could. 83 00:04:17,592 --> 00:04:19,678 So you can imagine a machine coming up with new ideas 84 00:04:19,720 --> 00:04:21,555 on how to solve problems or 85 00:04:21,596 --> 00:04:24,349 discovering new drugs for curing diseases. 86 00:04:24,391 --> 00:04:25,559 - Wow, okay. 87 00:04:25,600 --> 00:04:28,270 - The field in artificial intelligence 88 00:04:28,311 --> 00:04:30,439 that is having the most excitement right now, 89 00:04:30,480 --> 00:04:31,940 it's called Deep Learning. 90 00:04:31,982 --> 00:04:33,567 - [Kal] What is deep learning? 91 00:04:33,608 --> 00:04:34,901 - Deep learning is 92 00:04:34,943 --> 00:04:37,279 when we have several deep layers 93 00:04:37,320 --> 00:04:38,905 of these neural networks 94 00:04:38,947 --> 00:04:42,242 that are similar to what we have in our brains. 95 00:04:42,284 --> 00:04:44,036 So in our heads, we have all these neurons 96 00:04:44,077 --> 00:04:46,747 that are connected to each other and exchange information, 97 00:04:46,788 --> 00:04:50,751 and in a way, we are simulating this in machines. 98 00:04:50,792 --> 00:04:55,714 We feed it data in a way that is balanced and unbiased, so that it also learns. 99 00:04:55,756 --> 00:04:58,717 For example, image recognition we tell them 100 00:04:58,759 --> 00:05:01,303 these are images of cats, these are images of dogs, 101 00:05:01,344 --> 00:05:03,388 and then they just start churning through 102 00:05:03,430 --> 00:05:05,140 the images and learn by themselves 103 00:05:05,182 --> 00:05:06,933 how to recognize them so we don't have to 104 00:05:06,975 --> 00:05:09,770 program every single bit into that. 105 00:05:09,811 --> 00:05:12,439 - [Kal] Interesting. - And there's machine learning 106 00:05:12,481 --> 00:05:14,441 that is not deep learning. 107 00:05:14,483 --> 00:05:16,443 There are evolutionary algorithms 108 00:05:16,485 --> 00:05:19,404 where we basically use a principle from evolution 109 00:05:19,446 --> 00:05:23,992 and let the machine try out different instances, 110 00:05:24,034 --> 00:05:26,703 and then we see which one works best. 111 00:05:26,745 --> 00:05:28,997 And then the ones that work best 112 00:05:29,039 --> 00:05:30,624 get to go to the next generation. 113 00:05:30,665 --> 00:05:32,709 Just like organisms evolve, 114 00:05:32,751 --> 00:05:35,045 we use that principle where the fittest programs 115 00:05:35,087 --> 00:05:36,671 and the best programs survive. 116 00:05:36,713 --> 00:05:38,423 - Wow, okay. 117 00:05:38,465 --> 00:05:40,550 - So those are evolutionary algorithms? - Mm-hmm. 118 00:05:40,592 --> 00:05:44,346 - And what is the financial interest in exploring AI? 119 00:05:44,387 --> 00:05:47,390 It must have an incredible potential impact on the economy. 120 00:05:47,432 --> 00:05:50,477 - Yes, I think it's going to radically change the economy, 121 00:05:50,519 --> 00:05:52,938 and we're talking about a new industrial revolution here. 122 00:05:52,979 --> 00:05:55,649 - [Kal] Interesting. - It has been called our last invention 123 00:05:55,690 --> 00:05:58,610 because once we have artificial brain that is smarter than us, 124 00:05:58,652 --> 00:06:00,737 it can then invent more things for us. 125 00:06:00,779 --> 00:06:02,364 - Is it called the last invention 126 00:06:02,405 --> 00:06:04,241 'cause it's gonna kill us all? 127 00:06:04,282 --> 00:06:06,076 - Hopefully not. 128 00:06:08,662 --> 00:06:10,539 - Many are afraid artificial intelligence 129 00:06:10,580 --> 00:06:13,166 is going to become too smart and kill us all, 130 00:06:13,208 --> 00:06:14,876 but don't worry. 131 00:06:14,918 --> 00:06:16,795 One of the reasons AI is so smart 132 00:06:16,837 --> 00:06:19,297 is because it's dumb as hell. 133 00:06:19,339 --> 00:06:21,341 Come here, AI. 134 00:06:21,383 --> 00:06:24,553 Imagine you asked AI to find the perfect recipe for a cake 135 00:06:24,594 --> 00:06:27,806 using evolutionary algorithms. 136 00:06:27,848 --> 00:06:30,392 AI wouldn't try to think about the best way to make it, 137 00:06:30,433 --> 00:06:33,270 it would just try it billions of times 138 00:06:33,311 --> 00:06:35,689 with every ingredient in the kitchen 139 00:06:35,730 --> 00:06:38,608 in the dumbest possible ways. 140 00:06:38,650 --> 00:06:40,527 Most of course will be due to failure. 141 00:06:42,362 --> 00:06:45,574 This one for sure. Nice try, idiot. 142 00:06:45,615 --> 00:06:47,951 Failure doesn't hurt AI's feelings. 143 00:06:47,993 --> 00:06:49,703 It doesn't have any. 144 00:06:49,744 --> 00:06:52,664 The great part about evolutionary algorithms 145 00:06:52,706 --> 00:06:55,876 is that by trying all these seemingly stupid methods 146 00:06:55,917 --> 00:06:57,878 it might stumble upon a solution 147 00:06:57,919 --> 00:07:00,964 to a culinary problem no rational human would try to solve, 148 00:07:01,006 --> 00:07:03,800 like making a superior vegan cake. 149 00:07:03,842 --> 00:07:07,512 - I made a cake. - Way to go, AI. 150 00:07:07,554 --> 00:07:11,766 - It's mostly cashews. - Would you have thought to use cashews? 151 00:07:11,808 --> 00:07:14,644 Of course not, that would be stupid 152 00:07:14,686 --> 00:07:18,231 which AI is willing to be so you don't have to. 153 00:07:18,273 --> 00:07:20,817 Will this moron evolve into something so smart 154 00:07:20,859 --> 00:07:23,737 it could dominate the world and kill us all? 155 00:07:23,778 --> 00:07:25,238 Hard to say for sure. 156 00:07:25,280 --> 00:07:27,240 - I'm learning launch codes. 157 00:07:27,282 --> 00:07:30,785 - But in the meantime, have some cake. 158 00:07:33,496 --> 00:07:35,207 (upbeat music) 159 00:07:35,248 --> 00:07:37,292 - [Kal] Even though I'm still worried 160 00:07:37,334 --> 00:07:40,962 that AI is going to cook up more problems than it solves, 161 00:07:41,004 --> 00:07:43,715 experts agree it will boost productivity 162 00:07:43,757 --> 00:07:47,719 in areas like healthcare, transportation and finance, 163 00:07:47,761 --> 00:07:51,890 adding $15.7 trillion to global GDP by 2030. 164 00:07:51,932 --> 00:07:54,267 That's more than the current output 165 00:07:54,309 --> 00:07:56,353 of China and India combined. 166 00:07:59,814 --> 00:08:02,901 So how big of a deal is AI? 167 00:08:02,943 --> 00:08:04,861 - From my perspective, 168 00:08:04,903 --> 00:08:06,446 it's one of the three big deals in human history. 169 00:08:06,488 --> 00:08:08,490 - Human history? - Absolute human history. 170 00:08:08,531 --> 00:08:11,660 - [Kal] I'm here with Andrew McAfee, 171 00:08:11,701 --> 00:08:13,328 one of the world's leading experts 172 00:08:13,370 --> 00:08:15,664 on how new technology transforms economies 173 00:08:15,705 --> 00:08:19,501 and, subsequently, the entirety of human society. 174 00:08:19,542 --> 00:08:22,295 - If you want to graph human history, 175 00:08:22,337 --> 00:08:24,339 what you learn is that for thousands of years, 176 00:08:24,381 --> 00:08:26,675 absolutely nothing happened, we were just flatlining. 177 00:08:26,716 --> 00:08:30,095 It was almost indistinguishable from being dead. 178 00:08:30,136 --> 00:08:32,180 And then, all of a sudden, at one point in time, 179 00:08:32,222 --> 00:08:33,890 that graph of human history, 180 00:08:33,932 --> 00:08:35,350 it doesn't matter what you're looking at, 181 00:08:35,392 --> 00:08:37,060 went from boring horizontal 182 00:08:37,102 --> 00:08:39,479 to crazy vertical, kind of, in the blink of an eye. 183 00:08:39,521 --> 00:08:40,855 And it happened right around 1800, 184 00:08:40,897 --> 00:08:43,483 because of, first of all, steam power 185 00:08:43,525 --> 00:08:45,360 and then, second of all, electricity. 186 00:08:47,612 --> 00:08:50,782 So electricity did some pretty obvious things, right? 187 00:08:50,824 --> 00:08:52,158 It gave us trolleys, 188 00:08:52,200 --> 00:08:53,618 it gave us subways. 189 00:08:53,660 --> 00:08:55,495 Less obvious, it gave us vertical cities 190 00:08:55,537 --> 00:08:56,871 instead of horizontal ones. 191 00:08:56,913 --> 00:08:58,373 - Electricity did? - Absolutely. 192 00:08:58,415 --> 00:09:00,208 - You need elevators. - Oh, elevators, okay. 193 00:09:00,250 --> 00:09:02,210 - You just don't have vertical cities without that. 194 00:09:02,252 --> 00:09:03,878 You can't climb up 80 flights of stairs every day. 195 00:09:03,920 --> 00:09:06,423 So these two industrial revolutions of steam 196 00:09:06,464 --> 00:09:07,882 and then the one-two punch of electricity 197 00:09:07,924 --> 00:09:09,259 and internal combustion, 198 00:09:09,301 --> 00:09:10,760 literally changed human history. 199 00:09:10,802 --> 00:09:12,137 There's no other way to look at it. 200 00:09:12,178 --> 00:09:14,097 And these were all technologies 201 00:09:14,139 --> 00:09:17,142 that let us overcome the limitations of our muscles. 202 00:09:17,183 --> 00:09:18,685 What's going on with AI 203 00:09:18,727 --> 00:09:20,353 is that we are overcoming limitations 204 00:09:20,395 --> 00:09:22,522 of our individual brains, of our mental power. 205 00:09:24,649 --> 00:09:26,484 We have actual tough challenges to work on, 206 00:09:26,526 --> 00:09:28,194 sincerely tough challenges, right? 207 00:09:28,236 --> 00:09:29,362 We should cure cancer, 208 00:09:29,404 --> 00:09:31,448 we should feed more people, 209 00:09:31,489 --> 00:09:33,742 we should stop cooking the planet in the 21st century. 210 00:09:33,783 --> 00:09:35,910 These are just insanely complicated things. 211 00:09:35,952 --> 00:09:38,455 And our brains chip away at that complexity, 212 00:09:38,496 --> 00:09:39,998 and we do it with science, 213 00:09:40,040 --> 00:09:41,708 and we do it with accumulating knowledge, 214 00:09:41,750 --> 00:09:43,460 but the complexity is just overwhelming. 215 00:09:43,501 --> 00:09:45,503 The way I think about AI 216 00:09:45,545 --> 00:09:48,048 is that we actually have a really powerful new colleague 217 00:09:48,089 --> 00:09:51,384 to help us make in-roads into that crazy complexity, 218 00:09:51,426 --> 00:09:55,096 because what these new technologies are so good at 219 00:09:55,138 --> 00:09:57,390 is seeing even really subtle patterns 220 00:09:57,432 --> 00:09:59,434 in overwhelmingly huge amounts of data, 221 00:09:59,476 --> 00:10:00,977 more than you and I can take in. 222 00:10:01,019 --> 00:10:03,563 One of the craziest examples I heard recently 223 00:10:03,605 --> 00:10:05,982 was in finance and the rise of, 224 00:10:06,024 --> 00:10:08,777 they're called robo-advisors, 225 00:10:08,818 --> 00:10:11,363 which is just an algorithm that puts your investment portfolio together. 226 00:10:11,404 --> 00:10:13,031 Up until now, 227 00:10:13,073 --> 00:10:15,241 you had to have a certain level of affluence 228 00:10:15,283 --> 00:10:18,953 to even get in the office of financial planners and advisors. 229 00:10:18,995 --> 00:10:21,414 That's changing really quickly. 230 00:10:21,456 --> 00:10:23,666 With things like robo-advising, people who have less wealth 231 00:10:23,708 --> 00:10:25,543 and less wealth and less wealth, 232 00:10:25,585 --> 00:10:28,630 can get access to super powerful cutting-edge tools 233 00:10:28,671 --> 00:10:31,007 to improve their financial lives. 234 00:10:31,049 --> 00:10:32,842 - That's exciting, especially, 235 00:10:32,884 --> 00:10:34,886 because it seems like we've always had people 236 00:10:34,928 --> 00:10:37,305 who are willing to use this stuff to do harm. 237 00:10:37,347 --> 00:10:39,474 - I'm not saying that there's nothing to worry about. 238 00:10:39,516 --> 00:10:42,602 And what we know from the previous industrial revolutions 239 00:10:42,644 --> 00:10:44,687 is they brought some bad stuff along with them. 240 00:10:44,729 --> 00:10:47,440 We absolutely mechanized warfare 241 00:10:47,482 --> 00:10:49,234 with all of these industrial technologies. 242 00:10:49,275 --> 00:10:52,487 We absolutely polluted the hell out of the environment. 243 00:10:52,529 --> 00:10:54,280 We made really serious mistakes 244 00:10:54,322 --> 00:10:56,199 like large scale child labor 245 00:10:56,241 --> 00:10:57,867 because of the industrial revolution. 246 00:10:57,909 --> 00:10:59,202 So it's not all perfect, 247 00:10:59,244 --> 00:11:01,037 not at all points in time, 248 00:11:01,079 --> 00:11:03,957 and the same thing's gonna happen this time around. 249 00:11:03,998 --> 00:11:05,333 - Damn. 250 00:11:06,543 --> 00:11:08,169 Now McAfee's got me thinking, 251 00:11:08,211 --> 00:11:09,963 what if history repeats itself 252 00:11:10,004 --> 00:11:14,300 and AI reshapes society in all the wrong ways... 253 00:11:14,342 --> 00:11:15,927 like rampant pollution 254 00:11:15,969 --> 00:11:19,681 and tiny city coworkers? 255 00:11:19,722 --> 00:11:21,558 What about the moral and ethical questions 256 00:11:21,599 --> 00:11:24,519 that come up with powerful new technologies? 257 00:11:24,561 --> 00:11:27,522 Walking around London's National Computing Museum, 258 00:11:27,564 --> 00:11:29,607 you see a lot of machines that were created 259 00:11:29,649 --> 00:11:33,361 to advance society like the two-ton Harwell Dekatron, 260 00:11:33,403 --> 00:11:35,321 which was built to make calculations 261 00:11:35,363 --> 00:11:38,908 for Britain's scientific research program in the 1950s. 262 00:11:38,950 --> 00:11:40,994 But in the wrong hands, 263 00:11:41,035 --> 00:11:43,496 there's no telling how a new technology will be used. 264 00:11:43,538 --> 00:11:44,914 Could you watch porn on this computer? 265 00:11:44,956 --> 00:11:47,917 - (laughs) Well, you can turn them on 266 00:11:47,959 --> 00:11:49,127 to see very low resolution porn. 267 00:11:49,169 --> 00:11:50,587 - Okay. 268 00:11:51,921 --> 00:11:54,549 I'm here with programmer Alan Zucconi 269 00:11:54,591 --> 00:11:57,260 who teaches at Goldsmith's College here in London. 270 00:11:57,302 --> 00:12:00,805 He's used tech to help create some revolutionary things 271 00:12:00,847 --> 00:12:05,226 like game controllers for people with impaired mobility. 272 00:12:05,268 --> 00:12:07,562 He says that one of the biggest moral quandaries 273 00:12:07,604 --> 00:12:10,190 in tech history is coming soon 274 00:12:10,231 --> 00:12:13,735 as AI begins to replicate so many human behaviors 275 00:12:13,776 --> 00:12:16,112 it can pass as one of us. 276 00:12:16,154 --> 00:12:17,989 What is this thing? 277 00:12:18,031 --> 00:12:21,534 - Basically it's one of the first computers ever built, 278 00:12:21,576 --> 00:12:24,162 and it was built by Alan Turing and his collaborators. 279 00:12:24,204 --> 00:12:27,040 This machine was one of the first computers 280 00:12:27,081 --> 00:12:28,958 that was able to decode the Enigma code 281 00:12:29,000 --> 00:12:30,710 that was designed by Nazis. 282 00:12:30,752 --> 00:12:32,545 - Whoa. 283 00:12:32,587 --> 00:12:35,298 Alan Turing was the father of modern computer science 284 00:12:35,340 --> 00:12:37,509 and when he wasn't helping the allies win the war 285 00:12:37,550 --> 00:12:39,219 by breaking Nazi codes, 286 00:12:39,260 --> 00:12:41,137 he was philosophizing about something he called 287 00:12:41,179 --> 00:12:43,264 the Turing Test. 288 00:12:43,306 --> 00:12:46,017 - How can we tell apart a human from a machine? 289 00:12:46,059 --> 00:12:49,521 And if we can't tell the difference, 290 00:12:49,562 --> 00:12:53,191 then the machine passes what he called the "imitation game." 291 00:12:53,233 --> 00:12:56,069 The machine is trying to imitate the human behavior. 292 00:12:56,110 --> 00:12:58,988 Now this has been known as the Turing test, 293 00:12:59,030 --> 00:13:00,532 and this was one of the machines 294 00:13:00,573 --> 00:13:02,075 that hypothetically could have been used. 295 00:13:02,116 --> 00:13:04,244 - To take the Turing Test, 296 00:13:04,285 --> 00:13:05,954 a human would input questions into a machine 297 00:13:05,995 --> 00:13:08,873 while an outside observer assessed 298 00:13:08,915 --> 00:13:10,500 whether or not the responses coming back 299 00:13:10,542 --> 00:13:13,878 were from a human or a machine imitating a human. 300 00:13:13,920 --> 00:13:15,213 How old are you? 301 00:13:17,090 --> 00:13:18,716 - [Kal] There we go. - [Alan] Oh! 302 00:13:18,758 --> 00:13:20,468 - It knows how old it is. 303 00:13:20,510 --> 00:13:24,472 I was born in 1912, so I'm 105 years old. 304 00:13:26,849 --> 00:13:28,851 Back in Turing's time, 305 00:13:28,893 --> 00:13:30,270 it was pretty easy to spot the computer 306 00:13:30,311 --> 00:13:34,232 but today, AI is able to study human behavior 307 00:13:34,274 --> 00:13:36,818 and program itself to act like us. 308 00:13:36,859 --> 00:13:39,737 Can you tell the difference between this? 309 00:13:39,779 --> 00:13:41,239 - Normally I begin these remarks 310 00:13:41,281 --> 00:13:43,575 with a joke about data science 311 00:13:43,616 --> 00:13:45,577 but about half the stuff my staff came up with 312 00:13:45,618 --> 00:13:47,412 - was below average. - [Kal] And this? 313 00:13:47,453 --> 00:13:49,122 - Our enemies can make it look 314 00:13:49,163 --> 00:13:51,874 like anyone is saying anything at any point in time. 315 00:13:51,916 --> 00:13:55,503 - [Kal] That second one was actually created by BuzzFeed 316 00:13:55,545 --> 00:13:57,880 along with actor Jordan Peele. 317 00:13:57,922 --> 00:13:59,966 And it got a lot of people concerned 318 00:14:00,008 --> 00:14:02,885 about a new AI form of fake news. 319 00:14:02,927 --> 00:14:04,887 - Moving forward we need to be more vigilant 320 00:14:04,929 --> 00:14:06,681 with what we trust from the internet. 321 00:14:06,723 --> 00:14:08,808 AI studied Peele's facial movements, 322 00:14:08,850 --> 00:14:12,687 then merged them and recreated them on Obama's face, 323 00:14:12,729 --> 00:14:15,732 creating a hybrid known as a deepfake. 324 00:14:15,773 --> 00:14:17,025 - You might have seen 325 00:14:17,066 --> 00:14:18,901 something similar, for example, in Snapchat 326 00:14:18,943 --> 00:14:21,237 there is a filter that allows to swap faces. 327 00:14:21,279 --> 00:14:24,782 The difference is that, that filter does it in a very simple way. 328 00:14:24,824 --> 00:14:26,993 But the technology behind deepfakes 329 00:14:27,035 --> 00:14:28,578 relies on artificial intelligence. 330 00:14:28,620 --> 00:14:31,205 It comes from something called "deep learning." 331 00:14:31,247 --> 00:14:34,667 Artificial neural networks extract facial expression. 332 00:14:34,709 --> 00:14:37,211 It uses that expression to recreate your face. 333 00:14:37,253 --> 00:14:41,507 And this is how we manage to achieve photorealistic results. 334 00:14:41,549 --> 00:14:44,010 - Alan makes internet tutorials 335 00:14:44,052 --> 00:14:45,470 on how to make deep fakes, 336 00:14:45,511 --> 00:14:47,764 and he's a true believer that this technology 337 00:14:47,805 --> 00:14:50,808 should develop freely without restrictions, 338 00:14:50,850 --> 00:14:53,895 even if it could potentially start World War III. 339 00:14:53,936 --> 00:14:57,273 How is the consumer supposed to reasonably know 340 00:14:57,315 --> 00:14:59,651 what's reality and what's not reality? 341 00:14:59,692 --> 00:15:02,612 - As a consumer, when you approach news, 342 00:15:02,654 --> 00:15:05,448 whether it's an article, whether it's a video, whether it's a picture, 343 00:15:05,490 --> 00:15:07,909 everything that you see has been created by someone. 344 00:15:07,950 --> 00:15:09,327 "What is the narrative of what I'm seeing? 345 00:15:09,369 --> 00:15:11,329 What does this video want me to tell?" 346 00:15:11,371 --> 00:15:13,456 - So I can see-- 347 00:15:13,498 --> 00:15:15,375 - The danger. 348 00:15:15,416 --> 00:15:17,168 - The danger as well as just the curiosity of it. 349 00:15:17,210 --> 00:15:19,253 Is this actually gonna help people? 350 00:15:19,295 --> 00:15:21,839 Because I would imagine you have talked to people 351 00:15:21,881 --> 00:15:24,217 who look at how to grow the economy 352 00:15:24,258 --> 00:15:25,927 through this type of technology. 353 00:15:25,968 --> 00:15:28,554 What are some of the practical economic impacts of it? 354 00:15:28,596 --> 00:15:30,890 - I think that the first industry that will 355 00:15:30,932 --> 00:15:33,726 take advantage of that is the film industry. 356 00:15:33,768 --> 00:15:36,229 Simply because changing faces, 357 00:15:36,270 --> 00:15:38,731 it's something that we've been trying to do for 358 00:15:38,773 --> 00:15:41,067 decades in movies, and usually we use makeup, 359 00:15:41,109 --> 00:15:43,695 usually we use masks, sometimes we use CGI. 360 00:15:43,736 --> 00:15:45,655 - As an actor and as somebody who worked in politics, 361 00:15:45,697 --> 00:15:48,074 this freaks me out so much. 362 00:15:48,116 --> 00:15:49,117 - I totally also understand-- - And it should. 363 00:15:49,158 --> 00:15:50,785 - [Both] And it should. 364 00:15:50,827 --> 00:15:53,454 - BuzzFeed's deep fake revealed to the general public 365 00:15:53,496 --> 00:15:55,498 just how vulnerable we are. 366 00:15:55,540 --> 00:15:57,959 In a time when the president can open his mouth 367 00:15:58,000 --> 00:15:59,919 and move markets, 368 00:15:59,961 --> 00:16:02,338 a well made deep fake could sink the global economy 369 00:16:02,380 --> 00:16:04,632 faster than the flash crash, 370 00:16:04,674 --> 00:16:06,092 obliterating your IRA 371 00:16:06,134 --> 00:16:08,136 in the time it takes fake Obama to say-- 372 00:16:08,177 --> 00:16:10,430 - Stay woke, bitches. 373 00:16:10,471 --> 00:16:12,640 - [Kal] Does any of this sound a little science fictiony, 374 00:16:12,682 --> 00:16:14,517 even a little scary? 375 00:16:14,559 --> 00:16:17,979 If AI grows powerful enough to know how we move, 376 00:16:18,020 --> 00:16:19,981 how we talk, and how we think, 377 00:16:20,022 --> 00:16:22,900 it may become indistinguishable from us. 378 00:16:24,527 --> 00:16:26,863 And if AI has its own consciousness, 379 00:16:26,904 --> 00:16:30,116 AI could also develop strong opinions about us, 380 00:16:30,158 --> 00:16:32,910 and they may not be positive. 381 00:16:35,037 --> 00:16:36,789 - And in the future, 382 00:16:36,831 --> 00:16:39,125 AI could develop a will of its own, 383 00:16:39,167 --> 00:16:42,003 a will that is in conflict with ours. 384 00:16:42,044 --> 00:16:44,464 The rise of powerful AI will be either 385 00:16:44,505 --> 00:16:46,382 the best or the worst thing 386 00:16:46,424 --> 00:16:48,801 ever to happen to humanity. 387 00:16:48,843 --> 00:16:51,929 - I tried to convince people to slow down, 388 00:16:51,971 --> 00:16:54,849 slow down AI, to regulate AI. 389 00:16:54,891 --> 00:16:56,851 This was futile. I tried for years. 390 00:16:56,893 --> 00:16:59,020 - Nobody listened, - This seems like a scene 391 00:16:59,061 --> 00:17:00,980 - in a movie where the robots-- - Nobody listened. 392 00:17:01,022 --> 00:17:02,940 - ...are gonna fucking take over and you're freaking me out. 393 00:17:05,651 --> 00:17:06,944 - How real is the threat 394 00:17:06,986 --> 00:17:09,030 of an AI-led doomsday scenario? 395 00:17:12,116 --> 00:17:15,036 To find out, I need to talk to the guy 396 00:17:15,077 --> 00:17:18,206 who's research got everyone freaked out in the first place. 397 00:17:18,247 --> 00:17:20,875 Okay, so I'm really excited to talk to you because-- 398 00:17:20,917 --> 00:17:22,710 Well, I'm excited to talk to you for a number of reasons, 399 00:17:22,752 --> 00:17:26,172 but we have been exploring artificial intelligence, 400 00:17:26,214 --> 00:17:28,841 trying to figure out what it is, where it's headed. 401 00:17:28,883 --> 00:17:33,888 You've influenced people like Elon Musk and Bill Gates, 402 00:17:36,140 --> 00:17:41,979 and that's a pretty amazing sheet of influence. 403 00:17:42,021 --> 00:17:45,608 I'm at Oxford University meeting Dr. Nick Bostrom 404 00:17:45,650 --> 00:17:49,111 and since he's not one to toot his own horn, I will. 405 00:17:49,153 --> 00:17:51,197 ♪ Rock and roll ♪ 406 00:17:51,239 --> 00:17:54,992 He's one of the foremost minds on machine superintelligence 407 00:17:55,034 --> 00:17:57,078 and its existential risks 408 00:17:57,119 --> 00:17:59,789 and the author of some great beach reads. 409 00:17:59,831 --> 00:18:01,541 I feel lucky to meet him, 410 00:18:01,582 --> 00:18:03,501 because Nick is so busy doing his own deep learning 411 00:18:03,543 --> 00:18:05,253 that he only carves out an hour a month 412 00:18:05,294 --> 00:18:07,421 for answering questions about his research. 413 00:18:10,258 --> 00:18:14,303 A lot of the conversations about AI 414 00:18:14,345 --> 00:18:17,014 are things like are the robots gonna take over 415 00:18:17,056 --> 00:18:18,850 and is that gonna be the end of humanity. 416 00:18:18,891 --> 00:18:21,102 I'm curious if things are not managed properly 417 00:18:21,143 --> 00:18:24,272 is there a scenario in which AI hurts society 418 00:18:24,313 --> 00:18:28,192 or even maybe eliminates humanity as we know it? 419 00:18:28,234 --> 00:18:29,569 - In the longer-term context, 420 00:18:29,610 --> 00:18:31,445 if we're thinking about what really happens 421 00:18:31,487 --> 00:18:33,072 if AI goes all the way, 422 00:18:33,114 --> 00:18:34,949 and becomes able to replicate 423 00:18:34,991 --> 00:18:38,035 the same general intelligence that makes us human, 424 00:18:38,077 --> 00:18:40,496 then, yeah, I do think that in that context 425 00:18:40,538 --> 00:18:44,041 there are bigger risks, including existential risks. 426 00:18:45,376 --> 00:18:47,378 I mean, if you think about 427 00:18:47,420 --> 00:18:49,213 something like self-driving cars 428 00:18:49,255 --> 00:18:51,549 could run over a pedestrian. 429 00:18:51,591 --> 00:18:52,967 There are privacy concerns. 430 00:18:53,009 --> 00:18:56,053 The militarization of these autonomous weapons. 431 00:18:57,388 --> 00:18:59,307 All of these are real concerns. 432 00:18:59,348 --> 00:19:00,975 But at some point there will also be 433 00:19:01,017 --> 00:19:02,768 the question of how we affect 434 00:19:02,810 --> 00:19:04,645 these digital minds that we're building. 435 00:19:04,687 --> 00:19:09,442 They themselves might obtain degrees of moral standing. 436 00:19:09,483 --> 00:19:12,653 And if you roll the tape forward and if you think, what ultimately 437 00:19:12,695 --> 00:19:15,031 is the fate of homo sapiens, 438 00:19:15,072 --> 00:19:18,242 the long-term future could be machine intelligence dominated. 439 00:19:18,284 --> 00:19:21,704 It's quite possible humanity can go extinct. 440 00:19:23,372 --> 00:19:25,958 Those great powers come with a risk 441 00:19:26,000 --> 00:19:28,586 that they will, by accident 442 00:19:28,628 --> 00:19:30,671 or by deliberate misuse, 443 00:19:30,713 --> 00:19:33,507 be used to cause immense destruction. 444 00:19:35,843 --> 00:19:37,511 So I think those are in the cards 445 00:19:37,553 --> 00:19:38,804 and if we're thinking about longer timeframes, 446 00:19:38,846 --> 00:19:40,264 you know, the outcome might be 447 00:19:40,306 --> 00:19:44,185 very, very good or very not good. 448 00:19:46,228 --> 00:19:49,106 - [Kal] Okay, these scenarios do sound scary. 449 00:19:49,148 --> 00:19:51,525 But out of all the potential outcomes, 450 00:19:51,567 --> 00:19:52,944 Nick actually believes 451 00:19:52,985 --> 00:19:55,196 the most likely doomsday scenario with AI 452 00:19:55,237 --> 00:19:56,822 will be economic. 453 00:19:56,864 --> 00:19:58,199 - If you think about it, 454 00:19:58,240 --> 00:20:00,201 technology in general really is 455 00:20:00,242 --> 00:20:02,453 the idea that we can do more with less. 456 00:20:02,495 --> 00:20:05,665 We can achieve more of what we want with less effort. 457 00:20:05,706 --> 00:20:09,335 The goal in that sense is full unemployment, right? 458 00:20:09,377 --> 00:20:12,338 To be able to have machines and technology 459 00:20:12,380 --> 00:20:14,882 do everything that needs to be done so we don't have to work. 460 00:20:14,924 --> 00:20:17,259 And I think that's like the desired end goal. 461 00:20:17,301 --> 00:20:19,428 It's not some horrible thing we need to try to prevent. 462 00:20:19,470 --> 00:20:20,846 It's what we want to realize. 463 00:20:20,888 --> 00:20:22,932 Now, to make that actually be a utopia, 464 00:20:22,974 --> 00:20:24,767 there are a couple of big challenges 465 00:20:24,809 --> 00:20:27,645 along the way that would need to be solved. 466 00:20:27,687 --> 00:20:30,606 One, of course, is the economic problem. 467 00:20:30,648 --> 00:20:33,567 So one reason why people need jobs is they need income. 468 00:20:33,609 --> 00:20:35,611 If you can solve that economic problem, 469 00:20:35,653 --> 00:20:37,571 then I think there is a second big challenge that 470 00:20:37,613 --> 00:20:39,198 for many people it's also 471 00:20:39,240 --> 00:20:40,783 a sense of dignity. 472 00:20:40,825 --> 00:20:43,452 So many people tend to find their worth 473 00:20:43,494 --> 00:20:45,079 being a breadwinner 474 00:20:45,121 --> 00:20:47,581 or contributing to society, giving something back. 475 00:20:47,623 --> 00:20:49,291 Like but if a machine could do everything better than you could do, 476 00:20:49,333 --> 00:20:51,752 then you wouldn't have any chance 477 00:20:51,794 --> 00:20:53,796 to contribute anything, right? 478 00:20:53,838 --> 00:20:56,716 So then you would have to rethink culture at the fairly 479 00:20:56,757 --> 00:20:59,010 fundamental level, I think. 480 00:20:59,051 --> 00:21:00,803 - [Kal] A world where no one works? 481 00:21:00,845 --> 00:21:02,847 That doesn't sound so bad. 482 00:21:04,557 --> 00:21:06,475 I can see it now. 483 00:21:09,854 --> 00:21:11,772 Spending time with friends, 484 00:21:11,814 --> 00:21:14,650 mining the full extent of my human potential, 485 00:21:14,692 --> 00:21:16,861 not having to adjust the hot tub 486 00:21:16,902 --> 00:21:20,114 because it knows exactly how I like it. 487 00:21:20,156 --> 00:21:23,451 The problem is that's not how it's gone down historically. 488 00:21:23,492 --> 00:21:26,787 The rise of the machines has actually happened before, 489 00:21:26,829 --> 00:21:29,707 and last time it wasn't all strawberries and champagne 490 00:21:29,749 --> 00:21:31,417 in the hot tub. 491 00:21:31,459 --> 00:21:34,211 (upbeat music) 492 00:21:37,006 --> 00:21:39,175 I'm meeting with economist Nick Srnicek 493 00:21:39,216 --> 00:21:41,052 to find out what it really looked like 494 00:21:41,093 --> 00:21:43,179 the last time machines took our jobs. 495 00:21:43,220 --> 00:21:46,223 Oh, and we're meeting at a loom for some reason. 496 00:21:46,265 --> 00:21:48,017 So what are you gonna make? 497 00:21:48,059 --> 00:21:49,769 - I happen to be making a sort of anarchist flag, 498 00:21:49,810 --> 00:21:52,021 - actually. - Interesting, shocking. 499 00:21:53,898 --> 00:21:57,401 Nick has a PhD from the London School of Economics. 500 00:21:57,443 --> 00:22:00,404 I, on the other hand, do not. 501 00:22:01,447 --> 00:22:03,949 He also has a manifesto. 502 00:22:03,991 --> 00:22:07,661 It calls for everyone to hasten the coming age of automation 503 00:22:07,703 --> 00:22:09,747 by tearing down old institutions. 504 00:22:11,999 --> 00:22:15,503 Basically, dismantle capitalism now. 505 00:22:15,544 --> 00:22:17,296 Yeah, this is not gonna work for me. 506 00:22:17,338 --> 00:22:19,340 There's no way I can have this conversation with you, 507 00:22:19,381 --> 00:22:21,592 I'm sorry. Let me forget the loom. 508 00:22:21,634 --> 00:22:24,553 So why are we here? 509 00:22:24,595 --> 00:22:28,224 - Well, the loom is sort of like AI back in the 1800s. 510 00:22:28,265 --> 00:22:30,059 It was a new technology 511 00:22:30,101 --> 00:22:32,895 which was threatening a huge amount of jobs 512 00:22:32,937 --> 00:22:34,438 and basically it sparked off 513 00:22:34,480 --> 00:22:37,149 a number of different responses by workers, 514 00:22:37,191 --> 00:22:39,360 like the rise of the Luddites, for instance. 515 00:22:39,401 --> 00:22:41,195 We use the term Luddite nowadays 516 00:22:41,237 --> 00:22:44,073 to often mean just anybody who hates technology, 517 00:22:44,115 --> 00:22:46,534 but that's not really the case. 518 00:22:46,575 --> 00:22:48,619 - [Kal] The Luddites were named after Ned Ludd, 519 00:22:48,661 --> 00:22:51,163 an apprentice in a textile factory 520 00:22:51,205 --> 00:22:53,582 who legend says was whipped for idleness 521 00:22:53,624 --> 00:22:56,669 and then was like, "Dude, I'm only idle 522 00:22:56,710 --> 00:22:59,338 'cause I'm being replaced by a fucking loom, okay?" 523 00:22:59,380 --> 00:23:02,883 And he became the first person to rage against the machine, 524 00:23:02,925 --> 00:23:06,011 inspiring a movement. 525 00:23:06,053 --> 00:23:09,140 - The Luddites took to breaking the machinery 526 00:23:09,181 --> 00:23:10,975 to save their jobs. 527 00:23:11,016 --> 00:23:12,852 So I think that's something that we see today with AI. 528 00:23:12,893 --> 00:23:16,272 People are similarly feeling threatened today. 529 00:23:16,313 --> 00:23:17,982 - Do you know how many jobs 530 00:23:18,023 --> 00:23:19,608 are projected to be lost or in need of replacement? 531 00:23:19,650 --> 00:23:21,694 - 47% of jobs in America 532 00:23:21,735 --> 00:23:25,281 are potentially automatable over the next two decades. 533 00:23:25,322 --> 00:23:26,574 - So it sounds like a real problem. 534 00:23:26,615 --> 00:23:28,159 - It could be a massive problem 535 00:23:28,200 --> 00:23:29,785 and the real issue is how do we make sure 536 00:23:29,827 --> 00:23:32,037 that five years, ten years down the line, 537 00:23:32,079 --> 00:23:34,582 people aren't just being left to starve and without homes? 538 00:23:34,623 --> 00:23:37,793 - So how do we do that? - Universal basic income. 539 00:23:37,835 --> 00:23:39,628 - [Kal] Universal basic income 540 00:23:39,670 --> 00:23:43,007 is the radical idea that everyone in society gets free cash, 541 00:23:43,048 --> 00:23:44,842 no strings attached. 542 00:23:44,884 --> 00:23:47,636 And it has some high-profile fans. 543 00:23:47,678 --> 00:23:48,971 - We should explore ideas 544 00:23:49,013 --> 00:23:50,639 like universal basic income 545 00:23:50,681 --> 00:23:52,266 to make sure that everyone has a cushion 546 00:23:52,308 --> 00:23:53,684 to try new ideas. 547 00:23:53,726 --> 00:23:54,727 - [Kal] Some countries 548 00:23:54,768 --> 00:23:56,562 and even cities within America 549 00:23:56,604 --> 00:23:58,022 have tried pilot programs 550 00:23:58,063 --> 00:24:00,774 with mixed results. 551 00:24:00,816 --> 00:24:02,860 - I think there's an amazing opportunity 552 00:24:02,902 --> 00:24:04,528 with these new technologies 553 00:24:04,570 --> 00:24:06,655 to really change the way that we organize society. 554 00:24:06,697 --> 00:24:09,783 You could move towards a more social democratic system. 555 00:24:09,825 --> 00:24:13,787 It doesn't have to be the sort of cutthroat system that America has, 556 00:24:13,829 --> 00:24:15,623 in that everybody can support each other. 557 00:24:15,664 --> 00:24:17,374 If people like myself 558 00:24:17,416 --> 00:24:19,627 can start putting out these positive visions, 559 00:24:19,668 --> 00:24:21,670 I think when the crisis really hits, 560 00:24:21,712 --> 00:24:23,380 we can start to be implementing those ideas. 561 00:24:23,422 --> 00:24:27,218 - [Kal] UBI used to be regarded as a fringe concept, 562 00:24:27,259 --> 00:24:31,472 mostly promoted by people who, like Nick, write manifestos. 563 00:24:31,513 --> 00:24:34,016 But according to a 2017 Gallup poll, 564 00:24:34,058 --> 00:24:37,895 48% of Americans now support some form of UBI. 565 00:24:37,937 --> 00:24:40,064 But is a guaranteed paycheck 566 00:24:40,105 --> 00:24:42,149 enough to stop humans from rising up 567 00:24:42,191 --> 00:24:43,901 when robots come for our jobs? 568 00:24:45,611 --> 00:24:47,112 - What do we hate? 569 00:24:47,154 --> 00:24:48,739 - [Group] Artificial intelligence. 570 00:24:48,781 --> 00:24:50,366 - Why do we hate it? 571 00:24:50,407 --> 00:24:52,826 - It's forcing us to confront our weaknesses. 572 00:24:52,868 --> 00:24:55,204 - With that, I'd like to call to order this meeting 573 00:24:55,246 --> 00:24:57,498 of Luddites, the Local Union of Dudes 574 00:24:57,539 --> 00:24:59,250 Defying Intelligent Technology, 575 00:24:59,291 --> 00:25:01,752 Especially Social Media. 576 00:25:01,794 --> 00:25:03,545 First order of business, 577 00:25:03,587 --> 00:25:06,840 artificial intelligence is hollowing out the job market. 578 00:25:06,882 --> 00:25:09,802 Our middle class jobs are the first ones to go. 579 00:25:09,843 --> 00:25:11,804 People like us with these jobs 580 00:25:11,845 --> 00:25:15,140 will be pushed into low skill jobs at the bottom. 581 00:25:15,182 --> 00:25:16,850 - Why would that happen, Ed? 582 00:25:16,892 --> 00:25:20,104 - Apparently, AI's better at medium-skilled jobs 583 00:25:20,145 --> 00:25:22,106 like crunching numbers 584 00:25:22,147 --> 00:25:24,566 than it is at low-skilled jobs like sweeping the floor. 585 00:25:24,608 --> 00:25:26,694 So it'll leave those jobs to us. 586 00:25:26,735 --> 00:25:29,780 Now I ask you who here besides Bill 587 00:25:29,822 --> 00:25:31,490 looks like they should be sweeping a floor? 588 00:25:31,532 --> 00:25:33,409 No offense, Bill. 589 00:25:33,450 --> 00:25:36,161 And there will be less need for retail jobs. 590 00:25:36,203 --> 00:25:38,831 People can just go online and order exactly what they want 591 00:25:38,872 --> 00:25:41,166 because that son of a bitch AI 592 00:25:41,208 --> 00:25:44,378 solved the searching and matching problem. 593 00:25:44,420 --> 00:25:45,671 Searching for customers 594 00:25:45,713 --> 00:25:46,880 and matching them with products 595 00:25:46,922 --> 00:25:48,549 like when Steve searched for a toupee 596 00:25:48,590 --> 00:25:49,925 that matched his head. 597 00:25:49,967 --> 00:25:51,885 - Big problem. - (laughing) 598 00:25:51,927 --> 00:25:53,887 Timeless jokes aside, 599 00:25:53,929 --> 00:25:56,932 AI makes that way easier. 600 00:25:56,974 --> 00:25:59,560 Kids today can match with hot babes from their phone 601 00:25:59,601 --> 00:26:01,562 while sitting on the toilet. 602 00:26:01,603 --> 00:26:03,689 The toilet used to be sacred. 603 00:26:03,731 --> 00:26:05,149 - [Group] Yeah! 604 00:26:05,190 --> 00:26:06,442 - And sure, searching and matching 605 00:26:06,483 --> 00:26:08,235 will create specialized jobs, 606 00:26:08,277 --> 00:26:10,404 but if the damn robots choose who gets them, 607 00:26:10,446 --> 00:26:11,989 how convenient. 608 00:26:12,031 --> 00:26:14,033 Companies are using AI 609 00:26:14,074 --> 00:26:17,244 to find employees with unique skills. 610 00:26:17,286 --> 00:26:18,954 It's inhuman. 611 00:26:18,996 --> 00:26:20,331 - Like with Dave. 612 00:26:20,372 --> 00:26:22,249 - Yeah, where the hell is Dave? 613 00:26:22,291 --> 00:26:24,293 - Some job matching AI noticed that he worked at FedEx 614 00:26:24,335 --> 00:26:26,754 and had YouTube tutorials about shaving his back hair. 615 00:26:26,795 --> 00:26:28,464 Now he's making six figures 616 00:26:28,505 --> 00:26:30,799 at some razor subscription company. 617 00:26:30,841 --> 00:26:32,593 - He just shaved himself off our bowling team. 618 00:26:32,634 --> 00:26:34,219 - Yeah! 619 00:26:34,261 --> 00:26:35,929 - Hey Ed, I just got an alert 620 00:26:35,971 --> 00:26:37,139 that our tee shirts are being sold 621 00:26:37,181 --> 00:26:38,932 with targeted ads on Facebook. 622 00:26:38,974 --> 00:26:40,642 Are you using AI 623 00:26:40,684 --> 00:26:41,894 to make money off people who hate AI? 624 00:26:41,935 --> 00:26:44,271 - No, no. 625 00:26:44,313 --> 00:26:46,357 I mean who you gonna believe, 626 00:26:46,398 --> 00:26:49,735 me or the AI trying to tear us apart? 627 00:26:49,777 --> 00:26:50,903 What do we hate? 628 00:26:50,944 --> 00:26:52,821 - Artificial intelligence! 629 00:26:52,863 --> 00:26:53,822 - What are we gonna do about it? 630 00:26:53,864 --> 00:26:55,991 - We're working on it! 631 00:26:56,033 --> 00:26:57,284 - That's a start. 632 00:27:02,956 --> 00:27:04,291 - Does the AI revolution 633 00:27:04,333 --> 00:27:06,877 have to be a case of us versus them? 634 00:27:06,919 --> 00:27:10,923 Tech savvy entrepreneurs like Louis Rosenberg say no. 635 00:27:10,964 --> 00:27:12,966 And he's made a career of predicting the future. 636 00:27:13,008 --> 00:27:14,760 Ah! 637 00:27:14,802 --> 00:27:17,971 (laughs) 638 00:27:18,013 --> 00:27:20,432 I was trying to scare you but it didn't work. 639 00:27:20,474 --> 00:27:21,850 (laughing) 640 00:27:21,892 --> 00:27:24,645 Louis is a technologist and inventor 641 00:27:24,686 --> 00:27:27,940 who wrote a graphic novel about the end of humanity. 642 00:27:27,981 --> 00:27:29,858 But he thinks we have a future with AI 643 00:27:29,900 --> 00:27:31,860 that is all about collaboration. 644 00:27:31,902 --> 00:27:34,363 It's the guiding principle behind his brain child, 645 00:27:34,405 --> 00:27:36,698 a technology called swarm. 646 00:27:36,740 --> 00:27:40,285 Swarm combines AI's data analysis skills 647 00:27:40,327 --> 00:27:41,995 with human knowledge and intuition 648 00:27:42,037 --> 00:27:44,540 to create a superintelligence, 649 00:27:44,581 --> 00:27:48,627 something between Stephen Hawking and Professor X. 650 00:27:48,669 --> 00:27:51,964 - Ultimately, it's based on nature 651 00:27:52,005 --> 00:27:53,382 and I like to say it all goes back 652 00:27:53,424 --> 00:27:55,217 to the birds and the bees. 653 00:27:55,259 --> 00:27:56,468 And that's because it's based on a phenomena 654 00:27:56,510 --> 00:27:58,512 called swarm intelligence. 655 00:27:58,554 --> 00:28:00,389 - Okay. 656 00:28:00,431 --> 00:28:02,433 - Swarm intelligence is why birds flock and fish school 657 00:28:02,474 --> 00:28:04,309 and bees swarm. 658 00:28:04,351 --> 00:28:06,854 They are smarter together than alone. 659 00:28:06,895 --> 00:28:09,731 And that's why when you see a school of fish moving around, 660 00:28:09,773 --> 00:28:12,067 biologists would describe that as a superorganism, 661 00:28:12,109 --> 00:28:13,902 they are thinking as one. 662 00:28:13,944 --> 00:28:16,029 And if we can connect people together 663 00:28:16,071 --> 00:28:18,740 using artificial intelligence algorithms, 664 00:28:18,782 --> 00:28:22,119 we can make people behave as super experts 665 00:28:22,161 --> 00:28:23,745 because of swarm intelligence. 666 00:28:23,787 --> 00:28:25,164 - So how does that technology work? 667 00:28:25,205 --> 00:28:27,207 - What we do is we enable groups of people 668 00:28:27,249 --> 00:28:29,293 that can be anywhere in the world 669 00:28:29,334 --> 00:28:31,253 and we can give them a question that'll pop up 670 00:28:31,295 --> 00:28:33,213 on all their screens at the exact same time 671 00:28:33,255 --> 00:28:35,090 and then we give them a unique interface 672 00:28:35,132 --> 00:28:38,010 that allows them to convey their input 673 00:28:38,051 --> 00:28:39,845 and there'll be a bunch of different options. 674 00:28:39,887 --> 00:28:42,181 And we're not just taking a poll or a survey. 675 00:28:42,222 --> 00:28:45,267 Each person has what looks like a little graphical magnet, 676 00:28:45,309 --> 00:28:46,560 and so they use their magnet 677 00:28:46,602 --> 00:28:48,353 to pull the swarm in a direction. 678 00:28:48,395 --> 00:28:49,688 And we have AI algorithms 679 00:28:49,730 --> 00:28:51,523 that are watching their behaviors. 680 00:28:51,565 --> 00:28:53,734 And it's determining different levels 681 00:28:53,775 --> 00:28:55,819 of confidence and conviction, 682 00:28:55,861 --> 00:28:57,821 and it's finding out what is the best aggregation 683 00:28:57,863 --> 00:29:01,283 of all their opinions and all of their experience, 684 00:29:01,325 --> 00:29:04,077 and the swarm starts moving in that direction, 685 00:29:04,119 --> 00:29:05,787 and it converges on an answer. 686 00:29:05,829 --> 00:29:07,039 So I'll give you a fun example. 687 00:29:07,080 --> 00:29:08,624 We were challenged a year ago 688 00:29:08,665 --> 00:29:10,584 to predict the Kentucky Derby. 689 00:29:10,626 --> 00:29:13,462 - [Male] And they're off in the Kentucky Derby. 690 00:29:13,504 --> 00:29:16,673 - We had a group of 20 horse racing enthusiasts 691 00:29:16,715 --> 00:29:18,675 and we said, "Okay, you're gonna work together as a swarm 692 00:29:18,717 --> 00:29:20,052 "and you're gonna predict the Kentucky Derby, 693 00:29:20,093 --> 00:29:21,720 "but not just the winner, 694 00:29:21,762 --> 00:29:23,472 first place, second place, third place, fourth place." 695 00:29:23,514 --> 00:29:26,225 We had them converge on these answers... 696 00:29:26,266 --> 00:29:29,811 and the group was perfect. 697 00:29:29,853 --> 00:29:32,898 And so anybody who'd placed a $20 bet on those four horses, 698 00:29:32,940 --> 00:29:34,942 won $11,000. 699 00:29:34,983 --> 00:29:36,777 - Holy shit. - And what's interesting 700 00:29:36,818 --> 00:29:38,779 is if we look at those 20 people as individuals, 701 00:29:38,820 --> 00:29:40,781 not a single one of them on their own 702 00:29:40,822 --> 00:29:42,658 picked all four horses correct. 703 00:29:42,699 --> 00:29:43,867 - Wow. 704 00:29:43,909 --> 00:29:45,577 - And had they taken a vote, 705 00:29:45,619 --> 00:29:47,246 they would've only gotten one horse right, 706 00:29:47,287 --> 00:29:48,705 but when they worked together as a swarm, 707 00:29:48,747 --> 00:29:50,832 they found that right combination 708 00:29:50,874 --> 00:29:52,543 of all their different insights, 709 00:29:52,584 --> 00:29:53,961 and they were, in this case, perfect. 710 00:29:55,629 --> 00:29:58,465 - Louis has invited me to lead a swarm 711 00:29:58,507 --> 00:30:01,134 to see how a random group of people can come together 712 00:30:01,176 --> 00:30:02,511 to make predictions. 713 00:30:02,553 --> 00:30:03,720 We'll start with the easy stuff. 714 00:30:03,762 --> 00:30:05,681 (upbeat music) 715 00:30:05,722 --> 00:30:08,308 Okay guys, so I'm gonna read a series of questions 716 00:30:08,350 --> 00:30:11,687 and you have 60 seconds to answer each one. 717 00:30:11,728 --> 00:30:13,772 The first question, 718 00:30:13,814 --> 00:30:17,985 which of these 2018 summer movies will gross the highest? 719 00:30:18,026 --> 00:30:22,114 Solo: A Star Wars Movie, Deadpool 2, Ocean's Eight, 720 00:30:22,155 --> 00:30:25,993 Jurassic World: Fallen Kingdom or The Incredibles 2? 721 00:30:26,034 --> 00:30:28,328 We filmed the swarm in spring 2018 722 00:30:28,370 --> 00:30:31,540 before any information was out about summer movies. 723 00:30:31,582 --> 00:30:33,792 - [Louis] The AI is watching 724 00:30:33,834 --> 00:30:35,752 to get a sense of the various levels of confidence. 725 00:30:35,794 --> 00:30:37,170 Some people are switching, 726 00:30:37,212 --> 00:30:38,755 some people are staying entrenched 727 00:30:38,797 --> 00:30:41,174 and the AI algorithms are seeing 728 00:30:41,216 --> 00:30:43,302 their different levels of conviction 729 00:30:43,343 --> 00:30:44,886 and allowing it to find that path 730 00:30:44,928 --> 00:30:47,598 to the solution that they can best agree upon. 731 00:30:48,974 --> 00:30:52,019 - Okay, so The Incredibles 2. 732 00:30:52,060 --> 00:30:53,979 They were right, 733 00:30:54,021 --> 00:30:56,940 the Incredibles 2 was the summer's highest grossing movie. 734 00:30:56,982 --> 00:30:58,984 - So one really interesting application 735 00:30:59,026 --> 00:31:02,738 is looking at questions that involve morality. 736 00:31:02,779 --> 00:31:04,072 And this has come up recently, 737 00:31:04,114 --> 00:31:05,240 because of self-driving cars. 738 00:31:05,282 --> 00:31:06,658 There's a big push right now 739 00:31:06,700 --> 00:31:10,245 to build moral decisions into self-driving cars, 740 00:31:10,287 --> 00:31:12,456 which for some people think they're surprised to hear that 741 00:31:12,497 --> 00:31:14,791 but if you think about it, self-driving cars, 742 00:31:14,833 --> 00:31:17,961 going down the road, and a little kid runs out into the road, 743 00:31:18,003 --> 00:31:21,632 let's say the car can't stop but it could drive off the road 744 00:31:21,673 --> 00:31:23,675 and endanger the passenger, 745 00:31:23,717 --> 00:31:27,220 and maybe kill the passenger and save the kid. 746 00:31:27,262 --> 00:31:29,348 And so the automobile makers are saying, 747 00:31:29,389 --> 00:31:32,559 we need to program morality into these cars 748 00:31:32,601 --> 00:31:35,228 that represent the population, 749 00:31:35,270 --> 00:31:38,482 represent what we people, drivers would do. 750 00:31:38,523 --> 00:31:40,400 That sounds easy until you then realize 751 00:31:40,442 --> 00:31:43,403 well what is the morality of the population? 752 00:31:43,445 --> 00:31:45,197 There's not an easy way to get at that. 753 00:31:45,238 --> 00:31:47,908 And if they program in morality that represents us today, 754 00:31:47,949 --> 00:31:51,453 will that morality represent us 20 years from now? 755 00:31:51,495 --> 00:31:54,414 - Right. - Next question, there's a self-driving car 756 00:31:54,456 --> 00:31:56,792 with a sudden brake failure 757 00:31:56,833 --> 00:31:58,210 that's gonna drive through a pedestrian crossing 758 00:31:58,251 --> 00:32:00,212 that will result in the death of a man. 759 00:32:00,253 --> 00:32:04,424 - Option A, the person who get killed is crossing legally. 760 00:32:04,466 --> 00:32:07,386 - Option B, the self-driving car with the sudden brake failure 761 00:32:07,427 --> 00:32:09,971 will swerve and drive through a pedestrian crossing 762 00:32:10,013 --> 00:32:11,848 in the other lane that will result in the death 763 00:32:11,890 --> 00:32:14,434 of a male athlete crossing on the red signal. 764 00:32:14,476 --> 00:32:16,019 This is a jaywalker. 765 00:32:16,061 --> 00:32:19,815 This athlete does not give a shit at all, 766 00:32:19,856 --> 00:32:21,733 and he's jaywalking. 767 00:32:21,775 --> 00:32:23,944 What should the self-driving car do, 768 00:32:23,985 --> 00:32:25,946 kill the boring dude who's crossing legally 769 00:32:25,987 --> 00:32:29,866 or kill the athlete who's jaywalking? 770 00:32:29,908 --> 00:32:32,661 If AI is bringing about the next industrial revolution, 771 00:32:32,703 --> 00:32:35,664 rooms like this are essentially the new factory floor. 772 00:32:35,706 --> 00:32:37,833 With human workers providing labor 773 00:32:37,874 --> 00:32:42,379 based on something AI doesn't have on its own, a conscience. 774 00:32:42,421 --> 00:32:44,005 There's a lot of debate over this one. 775 00:32:44,047 --> 00:32:45,966 That's fascinating, I wonder why. 776 00:32:46,007 --> 00:32:47,634 - [Louis] That's a tough one. 777 00:32:47,676 --> 00:32:49,678 - For me it's not, if you're jaywalking. 778 00:32:51,346 --> 00:32:53,640 So there was a slight preference 779 00:32:53,682 --> 00:32:55,851 that you would strike the jaywalking male athlete. 780 00:32:55,892 --> 00:32:57,394 - Oh! 781 00:32:57,436 --> 00:32:59,020 - If you think that one upset you, 782 00:32:59,062 --> 00:33:00,313 just please get ready. 783 00:33:00,355 --> 00:33:03,024 So now we'd like you to imagine 784 00:33:03,066 --> 00:33:05,861 a worst-case scenario 785 00:33:05,902 --> 00:33:08,822 where a self-driving car cannot brake in time 786 00:33:08,864 --> 00:33:13,535 and must steer towards one of six different pedestrians, 787 00:33:13,577 --> 00:33:17,748 one baby in a stroller... 788 00:33:17,789 --> 00:33:20,083 or one boy... 789 00:33:20,125 --> 00:33:22,586 or one girl... 790 00:33:22,627 --> 00:33:25,589 or one pregnant woman... 791 00:33:25,630 --> 00:33:29,134 I know... 792 00:33:29,176 --> 00:33:32,220 or two male doctors... 793 00:33:32,262 --> 00:33:34,931 or two female doctors. 794 00:33:34,973 --> 00:33:36,558 Who needs to die? 795 00:33:36,600 --> 00:33:38,727 (tense music) 796 00:33:44,149 --> 00:33:45,817 - [Female] Oh, my God! 797 00:33:45,859 --> 00:33:47,903 What? That's awful. 798 00:33:51,239 --> 00:33:52,532 [Man] Come on, man. 799 00:33:52,574 --> 00:33:55,202 [Woman] Oh, my God, seriously? 800 00:33:58,497 --> 00:34:01,958 - You said the self-driving car should hit the boy. 801 00:34:02,000 --> 00:34:03,126 Interesting. 802 00:34:03,168 --> 00:34:04,377 The type of swarm intelligence 803 00:34:04,419 --> 00:34:06,004 created in this room today 804 00:34:06,046 --> 00:34:07,881 could be sold in the near future 805 00:34:07,923 --> 00:34:10,050 to self-driving car manufacturers. 806 00:34:10,091 --> 00:34:12,219 And if that sounds scary to you, 807 00:34:12,260 --> 00:34:14,805 it's way less scary than the alternative. 808 00:34:14,846 --> 00:34:17,682 - When a self-driving car is gonna slam on its brakes 809 00:34:17,724 --> 00:34:20,101 and realizes it can't stop in time to hit somebody, 810 00:34:20,143 --> 00:34:23,271 should the car protect the passenger or a pedestrian? 811 00:34:23,313 --> 00:34:25,232 The hope is that the car manufacturers 812 00:34:25,273 --> 00:34:29,236 program the cars to reflect the morality of the population 813 00:34:29,277 --> 00:34:31,154 that has been buying those cars. 814 00:34:31,196 --> 00:34:33,073 The cynical view would be 815 00:34:33,114 --> 00:34:35,158 car manufacturers start competing 816 00:34:35,200 --> 00:34:37,994 on their car will protect the passenger more 817 00:34:38,036 --> 00:34:39,454 than some other car 818 00:34:39,496 --> 00:34:41,122 and that could be a sales feature. 819 00:34:41,164 --> 00:34:42,791 I think that's a worse scenario 820 00:34:42,833 --> 00:34:45,252 than the moral sensibilities of the community. 821 00:34:45,293 --> 00:34:47,212 - Whoa, that's a dark thought. 822 00:34:47,254 --> 00:34:50,382 And we want to end this show on something uplifting, 823 00:34:50,423 --> 00:34:52,717 maybe even heavenly. 824 00:34:54,553 --> 00:34:58,723 So before you imagine a future with Grand Theft Auto levels 825 00:34:58,765 --> 00:35:01,893 of pedestrian safety negligence, let's take a field trip 826 00:35:01,935 --> 00:35:04,437 all the way back to where we started... 827 00:35:06,356 --> 00:35:08,108 ...in this remote Indian forest, 828 00:35:08,149 --> 00:35:12,362 harvesting honey for a company called Heavenly Organics. 829 00:35:12,404 --> 00:35:14,573 - This forest, you know, nobody owns it, 830 00:35:14,614 --> 00:35:19,035 and these indigenous people, they've lived here forever. 831 00:35:19,077 --> 00:35:22,080 - [Kal] Father and son Amit and Ishnar Hooda 832 00:35:22,122 --> 00:35:24,165 started their company 12 years ago 833 00:35:24,207 --> 00:35:27,127 as a way to provide work for local villagers. 834 00:35:27,168 --> 00:35:29,546 What were they doing before honey collection 835 00:35:29,588 --> 00:35:30,922 with your company? 836 00:35:30,964 --> 00:35:33,508 - Well, they were still doing that, 837 00:35:33,550 --> 00:35:36,011 but they just didn't have a market 838 00:35:36,052 --> 00:35:39,723 or a place to sell it to make enough living. 839 00:35:39,764 --> 00:35:42,684 - There's certainly no shortage of honey here. 840 00:35:42,726 --> 00:35:44,227 During flowering season, 841 00:35:44,269 --> 00:35:46,646 one worker can collect a literal ton of honey 842 00:35:46,688 --> 00:35:48,440 in only three months. 843 00:35:48,481 --> 00:35:51,276 But what good is that if nobody's there to buy it? 844 00:35:53,695 --> 00:35:57,240 It took a human crew three days, two plane rides 845 00:35:57,282 --> 00:36:00,452 and eight hours driving deep into a national forest, 846 00:36:00,493 --> 00:36:04,039 but fortunately for the locals and Heavenly Organics 847 00:36:04,080 --> 00:36:07,542 an AI algorithm was able to find this place in seconds 848 00:36:07,584 --> 00:36:10,253 and knew it would be a great investment. 849 00:36:10,295 --> 00:36:11,838 - They called us out of the blue and said 850 00:36:11,880 --> 00:36:16,009 they ran an algorithm and they found us 851 00:36:16,051 --> 00:36:19,137 being a match with a lot of their portfolio. 852 00:36:19,179 --> 00:36:23,099 And they wanted to talk to us about investment, if we were looking for it. 853 00:36:23,141 --> 00:36:26,102 - [Kal] Who owned this mysterious AI algorithm? 854 00:36:26,144 --> 00:36:28,146 A tech company called CircleUp 855 00:36:28,188 --> 00:36:31,358 located 8,000 miles away in where else? 856 00:36:31,399 --> 00:36:32,859 San Francisco. 857 00:36:32,901 --> 00:36:34,527 We're at Good Eggs, 858 00:36:34,569 --> 00:36:36,655 an online grocery delivery company 859 00:36:36,696 --> 00:36:39,574 that also caught the interest of CircleUp's AI. 860 00:36:39,616 --> 00:36:41,034 - This is a mission-driven company 861 00:36:41,076 --> 00:36:42,827 that raised capital with CircleUp 862 00:36:42,869 --> 00:36:45,789 but also helps all the small businesses that we work with 863 00:36:45,830 --> 00:36:47,415 find customers. 864 00:36:47,457 --> 00:36:49,668 - [Kal] CircleUp's COO, Rory Eakin, 865 00:36:49,709 --> 00:36:52,712 worked in both business and humanitarian organizations 866 00:36:52,754 --> 00:36:54,547 before starting the company. 867 00:36:54,589 --> 00:36:57,550 CircleUp uses AI to analyze billions of data points 868 00:36:57,592 --> 00:36:59,970 and find out what consumers really want 869 00:37:00,011 --> 00:37:01,680 from their food and health products. 870 00:37:01,721 --> 00:37:03,181 - The problem you're facing as a shopper, 871 00:37:03,223 --> 00:37:05,308 there's hundreds of companies all around 872 00:37:05,350 --> 00:37:07,477 in almost every category. 873 00:37:07,519 --> 00:37:09,187 - [Kal] Then, they invest in under the radar companies 874 00:37:09,229 --> 00:37:12,357 that AI thinks are gonna be the next big thing. 875 00:37:12,399 --> 00:37:13,650 One of those big things they found, 876 00:37:13,692 --> 00:37:15,694 Halo Top ice cream. 877 00:37:17,237 --> 00:37:19,656 - Halo Top ice cream was a small brand 878 00:37:19,698 --> 00:37:21,282 in Southern California. 879 00:37:21,324 --> 00:37:23,743 Today, it's the number-one pint in the country. 880 00:37:23,785 --> 00:37:25,495 - Oh, wow. 881 00:37:25,537 --> 00:37:27,497 - What we see is this amazing shift with shoppers 882 00:37:27,539 --> 00:37:28,623 in all categories, 883 00:37:28,665 --> 00:37:30,083 they want healthier products, 884 00:37:30,125 --> 00:37:32,252 less toxins in their household, 885 00:37:32,293 --> 00:37:35,088 lotions that don't have all these chemicals in them. 886 00:37:35,130 --> 00:37:36,756 - [Kal] When CircleUp's algorithm 887 00:37:36,798 --> 00:37:38,842 scanned billions of consumer data points 888 00:37:38,883 --> 00:37:41,720 they found that customers wanted a list of attributes 889 00:37:41,761 --> 00:37:44,472 that was incredibly specific, 890 00:37:44,514 --> 00:37:46,683 mission focused, eco friendly companies 891 00:37:46,725 --> 00:37:48,476 harvesting organic products 892 00:37:48,518 --> 00:37:51,438 while creating economic growth in their communities. 893 00:37:51,479 --> 00:37:53,648 Sounds impossibly detailed, right? 894 00:37:53,690 --> 00:37:56,609 But CircleUp checked off all those boxes 895 00:37:56,651 --> 00:37:58,278 when it found Heavenly Organics. 896 00:37:58,319 --> 00:38:00,613 - That's what AI can do 897 00:38:00,655 --> 00:38:02,866 is make sense of all of this data 898 00:38:02,907 --> 00:38:07,454 in a way that wasn't possible even 10 years go. 899 00:38:07,495 --> 00:38:09,164 - [Kal] So how is CircleUp's collaboration 900 00:38:09,205 --> 00:38:11,291 with Heavenly Organics working out? 901 00:38:11,332 --> 00:38:14,961 Let's hop back to India and ask Amit and Ishmar. 902 00:38:18,089 --> 00:38:20,842 - We have built a new facility which is twice as big. 903 00:38:20,884 --> 00:38:22,510 We're able to innovate. 904 00:38:22,552 --> 00:38:24,387 We're able to get new products. 905 00:38:24,429 --> 00:38:26,806 You know, create more impact, with of course this area. 906 00:38:26,848 --> 00:38:29,225 - So they helped you scale, it sounds like? 907 00:38:29,267 --> 00:38:31,811 - Yeah, help us create capacity and scalability. Yeah. 908 00:38:31,853 --> 00:38:34,272 - How has that impacted the people who collect for you? 909 00:38:34,314 --> 00:38:37,317 - Currently, we support 650 families. 910 00:38:37,358 --> 00:38:39,486 As we grow, we sell more honey, 911 00:38:39,527 --> 00:38:41,529 you know, every thousand kilos, we'll add a family 912 00:38:41,571 --> 00:38:44,783 so that means next year it'll be maybe 700, 750. 913 00:38:44,824 --> 00:38:46,534 - Oh, wow, okay. 914 00:38:46,576 --> 00:38:49,079 - And today you see that they are better off 915 00:38:49,120 --> 00:38:50,580 in terms of the economy. 916 00:38:50,622 --> 00:38:52,165 They have a good house, 917 00:38:52,207 --> 00:38:53,833 a good facility in the house. 918 00:38:53,875 --> 00:38:56,086 They send their kids to school. 919 00:38:56,127 --> 00:38:59,255 You know, it's like capitalism for good. You know what I mean? 920 00:38:59,297 --> 00:39:02,550 Business used to create a greater good. 921 00:39:02,592 --> 00:39:05,804 That's why we all got into it. 922 00:39:05,845 --> 00:39:09,557 - [Kal] Will AI rise up and overthrow humanity 923 00:39:09,599 --> 00:39:13,311 or leave us clamoring to find purpose in our lives? 924 00:39:13,353 --> 00:39:14,813 So far in this corner of the world, 925 00:39:14,854 --> 00:39:16,439 its impact has been pretty good. 926 00:39:17,482 --> 00:39:19,818 Maybe there's a scenario 927 00:39:19,859 --> 00:39:22,779 where AI gives an unflinching assessment of all of our data 928 00:39:22,821 --> 00:39:24,364 and decides we're not so bad, 929 00:39:24,405 --> 00:39:26,032 and we work together 930 00:39:26,074 --> 00:39:28,535 to create a brave new world of prosperity 931 00:39:28,576 --> 00:39:29,869 and robot human harmony. 932 00:39:31,621 --> 00:39:33,665 Or maybe not. 933 00:39:33,706 --> 00:39:36,417 In which case, our time is running short. 934 00:39:36,459 --> 00:39:39,003 So please enjoy these robot strippers. 935 00:39:39,045 --> 00:39:41,589 (rock music) 936 00:40:28,678 --> 00:40:32,015 - (audience applause) - [Man] That was great!