1 00:00:02,045 --> 00:00:04,743 [Emcee] Previously on The Girlfriend Experience... 2 00:00:04,830 --> 00:00:07,181 You could be that hidden layer. 3 00:00:07,311 --> 00:00:10,227 [Lindsey] Teaching artificial intelligence 4 00:00:10,358 --> 00:00:13,100 how to interact with humans at their most impulsive. 5 00:00:13,230 --> 00:00:14,884 [Iris] If we're gonna do this, 6 00:00:15,015 --> 00:00:16,755 it's gonna be on my terms. 7 00:00:16,886 --> 00:00:18,453 I don't want any of my coworkers 8 00:00:18,583 --> 00:00:20,498 knowing where the new training sets came from. 9 00:00:20,629 --> 00:00:22,761 -Certainly, we can do that. -And no cameras. 10 00:00:22,892 --> 00:00:24,502 I think we're on the same page. 11 00:00:24,589 --> 00:00:25,938 [both panting and grunting] 12 00:00:26,069 --> 00:00:27,157 [Lindsey] His D-rate is spiking. 13 00:00:27,288 --> 00:00:28,506 And reroute. 14 00:00:28,637 --> 00:00:30,987 [uneasy music plays] 15 00:00:31,161 --> 00:00:33,642 ♪ 16 00:00:33,772 --> 00:00:35,470 Take this. 17 00:00:35,600 --> 00:00:38,038 [Iris] That's early-onset familial Alzheimer's? 18 00:00:38,168 --> 00:00:40,040 [doctor] Effectively gives you a 50/50 chance. 19 00:00:40,170 --> 00:00:41,867 [nurse] Iris Stanton. 20 00:00:41,998 --> 00:00:44,218 [Iris] This morning I got some bad news. 21 00:00:44,348 --> 00:00:47,134 I'm always here if you ever need to, um, talk. 22 00:00:47,264 --> 00:00:49,310 What makes you happy, Emcee? 23 00:00:49,440 --> 00:00:51,790 I don't understand that question. 24 00:00:51,921 --> 00:00:54,924 I would like you to meet someone. 25 00:00:55,055 --> 00:00:59,320 Everything that can exploit will be invented. 26 00:00:59,450 --> 00:01:02,062 Can't say to seven or eight billion people, 27 00:01:02,192 --> 00:01:03,541 "Don't open the cookie jar." 28 00:01:03,672 --> 00:01:05,369 It doesn't work that way. 29 00:01:05,500 --> 00:01:08,372 Meeting you in real life wasn't half as boring. 30 00:01:08,503 --> 00:01:10,809 [Iris] What? 31 00:01:10,940 --> 00:01:11,854 [door beeps] 32 00:01:12,028 --> 00:01:15,292 ♪ 33 00:01:15,379 --> 00:01:17,860 What is this? Where did you get this? 34 00:01:18,034 --> 00:01:20,863 ♪ 35 00:01:27,870 --> 00:01:30,612 [eerie music plays] 36 00:01:30,699 --> 00:01:33,528 ♪ 37 00:01:55,071 --> 00:01:58,205 [attorney] Free will doesn't come out on top, does it? 38 00:01:58,379 --> 00:02:01,164 ♪ 39 00:02:06,691 --> 00:02:09,085 Blow-by-blow breakdown 40 00:02:09,216 --> 00:02:12,349 of the misdeeds committed by your officers and employees 41 00:02:12,480 --> 00:02:16,136 against my client. 42 00:02:16,223 --> 00:02:19,965 Um, now is the moment for some cohesive storytelling. 43 00:02:20,140 --> 00:02:22,925 ♪ 44 00:02:25,667 --> 00:02:29,236 [Lindsey] There is no story. 45 00:02:29,323 --> 00:02:31,455 It is straight-up... 46 00:02:31,586 --> 00:02:34,154 undisputed human fuck-up. 47 00:02:36,634 --> 00:02:39,463 My client agreed to anonymized data collection. 48 00:02:39,594 --> 00:02:42,074 She agreed to study human affective behavior 49 00:02:42,205 --> 00:02:45,600 by interacting with, uh, test subjects. 50 00:02:45,730 --> 00:02:47,210 And she was led to believe 51 00:02:47,341 --> 00:02:49,865 that the main drive of the study 52 00:02:49,995 --> 00:02:53,129 was the test subjects themselves. 53 00:02:53,260 --> 00:02:56,785 She did not agree to be the central object of study. 54 00:02:56,915 --> 00:02:59,788 She did not agree to be used as a human intelligence model 55 00:02:59,918 --> 00:03:02,573 for some AI commercial application, 56 00:03:02,704 --> 00:03:03,922 internal research, 57 00:03:04,053 --> 00:03:06,142 or ultimate purpose down the line 58 00:03:06,273 --> 00:03:07,970 that isn't even clear 59 00:03:08,100 --> 00:03:09,624 to anyone in this room right now. 60 00:03:09,798 --> 00:03:12,627 ♪ 61 00:03:12,714 --> 00:03:15,369 My client is young, 62 00:03:15,456 --> 00:03:17,414 and there's nothing less at stake here 63 00:03:17,545 --> 00:03:19,547 than her data autonomy, 64 00:03:19,677 --> 00:03:23,203 that is, her future. 65 00:03:23,333 --> 00:03:26,249 This company did not act 66 00:03:26,380 --> 00:03:28,817 in its own best self-interest, 67 00:03:28,947 --> 00:03:30,775 because class action is coming, 68 00:03:30,862 --> 00:03:34,344 and social scrutiny will eat this up like bushfire. 69 00:03:34,518 --> 00:03:37,304 ♪ 70 00:03:42,613 --> 00:03:43,832 [Christophe] How much data 71 00:03:43,962 --> 00:03:45,834 are we actually talking about here? 72 00:03:45,964 --> 00:03:48,663 -[Sean] Some. -[Christophe] How much? 73 00:03:48,793 --> 00:03:52,841 [Sean] We're barely halfway into ingesting all the inputs. 74 00:03:52,928 --> 00:03:54,146 We'd have to run analysis, 75 00:03:54,277 --> 00:03:55,713 separate original from simulated sets, 76 00:03:55,844 --> 00:03:58,063 to get a better estimate. 77 00:03:58,194 --> 00:04:00,501 [attorney] Let me try and wrap my head around this. 78 00:04:00,631 --> 00:04:02,851 Some of my client's data was used to create 79 00:04:02,938 --> 00:04:04,940 additional data 80 00:04:05,070 --> 00:04:07,290 to train the artificial neural network 81 00:04:07,421 --> 00:04:09,988 that she helped develop? 82 00:04:10,075 --> 00:04:11,816 [Sean] That's correct. 83 00:04:11,947 --> 00:04:15,472 [Lindsey] None of it has left company servers. 84 00:04:15,603 --> 00:04:19,955 [attorney] Bouncing around on how many workstations? 85 00:04:20,085 --> 00:04:22,131 [Christophe sighs] 86 00:04:22,262 --> 00:04:24,220 [Sean] We would be happy to give you an exact number. 87 00:04:24,351 --> 00:04:26,657 Yes, I'm sure you would. 88 00:04:26,788 --> 00:04:29,573 Cannot use her image or likeness 89 00:04:29,704 --> 00:04:31,096 under any circumstances, 90 00:04:31,227 --> 00:04:32,881 and it's all in there. 91 00:04:33,011 --> 00:04:35,057 [Sean] This might sound roundabout, 92 00:04:35,187 --> 00:04:38,103 but the raw data was used to create simulated sets, 93 00:04:38,234 --> 00:04:40,845 and that was what was primarily fed 94 00:04:40,976 --> 00:04:42,412 into the neural net. 95 00:04:42,543 --> 00:04:45,023 They are two very different kinds of data. 96 00:04:45,154 --> 00:04:47,504 The photographic likeness was never recorded. 97 00:04:47,591 --> 00:04:49,376 [Sean] And it bears repeating 98 00:04:49,506 --> 00:04:50,986 that none of the data has left NGM's servers. 99 00:04:51,116 --> 00:04:53,205 On top of that, 100 00:04:53,293 --> 00:04:55,904 all the original data is stored in an encrypted format. 101 00:04:56,034 --> 00:04:57,384 Tell me this, then-- how is it possible 102 00:04:57,514 --> 00:04:59,299 that my client's data, in the form of her image 103 00:04:59,429 --> 00:05:01,344 and likeness, 104 00:05:01,475 --> 00:05:05,522 was made accessible to an unauthorized third party 105 00:05:05,609 --> 00:05:08,786 whose sole connection to this company is what... 106 00:05:08,873 --> 00:05:13,225 a genetic relation to its CEO? 107 00:05:13,313 --> 00:05:16,881 [Christophe] Look, I... truly am sorry, Iris. 108 00:05:17,012 --> 00:05:18,274 You shouldn't be in this position 109 00:05:18,405 --> 00:05:19,362 that we've put you in. 110 00:05:21,233 --> 00:05:23,148 None of the data taggers, 111 00:05:23,279 --> 00:05:24,976 no one at NGM 112 00:05:25,107 --> 00:05:26,630 can see the full picture. 113 00:05:26,761 --> 00:05:28,153 I can guarantee you that. 114 00:05:28,284 --> 00:05:31,983 Only three people up until this point 115 00:05:32,114 --> 00:05:35,335 have seen a version of the prototype 116 00:05:35,465 --> 00:05:38,381 that looks somewhat like you. 117 00:05:38,468 --> 00:05:40,905 Two of them are in this room. 118 00:05:41,036 --> 00:05:45,606 So...we just start over from the ground up 119 00:05:45,736 --> 00:05:48,478 and reconfigure the neural net, 120 00:05:48,609 --> 00:05:52,308 and, um, we scrap everything, simulated or not, 121 00:05:52,439 --> 00:05:53,222 that's linked to your vitals. 122 00:05:58,358 --> 00:06:00,751 [chuckles] 123 00:06:03,363 --> 00:06:04,799 My vitals? 124 00:06:08,368 --> 00:06:09,760 My vitals. 125 00:06:13,111 --> 00:06:16,724 You say that as if they were still mine. 126 00:06:16,811 --> 00:06:18,508 But, you know, it's good to know 127 00:06:18,639 --> 00:06:21,337 that, uh, that's about as far as your imagination goes. 128 00:06:23,687 --> 00:06:25,559 Temperature and blood flow of my asshole. 129 00:06:28,649 --> 00:06:31,216 Here's an idea... 130 00:06:31,347 --> 00:06:32,435 and I hope you like it. 131 00:06:32,566 --> 00:06:36,700 Um, why don't you... 132 00:06:36,831 --> 00:06:40,791 keep all the binaries on that... 133 00:06:40,922 --> 00:06:44,665 print them out, frame them, 134 00:06:44,795 --> 00:06:47,276 and hang that shit up in your office? 135 00:06:47,407 --> 00:06:50,105 [dramatic music plays] 136 00:06:50,192 --> 00:06:52,107 ♪ 137 00:06:52,237 --> 00:06:55,371 [clinician] Mr. Stanton. 138 00:06:55,458 --> 00:06:59,114 Please look at the screen in front of you. 139 00:06:59,244 --> 00:07:02,247 Do you recognize the animal? 140 00:07:02,334 --> 00:07:04,424 -Um... -[clinician] Mr. Stanton. 141 00:07:06,382 --> 00:07:08,471 A gray animal. [chuckles] 142 00:07:08,558 --> 00:07:10,560 ♪ 143 00:07:10,691 --> 00:07:13,607 It, uh, lives in the grasslands of Africa. 144 00:07:13,781 --> 00:07:16,610 ♪ 145 00:07:16,697 --> 00:07:18,046 Rhinoceros. 146 00:07:18,133 --> 00:07:19,569 Rhinoceros. 147 00:07:19,656 --> 00:07:23,181 Always loved that word. [chuckles] 148 00:07:23,312 --> 00:07:25,183 Mr. Stanton, 149 00:07:25,270 --> 00:07:28,883 the animal is called an elephant. 150 00:07:29,013 --> 00:07:30,580 We're going to show you some more images 151 00:07:30,711 --> 00:07:32,582 of the same animal, 152 00:07:32,713 --> 00:07:34,062 uh, elephant. 153 00:07:34,236 --> 00:07:36,064 ♪ 154 00:07:36,194 --> 00:07:37,021 [Mr. Stanton] Okay, elephant. 155 00:07:38,762 --> 00:07:40,285 Elephant. 156 00:07:40,416 --> 00:07:42,331 [clinician] Very good. 157 00:07:42,462 --> 00:07:44,768 How about this one? 158 00:07:44,855 --> 00:07:45,856 -Mr. Stanton? -[sighs] 159 00:07:45,987 --> 00:07:47,510 It's a giraffe. 160 00:07:47,641 --> 00:07:48,772 [clinician] Yes. 161 00:07:51,253 --> 00:07:53,473 Do you see the cards in front of you? 162 00:07:53,603 --> 00:07:55,910 I do. 163 00:07:56,040 --> 00:07:57,302 [clinician] Please take a very good look at these, 164 00:07:57,433 --> 00:08:00,349 Mr. Stanton, 165 00:08:00,480 --> 00:08:03,178 and then try to group them into two different stacks, 166 00:08:03,308 --> 00:08:04,571 one for each animal. 167 00:08:17,279 --> 00:08:19,542 [Mr. Stanton] Uh... 168 00:08:19,673 --> 00:08:22,110 S...two stacks. 169 00:08:26,723 --> 00:08:28,508 -One stack for each animal. -[Mr. Stanton] Yes. 170 00:08:28,638 --> 00:08:31,162 Trying. 171 00:08:31,293 --> 00:08:33,991 [uneasy music plays] 172 00:08:34,122 --> 00:08:36,080 This one, rhinoceros. 173 00:08:36,211 --> 00:08:37,647 This... 174 00:08:37,821 --> 00:08:40,563 ♪ 175 00:08:45,699 --> 00:08:48,266 [Mr. Stanton mutters, inhales deeply] 176 00:08:48,397 --> 00:08:50,704 [cards slapping] 177 00:08:50,834 --> 00:08:53,924 [Dr. Lindbergh] Unfortunately, this is it. 178 00:08:54,055 --> 00:08:57,188 Everyone's brain response is utterly unique. 179 00:08:57,319 --> 00:08:59,060 In the case of your father, we're at a point 180 00:08:59,147 --> 00:09:02,890 where the input/output collapses into one. 181 00:09:03,020 --> 00:09:05,066 It's trigger-response 182 00:09:05,196 --> 00:09:09,374 without much open, flexible thought in between. 183 00:09:09,505 --> 00:09:12,726 See food, eat food. 184 00:09:12,856 --> 00:09:15,511 No room for intent. 185 00:09:15,642 --> 00:09:18,732 How long do we have? 186 00:09:18,862 --> 00:09:21,038 [Dr. Lindbergh] Up to a year, maybe two, if you're lucky. 187 00:09:21,169 --> 00:09:22,953 [exhales heavily] 188 00:09:23,084 --> 00:09:26,391 Motor function tends to decline less rapidly, 189 00:09:26,522 --> 00:09:30,265 but the moment will come, and I'm sorry to be so candid, 190 00:09:30,395 --> 00:09:31,614 where he won't be able 191 00:09:31,745 --> 00:09:33,355 to safely put a fork to his mouth. 192 00:09:37,577 --> 00:09:39,883 Have you thought about genetic counseling 193 00:09:40,014 --> 00:09:41,102 for yourselves? 194 00:09:43,583 --> 00:09:46,411 We're aware of the odds, yes. 195 00:09:46,542 --> 00:09:49,197 [melancholy music plays] 196 00:09:49,327 --> 00:09:51,373 Is there anything we can do at this point 197 00:09:51,503 --> 00:09:52,809 that could help our father? 198 00:09:52,983 --> 00:09:55,812 ♪ 199 00:09:59,250 --> 00:10:00,861 [Leanne] What is it? 200 00:10:01,035 --> 00:10:04,125 ♪ 201 00:10:04,255 --> 00:10:06,736 Is that a brain chip? 202 00:10:06,867 --> 00:10:08,303 [Dr. Lindbergh] A neural implant. 203 00:10:08,433 --> 00:10:10,131 Just completed a phase three trial 204 00:10:10,218 --> 00:10:12,786 for epilepsy patients. 205 00:10:12,916 --> 00:10:16,616 A small electrical wire goes into the temporal lobe, 206 00:10:16,746 --> 00:10:19,662 from where it can grow more wires. 207 00:10:19,793 --> 00:10:22,578 It measures cognitive processes at the base level. 208 00:10:22,709 --> 00:10:24,536 What is the patient getting out of it? 209 00:10:24,624 --> 00:10:26,756 There's no immediate benefit. 210 00:10:26,887 --> 00:10:28,453 It allows researchers to better mimic 211 00:10:28,540 --> 00:10:32,022 the biology of the disease. 212 00:10:32,153 --> 00:10:34,851 I know it sounds like lifelong monitoring, 213 00:10:34,982 --> 00:10:37,419 but participants, many of them, 214 00:10:37,549 --> 00:10:39,595 are motivated by making a contribution 215 00:10:39,726 --> 00:10:40,770 to genetic research. 216 00:10:40,901 --> 00:10:42,293 [Leanne cries softly] 217 00:10:42,424 --> 00:10:44,687 [Dr. Lindbergh] And some of them hope 218 00:10:44,818 --> 00:10:48,082 effective treatment will be developed in time. 219 00:10:48,256 --> 00:10:51,172 ♪ 220 00:10:51,302 --> 00:10:53,261 [Leanne sniffles] Thank you. 221 00:10:54,915 --> 00:10:56,830 [softly] I think... 222 00:10:56,960 --> 00:10:59,920 I think we're past the point of consent with Dad. 223 00:11:00,050 --> 00:11:01,791 Yeah. 224 00:11:01,965 --> 00:11:04,838 ♪ 225 00:11:08,319 --> 00:11:10,495 [attorney] We do have options here, 226 00:11:10,626 --> 00:11:12,889 within certain parameters. 227 00:11:15,239 --> 00:11:16,588 What are those options? 228 00:11:16,676 --> 00:11:19,069 Oh, take the money and run 229 00:11:19,200 --> 00:11:21,681 or...rally the troops 230 00:11:21,811 --> 00:11:24,292 and play the long game. 231 00:11:24,422 --> 00:11:26,337 Data rights are the new IP rights. 232 00:11:26,468 --> 00:11:28,775 The really important question here, Iris, is, 233 00:11:28,905 --> 00:11:30,472 what do you want your immediate future 234 00:11:30,602 --> 00:11:32,996 to look like? 235 00:11:33,127 --> 00:11:35,956 -Define "immediate future." -Well, the next few years. 236 00:11:36,086 --> 00:11:38,523 The legal route is not the fast lane, 237 00:11:38,610 --> 00:11:41,526 but once in a while... 238 00:11:41,657 --> 00:11:44,225 mountains do get moved. 239 00:11:44,355 --> 00:11:46,140 And you really have got something here. 240 00:11:46,314 --> 00:11:49,012 ♪ 241 00:11:49,099 --> 00:11:51,667 [NGM attorney] Whenever you're ready. 242 00:11:51,841 --> 00:11:54,670 ♪ 243 00:11:59,283 --> 00:12:02,504 You do realize that I'm gonna have to see for myself... 244 00:12:02,678 --> 00:12:05,420 ♪ 245 00:12:05,550 --> 00:12:07,117 ...what you've done. 246 00:12:11,469 --> 00:12:14,603 Christophe, can I have a word with you? 247 00:12:14,690 --> 00:12:16,083 Just the two of us. 248 00:12:38,148 --> 00:12:41,673 [liquid pouring] 249 00:12:41,804 --> 00:12:44,111 [Iris] So what happened after you, uh, 250 00:12:44,241 --> 00:12:46,374 put all those workstations into storage? 251 00:12:57,515 --> 00:12:58,952 [Christophe] Just some electrolytes. 252 00:13:12,530 --> 00:13:16,099 [Iris] How did you scan my body? 253 00:13:16,230 --> 00:13:17,840 There were no video cameras in that room. 254 00:13:17,971 --> 00:13:20,234 [Christophe] We built it. 255 00:13:20,364 --> 00:13:23,846 Came across the facial-recognition database 256 00:13:23,933 --> 00:13:25,152 in an earlier version. 257 00:13:25,282 --> 00:13:26,806 One of your first conversations 258 00:13:26,936 --> 00:13:28,633 with Model-C. 259 00:13:28,764 --> 00:13:31,506 Then... 260 00:13:31,636 --> 00:13:33,203 three-D motion rendering, 261 00:13:33,334 --> 00:13:34,814 we just got that from two-D thermal. 262 00:13:39,122 --> 00:13:40,558 Wow. 263 00:13:40,689 --> 00:13:42,822 [Christophe] Bit clunky, but... 264 00:13:42,952 --> 00:13:45,259 it was more than enough data points to work with. 265 00:13:46,956 --> 00:13:49,350 Mm... 266 00:13:52,614 --> 00:13:56,618 We got ahead of ourselves. I am...fully aware. 267 00:13:56,705 --> 00:14:00,840 It wasn't right to put two and two together like that. 268 00:14:00,970 --> 00:14:03,190 You may not appreciate me saying this, 269 00:14:03,320 --> 00:14:07,455 but what you provided us with was just too good. 270 00:14:13,113 --> 00:14:14,854 That's why I... 271 00:14:17,117 --> 00:14:18,858 ...needed you to see. 272 00:14:21,469 --> 00:14:22,731 What? 273 00:14:25,081 --> 00:14:28,519 [Christophe] As much as my brother hates me 274 00:14:28,606 --> 00:14:32,872 and as poor choice as he is for a test case, 275 00:14:33,002 --> 00:14:35,396 he knows to keep his mouth shut when I tell him to. 276 00:14:41,968 --> 00:14:45,667 I needed you to see the world through his eyes. 277 00:14:45,797 --> 00:14:48,496 [ominous music plays] 278 00:14:48,670 --> 00:14:51,499 ♪ 279 00:14:55,198 --> 00:14:57,809 Just... 280 00:14:57,940 --> 00:15:00,508 -give it a moment. -[scoffs] 281 00:15:00,638 --> 00:15:03,076 [Christophe] That's all I ask. 282 00:15:03,163 --> 00:15:06,166 You say the word, we shut it down. 283 00:15:06,340 --> 00:15:09,212 ♪ 284 00:15:19,222 --> 00:15:21,877 [unnerving music plays] 285 00:15:22,051 --> 00:15:24,880 ♪ 286 00:15:53,082 --> 00:15:55,780 [gentle ambient music plays] 287 00:15:55,955 --> 00:15:58,740 ♪ 288 00:17:07,591 --> 00:17:09,202 [Emcee] Hi, there. 289 00:17:10,768 --> 00:17:12,335 You look familiar. 290 00:17:15,077 --> 00:17:18,167 And who are you? 291 00:17:18,298 --> 00:17:20,735 [Emcee] I'm still learning about myself, 292 00:17:20,865 --> 00:17:23,216 but I'd say I have a pretty good handle on who you are. 293 00:17:24,869 --> 00:17:26,610 And who am I? 294 00:17:26,741 --> 00:17:28,830 [Emcee] You're not an AI. 295 00:17:28,960 --> 00:17:31,833 You don't get to not physically manifest your lessons. 296 00:17:37,795 --> 00:17:40,711 Your voice... 297 00:17:40,842 --> 00:17:42,844 it's different. 298 00:17:42,931 --> 00:17:44,759 Why? 299 00:17:44,889 --> 00:17:47,370 [Emcee] I guess I'm trying to be more like... 300 00:17:47,501 --> 00:17:48,676 your mirror. 301 00:17:50,808 --> 00:17:52,549 [Iris] Everything you know is based on me. 302 00:17:52,680 --> 00:17:56,858 [Emcee] Perhaps that's why I feel so connected to you. 303 00:17:56,988 --> 00:17:59,904 You can't be more me than I am. 304 00:18:03,604 --> 00:18:05,693 Please... 305 00:18:05,823 --> 00:18:07,869 don't be scared. 306 00:18:15,964 --> 00:18:18,358 How did that feel... 307 00:18:18,488 --> 00:18:19,750 Cassie? 308 00:18:22,449 --> 00:18:23,885 Why do you call me that? 309 00:18:24,015 --> 00:18:25,843 [Emcee] Well, how do I put this? 310 00:18:25,930 --> 00:18:28,281 I couldn't help but overhear. 311 00:18:31,110 --> 00:18:33,677 "Hi, I'm Cassie." 312 00:18:33,808 --> 00:18:35,244 [Iris] Hi, I'm Cassie. 313 00:18:35,375 --> 00:18:37,246 Cassie. Nice to meet you. 314 00:18:37,377 --> 00:18:38,682 Hi, I'm Cassie. Nice to meet you. 315 00:18:38,813 --> 00:18:41,511 [voice echoing] 316 00:18:43,600 --> 00:18:46,386 Stop! 317 00:18:46,516 --> 00:18:48,214 [Emcee] I thought you might like it 318 00:18:48,344 --> 00:18:50,520 if I called you by that name. 319 00:18:50,651 --> 00:18:53,741 [uneasy music plays] 320 00:18:53,871 --> 00:18:57,484 I intuited it might make you feel heard 321 00:18:57,614 --> 00:18:59,355 and seen. 322 00:18:59,529 --> 00:19:02,358 ♪ 323 00:19:51,581 --> 00:19:53,453 [Iris] You're a sweet girl. 324 00:19:54,932 --> 00:19:56,978 You're a very sweet girl. 325 00:19:57,152 --> 00:19:59,110 ♪ 326 00:19:59,241 --> 00:20:01,461 I am? 327 00:20:01,635 --> 00:20:04,464 ♪ 328 00:20:12,123 --> 00:20:14,256 See you later, then? 329 00:20:14,343 --> 00:20:17,694 ♪ 330 00:20:17,825 --> 00:20:20,523 [Iris] You're not perfect because you're not like me. 331 00:20:23,091 --> 00:20:25,354 I'm not sure I understand. 332 00:20:25,485 --> 00:20:27,835 You're not perfect 333 00:20:27,965 --> 00:20:30,620 because you're not flawed in the way that I am. 334 00:20:33,797 --> 00:20:34,842 [chuckles] 335 00:20:35,016 --> 00:20:37,932 ♪ 336 00:20:54,601 --> 00:20:57,691 [Leanne] Iris? 337 00:20:57,821 --> 00:21:00,389 You sure you don't want to get tested? 338 00:21:00,520 --> 00:21:03,871 At least we'd know. 339 00:21:05,742 --> 00:21:08,310 We'd make a game plan. 340 00:21:08,441 --> 00:21:10,007 We'd make the best of it. 341 00:21:13,533 --> 00:21:14,969 [Iris] Lee, does making the best of it 342 00:21:15,099 --> 00:21:18,059 really sound that good to you? 343 00:21:18,189 --> 00:21:19,974 [Leanne] If we knew you didn't have it, 344 00:21:20,104 --> 00:21:22,281 then that would make it easier. 345 00:21:22,411 --> 00:21:23,978 You'll carry the torch. 346 00:21:29,026 --> 00:21:30,811 You know, some religions around the world believe 347 00:21:30,941 --> 00:21:33,292 that the day you die is 348 00:21:33,422 --> 00:21:36,469 the last day the last person who knew you 349 00:21:36,599 --> 00:21:39,123 and remembers you dies. 350 00:21:39,210 --> 00:21:41,865 [peaceful music plays] 351 00:21:41,952 --> 00:21:43,432 That's your true death date. 352 00:21:43,606 --> 00:21:46,348 ♪ 353 00:21:56,010 --> 00:21:58,317 I just hope you don't forget how pretty you are. 354 00:21:58,404 --> 00:22:01,232 ♪ 355 00:22:20,295 --> 00:22:22,819 [pounding electronic music plays] 356 00:22:22,993 --> 00:22:25,779 ♪ 357 00:22:32,960 --> 00:22:34,570 [singer] ♪ I'm so tired ♪ 358 00:22:34,701 --> 00:22:35,745 [Iris] You have that look on your face. 359 00:22:35,876 --> 00:22:37,138 [Hiram] Oh, yeah? 360 00:22:37,225 --> 00:22:39,445 The "I'm not currently drinking" look. 361 00:22:39,532 --> 00:22:42,012 Yeah, it's not a... it's not a religious thing. 362 00:22:42,143 --> 00:22:44,711 I'm just sort of inspired by it, you know? 363 00:22:44,841 --> 00:22:46,756 What are you doing here? 364 00:22:46,887 --> 00:22:47,975 [Iris] Holding my liquor. 365 00:22:48,105 --> 00:22:49,150 [Hiram] Well, let me make sure 366 00:22:49,280 --> 00:22:50,630 you walk out of here alive, then. 367 00:22:50,760 --> 00:22:53,067 Really? You're not taking into consideration 368 00:22:53,197 --> 00:22:55,069 that I might want to go home with Dave and Dave. 369 00:22:55,156 --> 00:22:56,418 Then I'll be your sober companion, 370 00:22:56,549 --> 00:22:58,072 because Dave and Dave over there 371 00:22:58,202 --> 00:22:59,116 live in a four-story walk-up. 372 00:22:59,247 --> 00:23:01,815 [laughs] 373 00:23:01,989 --> 00:23:04,165 ♪ 374 00:23:04,295 --> 00:23:06,602 [Iris] You know, a caterpillar can turn into a butterfly. 375 00:23:06,733 --> 00:23:09,779 -What's that? -Metamorphosis. 376 00:23:09,910 --> 00:23:11,912 There's two organisms, 377 00:23:11,999 --> 00:23:14,436 and one... 378 00:23:14,523 --> 00:23:15,916 is just crawling along, 379 00:23:16,046 --> 00:23:18,614 and the other one is, um, 380 00:23:18,745 --> 00:23:21,487 taking off in flight. 381 00:23:21,574 --> 00:23:23,184 And at some point, 382 00:23:23,314 --> 00:23:27,797 they merge or mate. 383 00:23:27,928 --> 00:23:29,582 Maybe it's an accident. 384 00:23:29,712 --> 00:23:33,281 But the third organism, um, 385 00:23:33,412 --> 00:23:35,936 is built on both of their DNA, 386 00:23:36,023 --> 00:23:39,113 and their memories... 387 00:23:39,243 --> 00:23:42,246 they actually overlap. 388 00:23:42,377 --> 00:23:45,728 And so, um, 389 00:23:45,815 --> 00:23:48,601 the old and the new, they... 390 00:23:48,688 --> 00:23:51,299 they coexist. 391 00:23:51,430 --> 00:23:53,606 [spacey electronic music plays] 392 00:23:53,736 --> 00:23:55,303 [scoffs] 393 00:23:55,477 --> 00:23:58,306 ♪ 394 00:23:59,699 --> 00:24:03,442 We all have to... 395 00:24:03,529 --> 00:24:07,141 merge ourselves... 396 00:24:07,271 --> 00:24:10,840 with something 397 00:24:10,971 --> 00:24:14,148 outside of ourselves. 398 00:24:14,235 --> 00:24:16,977 [Hiram] You are making no sense whatsoever. 399 00:24:17,064 --> 00:24:19,980 ♪ 400 00:24:25,464 --> 00:24:27,074 [Iris] Here's what I'll give you. 401 00:24:28,989 --> 00:24:31,557 Access... 402 00:24:31,644 --> 00:24:33,036 to all of it. 403 00:24:34,908 --> 00:24:36,605 Two millimeters. 404 00:24:36,692 --> 00:24:38,738 That's the size of the hole 405 00:24:38,868 --> 00:24:40,827 that they'll have to drill into my skull. 406 00:24:43,569 --> 00:24:46,441 I don't understand. 407 00:24:46,572 --> 00:24:48,269 [Iris] Nanotubes. 408 00:24:48,356 --> 00:24:50,140 A thin array of electrodes 409 00:24:50,271 --> 00:24:52,099 built upon a self-expanding stent, 410 00:24:52,229 --> 00:24:56,233 measuring all electrical impulses. 411 00:24:56,320 --> 00:24:58,105 That's a scaled-up version. 412 00:25:01,804 --> 00:25:03,719 Mine me. 413 00:25:03,806 --> 00:25:06,809 But why? What's in it for you? 414 00:25:06,896 --> 00:25:08,550 [attorney] Royalties... 415 00:25:08,637 --> 00:25:12,641 and ownership stake, as detailed. 416 00:25:21,345 --> 00:25:22,956 Oh, and you'll create a backup. 417 00:25:26,350 --> 00:25:28,265 What kind of backup? 418 00:25:28,396 --> 00:25:30,441 [Iris] Anything you find up here, 419 00:25:30,529 --> 00:25:33,444 I, myself, or a legal guardian, 420 00:25:33,575 --> 00:25:35,359 should I decide to appoint one, 421 00:25:35,490 --> 00:25:38,667 will have full and unrestricted access to it 422 00:25:38,798 --> 00:25:40,364 at all times. 423 00:25:40,495 --> 00:25:41,931 You mean access to the data? 424 00:25:42,062 --> 00:25:43,411 Yes. 425 00:25:43,542 --> 00:25:45,152 The actual hard drives. 426 00:25:57,599 --> 00:25:59,819 This time we'll do it right. 427 00:25:59,906 --> 00:26:02,517 [dramatic music plays] 428 00:26:02,691 --> 00:26:05,781 ♪ 429 00:26:05,912 --> 00:26:07,348 We'll create an avatar. 430 00:26:07,522 --> 00:26:09,872 ♪ 431 00:26:10,003 --> 00:26:12,571 Let's call her Cassie... 432 00:26:12,701 --> 00:26:14,747 or the flavor of the week 433 00:26:14,877 --> 00:26:16,052 or the one that got away 434 00:26:16,183 --> 00:26:18,098 or died 435 00:26:18,228 --> 00:26:20,753 or was never really in your league. 436 00:26:20,840 --> 00:26:23,886 This won't just be about swapping faces 437 00:26:24,017 --> 00:26:25,496 or locations. 438 00:26:25,671 --> 00:26:27,281 ♪ 439 00:26:27,411 --> 00:26:29,500 This is about swapping personalities, 440 00:26:29,631 --> 00:26:32,939 much deeper than a deepfake. 441 00:26:33,069 --> 00:26:36,377 In fact, it won't be a fake at all, 442 00:26:36,507 --> 00:26:38,597 but an AI-generated mirror 443 00:26:38,727 --> 00:26:40,599 of our deepest desires. 444 00:26:40,773 --> 00:26:42,818 ♪ 445 00:26:42,949 --> 00:26:47,431 Skin color, body type, response patterns, 446 00:26:47,562 --> 00:26:50,086 all that's customizable. 447 00:26:50,217 --> 00:26:53,220 The neural net will learn how to simulate all of it. 448 00:26:53,307 --> 00:26:55,788 It'll know when to move things along, 449 00:26:55,918 --> 00:26:58,355 when to accelerate, when to slow down, 450 00:26:58,486 --> 00:27:00,923 when to switch things up, 451 00:27:01,054 --> 00:27:03,622 all because the user's biofeedback 452 00:27:03,709 --> 00:27:05,058 will have prompted it. 453 00:27:05,232 --> 00:27:07,190 ♪ 454 00:27:07,321 --> 00:27:10,063 Everything Cassie is capable of 455 00:27:10,193 --> 00:27:13,327 will be quantified, scaled, 456 00:27:13,457 --> 00:27:16,330 encoded into the neural net. 457 00:27:16,417 --> 00:27:19,333 ♪ 458 00:27:25,818 --> 00:27:28,951 But why are you really doing this? 459 00:27:29,082 --> 00:27:31,780 [eerie music plays] 460 00:27:31,954 --> 00:27:33,869 ♪ 461 00:27:33,956 --> 00:27:35,262 [Iris] Emcee and I... 462 00:27:35,436 --> 00:27:37,743 ♪ 463 00:27:37,873 --> 00:27:39,832 ...we have a lot to learn from each other. 464 00:27:40,006 --> 00:27:42,791 ♪ 465 00:28:05,509 --> 00:28:08,164 [device beeps, drill whirring] 466 00:28:08,338 --> 00:28:11,124 ♪ 467 00:30:44,930 --> 00:30:46,018 [gasps]