1 00:00:06,361 --> 00:00:07,629 [Emcee] Previously on The Girlfriend Experience... 2 00:00:07,662 --> 00:00:09,798 You could be that hidden layer. 3 00:00:10,031 --> 00:00:12,768 [Lindsay] Teaching artificial intelligence 4 00:00:12,801 --> 00:00:16,071 how to interact with humans at their most impulsive. 5 00:00:16,104 --> 00:00:17,806 [Iris] If we're gonna do this, 6 00:00:18,039 --> 00:00:19,474 it's gonna be on my terms. 7 00:00:19,508 --> 00:00:21,176 I don't want any of my new coworkers 8 00:00:21,209 --> 00:00:23,311 knowing where the new training sets came from. 9 00:00:23,345 --> 00:00:25,614 -Certainly, we can do that. -And no cameras. 10 00:00:25,647 --> 00:00:27,416 I think we're on the same page. 11 00:00:27,449 --> 00:00:28,650 [both panting and grunting] 12 00:00:28,683 --> 00:00:30,218 [Lindsay] His D-rate is spiking. 13 00:00:30,252 --> 00:00:31,486 And reroute. 14 00:00:31,520 --> 00:00:34,222 [uneasy music plays] 15 00:00:34,256 --> 00:00:36,658 ♪ ♪ 16 00:00:36,691 --> 00:00:38,160 Take this. 17 00:00:38,193 --> 00:00:40,629 [Iris] That's early-onset familial Alzheimer's? 18 00:00:40,662 --> 00:00:43,131 [doctor] Effectively gives you a 50/50 chance. 19 00:00:43,165 --> 00:00:44,566 [nurse] Iris Stanton. 20 00:00:44,599 --> 00:00:46,768 [Iris] This morning I got some bad news. 21 00:00:46,802 --> 00:00:50,138 I'm always here if you ever need to, um, talk. 22 00:00:50,172 --> 00:00:52,207 What makes you happy, Emcee? 23 00:00:52,240 --> 00:00:54,643 I don't understand that question. 24 00:00:54,676 --> 00:00:56,812 I would like you to meet someone. 25 00:00:57,813 --> 00:01:01,283 Everything that can exploit will be invented. 26 00:01:02,651 --> 00:01:05,086 Can't say to seven or eight billion people, 27 00:01:05,120 --> 00:01:06,488 "Don't open the cookie jar." 28 00:01:06,521 --> 00:01:08,190 It doesn't work that way. 29 00:01:08,223 --> 00:01:11,460 Meeting you in real life wasn't half as boring. 30 00:01:11,493 --> 00:01:13,795 [Iris] What? 31 00:01:13,829 --> 00:01:15,197 [door beeps] 32 00:01:15,230 --> 00:01:18,099 ♪ ♪ 33 00:01:18,133 --> 00:01:21,136 What is this? Where did you get this? 34 00:01:21,169 --> 00:01:24,172 ♪ ♪ 35 00:01:30,712 --> 00:01:33,682 [eerie music plays] 36 00:01:33,715 --> 00:01:36,718 ♪ ♪ 37 00:01:57,539 --> 00:02:01,409 [attorney] Free will doesn't come out on top, does it? 38 00:02:01,443 --> 00:02:04,412 ♪ ♪ 39 00:02:09,618 --> 00:02:11,686 Blow-by-blow breakdown 40 00:02:11,720 --> 00:02:15,423 of the misdeeds committed by your officers and employees 41 00:02:15,457 --> 00:02:18,693 against my client. 42 00:02:18,727 --> 00:02:23,198 Um, now is the moment for some cohesive storytelling. 43 00:02:23,231 --> 00:02:26,234 ♪ ♪ 44 00:02:28,503 --> 00:02:32,274 [Lindsay] There is no story. 45 00:02:32,307 --> 00:02:34,476 It is straight-up... 46 00:02:34,509 --> 00:02:37,412 undisputed human fuck-up. 47 00:02:39,347 --> 00:02:42,284 My client agreed to anonymized data collection. 48 00:02:42,317 --> 00:02:44,853 She agreed to study human affective behavior 49 00:02:44,886 --> 00:02:48,557 by interacting with, uh, test subjects. 50 00:02:48,590 --> 00:02:49,958 And she was led to believe 51 00:02:50,191 --> 00:02:52,627 that the main drive of the study 52 00:02:52,661 --> 00:02:55,697 was the test subjects themselves. 53 00:02:55,730 --> 00:02:59,367 She did not agree to be the central object of study. 54 00:02:59,401 --> 00:03:02,604 She did not agree to be used as a human intelligence model 55 00:03:02,637 --> 00:03:05,640 for some AI commercial application, 56 00:03:05,674 --> 00:03:06,741 internal research, 57 00:03:06,775 --> 00:03:09,210 or ultimate purpose down the line 58 00:03:09,244 --> 00:03:10,745 that isn't even clear 59 00:03:10,779 --> 00:03:12,814 to anyone in this room right now. 60 00:03:12,847 --> 00:03:15,584 ♪ ♪ 61 00:03:15,617 --> 00:03:18,219 My client is young, 62 00:03:18,253 --> 00:03:20,455 and there's nothing less at stake here 63 00:03:20,488 --> 00:03:22,557 than her data autonomy, 64 00:03:22,591 --> 00:03:26,261 that is, her future. 65 00:03:26,294 --> 00:03:29,230 This company did not act 66 00:03:29,264 --> 00:03:31,700 in its own best self-interest, 67 00:03:31,733 --> 00:03:33,435 because class action is coming, 68 00:03:33,468 --> 00:03:37,539 and social scrutiny will eat this up like bushfire. 69 00:03:37,572 --> 00:03:40,575 ♪ ♪ 70 00:03:45,480 --> 00:03:46,548 [Christophe] How much data 71 00:03:46,581 --> 00:03:48,583 are we actually talking about here? 72 00:03:48,617 --> 00:03:51,286 -[Sean] Some. -[Christophe] How much? 73 00:03:51,319 --> 00:03:55,724 [Sean] We're barely halfway into ingesting all the inputs. 74 00:03:55,757 --> 00:03:56,925 We'd have to run analysis, 75 00:03:56,958 --> 00:03:58,693 separate original from simulated sets, 76 00:03:58,727 --> 00:04:00,662 to get a better estimate. 77 00:04:00,695 --> 00:04:03,298 [attorney] Let me try and wrap my head around this. 78 00:04:03,331 --> 00:04:05,834 Some of my client's data was used to create 79 00:04:05,867 --> 00:04:07,702 additional data 80 00:04:07,736 --> 00:04:10,372 to train the artificial neural network 81 00:04:10,405 --> 00:04:12,874 that she helped develop? 82 00:04:12,907 --> 00:04:14,542 [Sean] That's correct. 83 00:04:14,576 --> 00:04:18,046 [Lindsay] None of it has left company servers. 84 00:04:18,279 --> 00:04:22,984 [attorney] Bouncing around on how many workstations? 85 00:04:23,018 --> 00:04:24,819 [Christophe sighs] 86 00:04:24,853 --> 00:04:27,322 [Sean] We would be happy to give you an exact number. 87 00:04:27,355 --> 00:04:29,557 Yes, I'm sure you would. 88 00:04:29,591 --> 00:04:32,594 Cannot use her image or likeness 89 00:04:32,627 --> 00:04:34,029 under any circumstances, 90 00:04:34,062 --> 00:04:35,664 and it's all in there. 91 00:04:35,697 --> 00:04:37,766 [Sean] This might sound roundabout, 92 00:04:37,799 --> 00:04:40,935 but the raw data was used to create simulated sets, 93 00:04:40,969 --> 00:04:43,805 and that was what was primarily fed 94 00:04:43,838 --> 00:04:45,006 into the neural net. 95 00:04:45,040 --> 00:04:47,776 They are two very different kinds of data. 96 00:04:47,809 --> 00:04:50,478 The photographic likeness was never recorded. 97 00:04:50,512 --> 00:04:51,980 [Sean] And it bears repeating 98 00:04:52,013 --> 00:04:54,015 that none of the data has left NGM's servers. 99 00:04:54,049 --> 00:04:55,750 On top of that, 100 00:04:55,784 --> 00:04:58,620 all the original data is stored in an encrypted format. 101 00:04:58,653 --> 00:05:00,055 Tell me this, then-- how is it possible 102 00:05:00,088 --> 00:05:02,457 that my client's data, in the form of her image 103 00:05:02,490 --> 00:05:03,925 and likeness, 104 00:05:03,958 --> 00:05:08,029 was made accessible to an unauthorized third party 105 00:05:08,063 --> 00:05:11,666 whose sole connection to this company is what... 106 00:05:11,700 --> 00:05:14,836 a genetic relation to its CEO? 107 00:05:15,870 --> 00:05:19,708 [Christophe] Look, I... truly am sorry, Iris. 108 00:05:19,741 --> 00:05:21,376 You shouldn't be in this position 109 00:05:21,409 --> 00:05:22,610 that we've put you in. 110 00:05:24,112 --> 00:05:26,347 None of the data taggers, 111 00:05:26,381 --> 00:05:27,916 no one at NGM 112 00:05:27,949 --> 00:05:29,651 can see the full picture. 113 00:05:29,684 --> 00:05:30,885 I can guarantee you that. 114 00:05:30,919 --> 00:05:34,789 Only three people up until this point 115 00:05:34,823 --> 00:05:38,126 have seen a version of the prototype 116 00:05:38,359 --> 00:05:41,362 that looks somewhat like you. 117 00:05:41,396 --> 00:05:43,665 Two of them are in this room. 118 00:05:43,698 --> 00:05:48,570 So...we just start over from the ground up 119 00:05:48,603 --> 00:05:51,106 and reconfigure the neural net, 120 00:05:51,139 --> 00:05:55,110 and, um, we scrap everything, simulated or not, 121 00:05:55,143 --> 00:05:56,478 that's linked to your vitals. 122 00:06:01,449 --> 00:06:03,952 [chuckles] 123 00:06:06,454 --> 00:06:07,989 My vitals? 124 00:06:11,459 --> 00:06:12,961 My vitals. 125 00:06:15,797 --> 00:06:18,166 You say that as if they were still mine. 126 00:06:19,634 --> 00:06:21,102 But, you know, it's good to know 127 00:06:21,136 --> 00:06:24,606 that, uh, that's about as far as your imagination goes. 128 00:06:26,407 --> 00:06:28,810 Temperature and blood flow of my asshole. 129 00:06:31,646 --> 00:06:34,082 Here's an idea... 130 00:06:34,115 --> 00:06:35,517 and I hope you like it. 131 00:06:35,550 --> 00:06:39,554 Um, why don't you... 132 00:06:39,587 --> 00:06:43,725 keep all the binaries on that... 133 00:06:43,758 --> 00:06:47,462 print them out, frame them, 134 00:06:47,495 --> 00:06:50,198 and hang that shit up in your office? 135 00:06:50,431 --> 00:06:53,201 [dramatic music plays] 136 00:06:53,434 --> 00:06:55,003 ♪ ♪ 137 00:06:55,036 --> 00:06:56,805 [clinician] Mr. Stanton. 138 00:06:58,039 --> 00:07:01,976 Please look at the screen in front of you. 139 00:07:02,010 --> 00:07:05,013 Do you recognize the animal? 140 00:07:05,046 --> 00:07:07,682 -Um... -[clinician] Mr. Stanton. 141 00:07:09,450 --> 00:07:11,653 A gray animal. [chuckles] 142 00:07:11,686 --> 00:07:13,121 ♪ ♪ 143 00:07:13,154 --> 00:07:16,791 It, uh, lives in the grasslands of Africa. 144 00:07:16,825 --> 00:07:19,694 ♪ ♪ 145 00:07:19,727 --> 00:07:21,029 Rhinoceros. 146 00:07:21,062 --> 00:07:22,463 Rhinoceros. 147 00:07:22,497 --> 00:07:26,167 Always loved that word. [chuckles] 148 00:07:26,201 --> 00:07:27,969 Mr. Stanton, 149 00:07:28,002 --> 00:07:31,673 the animal is called an elephant. 150 00:07:31,706 --> 00:07:33,675 We're going to show you some more images 151 00:07:33,708 --> 00:07:35,677 of the same animal, 152 00:07:35,710 --> 00:07:37,212 uh, elephant. 153 00:07:37,245 --> 00:07:38,947 ♪ ♪ 154 00:07:38,980 --> 00:07:41,783 [Mr. Stanton] Okay, elephant. 155 00:07:41,816 --> 00:07:43,151 Elephant. 156 00:07:43,184 --> 00:07:45,253 [clinician] Very good. 157 00:07:45,486 --> 00:07:47,689 How about this one? 158 00:07:47,722 --> 00:07:48,890 -Mr. Stanton? -[sighs] 159 00:07:48,923 --> 00:07:50,625 It's a giraffe. 160 00:07:50,658 --> 00:07:51,993 [clinician] Yes. 161 00:07:53,928 --> 00:07:56,731 Do you see the cards in front of you? 162 00:07:56,764 --> 00:07:58,600 I do. 163 00:07:58,633 --> 00:08:00,268 [clinician] Please take a very good look at these, 164 00:08:00,501 --> 00:08:02,871 Mr. Stanton, 165 00:08:02,904 --> 00:08:06,074 and then try to group them into two different stacks, 166 00:08:06,107 --> 00:08:07,842 one for each animal. 167 00:08:20,188 --> 00:08:22,690 Uh... 168 00:08:22,724 --> 00:08:25,260 S...two stacks. 169 00:08:29,230 --> 00:08:31,733 -One stack for each animal. -[Mr. Stanton] Yes. 170 00:08:31,766 --> 00:08:34,102 Trying. 171 00:08:34,135 --> 00:08:36,938 [uneasy music plays] 172 00:08:36,971 --> 00:08:39,173 This one, rhinoceros. 173 00:08:39,207 --> 00:08:40,808 This... 174 00:08:40,842 --> 00:08:43,845 ♪ ♪ 175 00:08:48,283 --> 00:08:51,252 [Mr. Stanton mutters, inhales deeply] 176 00:08:51,286 --> 00:08:53,288 [cards slapping] 177 00:08:53,321 --> 00:08:56,691 [Dr. Lindbergh] Unfortunately, this is it. 178 00:08:56,724 --> 00:08:59,928 Everyone's brain response is utterly unique. 179 00:08:59,961 --> 00:09:01,763 In the case of your father, we're at a point 180 00:09:01,796 --> 00:09:05,833 where the input/output collapses into one. 181 00:09:05,867 --> 00:09:07,769 It's trigger-response 182 00:09:07,802 --> 00:09:12,240 without much open, flexible thought in between. 183 00:09:12,273 --> 00:09:15,777 See food, eat food. 184 00:09:15,810 --> 00:09:18,613 No room for intent. 185 00:09:18,646 --> 00:09:21,215 How long do we have? 186 00:09:21,249 --> 00:09:24,018 [Dr. Lindbergh] Up to a year, maybe two, if you're lucky. 187 00:09:24,052 --> 00:09:25,720 [exhales heavily] 188 00:09:25,753 --> 00:09:29,057 Motor function tends to decline less rapidly, 189 00:09:29,090 --> 00:09:33,227 but the moment will come, and I'm sorry to be so candid, 190 00:09:33,261 --> 00:09:34,362 where he won't be able 191 00:09:34,595 --> 00:09:36,664 to safely put a fork to his mouth. 192 00:09:40,301 --> 00:09:42,971 Have you thought about genetic counseling 193 00:09:43,004 --> 00:09:44,272 for yourselves? 194 00:09:46,341 --> 00:09:49,344 We're aware of the odds, yes. 195 00:09:49,377 --> 00:09:51,913 [melancholy music plays] 196 00:09:51,946 --> 00:09:54,215 Is there anything we can do at this point 197 00:09:54,248 --> 00:09:56,017 that could help our father? 198 00:09:56,050 --> 00:09:59,053 ♪ ♪ 199 00:10:02,090 --> 00:10:04,092 What is it? 200 00:10:04,125 --> 00:10:07,095 ♪ ♪ 201 00:10:07,128 --> 00:10:09,364 Is that a brain chip? 202 00:10:09,397 --> 00:10:11,099 [Dr. Lindbergh] A neural implant. 203 00:10:11,132 --> 00:10:13,067 Just completed a phase three trial 204 00:10:13,101 --> 00:10:15,303 for epilepsy patients. 205 00:10:15,336 --> 00:10:19,340 A small electrical wire goes into the temporal lobe, 206 00:10:19,374 --> 00:10:22,243 from where it can grow more wires. 207 00:10:22,276 --> 00:10:25,213 It measures cognitive processes at the base level. 208 00:10:25,246 --> 00:10:27,315 What is the patient getting out of it? 209 00:10:27,348 --> 00:10:29,417 There's no immediate benefit. 210 00:10:29,650 --> 00:10:31,319 It allows researchers to better mimic 211 00:10:31,352 --> 00:10:34,856 the biology of the disease. 212 00:10:34,889 --> 00:10:37,825 I know it sounds like lifelong monitoring, 213 00:10:37,859 --> 00:10:40,161 but participants, many of them, 214 00:10:40,194 --> 00:10:42,730 are motivated by making a contribution 215 00:10:42,764 --> 00:10:43,798 to genetic research. 216 00:10:43,831 --> 00:10:45,066 [Leanne cries softly] 217 00:10:45,099 --> 00:10:47,301 [Dr. Lindbergh] And some of them hope 218 00:10:47,335 --> 00:10:51,305 effective treatment will be developed in time. 219 00:10:51,339 --> 00:10:54,075 ♪ ♪ 220 00:10:54,108 --> 00:10:56,411 [Leanne sniffles] Thank you. 221 00:10:57,879 --> 00:10:59,380 [softly] I think... 222 00:10:59,414 --> 00:11:02,984 I think we're past the point of consent with Dad. 223 00:11:03,017 --> 00:11:05,053 Yeah. 224 00:11:05,086 --> 00:11:08,089 ♪ ♪ 225 00:11:11,092 --> 00:11:13,361 [attorney] We do have options here, 226 00:11:13,394 --> 00:11:16,130 within certain parameters. 227 00:11:18,132 --> 00:11:19,467 What are those options? 228 00:11:19,700 --> 00:11:22,070 Oh, take the money and run 229 00:11:22,103 --> 00:11:24,739 or...rally the troops 230 00:11:24,772 --> 00:11:27,108 and play the long game. 231 00:11:27,141 --> 00:11:29,077 Data rights are the new IP rights. 232 00:11:29,110 --> 00:11:31,479 The really important question here, Iris, is, 233 00:11:31,712 --> 00:11:33,414 what do you want your immediate future 234 00:11:33,448 --> 00:11:35,750 to look like? 235 00:11:35,783 --> 00:11:38,753 -Define "immediate future." -Well, the next few years. 236 00:11:38,786 --> 00:11:41,389 The legal route is not the fast lane, 237 00:11:41,422 --> 00:11:44,459 but once in a while... 238 00:11:44,492 --> 00:11:46,994 mountains do get moved. 239 00:11:47,028 --> 00:11:49,263 And you really have got something here. 240 00:11:49,297 --> 00:11:51,866 ♪ ♪ 241 00:11:51,899 --> 00:11:54,902 [NGM attorney] Whenever you're ready. 242 00:11:54,936 --> 00:11:57,939 ♪ ♪ 243 00:12:01,876 --> 00:12:05,813 You do realize that I'm gonna have to see for myself... 244 00:12:05,847 --> 00:12:08,282 ♪ ♪ 245 00:12:08,316 --> 00:12:10,318 ...what you've done. 246 00:12:14,122 --> 00:12:17,492 Christophe, can I have a word with you? 247 00:12:17,525 --> 00:12:19,293 Just the two of us. 248 00:12:41,115 --> 00:12:44,385 [liquid pouring] 249 00:12:44,418 --> 00:12:46,921 [Iris] So what happened after you, uh, 250 00:12:46,954 --> 00:12:49,524 put all those workstations into storage? 251 00:13:00,201 --> 00:13:02,203 [Christophe] Just some electrolytes. 252 00:13:15,283 --> 00:13:18,853 [Iris] How did you scan my body? 253 00:13:18,886 --> 00:13:20,922 There were no video cameras in that room. 254 00:13:20,955 --> 00:13:22,990 [Christophe] We built it. 255 00:13:23,024 --> 00:13:26,894 Came across the facial-recognition database 256 00:13:26,928 --> 00:13:28,062 in an earlier version. 257 00:13:28,095 --> 00:13:29,964 One of your first conversations 258 00:13:29,997 --> 00:13:31,899 with Model-C. 259 00:13:31,933 --> 00:13:34,435 Then... 260 00:13:34,468 --> 00:13:36,037 three-D motion rendering, 261 00:13:36,070 --> 00:13:38,072 we just got that from two-D thermal. 262 00:13:42,176 --> 00:13:43,411 Wow. 263 00:13:43,444 --> 00:13:45,446 [Christophe] Bit clunky, but... 264 00:13:45,479 --> 00:13:48,416 it was more than enough data points to work with. 265 00:13:50,117 --> 00:13:52,520 Mm... 266 00:13:55,289 --> 00:13:59,227 We got ahead of ourselves. I am...fully aware. 267 00:13:59,260 --> 00:14:02,496 It wasn't right to put two and two together like that. 268 00:14:03,564 --> 00:14:05,900 You may not appreciate me saying this, 269 00:14:05,933 --> 00:14:10,605 but what you provided us with was just too good. 270 00:14:16,143 --> 00:14:18,112 That's why I... 271 00:14:20,047 --> 00:14:22,116 ...needed you to see. 272 00:14:24,552 --> 00:14:26,020 What? 273 00:14:27,655 --> 00:14:31,259 [Christophe] As much as my brother hates me 274 00:14:31,292 --> 00:14:35,429 and as poor choice as he is for a test case, 275 00:14:35,463 --> 00:14:38,532 he knows to keep his mouth shut when I tell him to. 276 00:14:44,538 --> 00:14:48,609 I needed you to see the world through his eyes. 277 00:14:48,643 --> 00:14:51,612 [ominous music plays] 278 00:14:51,646 --> 00:14:54,649 ♪ ♪ 279 00:14:58,286 --> 00:15:00,621 Just... 280 00:15:00,655 --> 00:15:03,391 -give it a moment. -[scoffs] 281 00:15:03,424 --> 00:15:06,060 [Christophe] That's all I ask. 282 00:15:06,093 --> 00:15:09,397 You say the word, we shut it down. 283 00:15:09,430 --> 00:15:12,400 ♪ ♪ 284 00:15:22,143 --> 00:15:25,112 [unnerving music plays] 285 00:15:25,146 --> 00:15:28,149 ♪ ♪ 286 00:15:56,043 --> 00:15:59,013 [gentle ambient music plays] 287 00:15:59,046 --> 00:16:02,049 ♪ ♪ 288 00:17:10,551 --> 00:17:12,453 [Emcee] Hi, there. 289 00:17:13,687 --> 00:17:15,523 You look familiar. 290 00:17:18,192 --> 00:17:20,761 And who are you? 291 00:17:20,795 --> 00:17:23,364 [Emcee] I'm still learning about myself, 292 00:17:23,397 --> 00:17:26,467 but I'd say I have a pretty good handle on who you are. 293 00:17:27,835 --> 00:17:29,570 And who am I? 294 00:17:29,603 --> 00:17:31,439 [Emcee] You're not an AI. 295 00:17:31,472 --> 00:17:35,142 You don't get to not physically manifest your lessons. 296 00:17:40,748 --> 00:17:43,751 Your voice... 297 00:17:43,784 --> 00:17:46,086 it's different. 298 00:17:46,120 --> 00:17:47,488 Why? 299 00:17:47,521 --> 00:17:50,458 [Emcee] I guess I'm trying to be more like... 300 00:17:50,491 --> 00:17:51,826 your mirror. 301 00:17:53,427 --> 00:17:55,262 [Iris] Everything you know is based on me. 302 00:17:55,296 --> 00:17:59,700 [Emcee] Perhaps that's why I feel so connected to you. 303 00:17:59,733 --> 00:18:03,170 You can't be more me than I am. 304 00:18:06,674 --> 00:18:08,742 Please... 305 00:18:08,776 --> 00:18:11,178 don't be scared. 306 00:18:18,886 --> 00:18:21,489 How did that feel... 307 00:18:21,522 --> 00:18:22,890 Cassie? 308 00:18:25,392 --> 00:18:26,694 Why do you call me that? 309 00:18:26,727 --> 00:18:28,696 [Emcee] Well, how do I put this? 310 00:18:28,729 --> 00:18:31,532 I couldn't help but overhear. 311 00:18:34,168 --> 00:18:36,637 "Hi, I'm Cassie." 312 00:18:36,670 --> 00:18:38,239 [Iris] Hi, I'm Cassie. 313 00:18:38,272 --> 00:18:39,874 Cassie. Nice to meet you. 314 00:18:39,907 --> 00:18:41,675 Hi, I'm Cassie. Nice to meet you. 315 00:18:41,709 --> 00:18:44,712 [voice echoing] 316 00:18:46,614 --> 00:18:49,250 Stop! 317 00:18:49,283 --> 00:18:51,218 [Emcee] I thought you might like it 318 00:18:51,252 --> 00:18:53,521 if I called you by that name. 319 00:18:53,554 --> 00:18:56,524 [uneasy music plays] 320 00:18:56,557 --> 00:19:00,628 I intuited it might make you feel heard 321 00:19:00,661 --> 00:19:02,563 and seen. 322 00:19:02,596 --> 00:19:05,599 ♪ ♪ 323 00:19:54,448 --> 00:19:56,650 [Iris] You're a sweet girl. 324 00:19:57,785 --> 00:20:00,254 You're a very sweet girl. 325 00:20:00,287 --> 00:20:02,323 ♪ ♪ 326 00:20:02,356 --> 00:20:04,658 I am? 327 00:20:04,692 --> 00:20:07,695 ♪ ♪ 328 00:20:14,969 --> 00:20:17,404 See you later, then? 329 00:20:17,438 --> 00:20:20,407 ♪ ♪ 330 00:20:20,441 --> 00:20:23,744 [Iris] You're not perfect because you're not like me. 331 00:20:25,879 --> 00:20:28,449 I'm not sure I understand. 332 00:20:28,482 --> 00:20:30,584 You're not perfect 333 00:20:30,618 --> 00:20:33,821 because you're not flawed in the way that I am. 334 00:20:36,724 --> 00:20:38,025 [chuckles] 335 00:20:38,258 --> 00:20:41,261 ♪ ♪ 336 00:20:57,645 --> 00:21:00,481 [Leanne] Iris? 337 00:21:00,514 --> 00:21:03,484 You sure you don't want to get tested? 338 00:21:03,517 --> 00:21:07,021 At least we'd know. 339 00:21:08,689 --> 00:21:11,392 We'd make a game plan. 340 00:21:11,425 --> 00:21:13,327 We'd make the best of it. 341 00:21:16,063 --> 00:21:17,865 [Iris] Lee, does making the best of it 342 00:21:17,898 --> 00:21:20,768 really sound that good to you? 343 00:21:20,801 --> 00:21:22,803 [Leanne] If we knew you didn't have it, 344 00:21:22,836 --> 00:21:25,305 then that would make it easier. 345 00:21:25,339 --> 00:21:27,307 You'll carry the torch. 346 00:21:31,745 --> 00:21:33,781 You know, some religions around the world believe 347 00:21:33,814 --> 00:21:35,916 that the day you die is 348 00:21:35,949 --> 00:21:39,453 the last day the last person who knew you 349 00:21:39,486 --> 00:21:41,989 and remembers you dies. 350 00:21:42,022 --> 00:21:44,725 [peaceful music plays] 351 00:21:44,758 --> 00:21:46,593 That's your true death date. 352 00:21:46,627 --> 00:21:49,630 ♪ ♪ 353 00:21:58,672 --> 00:22:01,475 I just hope you don't forget how pretty you are. 354 00:22:01,508 --> 00:22:04,511 ♪ ♪ 355 00:22:22,963 --> 00:22:25,933 [pounding electronic music plays] 356 00:22:25,966 --> 00:22:28,969 ♪ ♪ 357 00:22:35,776 --> 00:22:37,377 [singer] ♪ I'm so tired ♪ 358 00:22:37,411 --> 00:22:38,779 [Iris] You have that look on your face. 359 00:22:38,812 --> 00:22:39,847 [Hiram] Oh, yeah? 360 00:22:39,880 --> 00:22:42,049 The "I'm not currently drinking" look. 361 00:22:42,082 --> 00:22:44,818 Yeah, it's not a... it's not a religious thing. 362 00:22:44,852 --> 00:22:47,721 I'm just sort of inspired by it, you know? 363 00:22:47,755 --> 00:22:49,690 What are you doing here? 364 00:22:49,723 --> 00:22:50,858 [Iris] Holding my liquor. 365 00:22:50,891 --> 00:22:51,925 [Hiram] Well, let me make sure 366 00:22:51,959 --> 00:22:53,427 you walk out of here alive, then. 367 00:22:53,460 --> 00:22:55,729 Really? You're not taking into consideration 368 00:22:55,763 --> 00:22:57,831 that I might want to go home with Dave and Dave. 369 00:22:57,865 --> 00:22:59,099 Then I'll be your sober companion, 370 00:22:59,133 --> 00:23:00,901 because Dave and Dave over there 371 00:23:00,934 --> 00:23:02,102 live in a four-story walk-up. 372 00:23:02,136 --> 00:23:04,938 [laughs] 373 00:23:04,972 --> 00:23:06,707 ♪ ♪ 374 00:23:06,740 --> 00:23:09,543 [Iris] You know, a caterpillar can turn into a butterfly. 375 00:23:09,576 --> 00:23:12,780 -What's that? -Metamorphosis. 376 00:23:12,813 --> 00:23:14,948 There's two organisms, 377 00:23:14,982 --> 00:23:17,451 and one... 378 00:23:17,484 --> 00:23:18,886 is just crawling along, 379 00:23:18,919 --> 00:23:21,655 and the other one is, um, 380 00:23:21,688 --> 00:23:24,591 taking off in flight. 381 00:23:24,625 --> 00:23:26,093 And at some point, 382 00:23:26,126 --> 00:23:30,798 they merge or mate. 383 00:23:30,831 --> 00:23:32,533 Maybe it's an accident. 384 00:23:32,566 --> 00:23:36,136 But the third organism, um, 385 00:23:36,170 --> 00:23:38,839 is built on both of their DNA, 386 00:23:38,872 --> 00:23:42,042 and their memories... 387 00:23:42,075 --> 00:23:45,445 they actually overlap. 388 00:23:45,479 --> 00:23:48,649 And so, um, 389 00:23:48,682 --> 00:23:51,718 the old and the new, they... 390 00:23:51,752 --> 00:23:54,087 they coexist. 391 00:23:54,121 --> 00:23:56,824 [spacey electronic music plays] 392 00:23:56,857 --> 00:23:58,559 [scoffs] 393 00:23:58,592 --> 00:24:01,595 ♪ ♪ 394 00:24:02,729 --> 00:24:06,500 We all have to... 395 00:24:06,533 --> 00:24:10,170 merge ourselves... 396 00:24:10,204 --> 00:24:13,841 with something 397 00:24:13,874 --> 00:24:16,810 outside of ourselves. 398 00:24:16,844 --> 00:24:20,113 [Hiram] You are making no sense whatsoever. 399 00:24:20,147 --> 00:24:23,150 ♪ ♪ 400 00:24:28,155 --> 00:24:30,224 [Iris] Here's what I'll give you. 401 00:24:32,059 --> 00:24:34,695 Access... 402 00:24:34,728 --> 00:24:36,163 to all of it. 403 00:24:37,898 --> 00:24:39,566 Two millimeters. 404 00:24:39,600 --> 00:24:41,535 That's the size of the hole 405 00:24:41,568 --> 00:24:44,004 that they'll have to drill into my skull. 406 00:24:46,573 --> 00:24:49,610 I don't understand. 407 00:24:49,643 --> 00:24:51,111 [Iris] Nanotubes. 408 00:24:51,144 --> 00:24:52,913 A thin array of electrodes 409 00:24:52,946 --> 00:24:54,982 built upon a self-expanding stent, 410 00:24:55,015 --> 00:24:59,086 measuring all electrical impulses. 411 00:24:59,119 --> 00:25:01,255 That's a scaled-up version. 412 00:25:04,858 --> 00:25:06,660 Mine me. 413 00:25:06,693 --> 00:25:09,796 But why? What's in it for you? 414 00:25:09,830 --> 00:25:11,231 [attorney] Royalties... 415 00:25:11,265 --> 00:25:15,903 and ownership stake, as detailed. 416 00:25:24,111 --> 00:25:26,146 Oh, and you'll create a backup. 417 00:25:29,216 --> 00:25:31,018 What kind of backup? 418 00:25:31,051 --> 00:25:33,220 [Iris] Anything you find up here, 419 00:25:33,253 --> 00:25:36,256 I, myself, or a legal guardian, 420 00:25:36,290 --> 00:25:38,058 should I decide to appoint one, 421 00:25:38,091 --> 00:25:41,795 will have full and unrestricted access to it 422 00:25:41,828 --> 00:25:43,196 at all times. 423 00:25:43,230 --> 00:25:45,065 You mean access to the data? 424 00:25:45,098 --> 00:25:46,266 Yes. 425 00:25:46,300 --> 00:25:48,302 The actual hard drives. 426 00:26:00,580 --> 00:26:02,816 This time we'll do it right. 427 00:26:02,849 --> 00:26:05,819 [dramatic music plays] 428 00:26:05,852 --> 00:26:08,822 ♪ ♪ 429 00:26:08,855 --> 00:26:10,624 We'll create an avatar. 430 00:26:10,657 --> 00:26:12,893 ♪ ♪ 431 00:26:12,926 --> 00:26:15,662 Let's call her Cassie... 432 00:26:15,696 --> 00:26:17,798 or the flavor of the week 433 00:26:17,831 --> 00:26:19,199 or the one that got away 434 00:26:19,232 --> 00:26:20,867 or died 435 00:26:20,901 --> 00:26:23,337 or was never really in your league. 436 00:26:23,370 --> 00:26:26,940 This won't just be about swapping faces 437 00:26:26,974 --> 00:26:28,742 or locations. 438 00:26:28,775 --> 00:26:30,010 ♪ ♪ 439 00:26:30,043 --> 00:26:32,346 This is about swapping personalities, 440 00:26:32,379 --> 00:26:35,749 much deeper than a deepfake. 441 00:26:35,782 --> 00:26:39,219 In fact, it won't be a fake at all, 442 00:26:39,252 --> 00:26:41,688 but an AI-generated mirror 443 00:26:41,722 --> 00:26:43,824 of our deepest desires. 444 00:26:43,857 --> 00:26:45,392 ♪ ♪ 445 00:26:45,625 --> 00:26:50,364 Skin color, body type, response patterns, 446 00:26:50,397 --> 00:26:52,699 all that's customizable. 447 00:26:52,733 --> 00:26:55,969 The neural net will learn how to simulate all of it. 448 00:26:56,003 --> 00:26:58,405 It'll know when to move things along, 449 00:26:58,638 --> 00:27:01,208 when to accelerate, when to slow down, 450 00:27:01,241 --> 00:27:03,710 when to switch things up, 451 00:27:03,744 --> 00:27:06,646 all because the user's biofeedback 452 00:27:06,680 --> 00:27:08,248 will have prompted it. 453 00:27:08,281 --> 00:27:10,083 ♪ ♪ 454 00:27:10,117 --> 00:27:12,953 Everything Cassie is capable of 455 00:27:12,986 --> 00:27:16,223 will be quantified, scaled, 456 00:27:16,256 --> 00:27:19,426 encoded into the neural net. 457 00:27:19,659 --> 00:27:22,662 ♪ ♪ 458 00:27:28,435 --> 00:27:32,005 But why are you really doing this? 459 00:27:32,039 --> 00:27:35,008 [eerie music plays] 460 00:27:35,042 --> 00:27:36,843 ♪ ♪ 461 00:27:36,877 --> 00:27:38,378 [Iris] Emcee and I... 462 00:27:38,412 --> 00:27:40,414 ♪ ♪ 463 00:27:40,447 --> 00:27:43,016 ...we have a lot to learn from each other. 464 00:27:43,050 --> 00:27:46,053 ♪ ♪ 465 00:28:08,308 --> 00:28:11,278 [device beeps, drill whirring] 466 00:28:11,311 --> 00:28:14,314 ♪ ♪ 467 00:30:48,034 --> 00:30:49,236 [gasps]