1 00:00:09,250 --> 00:00:11,560 Our discussion about the future of the workforce 2 00:00:11,560 --> 00:00:14,670 would not be complete without an exploration of what impact 3 00:00:14,670 --> 00:00:17,060 the current innovations in technology will have, 4 00:00:17,060 --> 00:00:19,780 specifically, artificial intelligence. 5 00:00:19,780 --> 00:00:21,840 Over the past three years, there have 6 00:00:21,840 --> 00:00:24,150 been significant innovations in big data, 7 00:00:24,150 --> 00:00:27,270 database architecture, and artificial intelligence, 8 00:00:27,270 --> 00:00:30,730 that are enabling new business models and products. 9 00:00:30,730 --> 00:00:33,070 In simplest terms, the innovations 10 00:00:33,070 --> 00:00:35,900 in artificial intelligence are equipping algorithms 11 00:00:35,900 --> 00:00:39,670 to make smarter decisions about tasks and problems 12 00:00:39,670 --> 00:00:43,520 that have, so far, been thought only to be reserved for humans. 13 00:00:43,520 --> 00:00:47,230 The accompanying innovations, and cheapening of hardware, 14 00:00:47,230 --> 00:00:50,310 that enables artificial intelligence algorithms to read 15 00:00:50,310 --> 00:00:52,910 and process incredibly large data sets, 16 00:00:52,910 --> 00:00:55,550 has motivated more entrepreneurs, 17 00:00:55,550 --> 00:00:58,830 and technologists, to innovate with artificial intelligence. 18 00:00:58,830 --> 00:01:01,570 Many think that the implications for the workplace 19 00:01:01,570 --> 00:01:03,010 will hurt workers. 20 00:01:03,010 --> 00:01:05,630 Similar to how the Industrial Revolution displaced 21 00:01:05,630 --> 00:01:07,780 many workers through the creation of tech 22 00:01:07,780 --> 00:01:11,180 that was cheaper and overall more efficient than people. 23 00:01:11,180 --> 00:01:14,250 People believe that AI will replace, not 24 00:01:14,250 --> 00:01:17,700 just blue collar workers, but also white collar workers. 25 00:01:17,700 --> 00:01:21,190 However, the fact is that the impact on workers 26 00:01:21,190 --> 00:01:24,470 is under our control, the control of entrepreneurs, 27 00:01:24,470 --> 00:01:26,990 technologists, and everyday citizens, 28 00:01:26,990 --> 00:01:28,880 that are part of the workforce. 29 00:01:28,880 --> 00:01:31,440 Business owners and workers can harness 30 00:01:31,440 --> 00:01:34,670 AI to augment human intelligence in the workplace 31 00:01:34,670 --> 00:01:39,630 to creatively improve decision making instead of replacing it. 32 00:01:39,630 --> 00:01:44,290 Here's Rob High, the CTO of IBM Watson, one of the primary R&D 33 00:01:44,290 --> 00:01:48,930 leaders in the AI space, making this exact point at the 2016 34 00:01:48,930 --> 00:01:52,220 MIT Technology Conference. 35 00:01:52,220 --> 00:01:54,210 If we become experts in everything, 36 00:01:54,210 --> 00:01:55,540 we're experts and nothing. 37 00:01:55,540 --> 00:01:57,170 And so the same phenomenon occurs 38 00:01:57,170 --> 00:01:58,860 within these cognitive systems as well. 39 00:01:58,860 --> 00:02:01,230 They can actually begin to get a little bit confused 40 00:02:01,230 --> 00:02:03,340 in how to ration-- 41 00:02:03,340 --> 00:02:04,740 this would be a good question. 42 00:02:04,740 --> 00:02:06,840 So a couple points to be made about that. 43 00:02:06,840 --> 00:02:10,150 That said, because each of these are distinct, 44 00:02:10,150 --> 00:02:12,210 we can actually set the system up 45 00:02:12,210 --> 00:02:15,397 based on whatever is appropriate for the application we're 46 00:02:15,397 --> 00:02:16,230 trying to solve for. 47 00:02:16,230 --> 00:02:19,470 For the case of oncology treatment advice, 48 00:02:19,470 --> 00:02:21,390 the goal is actually about trying 49 00:02:21,390 --> 00:02:24,510 to identify the best treatment based on outcomes, based 50 00:02:24,510 --> 00:02:27,691 on standard of care practices and clinical expertise, 51 00:02:27,691 --> 00:02:29,940 based on similarity of this patient to other patients. 52 00:02:29,940 --> 00:02:31,900 And all those can contribute to help 53 00:02:31,900 --> 00:02:35,560 Watson come back with an ordered list of potential treatments. 54 00:02:35,560 --> 00:02:37,560 But again, we're not trying to make 55 00:02:37,560 --> 00:02:38,770 the decision for the human. 56 00:02:38,770 --> 00:02:40,616 We're not trying to think for the human. 57 00:02:40,616 --> 00:02:42,990 What we're trying to do is do the research for the human, 58 00:02:42,990 --> 00:02:45,410 so the human can then go make the decision better. 59 00:02:45,410 --> 00:02:46,880 In this case the doctor. 60 00:02:46,880 --> 00:02:48,930 So we're not going take-- 61 00:02:48,930 --> 00:02:51,180 Watson doesn't make the decision about what treatment 62 00:02:51,180 --> 00:02:52,850 to give for the doctor. 63 00:02:52,850 --> 00:02:55,840 It presents the doctor a set of relevant treatments that 64 00:02:55,840 --> 00:02:59,250 has been rationalized based on all this other information, 65 00:02:59,250 --> 00:03:01,910 at a speed that a doctor could not do on their own, 66 00:03:01,910 --> 00:03:03,870 including finding all the literature that's 67 00:03:03,870 --> 00:03:05,661 relevant to support why they should believe 68 00:03:05,661 --> 00:03:11,720 in that treatment or not. 69 00:03:11,720 --> 00:03:14,550 So what specific type of disruptions 70 00:03:14,550 --> 00:03:18,060 can we expect AI to cause for employers and employees? 71 00:03:18,060 --> 00:03:21,470 Let's take a look at what David Autor, a professor of economics 72 00:03:21,470 --> 00:03:23,900 at MIT, said in the Future of Work panel 73 00:03:23,900 --> 00:03:26,120 at the Nobel Week Dialogue of 2015. 74 00:03:26,120 --> 00:03:30,310 In terms of employment, as was said earlier, 75 00:03:30,310 --> 00:03:32,030 people have been worried for centuries 76 00:03:32,030 --> 00:03:33,610 about displacement of labor. 77 00:03:33,610 --> 00:03:35,380 And labor has been vastly displaced. 78 00:03:35,380 --> 00:03:38,610 In the start of the 20th century, 40% of US employment 79 00:03:38,610 --> 00:03:40,030 was on farms. 80 00:03:40,030 --> 00:03:41,280 Now it's under 2%. 81 00:03:41,280 --> 00:03:43,740 We had no idea what was coming. 82 00:03:43,740 --> 00:03:44,710 But it came. 83 00:03:44,710 --> 00:03:46,620 We find uses for ourselves. 84 00:03:46,620 --> 00:03:49,870 And that's partly because the technology augments us, 85 00:03:49,870 --> 00:03:51,480 it doesn't just replace us. 86 00:03:51,480 --> 00:03:54,380 It makes the rest of what we do more valuable. 87 00:03:54,380 --> 00:03:57,390 So, you know, I can make more as an Uber driver 88 00:03:57,390 --> 00:03:59,340 than as a rickshaw driver, not just 89 00:03:59,340 --> 00:04:02,970 because I'm in a rich world, but because I have a tool that 90 00:04:02,970 --> 00:04:05,096 makes me able to transport people much faster 91 00:04:05,096 --> 00:04:07,220 and further, and more safely, and more comfortably, 92 00:04:07,220 --> 00:04:08,470 in a given amount of time. 93 00:04:08,470 --> 00:04:10,990 In many, many ways, the tools we make 94 00:04:10,990 --> 00:04:12,322 make our time more valuable. 95 00:04:12,322 --> 00:04:14,280 Because there's always some piece that we still 96 00:04:14,280 --> 00:04:17,760 have to supply, and because that becomes a scarce factor 97 00:04:17,760 --> 00:04:19,640 that raises our labor value. 98 00:04:19,640 --> 00:04:21,089 So there's a challenge-- 99 00:04:21,089 --> 00:04:23,410 So now let me say, but let's say I'm wrong, 100 00:04:23,410 --> 00:04:25,897 and that we're going to be all replaced by robots. 101 00:04:25,897 --> 00:04:27,480 That they can do all the work we want. 102 00:04:27,480 --> 00:04:28,960 Is that a problem? 103 00:04:28,960 --> 00:04:31,330 Well, if it's a problem, it's a very unusual 104 00:04:31,330 --> 00:04:32,250 historical problem. 105 00:04:32,250 --> 00:04:34,870 Most problems, economic problems, societal problems, 106 00:04:34,870 --> 00:04:36,710 are problems of scarcity, of not having 107 00:04:36,710 --> 00:04:39,196 enough of something, enough food, enough power, 108 00:04:39,196 --> 00:04:40,445 enough safety, enough shelter. 109 00:04:45,020 --> 00:04:47,660 In addition to changing the kinds of roles 110 00:04:47,660 --> 00:04:49,900 that workers can take on in the workplace, 111 00:04:49,900 --> 00:04:53,270 AI-based innovation can also impact social inequality, 112 00:04:53,270 --> 00:04:55,410 for better or worse, to be determined 113 00:04:55,410 --> 00:04:57,120 by how we choose to behave. 114 00:04:57,120 --> 00:05:00,030 For example, women and people of color 115 00:05:00,030 --> 00:05:03,490 are very under-represented among technology entrepreneurs that 116 00:05:03,490 --> 00:05:05,310 receive funding from venture capitalists 117 00:05:05,310 --> 00:05:06,960 for their enterprises. 118 00:05:06,960 --> 00:05:10,420 If venture capital investment goes towards a more diverse set 119 00:05:10,420 --> 00:05:13,310 of entrepreneurs, then we will see social inequality 120 00:05:13,310 --> 00:05:14,750 improve positively. 121 00:05:14,750 --> 00:05:18,060 Here's Amanda Kahlow, CEO of 6Sense, 122 00:05:18,060 --> 00:05:20,520 a predictive intelligence platform for marketing 123 00:05:20,520 --> 00:05:23,720 and sales, expanding on the current under-representation, 124 00:05:23,720 --> 00:05:25,970 based on her experiences. 125 00:05:25,970 --> 00:05:28,300 The short answer is yes, I feel supported. 126 00:05:28,300 --> 00:05:32,310 But I do feel-- so I go to a lot of CEO events 127 00:05:32,310 --> 00:05:33,977 and sit on a panel like this. 128 00:05:33,977 --> 00:05:36,060 The last, I was in Hawaii just a little while ago. 129 00:05:36,060 --> 00:05:38,460 250 CEOs, I was the only woman. 130 00:05:38,460 --> 00:05:41,750 And in enterprise tech, I'm less than 1% in this world. 131 00:05:41,750 --> 00:05:42,800 And it's crazy. 132 00:05:42,800 --> 00:05:44,327 And it doesn't feel right. 133 00:05:44,327 --> 00:05:45,910 And I think it's because we're missing 134 00:05:45,910 --> 00:05:48,243 that confidence, and the passion to go forward, and want 135 00:05:48,243 --> 00:05:49,560 to dream big. 136 00:05:49,560 --> 00:05:50,896 Do I feel supported by the men? 137 00:05:50,896 --> 00:05:52,020 I think yes, I'm supported. 138 00:05:52,020 --> 00:05:54,260 I think pretty much, I feel very supported. 139 00:05:54,260 --> 00:05:57,170 Especially my male co-founders are ridiculously supportive. 140 00:05:57,170 --> 00:05:58,670 And actually, I think that they have 141 00:05:58,670 --> 00:06:00,970 more faith and confidence in me than I do myself, 142 00:06:00,970 --> 00:06:02,010 which is amazing. 143 00:06:02,010 --> 00:06:05,570 I think I have had to, maybe, guide 144 00:06:05,570 --> 00:06:07,630 and teach some of my VCs and other people 145 00:06:07,630 --> 00:06:09,340 in the world what's important. 146 00:06:09,340 --> 00:06:11,195 And it's not always just about-- 147 00:06:11,195 --> 00:06:13,070 I think we were talking earlier about the end 148 00:06:13,070 --> 00:06:15,070 goal of the business, and growing the business-- 149 00:06:15,070 --> 00:06:16,720 but creating a sustainable company 150 00:06:16,720 --> 00:06:18,550 is also about the culture and the people. 151 00:06:18,550 --> 00:06:22,330 And I think that what we're missing in today's workforce 152 00:06:22,330 --> 00:06:24,380 is putting an emphasis on that. 153 00:06:24,380 --> 00:06:27,280 In addition to all of the great technologies that we can do. 154 00:06:27,280 --> 00:06:30,090 And that's the thing that the women bring to the table. 155 00:06:30,090 --> 00:06:33,060 I'm super proud of our lack of attrition rate at 6Sense. 156 00:06:33,060 --> 00:06:34,560 We don't have people leave, and it's 157 00:06:34,560 --> 00:06:36,900 because they feel supported and heard, 158 00:06:36,900 --> 00:06:39,540 and they can also be creative and do amazing things 159 00:06:39,540 --> 00:06:41,640 in the work that we're providing. 160 00:06:41,640 --> 00:06:45,435 But I think it's not intentional when men aren't supporting. 161 00:06:45,435 --> 00:06:46,810 I was actually sitting on a plane 162 00:06:46,810 --> 00:06:49,245 the other day talking to a gentleman. 163 00:06:49,245 --> 00:06:50,370 And he asked me what I did. 164 00:06:50,370 --> 00:06:52,640 And I never come out and say, I'm the CEO of this company. 165 00:06:52,640 --> 00:06:54,110 I say I work for a software company. 166 00:06:54,110 --> 00:06:56,380 And I started talking about the stuff that we're doing. 167 00:06:56,380 --> 00:06:58,190 And then two hours into the conversation, he's like, well, 168 00:06:58,190 --> 00:06:58,900 what do you do there? 169 00:06:58,900 --> 00:06:59,910 And he's like, are you in marketing? 170 00:06:59,910 --> 00:07:01,370 And I was like, no, I'm the CEO. 171 00:07:01,370 --> 00:07:04,794 And he got up from his seat, and he had this visceral reaction. 172 00:07:04,794 --> 00:07:05,710 And he didn't mean to. 173 00:07:05,710 --> 00:07:06,310 And then he sat down. 174 00:07:06,310 --> 00:07:08,726 He's like, I'm so sorry, I didn't mean to react like that. 175 00:07:08,726 --> 00:07:10,300 And [? he ?] didn't intend to-- 176 00:07:10,300 --> 00:07:11,010 [LAUGHTER] 177 00:07:11,010 --> 00:07:15,990 It was just a lack of filter which we all have sometimes. 178 00:07:15,990 --> 00:07:18,990 When he was realizing who I was in the company-- and I 179 00:07:18,990 --> 00:07:21,080 wear braids and pigtails, and I don't always 180 00:07:21,080 --> 00:07:23,440 dress the part of something else. 181 00:07:23,440 --> 00:07:25,700 I always struggle with being my authentic woman self. 182 00:07:25,700 --> 00:07:29,760 I want to maintain that while dominating in the male world 183 00:07:29,760 --> 00:07:30,260 that I'm in. 184 00:07:30,260 --> 00:07:32,750 So I think there are levels of education 185 00:07:32,750 --> 00:07:33,750 that we can give others. 186 00:07:33,750 --> 00:07:35,070 Thank you. 187 00:07:35,070 --> 00:07:37,620 Income and social inequality between genders 188 00:07:37,620 --> 00:07:39,380 will not improve unless we choose 189 00:07:39,380 --> 00:07:41,620 to make fairer decisions about who 190 00:07:41,620 --> 00:07:43,420 gets opportunities to create and run 191 00:07:43,420 --> 00:07:45,620 companies in the tech industry. 192 00:07:45,620 --> 00:07:48,860 If we do not actively monitor our biases, 193 00:07:48,860 --> 00:07:50,880 we will not be able to take advantage 194 00:07:50,880 --> 00:07:53,500 of this period of innovation and wealth creation 195 00:07:53,500 --> 00:07:55,460 towards a more equitable future. 196 00:07:55,460 --> 00:07:57,880 There is also an opportunity gap among people 197 00:07:57,880 --> 00:08:01,330 for who gets to work within companies in the tech industry. 198 00:08:01,330 --> 00:08:05,820 Leila Janah, founder and CEO of Sama Group, an organization 199 00:08:05,820 --> 00:08:09,350 that lifts people out of poverty by enabling them to do work 200 00:08:09,350 --> 00:08:12,690 in the digital economy, discusses this opportunity gap, 201 00:08:12,690 --> 00:08:14,680 and how we can mitigate it. 202 00:08:14,680 --> 00:08:17,400 Now interestingly, traditionally, 203 00:08:17,400 --> 00:08:19,580 in the informal sector, if you're doing, 204 00:08:19,580 --> 00:08:21,950 say construction work, and you go 205 00:08:21,950 --> 00:08:23,590 and you find your job by standing 206 00:08:23,590 --> 00:08:26,060 on the side of the road and waiting for someone to hire you 207 00:08:26,060 --> 00:08:29,070 as a labor contractor, there is no such thing 208 00:08:29,070 --> 00:08:31,120 as getting rewarded for good work. 209 00:08:31,120 --> 00:08:33,760 You might put in a wonderful 10 hour shift, 210 00:08:33,760 --> 00:08:36,809 but you will get no five star rating on a website. 211 00:08:36,809 --> 00:08:38,542 You go back to zero the next day, 212 00:08:38,542 --> 00:08:41,000 standing on the side of the road waiting for your next job. 213 00:08:41,000 --> 00:08:44,010 So I think that technology actually 214 00:08:44,010 --> 00:08:47,950 brings a lot more transparency in areas like wages, 215 00:08:47,950 --> 00:08:51,060 and reviews, and feedback, that are very good for low income 216 00:08:51,060 --> 00:08:51,910 workers. 217 00:08:51,910 --> 00:08:53,610 And I'll say one last thing on this, 218 00:08:53,610 --> 00:08:56,200 which is that, David and I were talking about this just 219 00:08:56,200 --> 00:08:58,720 before the panel, we so often assume 220 00:08:58,720 --> 00:09:03,290 that technology has some kind of morality built into it. 221 00:09:03,290 --> 00:09:05,420 Technology, in fact, is agnostic. 222 00:09:05,420 --> 00:09:08,070 It's as agnostic as roads, and waterways. 223 00:09:08,070 --> 00:09:10,340 It's all about what we do with technology. 224 00:09:10,340 --> 00:09:14,030 We can choose to inject morality into the technology we build. 225 00:09:14,030 --> 00:09:18,020 We can choose to build systems that enfranchise the poor. 226 00:09:18,020 --> 00:09:19,570 Or we can choose not to. 227 00:09:19,570 --> 00:09:22,610 But we're not victims of technological progress. 228 00:09:22,610 --> 00:09:26,170 It's not happening in a way that we can't control. 229 00:09:26,170 --> 00:09:28,510 At least until the singularity happens, at which point 230 00:09:28,510 --> 00:09:29,820 all bets are off. 231 00:09:29,820 --> 00:09:30,700 [LAUGHTER] 232 00:09:30,700 --> 00:09:33,680 The first is, job training needs to completely change. 233 00:09:33,680 --> 00:09:37,880 In the US, which is the best example I know, it's abysmal. 234 00:09:37,880 --> 00:09:41,350 We are still training people for jobs that are disappearing 235 00:09:41,350 --> 00:09:42,670 at an alarming rate. 236 00:09:42,670 --> 00:09:45,289 We are not training them in how to be entrepreneurial 237 00:09:45,289 --> 00:09:46,330 and marketing themselves. 238 00:09:46,330 --> 00:09:48,770 The skill that you need for 21st century jobs 239 00:09:48,770 --> 00:09:50,220 is marketing yourself. 240 00:09:50,220 --> 00:09:52,620 If Kim Kardashian can teach us anything, 241 00:09:52,620 --> 00:09:54,600 it's probably only that. 242 00:09:54,600 --> 00:09:57,070 But interestingly, all of these new platforms 243 00:09:57,070 --> 00:09:59,250 require you to be able to set up a profile, 244 00:09:59,250 --> 00:10:01,690 to choose the right kind of a photograph that will attract 245 00:10:01,690 --> 00:10:03,687 customers, to send a follow up email, 246 00:10:03,687 --> 00:10:05,020 to have customer service skills. 247 00:10:05,020 --> 00:10:08,100 And you're absolutely right, that many low income people-- 248 00:10:08,100 --> 00:10:11,130 And just to give you an idea of how profound this is, in the US 249 00:10:11,130 --> 00:10:13,100 we have one of our centers which is 250 00:10:13,100 --> 00:10:15,590 less than a mile from the Facebook headquarters 251 00:10:15,590 --> 00:10:17,790 in Silicon Valley, in East Palo Alto. 252 00:10:17,790 --> 00:10:19,430 Which is a very poor community. 253 00:10:19,430 --> 00:10:24,080 And fully 25% of our incoming class for one of our trainings 254 00:10:24,080 --> 00:10:28,250 last year had people who had zero internet access at home, 255 00:10:28,250 --> 00:10:30,420 and no smartphone. 256 00:10:30,420 --> 00:10:32,860 A mile from the Facebook campus. 257 00:10:32,860 --> 00:10:35,200 So we can talk all we want about how technology is going 258 00:10:35,200 --> 00:10:37,100 to make everyone's lives better, but the reality 259 00:10:37,100 --> 00:10:38,710 is that many people are disconnected. 260 00:10:38,710 --> 00:10:41,600 And if you didn't grow up having the internet at home, how can 261 00:10:41,600 --> 00:10:44,740 you possibly understand how to market yourself? 262 00:10:44,740 --> 00:10:47,310 It's not easy to be an autodidact, 263 00:10:47,310 --> 00:10:49,840 and to learn how to teach yourself things on YouTube, 264 00:10:49,840 --> 00:10:51,506 if you didn't grow up with the internet. 265 00:10:55,470 --> 00:10:57,610 Advancements in technology taking place right 266 00:10:57,610 --> 00:10:59,390 now are incredibly exciting. 267 00:10:59,390 --> 00:11:02,160 And we all stand to gain from them as consumers. 268 00:11:02,160 --> 00:11:05,680 However, as with other periods of societal change, 269 00:11:05,680 --> 00:11:08,350 we, citizens and the government, have 270 00:11:08,350 --> 00:11:11,490 to ensure that the opportunities to launch, run, and work, 271 00:11:11,490 --> 00:11:14,610 in these new enterprises is made available to all members 272 00:11:14,610 --> 00:11:16,420 of society.