Skip to main content

Bladerunner and the Treasury Secretary

I went to see Bladerunner last night. A re-run; indeed, for me, a multi-run. It's actually been 35 years since this epochal movie first jolted our minds. Way before Google, before Facebook, before cellphones, before much of anything technological save wheelbarrows and telex machines and those huge basement "computer rooms" with ridiculously large tapes and white-coated operators (my first ever job was to feed one with forms). Back when Harrison Ford, was young, as were some of us.

Bladerunner, briefly put, is about "replicants" who are humanoid robots and go rogue; so humans, of course (in this case, the aforesaid young Harrison Ford), need to bring them to heel by shooting them.

But it's worth recalling that Michael Crichton's Westworld ante-dated Bladerunner by nearly a decade - another humanoid drama focused on the profound ambiguities we are creating, which despite the going-rogue Yul Brynner never quite managed to break the "cult movie" barrier. Its revival as a series on binge-available TV, together with the derivative Humans, is re-focusing the question - perhaps the very greatest of our human questions - of the meaning, distinctiveness, putative uniqueness, of our nature vis-a-vis machines. And in a world equally fixated with the significance of human rights - and the unbounded good offered by emerging technologies.

I'm very glad I was able to spend time with Mike Crichton on a couple of occasions before his untimely death in 2008. One was a conference we co-sponsored at the Salk Institute that brought together leaders from science and technology and the arts. On another occasion, we invited him to address members of the United States Senate on intellectual property over-reach in biotechnology (he began, "I own a lot of intellectual property"!). A man of prodigious intellect and creativity (one week, I'm told, he was responsible for the top-watched TV show, the highest-grossing movie, and the best-selling novel), he was also a Harvard MD (and, for what it's worth, 6 foot 9) - and a man of conscience and humility. As we sat waiting for the Senate meeting he was fumbling nervously with his notes.

I have never met Secretary Mnuchin, who declared this week that it would be 50-100 years before the impact of AI/robotics on human employment would be evident. It was perhaps the single most ridiculous statement from a government official that I have ever heard, and of course, like you, I have heard plenty! And it epitomized the cleavage between the best thinking in science and technology and the top thinking in government - that has emerged in the past generation as the leading and most uninformed source of risk in the developed economies. (In case you've not been following the debate, PwC just came out with a report suggesting that 40% of U.S. jobs would be defunct in 15 years. There have been others.)

Of course, that needs some clarification. U.S. governments of both parties have appointed much-respected advisers and assorted acclaimed advisory groups. And OSTP (the "White House Office of Science and Technology Policy") has long been a center of exceedingly smart yet underfunded effort with depressingly limited influence. As we at C-PET have persistently pointed out over the past decade, the impact on government has been minimal, if measurable at all. Which is not to say that we have had in living memory such a ridiculously idiotically ignorant statement as that which just issued from the Treasury Secretary. But it does underline the disjunction between what the U.S. government likes to portray that it is doing, and what it actually does. One example: President Obama's decision, late on, to set up a blue ribbon commission on cybersecurity. Private comment from one of the best-placed cyber experts: blah blah. Of course, this does not need to be a remotely partisan statement, as the current administration's entirely ridiculous (and anti-American) efforts to cut federal R and D funding demonstrate.

The jobs issue offers us a keyhole view of the future. As I argue in my upcoming book, Will Robots Take Your Job? A Plea for Consensus, reliance on "conventional wisdom" may prove disastrous. The question of the 21st century goes deep, and poses the human question in the face of truly extraordinary technological advance. In policy terms, what is government's role - as tech issues come to dominate every aspect of our human experience? (And when our top officials seem to have so little feel for what is going on.) Is the 21st century essentially more of the 20th, or is tech prompting a fundamental step-change in human possibility, opportunity, and risk? I sat through every single one of the presidential debates, as did many did you, and heard not one word on these questions....
The Mnuchin statement may be an outlier in its ridiculousness - savvier officials have simply ignored the question - but it's also a wake-up call to those of us who are properly sensitized to change, disruption, and the future.

Go see Bladerunner; go reflect on technological change, a la Moore's Law, over the past 35 years; read the PwC report; and let the deep ignorance of Secretary Mnuchin's remarks disturb you. Then work for an America that is actually focused on the future.

Best regards,

Nigel Cameron
President and CEO
Center for Policy on Emerging Technologies
Washington, DC