top of page
Writer's pictureOurStudio

Why We Don't Need a Department of Technology Policy

Donald Trump meets with tech leaders at Trump Tower December 14, 2016

Polaris/Newscom


There seem to be embarrassing new "Internet of Things" failures every week now. Sometimes, they are on the humorous side, like when a "smart toilet" was hacked to randomly flush at startled bathroom-goers. Other times, they can be disturbing, as in case of critical vulnerabilities in St. Jude's implantable cardiac devices that could put users' lives in the hands of hackers. But in all cases, these failures tend to grab headlines and inflame calls for government regulation.

It's not hard to see why. When faced with some kind of public dilemma, many people immediately assume that the government alone can solve the problem. And when you throw in futuristic fears about losing control of everyday things around us, the prospect of a savior from above seems all the more necessary. But we must take care that such "solutions" don't create more problems than they supposedly solve. Such would almost certainly be the case with one recent proposal: a "Department of Technology Policy."

A 'World-Size Robot'

Recently, Bruce Schneier, a veteran in information-security studies and leading voice in technology policy, penned a long article for the New Yorker in which he argues for the creation of a new federal agency—the "Department of Technology Policy"—that would consolidate control of technological regulations into a single body. Schneier explains how the incredible rate of "smart"-device adoption has created some new and unprecedented security challenges.

Few people realize just how quickly IOT devices have saturated the world around them. This will only accelerate—Schneier likens the rise of IOT technologies to building a "world-size robot," with all of the sensors, commands, and computations to match. And with an expanded connected reality comes an expanded digital threat set. Computer bugs and software vulnerabilities no longer merely endanger personal data and hardware, they can potentially shut down connected home devices or hijack moving cars and even cause us physical harm.

Indeed, there have been considerable security problems with connected devices. Often, the issues are theoretical: Security researchers warn the public at conferences and in journals of major vulnerabilities they discover in popular consumer routers or printers or security cameras—vulnerabilities which may or may not end up getting patched.

But sometimes these vulnerabilities are actually exploited. Last October, some of the Internet's most popular websites—Twitter, Amazon, GitHub, Reddit—were knocked offline thanks to insecure IOT devices. Some malicious actor was able to infect an army of DVRs, cameras, baby monitors, and printers with a malware called Mirai, directing these devices to launch a distributed-denial-of-service (DDOS) attack on those websites' hosting provider, Dyn. While the attack was short, and the fallout was mostly limited to inconvenience and loss of sales, it was a major warning signal for security researchers who envisioned how such an attack could have been much more devastating.

The main problem, as Schneier sees it, is that many companies developing and selling connected devices do not have the right security chops to make sure that they are safe before people buy them. Technology companies like Google and Apple have large dedicated teams to locate and patch software vulnerabilities as soon as possible—and even this process is imperfect. Now, companies who have no such software experience may put IOT products out to market without the necessary testing, which could create major unexpected problems down the road. And the home consumers who buy such devices are seldom equipped to evaluate the security settings on their own.

Whose Failure?

While Schneier's essay does an excellent job of describing the new security challenges that smart devices create, it falls short on solutions. "The market can't fix this," Schneier suggests, "because neither the buyer nor the seller cares … There is no market solution, because the insecurity is what economists call an externality: It's an effect of the purchasing decision that affects other people. Think of it kind of like invisible pollution."

Like many who make "market failure" arguments, Schneier believes that the government alone can intervene to fix the problem. Specifically, he thinks an entirely new federal agency is needed, fearing that without a Department of Technology Policy nothing could compel device manufacturers to internalize the externalities of poor digital security.

But behind every suspected market failure is usually an existing government failure. Schneier himself says as much when he discusses the many laws that inhibit security research and contribute to smart-device insecurity. In particular, laws like the Digital Millennium Copyright Act (DMCA) and Computer Fraud and Abuse Act (CFAA) penalize computer scientists who try to test or report certain software vulnerabilities. These laws should be amended before we do anything else.

Perhaps more importantly, when considering how best to address market failures, we must not succumb to what economist Harold Demsetz called the "Nirvana fallacy." If you compare an imperfect existing situation with the perfect ideal of government intervention, of course the government solution will be tempting. But government bodies operate in an imperfect reality, and once created, they will generate their own set of unintended consequences, which will be very hard to turn back.

Tech Policy Touches Everything

Beating back even a single federal regulation often requires a rare combination of years of scholarly attention, unwavering political will, and random chance. Unfortunately, bad government policy can inflict society for decades by mere virtue of institutional inertia. If a newly-created "Department of Technology Policy" proves useless, incompetent, or corrupt, it would be very hard if not impossible to set it right or close up shop. And since "technology" now touches so much of our lives, any new Department of Technology runs the risk of becoming entrenched in most of the things that we do.

Often, social problems can seem unprecedented or difficult to solve when new technologies are involved. Yet again and again, society has developed legal and social solutions borrowed from some earlier problem that can be applied to the new technology. For example, the "security pollution" that Schneier describes could perhaps be addressed with common law precedents in the courts, or through voluntary standards-setting bodies, or third party audits.

In other situations, new technological solutions may be appropriate. Entrepreneurs and researchers are hard at work on new IOT security solutions, because after all, where there is a great social need, there is a great profit opportunity. But if a "Department of Technology Policy" preemptively blocks such research, or requires companies to dedicate resources elsewhere by mandate, these solutions may never be discovered.

Security researchers like Schneier provide a great service in bringing attention to the newest technological problems that arise. Markets are not perfect, and they certainly don't immediately fix problems in the exact ways that we might want. But where government regulation is inflexible and susceptible to capture, market processes are adaptive and biased toward improvement. Rather than wishing for a Department of Technology Policy, we should focus on overturning the bad existing government policies that undermine security. With patience and humility, we may find that what we thought was a "market failure" was in actuality a market opportunity all along.

0 views0 comments

Comments


bottom of page