The scandal over Uber’s Greyball program is causing some people to look back to an earlier scandal over the permissions requested by the Uber app on Android. Besides the likelihood that Uber’s problems are about to get a lot worse, that earlier scandal may well end up changing the rules for the whole technology industry.
The elevator pitch for Uber is that they are in the business of regulatory arbitrage. For a variety of reasons, many towns and cities throughout the world restrict the supply of taxi licenses so there are far fewer than are needed to meet demand. For example, it is difficult to get a cab in many parts of NYC as the scarce number of drivers sticks to the most profitable runs. Uber provides a technical infrastructure that allows its drivers to meet that need.
One consequence of Uber’s business plan is that it is a defendant in ongoing and expanding litigation throughout the world. Faced with the cost and complexity of bringing a case against the Uber corporation and its legions of high-priced lawyers, most local taxi regulators have targeted enforcement efforts on drivers.
As Uber approaches what is expected to be the hottest IPO of 2017, questions about its business model and practices are mounting.
The first major crack in Uber’s public image came when Susan Fowler, a former engineer at the company, wrote about the sexual discrimination and harassment she had faced working at the firm for a year. After the usual and predictable PR pushback, Uber responded with a very public firing of a senior vice president of Engineering who had resigned after allegations of sexual harassment by an employee at his previous employer. But this SVP is certainly not the person who had harassed Fowler: She quit Uber at least a month before he joined.
The complications of honeypots
News of the Greyball program broke in the wake of the harassment scandal. In a nutshell, Greyball was a set of tools designed to identify likely law enforcement officers so they were not offered Uber service—serving as a real-life version of a tool security researchers know as a ‘honeypot’ or ‘jail’ program. In the security world, potential attackers are identified and redirected to a simulation of the real system. The purpose is more than protecting the real system from harm, it allows the behaviour of the attackers to be observed and analysed.
Use of honeypots against computer criminals has many potential legal complications. Use of honeypots to detect and trap law enforcement creates complications of a different magnitude. Whether-or-not the Greyball program was legal is probably beside the point at this juncture. If Uber is to ever file for an IPO, its S-1 will have to explore said complications in exhaustive and almost certainly ruinous detail.
The VW emissions scandal demonstrates the disastrous impact discovery of a secret program such as Greyball can have on a company. The VW incident came close to shuttering one of the oldest and largest automotive companies in the world. If news of deceptive software can cause such extensive damage to an established and profitable company like VW, what can it do to a newcomer that has never made a profit?
Serious though the Greyball revelation is in itself, the much bigger problem might lie in a much earlier Uber scandal that many dismissed as unimportant at the time. In November 2014, a blog post claimed “Uber's Android app is literally malware.” The app requested access to practically every piece of personal information on the phone including location, contacts, calendar, camera and the microphone.
Based on what was known at the time, the fact that the app was asking for much more than it needed for its purpose was generally dismissed as needless worrying. An article by Owen Williams expressed the consensus view: “There’s no reason for Uber to collect data beyond what it needs; it’s certainly not in the company’s best interest.”
What we know now is that one of the reasons the app was asking for all that data was to enable Greyball to identify and honeypot law enforcement. Now that the bland assurances that Uber could not possibly be doing anything bad have proved unfounded, it is time to ask, ‘what else might the people with access to that data have done with it?’
Computers don’t commit crimes, nor do corporations. It is always people who commit crimes. And the reasons why individuals commit computer crimes go far beyond the limited agendas of Silicon Valley startups chasing a hot IPO.
The biggest problem with a company developing an ethically questionable system like Greyball is that it needs employees willing to develop and operate it. The sexual harassment described by Fowler is additional evidence of a corporate culture in which any corner can be cut if it achieves the desired result.
More than an ethically challenged startup
What could an employee do with the access permissions granted to the Uber app? It is rather easier to ask what they could not. Insider trading? Well, the first person who introduced me to Uber was a lawyer at a white shoe NYC firm, and the app has access to the microphone, camera and calendar. How about some international espionage? It’s less profitable, but Cold War 2.0 is undeniably well-funded. How about something worse? Well, the same company that didn’t fire employees for sexual harassment developed and gave them access to a cyberstalking infrastructure and continuous access to their victims’ GPS.
‘Wait!’ I hear you saying, ‘you can’t prove any of this happened.’ Which is precisely my point and the reason that the Uber case is about much more than one ethically challenged startup chasing a hot IPO.
Under the old rules, it was assumed that companies did the right thing until it was proven otherwise. Those rules changed when VW was found to have written software to fake emissions testing.
As explained at the start, Uber is in the business of regulatory arbitrage. But arbitrage works both ways. Uber operates in the European Union and is subject to the EU privacy directive. Mishandling of personally identifying information can lead to penalties of 10 per cent of a company’s global sales.
In security, absence of evidence is indeed now going to be taken as evidence of absence, and it is going to require a completely different approach to developing systems. It is no longer enough to use cryptography to secure confidential information; under the new rules it will be necessary to prove that the protections were effective and could not have been bypassed by insiders.
It has long been best practice for system administrators to keep logs of service transactions. Under the new rules, it will be necessary to track all movements of personal information. All system logs and all code affecting any aspect of the system operation will need to be tracked using tamper-evident techniques similar to those applied in the Bitcoin blockchain.
Phillip Hallam-Baker, VP and principal scientist, Comodo
Image Credit: Melies The Bunny Follow / Flickr