Governments play an essential role in overseeing strong, effective and even-handed regulation to ensure open, competitive markets and to prevent market abuses. Yet as our traditional economy moves rapidly towards a digital, software-driven economy, our regulators have fallen ever further behind the times.
This is hardly a new problem. Back in the mid-1990s, as head of IT at the city of London regulator, I struggled to understand how any meaningful regulation was taking place. The city had become unrecognisable, transitioning from Victorian-era potty-mouthed market traders shouting and gesticulating at each other in colourful jackets to largely computerised operations.
Despite this transformation, the paper-oriented processes of city regulation trundled on much as they had done in the past. Most of my colleagues looked at me with suspicion when I suggested we needed to scrutinise the code and data of these new systems, putting telemetry in place to monitor, track and guard against intentional or malicious abuse.
Some 20 years after I left the working museum of regulation, the stories of how Volkswagen apparently gamed emissions tests for its diesel vehicles emerged. Quelle surprise. It's made me wonder just what have regulators been doing in these intervening years. Given that almost every business process is now defined and operated in software, how do regulators fulfil their remit if they're not requiring access to and full transparency of source code?
Having failed two decades ago to modernise regulation for those earlier transformed markets, the problem is now compounded by inadequate regulation of digital economy businesses. If regulators were on top of their game, surely any digital business operating in a regulated market should expect their software to be routinely inspected? After all, if code remains unseen and hence not understood how will regulators ever know if companies are manipulating markets and abusing their power?
Let's take the example of taxis, traditionally a regulated business. However, new market entrants are playing by different rules, as the debate about Uber demonstrates. A rogue "innovator" taxi company could, by accident or intent, develop software algorithms that distort the market and manipulate pricing by holding back drivers from a particular area until demand, and profits, can be increased. How would the regulator know whether this is happening if they do not inspect and regulate source code?
Regulation should long ago have become as much about code, about analysing and understanding software, as about visible business practices and behaviours. Yet the VW case suggests that regulators still lack the skills and competencies required to regulate the digital economy. Governments need to modernise regulation and introduce digitally competent, computer-literate regulators. The continued failure to do so is likely to make the previous costly and damaging failures of regulation appear trivial in comparison.
For our digital economy to be trusted and take flight, we need independent assurance that the best software engineering practices are being applied and that the makers of the next generation devices and business models are not quietly gaming the system and skewing markets. It's time to bring our Victorian regulators into the 21st century.