Has technology rendered open standards unnecessary?

September 3, 2015

Interesting things happen when the price of a technology approaches zero.

It makes possible new products and services, changes consumer behaviour and renders old problems and priorities obsolete.

Take, for example, the plummeting cost of computer memory, which – combined with the internet and world wide web – has given us things like email, cloud computing and social media. In the physical world we buy and dispose. In our virtual lives we’ve all become digital hoarders. We curate rather than cut.

Recently, I have started to wonder whether the plummeting cost of a different type of technology might shake up debate around open standards, too.

The Need for Open Standards

Over the past two years I have repeatedly beaten the drum for the UK public sector to adopt open standards for data: finding common formats and ways of recording information.

For those not familiar with their purpose, a brief analogy might help.

Imagine a room full of people who all speak a different language, with none in common. They will find it very hard to communicate without hiring expensive translators to mediate every conversation. They may even decide it’s too much hassle to communicate at all.

The same applies in the world of IT.

Not to be confused with programming languages, computer systems can use different formats for recording data (e.g. .xls, .xml, .pdf, .docx, etc.) or have different conventions for recording a piece of information. This can be as mundane as a date being recorded in the UK style (DD/MM/YY) or US version (MM/DD/YY), or as consequential as an individual being represented by a completely different identifier. This diversity is evident across the UK public sector. Over several decades its organisations have acquired many thousands of – often bespoke – IT systems based on myriad standards that simply don’t talk to each other.

The result is that it is very hard to share and combine information held in different systems.

This is not just a headache for CIOs.

As I discussed in Small Pieces Loosely Joined, it creates an obstacle to public sector teams sharing data so they can work together efficiently to tackle problems or support a particular family or individual. It makes it hard to run predictive data analytics to prevent rather than cure problems. It hinders using data to target scarce resources at areas of greatest need. All these things are badly needed at a time when public finances are tight and smarter ways of working need to be found.

Happily, it is possible to make different IT systems communicate with each other, using the digital equivalent of a translator, known as middleware. Yet like their human equivalents, middleware translators have traditionally been expensive – sometimes very expensive. Middleware (a.k.a. Electronic Data Interchange – EDI) projects to connect entire networks of systems can run into hundreds of thousands of pounds.

Connecting systems in this way also brings complications.

Sometimes, when sending data from system A to system B, it can fail to reach its destination – literally be lost in translation – making it necessary to investigate where it has gone wrong. That might be in system A; the connection from system A to the middleware; the middleware itself; the connection from the middleware to system B; or finally in system B. It’s like checking which one of many bulbs has blown your Christmas lights.

Advocates of open standards argue that a better solution for public sector IT is the same as for the room of people: they just need to agree on a common language. That way, everyone can communicate freely.

The government has taken this idea seriously. In March 2015, Mike Bracken, GDS’s outgoing Head, was appointed as the government’s first Chief Data Officer with a remit to define open standards for the whole public sector. Policy debate has focused on what those standards should look like, who should adopt them and by when, and – critically – who should pay for the transition.

After all, adopting open standards may not come cheap. For some organisations it may require making updates to their current IT systems and software or replacing them entirely. Harder still, it may entail retraining personnel or putting in place entirely new processes to collect and record different information. Converting the whole public sector is therefore likely to take years.

But the result – seamless communication between different systems and the teams that use them – will surely be worth it.

However.

What happens if the cost of translation approaches zero?

This is essentially what is now happening. A range of companies are offering products that virtualise the entire middleware process, creating software that can connect and translate from anything, to anything, and back again for a fraction of the cost than was previously possible. Call it data-as-a-service. Additionally, where previous iterations of middleware required armies of consultants to design, implement and maintain, the new generation can be used and manipulated by those with non technical backgrounds.

That may change the ball game.

When the cost of translation is negligible, it becomes increasingly hard to justify the expense of moving to a new system that uses open standards. The realist in me knows that adopting them at scale will take a very long time.

In the interim, then, perhaps what we need is a wholesale commitment to putting in place the translation layers needed to make public sector data sharing a reality.

This has been the pragmatic response used by cities in the USA, not least New York where Mike Flowers, the City’s first Chief Analytics Officer, has described asking teams to change their systems as a ‘non-starter’. It just gives organisations one more excuse not to engage in data sharing at all.

Cheap translation should sound the death knell of that excuse.

Whether we opt for a) open standards or b) cheap middleware should be a strictly empirical question of what works best, or which approach is most cost effective. The answer is likely that there is room for both – each will be the preferred option in different spheres. (Open standards are likely to remain vital for the success of initiatives such as Government as a Platform. In other areas, greater benefit will be gained when specialist teams continue to use the specialist standards that suit them best.)

Which brings us to the bigger point.

The real barriers to data sharing are not technical, but political and cultural.

Government and public sector bodies need to be willing to share their data where it enables them to work in partnership with other teams to better achieve their mutual goals. It requires government to proactively support public sector bodies in finding responsible, legal and ethical ways to combine, anaylse and act upon the datasets they hold. At a city level, this could be achieved by putting in place an Office of Data Analytics – the primary recommendation in my report Big Data in the Big Apple.

Again to quote Mike Flowers, ‘Being data-driven is not primarily a challenge of technology; it is a challenge of direction and organizational leadership.’ That is as true for the UK as it is for the USA.

For the UK public sector, the cost of translation technology has never been cheaper.

As departments seek to make savings of 20-40% of their budgets by 2020 as part of the upcoming Spending Review, the cost of failing to share data to enable smarter ways of working has never been higher.

Join our mailing list