Friday 30 October 2015

Back to Basics?


Another of the issues we look into in December’s BIR is the challenges around the effective implementation of policy, why it is so important and what the solutions may look like.  On reviewing the subject I discovered that poor implementation of policy was not specific to any one sector, the same challenges faced everyone.  Of particular interest was an Oracle sponsored Economist Intelligence Unit report – Enabling Efficient Policy Implementation (2010).  The research investigated both the challenges and opportunities faced by organisations today and discovered that poor implementation of policy could be catastrophic for organisations leading to law suits, prosecution or fines, however these consequences did not affect greatly how policy was created, communicated and implemented. 

We have seen the consequences of poor information security policy implementation first hand with most weeks having a new incident reported in the news.  The latest story to hit the headlines being the security breach at British Gas but perhaps the biggest story was that of TalkTalk.   This illustrates the close link between IT security policy and information security policy but also the lack of clear standards on what levels of security are needed for different types of information held (http://www.theguardian.com/technology/2015/oct/23/talktalk-criticised-for-poor-security-and-handling-of-hack-attack ).

There is a definite need to be proactive in policy implementation, from first stage communications to effective monitoring, all of which needs to be properly resourced, a challenge indeed in many of todays leaner organisations.  Challenge, yes, but highly important as nicely stated in the EIU report, “policy cannot enact itself”!

But then if resourcing is important so is the need for effective ways to ensure those affected by the policy see the importance of it and adhere to it.  Well yes of course but this seems to be easier said than done.


At the start of our exploration of this area two of our articles look at policy this time, considering the need for information security management and the importance for information asset management.  Read more in December's issue and follow us on this subject throughout 2016.

Friday 23 October 2015

On the brink of a digital doomsday?

A couple of weeks ago the Daily Telegraph reported the threat of an emerging information dark age (http://www.telegraph.co.uk/news/science/science-news/11922192/Vital-information-could-be-lost-in-digital-dark-age-warns-professor.html). According to Professor David Garner, former president of the Royal Society for Chemistry, technological obsolescence endangers the future of digital information, and underlines the necessity of paper back-ups. Professor Garner cited the BBC’s Digital Domesday project from 1986 as an example of digital obsolescence.
On the brink of a digital domesday

You could be forgiven for experiencing a vague sense of deja vu on reading the above. It is an article that could have been published any time since over the past twenty years. Indeed, the specific example of the BBC Digital Domesday has almost become a cliché of such concerns. While I don’t want to suggest that no challenges remain, the Digital Domesday happens to be a really bad example of the problem of digital preservation, and bad in precisely the right way to highlight why the problem isn’t quite as catastrophic as it may appear.

The attraction of the example derives from the contrast between the vellum of the surviving copies of the original Domesday book, and the laser disk of the 1980s equivalent. But this association with an important historical artifact confers a spurious significance on the digital Domesday. In fact the BBC project has no great historical value and is somewhat of a cultural curiosity.

More importantly, the BBC Digital Domesday is a bad example of the issues associated with digital Domesday because many of the reasons for its rapid obsolescence to do not apply to much digital information today. It was obsolete almost before it was complete because of a unfortunate technological framework -– the laser video disk which was already virtually obsolete outside of educational contexts, and the unsuccessful BBC Master Computer. This tied the data to both its storage medium, and to its proprietary computing environment. The 1980s witnessed a clash of competing proprietary systems and standards in the micro computer marketplace, but this is a situation which has now all but disappeared. In place we have a suite of agreed and open standards and data formats which not only function in principle, but underpin contemporary digital architecture in a real and largely profitable way. And despite the anxieties of Zittrain (The Future of the Internet, 2008) about tethered devices, open systems and standard and generative computing devices are winning the battle.

Finally it is a bad example of an emerging dark age because the digital Domesday project has not been lost. It never really was. You can access it right now at: http://www.bbc.co.uk/history/domesday. If anything it is a very good example of how robust digital data really is. 

This is not to imply that digital information provides no preservation issues. The best format for created archival record is still paper (especially in those legal related contexts where records are required essentially in perpetuity).  But while this is still best practice, it has to be recognized that it is a defensive position relating to best practice for future archival purposes, and does not reflect the probable future survival of most digital information current in existence. We are on the brink of an age of limitless and virtually cost free storage where the default position will be to migrate and retain data precisely because of the potential future commercial and cultural value of that information which can never be precisely estimated in advance.

What remains however is the problem of data migration and intellectual property rights. It is still the case that information systems do not talk to one another as politely as we might like. Or often at all. This is not a problem that can be overcome with standards and agreements, because the semantic structure of data sets is a significant part of their meaning, and can never be entirely standardised. More importantly, intellectual property prevents automatic archiving of materials In February, Vint Cerf raised this issue, and (http://www.theguardian.com/technology/2015/feb/13/google-boss-warns-forgotten-century-email-photos-vint-cerf) suggesting that:

“the rights of preservation might need to be incorporated into our thinking about things like copyright and patents and licensing. We’re talking about preserving them for hundreds to thousands of years."

Cerf also suggested a way to manage data migration issues: “the solution” he suggested “is to take an X-ray snapshot of the content and the application and the operating system together, with a description of the machine that it runs on, and preserve that for long periods of time. And that digital snapshot will recreate the past in the future” (http://www.bbc.co.uk/news/science-environment-31450389). This is an example of using software emulation techniques in digital preservation, widely discussed.

This all comes to mind as we’re preparing December’s issue of Business Information Review (http://bir.sagepub.com), with an interesting article on data migration from legacy information systems, and suggestions for proposed ways of managing the issue.

Luke Tredinnick