Thursday, February 2, 2012

The importance of data standards in 2012

2012 promises to be a watershed of new technologies heading to maturity; particularly cloud computing. Larry Ellison was way of his time back in the late 90's in trumpeting the Web PC and the demise of the local PC with all of its associated issues and costs.

Like almost all revolutions, the cloud will be additive and help make all the rest work better, faster and longer. PCs will interact with phones and tablets, and an increasing number of automated sensors will be feeding cloud based applications across a wide range of functions.

Standardizing the data "bullets" will make the issue of whose platform and technology largely moot, and allow greater interoperability across partners, government, vendors, competitors and internationally. There are plenty of issues - security, scalability, resiliancy and total cost of ownership are going to be with us for a long time to come.

You've gotten a taste of how this can work with Twitter - for people to people transactions, keeping it to 140 charachters (though with links, people reference huge data sets including the library of congress) makes brevity a necessity.

Google has embraced CAP (Common Alerting Protocol) for their newly launched Public Alerting System - allowing for a wide range of alerts to be consolidated into one map. For the private sector security market, Pinkerton/Securitas has launched a product called Vigilance that is heavily embracing CAP as a means of integrating public, private, premium, open source, sensor, and internal data all into one interface.

It will take time, but this is a growing trend! Big data with no structure is tough to utilize. Big data with structure is far more searchable by mere mortals.

No comments:

Post a Comment