Feeds

Data: It's gotta have meaning, man

A context for everything and everything in its context

Top 5 reasons to deploy VMware with Tegile

Opinion How often have you heard the excuse of blaming blown project budgets on unanticipated systems integration costs? For good reason, nobody wants to do customised point-to-point integrations if they can help it - it's difficult if not impossible to leverage the work.

But in one respect, such integrations contained one potentially messy issue. When working with designated source and target, you became all too familiar with the data you were trying to integrate and therefore didn't have to worry about the context or meaning of the data that you were trying to exchange.

Nonetheless, when you think about reusing software assets, context stares you in the face. For instance, what if you want to reuse a process for tracking customer preferences in another entity, only to learn that privacy laws prevent the use of some portions of that data? And if another part of your business has a different definition of what constitutes a customer, the divergent meanings become show stoppers.

Admittedly, given the difficulty of attaining software reuse, concerns about context or the meaning of data remained academic. eBizQ's Beth-Gold Bernstein recalled being at the event where IBM announced SNA and told everybody to start building their enterprise data dictionaries. "I worked with organisations that did that. They had the books on their shelves, but it didn't do anything. They were just books on the shelves."

And in fact, thinking about systems that can automatically decide meaning or context from data kind of conjures up some of the original goals of Artificial Intelligence, which was supposed to produce software that could think. Japan mounted a fifth generation computing project back in the 1980s that was supposed to leapfrog the west with AI software, replicating their successes with lean manufacturing. We're not terribly sure if the Japanese effort actually got as far as generating shelfware.

About a decade ago, web pioneer and W3C director Tim Berners-Lee began pushing the idea of a Semantic Web that would provide a web that was searchable, not only by keywords, but real meaning. Along the way, the W3C developed several standards including Resource Description Framework (RDF) and Web Ontology Language (OWL) that specify how to represent entity relationships or meanings using XML. But today, we're still on Web 2.0, which is a more dynamic, interactive, but hardly a semantic place.

The emergence of SOA has made the possibility of software reuse less academic. According to IT architectural consultant Todd Biske, a consistent semantic model is critical to SOA if your services are going to be adequately consumed. Without such a model, suggests Biske, it'll be harder for users to figure out if the service is what they're looking for.

While short of the true meaning of semantics, the use of metadata has exploded through integration middleware and SOA registries/repositories that provide descriptors to help you, or some automated process, find the right data or service. There are also tools from providers like Software AG that are starting to infer relationships between different web services. This is all tactical semantics with a lower case "s" - it provides some descriptors that present at best a card catalog "what" information is out there, and from a technical standpoint, "how" to access it.

It may be lower case "semantic web", but it's a useful one. And that's similar to the lower case "ai" that spawned modest pieces of functionality that didn't make machines smarter per se, but made them more convenient (e.g.,context-based menus).

Our sense is also that we're ages away from Semantic Web, or Semantic Services with a capital "S". Current Analysis principal analyst and longtime Network World contributor Jim Kobielus equated the challenge as a "boil the ocean" initiative during a recent Dana Gardner podcast.

Few have covered the topic as extensively. In a recent Network World column, Kobielus summarised the prospects: Most vendors are taking a wait and see attitude. For instance, Microsoft, which is sponsoring a project code-named Astoria to extend ADO.NET with a new entity data model that would implement some of the W3C semantic web standards, has yet to promise whether to implement any of the technology in SQL Server.

Kobielus believes that it will take at least another decade before any of this is commercialised. While our gut believes he's optimistic, we find it hard to argue with his facts. Besides, he adds, it took a full half-century for hypertext to advance from "Utopian Vision" to something taken for granted today on the web.

This article originally appeared in onStrategies.

Copyright © 2007, onStrategies.com

Tony Baer is the principal with analyst onStrategies. With 15 years in enterprise systems and manufacturing, Tony specialises in application development, data warehousing and business applications, and is the author of several books on Java and .NET.

Security for virtualized datacentres

More from The Register

next story
PEAK APPLE: iOS 8 is least popular Cupertino mobile OS in all of HUMAN HISTORY
'Nerd release' finally staggers past 50 per cent adoption
Microsoft to bake Skype into IE, without plugins
Redmond thinks the Object Real-Time Communications API for WebRTC is ready to roll
Microsoft promises Windows 10 will mean two-factor auth for all
Sneak peek at security features Redmond's baking into new OS
Mozilla: Spidermonkey ATE Apple's JavaScriptCore, THRASHED Google V8
Moz man claims the win on rivals' own benchmarks
Yes, Virginia, there IS a W3C HTML5 standard – as of now, that is
You asked for it! You begged for it! Then you gave up! And now it's HERE!
FTDI yanks chip-bricking driver from Windows Update, vows to fight on
Next driver to battle fake chips with 'non-invasive' methods
DEATH by PowerPoint: Microsoft warns of 0-day attack hidden in slides
Might put out patch in update, might chuck it out sooner
Ubuntu 14.10 tries pulling a Steve Ballmer on cloudy offerings
Oi, Windows, centOS and openSUSE – behave, we're all friends here
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.