The Quieter Side of Semantic Technology
January is always a good time to reflect back and look forward.
Right now, I am preparing a short talk on semantic technology and the public sector. In researching this topic, it occurred to me that "semantic technology" had a different ring to it in 2008 than it does now. At the time, there was a lot of buzz as people began to think about how machine interpretation of unstructured content, distributed across the web, required machines to get at the meaning of content. A quest for the semantic web was born and that popularized the concept of "semantic technology." Today, when I checked Google Adwords, "Semantic Technology" gets 6,600 monthly searches. That's not much. "SharePoint" gets over 7 million.
Although it may seem that the shine is gone, what we have is another case of visionaries boiling the ocean, with engineers following with more focused attention to tractable problems. Of course, as IBM's Watson demonstrated by winning Jeopardy this year (see my column "Is IBM Watson Technology Practical for the Enterprise"), conceptual maps can be quite powerful when focused on specific areas of knowledge. IBM is now working on developing even deeper but narrower uses of Watson for healthcare applications.
Although there was a big splash around Watson, what is more interesting is the steady progress of semantic technology in 2011. This is particularly true with regard to two important trends. One is growing interest in standard ontologies for modeling the meaning of data within specific domains; and the other, the trend to leverage ontologies in framing complex business rules for data analysis.
One of the most exciting areas of ontology development is the growing momentum around NIEM (www.niem.gov), the National Information Exchange model. NIEM, a national government-supported program, addresses information exchange across ten important public-sector communities, including, for example, justice, human services, and healthcare. NIEM is supporting development of common vocabularies for exchange of information.
NIEM is indicative of a major trend in which content findability is significantly improved when we provide users with very relevant perspectives on related content. This is something many search engines have not yet caught up with. However, in working with our partner, Smartlogic, we are optimizing search to show very relevant related content across a range of industry application. On a recent content strategy engagement, for example, we helped a public sector utility company see how they could use a taxonomy to bring multiple sources of information together so that its field organization could more effectively solve complex service and maintenance issues.
With regard to the second trend, many companies are starting to quietly build ontologies that allow them to interact more intelligently with their suppliers and customers. In retail, these applications range from using taxonomy-enabled business logic to harmonize supplier data with the retailer's item master to using ontologies to support cross-selling and personalization of web-sites. There are other applications as well. For example, we are currently working with a retailer to plan ways to leverage ontology in customer data mining and the tailoring of relevant marketing messages.
Retail is not alone in using ontologies to support data analysis.
We have been working with a healthcare company to map patient data to a conceptual model that will drive a business application that identifies patient safety risks. Healthcare applications are moving ahead quickly, with attention focused on developing a standard electronic medical record (EMR) ontology that can support both exchange of information and analysis of care requirements and health trends.
So, overall, the prospects of semantic technology in 2012 look quite good even if the buzz is gone. Like most areas of engineering progress, we have a lot of tools now but we need to focus them on solving tractable high-payoff problems in business and the public sector.