All Posts

Making the Best of AI – What Executives Need to Know

Beginning the journey with the right preparation will lead to success instead of disappointment 

For many enterprises, some AI capabilities are moving into core operations.  However, other enterprises are still early in the journey or are seeking out applications that will provide meaningful, measurable returns. Still others are just scratching the surface with initial capabilities but realize that they have far to go in order to truly build competitive advantage. 

The usual traps to new technology are out in force – over hyped capabilities, aspirational functionality, and unrealistic expectations.  On the business side, enterprises face the pitfalls that any technology adoption is susceptible to – insufficient resourcing, unclear objectives or scope, lack of business justification, and insufficient supporting processes, among others.   

Here are the top issues that execs need to keep in mind when trying to get the most from their (AI) technology investments. 

First Rule of AI:  Forget AI

It is important to take a step back and ask what the organization needs to accomplish with a given process or initiative.  AI is a tool in your toolkit, and like any tool, it should not be the focus, but should be in service to the problem that needs to be solved.  

To identify the problem, ask the following questions: What is the process that requires attention or intervention?  Customer service?  New product development?  Understanding of risk patterns?  Where does the organization need to optimize to meet customer expectations or market demands, competitive threats or take advantage of new value propositions? 

AI does not replace entire processes or people – it is an intervention at specific points of a process.  AI should be considered, as my colleague Dan Turchin likes to say, as “augmented” intelligence, rather than “artificial” intelligence. It supports people doing their jobs rather than replacing their jobs.  AI removes tedium from repetitious tasks that bore humans.  It also speeds the ability to perform complex analyses and uncover insights in large amounts of data. 

Starting with Business Outcomes

Once the right questions have been asked, begin with the objective – what does the business need to do in order to solve the problems that have been identified?  What can be automated, and what can be made more efficient by helping an analyst access information more readily?  For a claims processor, it might be consolidating prior history data for similar claims from multiple systems, including the large and growing collection of unstructured content.  Text analytics and semantic search can lead to tremendous productivity gains by making knowledge and content available in context. 

The first step of clarifying business outcomes is mapping end-to-end processes.  You cannot automate a mess and you cannot automate what you don’t understand.  Ideating on future state business scenarios and articulating current state process is done through libraries of use cases, which are testable, measurable tasks that your customers or employees need to accomplish in the course of their work.  

The more detailed and finely grained the use case, the more effectively can the needs can be met through machine learning and AI powered personalization and contextualization. Personalization can then be tailored according to a rich understanding of the user and their context (background, interests, role, title, industry, objectives, equipment configuration – everything and anything you can understand about the user), Certain prerequisites need to be in place, but beginning with scenarios and use cases, testing, iterating functionality, and measuring baselines and impact will maximize the value from AI programs.     

Data is More Important than the Algorithm 

AI runs on data. The recent analogies are that data is the new oil or similar analogies.  The point is that training data can be as simple as a knowledge base consisting of reference documents, or as complex as a billion historical transactions.  In either case, it is important to align the data with the use case.   Some vendors claim that AI will “fix your data.”  They predict “no muss no fuss,” but that is rarely the situation. Data needs to be architected in a way that informs the algorithm about what is important to the business – products, services, solutions, processes, customer characteristics, employee tasks and more.  This takes the form of what is referred to as an “ontology” – a way of describing the organization as multiple categories and taxonomies.  

For a manufacturer, these taxonomies would include products, industries, competitors, markets, manufacturing processes, applications, problems, solutions, tasks, customer types, roles, document types. They represent all of the different concepts that describe the business as well as the relationships among these things.   The relationships might be products for a solution, applications for a process, solutions for a problem and so on. This set of information forms the “knowledge scaffolding” of the enterprise.  

When data is accessed using the structure of an ontology, it is frequently presented in the form of a “knowledge graph. ” An example is IMDb, the movie database.  One can look up actors and then navigate to movies the actors have been in, then connect the directors to other movies, etc.  The corporate analogy would be navigating from a particular customer to the industry that customer works in, and then looking at other customers in that industry and considering other types of products and services that may also be of interest to these customers. This kind of functionality could be part of a cross-sell recommendation system for sales people.

Cultural Requirements 

The organizational culture must be open to experimentation and able to accept failure.  In some enterprises politics, makes it more difficult to embrace AI and associated process change. AI is difficult and there are inevitably failures and setbacks while traversing the learning curve.  If the culture is one of “success theatre” where digital transformations are touted at the executive level and ridiculed or diminished by those at the operational level, real progress will be difficult to achieve.  A culture of learning, experimenting and failing needs to be part of innovation.  A philosophy of being a fast follower with less tolerance for experimentation is perfectly reasonable, as long as the objective is aligned with the level of maturity.

Leadership and Social Capital 

Programs with significant impact require that leadership take chances, and moving into new areas entails risk that many may not be prepared to take.  A leader with the vision of a new way of operating needs to understand the nuances and mechanics of making the innovation an operational reality.  That requires a track record and organizational social capital.  If the program is risking social capital, execs need to be sure that they are doing all they can to reduce risk and ensure success. Being realistic around organizational capacity and capabilities is a prerequisite.  

Adequate Resourcing 

Many programs are funded based on ROI projections.  Frequently, projects are funded to address identified gaps in capability; however, these are sometimes only the tip of a capability gap iceberg.   At the surface certain things appear to be straightforward, but during a project, those surface issues are peeled back to reveal bigger challenges that were not anticipated.  This is where current state maturity models can provide a higher fidelity understanding of what needs to be in place to operationalize.  

Maturity models show the organization what is achievable and how much needs to be done to build fluency and capability.  For example, a proof of concept (PoC) can afford to carefully cleanse, structure and enrich data.  However, production data sources are not afforded the same degree of curation and attention.  Getting that production data in shape for full deployment can require resources that were not anticipated or budgeted.

Supporting Processes 

Along the lines of maturity, upstream processes also need careful evaluation.  One organization put a lot of resources into the content and data models that would support personalization.  When it came time to deploy, however, the marketing function could not identify differentiated messaging that would be used for personalization.  The infrastructure was there, but the supporting process to locate the appropriate the messaging was not. 

Measuring Results 

Even when embarking on supporting foundation projects around data quality or completeness, the organization needs to show a linkage to measurable results. Data quality can be scored.  Data supports a process, and it is critical to instrument the process (and gather baselines) that will be impacted.  Processes support business outcomes which are also measured, so there is a connection. And outcomes support the enterprise strategy.   Linkage from data to process to outcome to enterprise strategy will retain executive attention and funding. 

To recap, success with AI requires:

  • Clarity of business purpose
  • Detailed understanding of processes
  • The correct, quality data sources structured for the application
  • culture that is open to new ways of working 
  • An understanding of which aspects of a process can be improved or automated using AI technology
  • A strong sponsor with social capital 
  • Adequate resources and funding 
  • The right supporting processes 
  • A way of measuring results – upstream and downstream  

Acting on these guidelines will improve the likelihood of success. These principles apply to many types of enterprise projects, and AI is no different.  There are more dependencies and complexities to today’s technology initiatives, and success requires attention to the basic blocking and tackling. AI will not understand the needs of the business by itself. It needs enterprise support from the ground up to the C suite.


This article originally appeared on CXO Outlook.

Earley Information Science Team
Earley Information Science Team
We're passionate about enterprise data and love discussing industry knowledge, best practices, and insights. We look forward to hearing from you! Comment below to join the conversation.

Recent Posts

[Earley AI Podcast] Episode 26: Daniel Faggella

Human Cognitive Science Guest: Daniel Faggella

[RECORDED] Master Data Management & Personalization: Building the Data Infrastructure to Support Orchestration

The Increasing Criticality of MDM for Personalization for Customers and Employees Master data management seems to be one of those perennial, evergreen programs that organizations continue to struggle with. Every couple of years people say, “we're going to get a handle on our master data” and then spend hundreds of thousands to millions and tens of millions of dollars working toward a solution. The challenge is that many of these solutions are not really getting to the root cause of the problem.  They start with technology and begin by looking at specific data elements rather than looking at the business concepts that are important to the organization. MDM programs are also difficult to anchor on a specific business value proposition such as improving the top line. Many initiatives are so deep in the weeds and so far upstream that executives lose interest and they lose faith in the business value that the project promises. Meanwhile frustrated data analysts, data architects and technology organizations feel cut off at the knees because they can't get the funding, support and attention that they need to be successful. We've seen this time after time and until senior executives recognize the value and envision where the organization can go with control over its data across domains, this will continue to happen over and over again. Executives all nod their heads and say “Yes! Data is important, really important!” But when they see the price tag they say, “Whoa hold on there, it's not that important”. Well, actually, it is that important. We can't forget that under all of the systems, processes and shiny new technologies such as artificial intelligence and machine learning lies data. And that data is more important than the algorithm. If you have bad data your AI is not going to be able to fix it. Yes there are data remediation applications and there are mechanisms to harmonize or normalize certain data elements. But looking at this holistically requires human judgment: understanding business processes, understanding data flows, understanding dependencies and understanding of the entire customer experience ecosystem and the role of upstream tools, technologies and processes that enable that customer experience. Until we take that holistic approach and connect it to business value these things are not going to get the time, attention and resources that they need. In our next webinar on March 15th, we're going to take another look at helping organizations connect master data to the Holy Grail of personalized experience. This is an opportunity to bring your executives to a webinar that will show them how these dots are connected and how to achieve significant and measurable business value. We will show the connection between the data, the process that the data supports, business outcomes and the and the organizational strategy. We will show how each of the domains that need to be managed and organized to enable large scale orchestration of the customer and the employee experience. Please join us on March 15th and share with your colleagues - especially with your leadership. This is critically important to the future of the organization and getting on the right track has to begin today.

[Earley AI Podcast] Episode 25: Michelle Zhou

Data Tells the Story Guest: Michelle Zhou