Does artificial intelligence propel digital transformation forward, or does transformation create conditions enabling AI adoption? Do these initiatives reinforce each other synergistically, or do they create conflicting demands that undermine both?
Research consistently reveals troubling success rates for large-scale transformation initiatives. Studies from BCG and McKinsey place success rates near 30%. The pandemic accelerated certain transformation aspects as remote work necessitated new tools, technologies, and infrastructure. Yet whether this acceleration translates to genuine transformation success beyond enabling distributed work remains unclear.
Frequently, acceleration merely meant rapid program initiation and deployment while accumulating technical debt and neglecting foundational processes plus data quality. A technology leader from a major global professional services firm told me their organization accomplished two years of transformation work within weeks. This sounds impressive until recognizing the impossibility of achieving such compression without significant compromises.
AI program success statistics prove equally concerning. IBM research from 2021 cited in MIT Sloan Management Review indicates just over 20% of surveyed organizations have deployed AI enterprise-wide. Additional research shows 40% of organizations making significant AI investments have failed to realize business benefits—meaning 60% have achieved some value. Some commentators argue AI accelerates and enhances transformation success. However, McKinsey's 2018 study reported only 23% of organizations with successful transformations leverage AI technology. Most successful transformations apparently succeeded without explicitly deploying AI.
The Survey Reliability Problem
I recently encountered a white paper celebrating AI successes in B2B transformations. One featured organization reportedly achieved tremendous value from AI-driven B2B communication personalization. This company happens to be a former client of my firm, and I maintain contact with personnel responsible for B2B personalization there. According to my contacts, the organization in fact continues struggling with personalization implementation.
This illustrates what researchers call the "front line paradox"—frontline employees typically sense impending change first yet remain least heard within organizations. They directly manage challenges and witness impacts from newly deployed technologies and work processes. Paraphrasing Stanford professor Robert Burgelman and Andy Grove, Intel's late CEO: frontline employees feel winds of change because they spend time outdoors where disruption's stormy clouds rage.
The implication: information senior leadership communicates to external audiences or derives from surveys and interviews may not align with experiences of those interacting with customers or using new solutions for customer service. In this instance, ground-level reality differed dramatically from messaging emanating from organizational hierarchy peaks.
The Data Organizing Imperative
A Toward Data Science article from November 2020 asserts that digital transformation represents one of the most critical drivers determining how companies will continue delivering customer value in highly competitive, rapidly evolving business environments. The article identifies artificial intelligence as among central enablers of digital transformation across multiple industries.
Given low transformation success rates combined with AI operationalization obstacles, the logic becomes questionable: how can one high-risk, low-success program type critically enable another high-risk, low-success program type? This resembles strategies combining two money-losing ventures hoping to create one profitable operation. Possible? Certainly. Probable? No.
The article's author articulated data organization and structure importance for supporting broad transformations and specific AI initiatives, noting that serious artificial intelligence discussions prove pointless without organized data. The same obviously applies to overall digital transformations. Digital transformation fundamentally constitutes data transformation. Organizations must decelerate and address fundamental data issues before accelerating AI programs or achieving broader transformation success.
Scope Tensions and Fragmentation Risks
An interesting dichotomy emerges: digital transformation typically encompasses broad scope cutting across multiple departments and processes, while successful AI projects demonstrate narrow scope addressing specific processes and incremental improvements. Digital transformation demands holistic value chain perspectives, yet applying AI to individual departments and processes can produce fragmented efforts disconnected from larger enterprise information flows. Extracting maximum value from distributed AI experiments requires capability to centralize lessons learned and standardize approaches that have realized value.
Balancing these perspectives demands leaders maintain holistic transformation views while zooming into details of dozens or hundreds of individual processes supporting business objectives and outcomes. Moving fluidly from macro views down to micro levels and back characterizes successful transformations of any type. Incorporating additional technologies as transformation components where organizational maturity proves limited adds complexity and increases dependencies on foundational processes, data architecture, and data quality. For example, investing in personalization technologies proves pointless without understanding customer needs across segments.
Managing Uncertainty and Validating Assumptions
The fundamental principle: avoid adding excessive unknowns to transformation programs. AI projects demand iterative testing and supporting process evolution. Clean, consistent, well-architected data represents the admission price. Don't assume data exists in usable condition for target processes. Don't accept vendor promises or status reports from program leaders far removed from frontlines as reality. The most effective methods for determining whether supporting processes and data meet success requirements include competitive benchmarking, internal benchmarking, heuristic evaluations, and maturity assessments. Objective metrics reveal data adequacy.
Heuristic evaluations—collections of best practices and rules of thumb—provide snapshots of organizational performance on current efforts. What capabilities does the organization possess? Are foundational processes and data quality robust? Or does foundation strengthening require significant time and effort? Maturity assessments cut across multiple dimensions that may appear beyond domain scope yet would impact downstream processes. For example, product data maturity includes understanding whether vendor and supplier service level agreements incorporate data quality measures and remediation processes. Vendors constitute part of the information supply chain; source problems can negatively impact numerous downstream processes and dramatically increase costs.
AI can deliver competitive advantages, accelerate value realization, reduce costs, and enhance customer experiences. These represent exciting prospects. However, achieving these impressive outcomes requires addressing enabling steps often considered mundane: upstream supporting processes, data quality, architecture design, governance frameworks, change management. Too frequently, these are assumed present or relegated to someone else's responsibility. Worse, they're dismissed as insufficiently important to warrant attention.
According to a 2020 PWC survey, over 40% of executives intended deploying AI to improve productivity and efficiency, yet only 13% identified standardizing, labeling, and cleansing data for AI systems as a top priority for the coming year. This telling statistic reveals inadequate understanding of standardized, labeled (well-architected), clean data's critical role. Because dependencies multiply once AI embeds in digital transformation programs, foundational work investments cannot be deferred or relegated to later phases.
Seeking Truth from Operational Realities
Above all, obtain accurate accounts from organizational frontlines regarding AI project progress and digital transformation program benefits while course correction remains possible. Too many organizations engage in "success theater" where realities of failures and setbacks never reach the C-Suite. In some cases, career impacts on leaders prove insignificant because by the time failures manifest in company performance, these individuals have moved to next opportunities. Consequently, transparency and accountability motivations don't always run strong.
AI and digital transformation success or failure proves existential for many organizations, making truth-seeking essential. Yet disconnects frequently exist between stated goals and realistic assessments about requirements for achievement. In one enterprise, digital transformation funding cutbacks meant key roles remained unfilled and programs operated under-resourced. Yet timelines and program objectives didn't change. Frontlines saw the approaching disaster while leadership remained oblivious. Neither AI nor digital transformation constitutes magical solutions superseding our digital world's fundamentals. Rather, both represent solutions depending on developing those fundamentals—a critical process organizations cannot ignore while expecting to flourish.
Ten Essential Principles for Success
1. Front-Load Data Planning
Consider data requirements before initiating programs. Engage data governance and quality specialists to outline data needed for transformation support. If migrations or significant quality efforts will be required, they must begin earlier—not await technical development completion.
2. Amplify Frontline Voices
Pay attention to frontline worker reports about program performance and day-to-day work impacts. Develop channels gathering unbiased feedback.
3. Balance Macro and Micro Perspectives
Maintain awareness of larger enterprise objectives while drilling into effort details—map upstream and downstream impacts and dependencies at least at high levels. When optimizing product-related content for SEO on e-commerce sites, consider how that content must serve other audiences: customer self-service, call centers, syndication partners, embedded product service content. Focusing solely on SEO proves short-sighted.
4. Establish Process Baselines
Identify baselines for processes that will be impacted, including existing practices and compliance with known industry practices and standards. Ensure processes and tools for monitoring impact are operational.
5. Communicate Continuously
Build communications plans keeping people engaged—promoting wins, reinforcing need for "boring parts" like process improvement and intentional change management. Bring appropriate people to day-to-day operational decision-making and communicate outcomes to executives.
6. Monitor Technical Debt
Remain mindful where technical debt increases through undocumented work or launch shortcuts. Items left for later frequently never happen. Excessive technical debt hobbles programs.
7. Distinguish Technical from Process Problems
Don't expect technical solutions for non-technical problems—for example, fixing manual, chaotic customer onboarding processes. Technical components may exist, but start with human understanding of issues and why certain processes exist before applying technology.
8. Calibrate to Organizational Maturity
Align program plans with organizational maturity. Upstream dependencies aren't always apparent; certain maturity levels will be needed for success. Personalization at scale requires foundational capabilities in content operations, customer journey modeling, knowledge processes, and product information management. Capability gaps in one area impact overall programs.
9. Resource Information Preparation Adequately
Plan for adequate resources preparing content, data, and knowledge for transformation. This demands deep dives into information processes and lifecycles including sources, uses, provenance, rights, quality, purpose, systems of record, and consuming systems—both internal and external.
10. Maintain Adaptive Roadmaps
Continually update and refine roadmaps as new dependencies and resourcing constraints arise. Plans should evolve continually as programs encounter new issues, solve problems, and adapt to changing customer needs. If program elements lose funding, capture that impact and adjust timelines if necessary.
Looking Forward
Coming years will see diminished discussion of AI as standalone programs or projects. It will increasingly weave into organizational infrastructure. Digital transformations will continue as ongoing initiatives with increasing focus on cognitive technologies and intelligent assistants improving information access for stakeholders across categories. AI can differentiate and accelerate digital transformations provided attention remains focused on fundamentals.
Organizations succeeding at the AI-transformation convergence recognize that neither represents a shortcut around basic disciplines. Data quality, process maturity, governance frameworks, change management, stakeholder engagement—these unglamorous fundamentals determine whether ambitious technology initiatives deliver transformative value or merely accumulate technical debt while burning resources. The path to intelligent, digitally transformed organizations runs through patient, disciplined attention to information foundations. No algorithm can compensate for data chaos, and no platform can rescue transformation programs built on organizational dysfunction.
Note: A version of this article appeared on CustomerThink and has been revised for Earley.com.
