This article is part of a continuing series on the evolving role of the chief data officer (CDO) and how CDOs work across the executive leadership team. For this installment, I sat down with Ursula Cottone, CDO of Citizens Bank, whose path to the role ran not through IT but through nearly every corner of the banking business. Our conversation explored how she approached a complex organizational transformation -- and why success in this role depends far less on technical expertise than most people assume.
This article originally appeared in the May/June 2017 issue of IT Pro, published by the IEEE Computer Society.
Why the Role Existed in the First Place
Cottone joined Citizens Bank in 2015 as the organization's first CDO, having previously held the same distinction at another institution. Her background was anything but a conventional technology career path. She had managed branches, worked in credit and investment banking, and led a large-scale enterprise system implementation from the business side. That breadth of experience gave her a fundamental insight that shaped everything that followed: the technology side of banking is primarily about understanding business requirements, not about data structures or platform functionality.
Citizens Bank created the CDO role in response to a specific and familiar problem. The organization had invested substantially in enterprise data management over several years, had gone through the difficulty of deploying new tools and platforms, and was not seeing the returns those investments were supposed to generate. A consulting engagement concluded that the data program's heavy technology orientation needed to be rebalanced toward business outcomes. The recommendation was to bring in a dedicated executive whose mandate would be extracting genuine business value from the organization's data and technology investments.
The First Order of Business: Listen
In any complex organizational environment, the temptation is to arrive with a plan. Cottone's approach was the opposite. Her first priority was understanding -- the people, the culture, the pain points, and the institutional history that explained how the organization had arrived at its current state.
She began at the top, meeting with the CEO and the full set of direct reports: business line heads covering consumer banking, business banking, and wealth; chief operating officers for individual divisions; and CFOs and finance leaders across the organization. Those executive conversations established a high-level picture of priorities and frustrations. The subsequent meetings with their direct reports revealed something equally important: the operational reality beneath the strategic framing, where actual capability gaps lived and where people were actively building new ways to serve customers and improve efficiency.
Over five to six months, Cottone and her team conducted more than 50 of these conversations, tracking themes, stakeholder names, key takeaways, and referrals to additional people worth engaging. Running in parallel was a technical inventory of everything already in place -- what technologies had been built or purchased, who owned them, what functions they supported, and what was or was not working. The goal was a complete picture of the technology landscape, the data environment, and the processes that connected them.
Understanding the Team She Inherited
Alongside the stakeholder mapping and technical audit, Cottone needed to understand the team she had taken on. The enterprise information services group, as it was known, comprised 72 people who were, by her account, dedicated and talented. The problem was not their capability. It was their organization.
The team had evolved in a way that produced significant siloing. Different groups served different functions, and because those functions operated independently, there was little reason or opportunity for cross-team collaboration. The work supported siloed systems. And critically, there was limited visibility into how any individual team's output connected to business outcomes.
This dynamic is not unusual in technical data functions, and it points to one of the persistent challenges of the CDO role. Data modelers in particular work in high levels of abstraction. The downstream value of that work -- to customers, to business lines, to the organization's strategic position -- is often difficult to articulate and even harder to make visible. At a metadata conference I attended around that time, a team from a major enterprise asked for my help gaining organizational support for their work. They spent 45 minutes explaining what they did. I kept asking who their customer was, what business value they were delivering, what outcomes their work produced. I walked away without a clear answer -- and I am considered an expert in this space. If the work cannot be explained to someone looking for it, it will not receive the resources it needs.
Aligning the Team to Business Priorities
After working through the full discovery process, Cottone identified which team members and activities were generating meaningful business value and which were not clearly tied to Citizens Bank's strategic priorities. Some stakeholder needs that had been anticipated were going unaddressed. Some work simply was not connected to where the organization needed to go.
Rather than redirect the entire group, she made a deliberate and difficult decision: she significantly reduced the team's size, stopping activities that were not contributing to organizational goals and retaining the staff and work streams that were. Importantly, her analysis concluded that the problem was structural, not personal. The team members were passionate and committed to their work. They were simply not aligned with the problems the business most needed to solve. When the discontinued activities generated no organizational objection, that absence of pushback confirmed that nothing critical had been removed.
Two individuals proved especially valuable through this period of restructuring. One was a long-tenured team member whose deep institutional memory and broad internal relationships allowed Cottone to navigate issues from multiple directions within the organization. The other was a senior technical resource with comprehensive knowledge of the enterprise technology landscape. Both represented a form of organizational knowledge that cannot be documented or transferred quickly -- the kind of tacit expertise that makes transformation programs either succeed or stall.
Building the Strategy and Getting Buy-In
With a foundation of stakeholder relationships, a mapped data environment, and a restructured team in place, the work of developing a formal strategy could begin. That strategy then had to be socialized with the same people whose input had shaped it -- translated from the abstract to the concrete, connected to the pain points that had surfaced during the listening phase, and presented clearly enough to sustain executive confidence through a long and uncertain road.
Cottone was direct with the executive committee. The organization was at a decision point. Prior investments had been made, results had not fully materialized, and the question was whether to absorb that sunk cost and stop, or commit to the next round of investment needed to actually deliver business value. Leadership chose to move forward, accepting that this commitment required a longer planning horizon than a typical quarterly earnings cycle could accommodate. Reaching that decision demanded honest communication about what was working, what was not, and what realistic expectations looked like for when meaningful returns would appear.
Governance as an Organizational System
Large-scale data programs require sustained coordination across business functions, and Citizens Bank built that coordination deliberately. An advisory steering group was established early, drawing participation from marketing, analytics, lines of business, and technology. This group grew to approximately 15 members who were responsible not just for oversight but for carrying program communications back into their respective organizations and keeping their own stakeholders current.
The challenge with this kind of structure, particularly in the early stages, is maintaining engagement while the foundational work proceeds and there is relatively little to show. Agendas need to remain relevant. Expectations need to be calibrated honestly from the start. If participants expect quick results and that expectation is not corrected early, the credibility of the entire program is at risk. Cottone framed the journey explicitly as a shared one -- a "together" effort rather than a series of status reports delivered from above.
An extended advisory group of approximately 25 participants worked in closer collaboration mode, enabling the kind of one-on-one conversation that steering committee formats rarely allow. About 18 months into the program, two additional groups were formed: customer data stewards who helped design and test the customer master data environment, and source data stewards aligned to each of the bank's 75 data sources feeding the master data management system and the data lake, both of which went live in 2016.
A further role was introduced in mid-2016: data trustees, one per business unit, positioned between the stewards and the advisory committee. These individuals prioritized issues raised by their business stakeholders and managed the details of data sources entering the data lake -- covering ownership, usage rights, provenance, privacy and security considerations, quality standards, and structural characteristics. Counting liaisons, advisors, subject matter experts, stewards, trustees, and extended group members, the total stakeholder community engaged with the program reached approximately 150 people.
Building the Technical Foundation
The bank's customer data had historically lived in application silos, making it difficult to consolidate records or analyze patterns across business lines. Rather than build a traditional data warehouse, Cottone chose to develop warehouse-like analytical capability alongside more advanced data ingestion tools in a unified environment -- a data lake that could serve as the foundation for more sophisticated capabilities without requiring additional technology purchases.
The customer master became the cornerstone of this architecture. Every downstream application and analytical capability depended on a customer record that was accurate, complete, and current. Getting that foundation right was a prerequisite for everything that followed.
Since the program received formal approval at the end of 2015, the team met every milestone it committed to. By October 2016, more than 20 organizational projects were projected to derive direct benefit from the enterprise data infrastructure being built, with those benefits expected to materialize through 2017 and into 2018.
The Harder Problem: Measuring and Communicating Value
The business case for foundational data infrastructure does not fit neatly into traditional ROI models. Leaders naturally want to know when investment will produce returns and how large those returns will be. The problem is that a customer master or a data lake does not generate revenue on its own -- it enables other programs to do so. When multiple initiatives are all contributing to a shared goal such as cross-selling or customer retention, attributing specific outcomes to the foundational data layer is not analytically possible.
Cottone presented this reality directly, without softening it. The business case she brought forward in the summer of 2015 identified cost-avoidance benefits across six initial projects spanning risk, compliance, consumer, commercial, sales, and marketing -- a linkage-based approach that showed how data capabilities would support business-driven outcomes rather than produce standalone returns. Executive updates occurred every six months, covering progress against commitments and the evolving picture of value accumulation across the broader program.
Transparency, in her view, is what sustains organizational patience. Telling stakeholders honestly where a program stands, what the realistic timeline looks like, and how progress will be visible -- without over-promising -- keeps leadership willing to remain on a journey whose payoff does not fit on a quarterly earnings slide.
The Organizational Case for Getting Data Right
The Citizens Bank experience illustrates something that applies well beyond financial services. Data governance, stewardship, and infrastructure programs are long-term organizational commitments. They require executive patience, honest communication, and the willingness to measure progress through milestones and linked business outcomes rather than direct attribution.
The payoff is equally durable. Sound data practices reduce friction throughout the organization, accelerate information flows, enable adaptability to competitive pressure, and improve the organization's capacity to serve customers effectively. These are not soft benefits. They compound over time in ways that short-cycle technology investments typically cannot.
The role of the CDO, at its core, is to build and sustain the conditions under which data can become a genuine organizational asset. That work begins not with platforms or pipelines, but with the people who produce, govern, and depend on the data those systems carry.
This article was originally published in IT Pro by the IEEE Computer Society and has been revised for Earley.com.

