ANALYTICS
IN ACTION
By: Kimberly
Nevala Director of Business Strategies for SAS Best Practices
I recently had the opportunity to moderate a session examining the
rewards and realities of the analytics journey in public sector. The executive
panel was overwhelming positive about the opportunity for data and analytics in
both the government of Alberta and the public sector in general. In fact, most
agreed that Alberta has some of the best data in the world, particularly in the
health, energy and natural resource sectors. They were equally frank about the
real and perceived barriers to making the most of this rich resource.
Here are some of the key considerations and cautions they shared
about what it will take to create a culture of evidence and make analytics a
core decision-making tool in government.
Start With WHY (in their terms)
Why analytics? Too often, we answer this question generically,
without explicitly identifying the need, pain or problem to be solved.
Why analytics or an integrated data ecosystem? Not because the
data is in siloes, duplicated, hard to access and understand. The real why comes
from the application of information to improve the business. In the case of
health and human services, it’s literally to save lives. For policy makers, analytics arms deputy
ministers with tangible, tactical evidence to drive good policy decisions while
also establishing a body of evidence and record of why key decisions were made.
Of course, the why isn’t the same for every constituent. Making
the case requires linking analytics to the intrinsic and extrinsic motivations and
outcomes which define their success.
Begin with the end in mind
Although the public sector still largely operates top-down,
panelists suggested an organic, bottom-up approach is required for analytical
innovation. Front-line staff understand the micro-climate in which they
operate. In the case of health delivery, this involves the clinicians who are
intimately aware of the pain points and opportunities in the incumbent care
delivery process. They are also the
people who ultimately must change clinical pathways and models to incorporate new
insights.
That said, the panel cautioned against over-engineering data-driven
improvement programs. There can be a tendency to try predetermine the system
from soup to nuts: here’s what we’ll measure, here’s what you will do to change
the process based on the data, here what we’ll…ad infinitum. A better approach
is to expose the evidence and engage those in the know to determine how to
respond. In fact, the earlier the engagement the better. Build a solution
without your customers’ involvement and not only will they not come, they may
actively oppose the solution on principle.
Go to the Light
Rather than worrying about universal acceptance, educate audiences
on the art of the possible. Engage with those who see the potential and have a
problem they are willing and ready to solve. Finding them is easy: they will
raise their hands. Focusing on early adopters and small, tangible problems allows
for early wins making the case with less enthusiastic constituents.
Worried people won’t engage? Evidence to the contrary abounds. Clinicians
exposed to departmental quality measures have proactively asked for data on
their own performance. One large
US-based health-care provider reports that quality improved – without any
systematic program intervention – after key metrics were routinely published.
Evidence is the START of the Story
With the advent of digital everything and always-on communication
channels, the decision-making cycle has sped up exponentially. Have answer,
will act? Not so fast. Buckets of data do not evidence make. Nor do buckets of
evidence create knowledge.
Time and space are required to not only cultivate but appropriately
consider the evidence. The onus here is largely on the senior tier to slow down
the decision-making cascade enough to allow for such deliberation. Management
must also be trained to demand and appropriately interpret information once
found. One panelist recounted how a new
perspective introduced at the eleventh hour led to a heated, last-minute
rethink of a proposed policy. In the end, the right answer, rather than the
convenient one, emerged.
Evaluate, Implement, Adapt
The best analytics will only create expensive trivia if the
organization isn’t prepared to act on found information. That requires a
disciplined approach to analytic discovery and information delivery. This
starts with asking not just what are we looking for, but what we will we do with the information once we find it.
Followed by a systematic process that allows hypotheses to be tested and interventions
to be implemented, monitored and adapted based on discovered results.
Organizations must also mindfully architect and design future
systems based on found insights. This includes proactively identifying
information requirements during business process and system design. Thereby ensuring
data required to monitor performance and adapt is created and captured from the
start, as opposed to considering data a happy byproduct or derivative of
running the business.
Practice Open Data
While data access is often cited as a primary barrier to analytic
progress, some suggest this is really a cultural and managerial issue, not a
legislative one. It is based on a historical legacy that errs on the side of
not sharing data, for fear of what someone might do with it once it leaves the
nest.
Panelists suggested the onus must shift from measuring data producers
on creation and access, to how effectively information is shared; judging data
consumers not on access but on usage. What do they do with the information
available to them?
Such a shift would make data producers obligated and accountable
for sharing data while holding consumers accountable for appropriate use, thereby
governing value and risk based on information use, not the mere presence of
data.
Make the Facts Known
The points above make a simple assumption: that evidence is
published. While this sounds a bit tongue-in-cheek, panelists pointed to a common
reticence to share not just raw assets, but found insights. Perhaps that’s from
fear of being proven wrong or provoking strong opposition, particularly when
the evidence belies standard beliefs or operating practices.
However, the alternate is arguing on shifting ground, rather than
using information to support a particular position or counter an argument of
those armed with fervently held beliefs, but few facts.
Exposing the data comes with risk. But often practitioners view
evidence and facts as a means to persuade or change another’s mind, rather than
using those facts as a basis to engage others in a substantive discussion--
even when the discussion turns to the basis of the evidence.
Facts alone may not change the opinion of a dedicated naysayer. Facts
do, however, provide a common basis for discussion. And isn’t that the real
point?
Great Artists Steal – Let Them
When it comes to cultivating a shared purpose, cross-pollinating
key skills and spawning new perspectives on the importance of communities of
practices for analytics was top of mind. It need not be over-architected. Creativity
spawns creativity. While innovation can’t be mandated, the right environment
can encourage out-of-the-box thinking.
Creating time and a safe space for analysts from different
departments within and among ministries has multiple benefits. Companies that
have adopted this approach have reported increased returns on analytic
investments and more engaged and motivated analysts, not a small victory given
the current premium on core data science resources.
Such collaborative teaming models become particularly important as
new business and service models transcend historical departmental or ministry
boundaries. Simple examples include the integration of different episodes of
care to account for a holistic patient experience and treatment and the interplay
of adult, child welfare and juvenile justice systems.
Partner Up
Of course, the need for partnership isn’t limited to the data
science or analytic community. To maximize shared assets it will be imperative for
like-minded ministries to work together on common problems rather than
reinventing the wheel each time. Areas such a fraud and risk are often great
starting groups for this type of initiative, which are most effective with
broad data sets culled from across ministry boundaries.
And while communities of interest help maximize incumbent
analytic/data science skill sets, third-party partners can also play a role. Universities
in particular are fertile grounds for collaboration. Engaging aspiring data
scientists to work on projects with live data provides two-fold benefits. Students
gain real-world experience and more interesting projects while helping bridge
the talent gap (be it lack of available resources or specific skill sets
required). Not to mention, it’s never too early to plant seeds with the up-and-coming
generation of leaders and data scientists about the power of using data for
good in public service.
Conclusion
This broad-ranging and thought-provoking discussion also touched
on the need for a collaborative information governance model, cultivation of
analytical not just statistical literacy, creation of a shared data
infrastructure and analytic lab environments, and the need to incorporate data
and analytic competencies into the GOA’s workforce development models.
While the challenges appear daunting, participants unanimously
agreed that the opportunity far outweighs these impediments. The GOA in
particular and the public sector in general are well-positioned to create a
rich data resource that is properly governed and available for public good. But
as Zig Ziegler once opined: “the key to getting ahead is getting started”. Why
not now?
As the Director of Business
Strategies for SAS Best Practices Kimberly Nevala balances forward-thinking
with real-world perspectives in business analytics, data governance, analytic
cultures and change management.
IPAC invited Ms. Nevala as a guest blogger, in advance of her panel at the 68th IPAC Annual Conference in Toronto, June 26-29th. To find out more about the panel, please visit: http://www.ipac.ca/2016-Program
If you are interested in learning more about SAS Canada's work in the Public Sector or to organize a similar Public Sector Breakfast Series panel in your city or province, please contact IPAC at manderson@ipac.ca
IPAC received this very thoughtful commentary response from one of the attendees of the SAS Breakfast, Rachelle Foss, and would like to share it with the wider Network.
ReplyDeleteThanks to Rachelle, and also to Carla-Jeanne Johnson and Tessia Williams from the IPAC Edmonton Regional Group, for providing their comments to IPAC about the pertinence of such events to public sector professionals.
IPAC has reproduced below a submission from Ms. Rachelle Foss.
Putting Analytics to Work
By Rachelle Foss, Freelance Writer
In today’s age we are increasingly producing data, so understanding effective uses of data can help us overcome industry challenges and direct our businesses towards success.
I had the pleasure of attending a presentation hosted by the Edmonton IPAC Regional Group, in partnership with SAS Canada, that talked about how to do exactly that. Held at Edmonton’s World Trade Centre, the panel of Alberta industry leaders taught me a few important things:
A simplified definition of analytics:
1. Descriptive Analytics: The start of the analytics process helps us understand what’s already happened.
2. Forecasting: Once we understand what has already happened, we can start to make predictions about what might happen next.
3. Prescriptive Analytics: Using our knowledge of the past and our prediction for the future, we can start to get ahead of the data and direct our business decisions towards the best services, products, and interactions.
Basically, it’s looking at our data and asking ourselves these questions:
• What it is;
• Where it is;
• What quality it is; and
• Who can and who can’t have access to it.
There can be a few challenges to consider, but being aware of these challenges in advance is a great way to overcome them.
1. How to present data effectively to decision makers: ensure data is readily available for easy access in a tight deadline. Training an effective team to fulfill required roles will help you get the most from the data and present it in the best way.
2. Overcoming management and executives who insist on doing things a particular way because "that’s the way they’ve always been done" can be overcome by asking why things are done a certain way and what might happen if we make changes.
3. Disabling data over-protection to enable responsible information sharing can be accomplished by creating communities of practice.
It all comes down to using the information we have to take us from insight into action.
[Rachelle Foss is an independent writer and blogger. She lives in Alberta, Canada.]