BeyeNETWORK Blogs BeyeNETWORK Blogs. Copyright BeyeNETWORK 2005 - 2019 http://www.b-eye-network.com/rss/content.php 150 31 BeyeNETWORK Blogs http://www.b-eye-network.com/images/logo_b-eye_rss.gif http://www.b-eye-network.com/rss/content.php Defining business analytics: an empirical approach
D. J. Power, C. Heavin, J. McDermott & M. Daly

ABSTRACT

Searches of the Web using Google, and database searches of the academic and practitioner literature, return a large number of differing and varied definitions of the concept of business analytics. This article reviews the growing literature on Business Analytics (BA) using traditional and qualitative research tools. Our searches included using Google Search to identify examples of business analytics applications, and a focused keyword search of the available practitioner and academic literatures. Text analytics techniques identified frequently used terms in prior definitions of business analytics. Our empirical, inductive approach provides a basis for proposing and explaining a formal sentence definition for Business Analytics. The analysis provides a starting point for operationalising a measure for the business analytics construct. Additionally, understanding business analytics can help managers assess skill deficiencies and evaluate claims about relevance of tools and techniques. Finally, carefully defining the Business Analytics concept should provide stimulus for new research ideas.

To cite this article: D. J. Power, C. Heavin, J. McDermott & M. Daly (2018) Defining business analytics: an empirical approach, Journal of Business Analytics, 1:1, 40-53, DOI: 10.1080/2573234X.2018.1507605

To link to this article: https://doi.org/10.1080/2573234X.2018.1507605


]]>
http://www.b-eye-network.com/blogs/power/archives/2018/08/defining_busine.php Wed, 29 Aug 2018 08:24:25 MST http://www.b-eye-network.com/blogs/power/archives/2018/08/defining_busine.php
Data Interpreter: A Crucial Component of Successful Analytics

Recently I attended an excellent conference - The Enterprise Data World in San Diego. One of the event's keynoters was Claudia Imhoff, an internationally recognized expert on analytics, business intelligence and the architectures to support those initiatives. She presented "The Data Interpreter - How to Become THE Trusted Advisor for Your C-Suite."

What I found most interesting about her presentation was that there is indeed a need in every organization for someone who can translate the analytics and visualizations into a format that is understandable to everyone, especially the C-suite executives.

Claudia explained that many members of an organization are likely to be unfamiliar with the complex types of visualizations that are now possible. The message that is conveyed by those visualizations is often difficult for anyone to understand - other than the data scientists who created them. For that reason, she shared the need for the role of the data interpreter and provided advice on how to become that trusted advisor.

A compelling component of Claudia's presentation was her discussion of the data that is the foundation of today's complex analytics and visualizations. The most important task that falls to the data interpreter, according to Claudia, is vetting the analytics. All credibility is lost if the numbers or the trends are incorrect. The foundation beneath all analytics is, of course, the data. It is critical that the organization's data be trusted and readily available for analysis. When an organization has a data catalog where everyone in the organization can find the data they need to collaborate, the result is improved productivity and data that can be used with confidence.

I found it clear from Claudia's keynote address that a data catalog like Alation is a necessity for data interpreters. Alation automatically indexes your data by source. It also automatically gathers knowledge about your data. Like Google, Alation uses machine learning to continually improve human understanding.



]]>
http://www.b-eye-network.com/blogs/powell/archives/2018/05/data_interprete.php Fri, 25 May 2018 07:37:43 MST http://www.b-eye-network.com/blogs/powell/archives/2018/05/data_interprete.php
Gartner Insights: Building a World-Class Analytics Strategy
According to Gartner, "It's time for data and analytics leaders to become the epicenter of business value." This statement really hit home for me because I have always been a proponent of continual learning. We must always strive to keep up with the advances in technology or our organizations won't enjoy the success made possible through digital transformation.

This year's Data & Analytics Summit in Grapevine, Texas, presented a golden opportunity to learn about advances in artificial intelligence, analytics, and business intelligence. There were countless opportunities to hear from Gartner analysts about how to build and execute a world-class analyics strategy.

With more than 3,800 attendees from all over the world, the Summit also was a great place to network with others and hear presentations from successful enterprises on a variety of analytics/business intelligence topics. Additionally, Gartner named four key dimensions of scale that need to be mastered in order to accelerate analytical discovery. They are: trust, diversity, complexity and literacy. To read a more comprehensive review of the Summit along with key vendors providing the advanced technologies, please view the full article I wrote about this event.



]]>
http://www.b-eye-network.com/blogs/powell/archives/2018/03/gartner_insight.php Mon, 26 Mar 2018 13:01:54 MST http://www.b-eye-network.com/blogs/powell/archives/2018/03/gartner_insight.php
New Front End for Data Science and AI Want to learn more about Jupyter, the open source tool for collaborative data science and AI? If so, I recommend you attend JupyterCon at the New York Hilton Midtown from August 23 to August 25. This is the first Jupyter conference, and it will provide you with real-world examples of how leading data-driven companies are benefiting from this powerful platform.

]]>
http://www.b-eye-network.com/blogs/powell/archives/2017/08/new_front_end_f.php Wed, 2 Aug 2017 13:22:28 MST http://www.b-eye-network.com/blogs/powell/archives/2017/08/new_front_end_f.php
GPU Technology Advances Real-Time Analytics
We are seeing a major sea change in how enterprises are harnessing big data through machine intelligence and deep learning. Graphical processing units (GPUs) excel at workloads that require large amounts of mathematical calculations, simulations, streaming data and machine learning. This has resulted in a paradigm shift, enabling a whole new range of applications for autonomous vehicles, Industrial Internet of Things (IIOT) and advanced analytics.

I recently interviewed Amit Vij, CEO of Kinetica, whose company has pioneered a database that uses general purpose graphical processing units that is run SIMD operations in a highly parallel fashion. Its GPU-accelerated analytics database provides real-time insights into large and streaming datasets.

Kinetica was incubated within the U. S. Army Intelligence Command and the NSA.
It started as a geospatial and temporal computational engine and slowly evolved to a highly available and distributed in memory database that is accelerated by GPUs.

Kinetica can scale to meet the needs of very computational intensive applications such as risk mitigation and fraud detection in the financial industry. Their technology can displace current technologies that utilize scale-up hardware as Kinetica leverages scale-out architectures.

According to Amit, a major value proposition is converging machine learning and deep learning into a database running in-database analytics, registering, for example, a Google TensorFlow library to Kinetica, running a machine learning model on a data table that has over one billion rows, and then persisting out a new data table. "We're seeing that as an easy workflow for an organization to use," said Amit.

GPUs are accelerating the use of analytics across a broad range of use cases and having a profound impact across a wide variety of industries. Organizations that are trying to achieve real-time analytics with large and streaming datasets should check out Kinetica.

To read my full interview with Amit Vij, click here.


]]>
http://www.b-eye-network.com/blogs/powell/archives/2017/06/gpu_technology.php Tue, 20 Jun 2017 07:49:00 MST http://www.b-eye-network.com/blogs/powell/archives/2017/06/gpu_technology.php
Using Cloud BI to Give Greater Insights to Sales Operations

At the recent Gartner Data & Analytics Summit in Grapevine, Texas, I had the opportunity to speak with Jim Slagle, vice president of Business Intelligence at Apria Healthcare. A leading provider of home respiratory services and medical equipment including oxygen therapy, inhalation therapies, sleep apnea treatment, enteral nutrition and negative pressure wound therapy, Apria Healthcare has 375 branches throughout the United States and employs more than 8,000 people.


I learned that over the last 10 years, Apria has had various BI initiatives with mixed results. Most recently, Jim and his team implemented a full cloud-based BI solution with Domo, focusing on sales operations. The intent was not to replace their legacy BI, but to take the next step and facilitate specific sales-oriented use cases.


At Apria, the platform for sales is designed to provide visibility and insights into sales dimensions and manage base KPIs across the sales team. Domo allows each KPI owner to see his or her KPIs in real-time. This enables them to see where they need to course correct, and identify opportunities to step in and provide direction where needed. This capability means a substantial improvement in operational efficiency as the Apria sales team can now be more agile and adapt on the fly instead of waiting until the end of the month or end of the quarter to look at the sales performance in hindsight.


Jim spoke on the importance of governance. Before they put anything into Domo, they are very meticulous about the accuracy and the sourcing of the data, making sure that the functional people who own the data are part of the process. Before the data even gets to Domo, it is governed, federated, and verified as repeatable and accurate. Jim states that data integrity is non-negotiable because if you're second-guessing the data, it is counterproductive.


With regard to the ROI of their implementation, Jim points to the fact that with Domo, Apria hit their sales goals for 12 straight months - for the first time in Apria's history. Jim says, "With Domo, we can comparatively know how we're doing at any point of time."


Jim believes Domo has helped them be successful because it has the all the ideal features of a modern BI solution: Domo is aligned to corporate goals, actionable, simple and intuitive.


Domo gives Apria the ability to actively track KPIs and provides the insight to know where to make adjustments to help meet budget forecasts, and this, Jim says, differentiates Apria from many of its peers in the industry. Additionally, executive-level adoption of BI makes the entire sales budgeting process easier and more effective. In short, Jim feels that after many years of searching for the right BI solution, they have found it with Domo.




]]>
http://www.b-eye-network.com/blogs/powell/archives/2017/06/using_cloud_bi.php Tue, 6 Jun 2017 06:00:00 MST http://www.b-eye-network.com/blogs/powell/archives/2017/06/using_cloud_bi.php
Cloud BI Really Works

Colony American Finance was able to increase profitability by taking their business intelligence (BI) to the cloud. The importance of empowering business users was critical, and my interview with Matthew March, CIO at Colony American Finance, highlights how efficiently he was able to work with his executive team to create an effective BI platform in just two months.

Over the years I have been discussing several keys to effective BI. These include time to market, integration with operational business processes and timely acquisition of data. In my interview with Matthew, which includes several use cases, he shares how he was able to effectively address all of these key areas by creating a BI infrastructure that encompasses the cloud. He can now provide his business users with safe and secure access to 27 different data sources so they can do their own ad hoc analysis without involving IT.

This is one of the best BI implementations I have seen, and it clearly shows how this company has reached the goal that every BI implementation hopes to achieve. By eliminating stovepipe BI and enabling business users to effectively do ad hoc analysis themselves, Colony American Finance has realized the promise of BI -- to empower users and do it with minimal IT involvement.



]]>
http://www.b-eye-network.com/blogs/powell/archives/2016/11/xyz.php Mon, 14 Nov 2016 08:15:49 MST http://www.b-eye-network.com/blogs/powell/archives/2016/11/xyz.php
Big Brother can watch us
Free download:

Power, D. J. (2016), "'Big Brother' can watch us," Journal of Decision Systems,
Special Issue the Proceedings of the 2016 Open Conference of the IFIP WG 8.3, "Big Data, Better Decisions, Brighter Future", Edited By: David Sammon, Frada Burstein, Ciara Heavin, Gloria Philips-Wren, Frederic Adam and Ana Respicio, Volume 25, Supplement 1, 2016, pp. 578-588. Published online on June 16, 2016 at URL

http://www.tandfonline.com/doi/full/10.1080/12460125.2016.1187420

Abstract
Privacy, surveillance, and government abuse of data are concerns of many people in our complex digital world. 'Big Brother' in the title of this article is a metaphorical warning about the consequences if government uses modern technologies to maintain power and control people. Issues related to the abuse of data and surveillance are not new in the academic literature and mass market media, the current threat is however greater. Technology has advanced to the point where George Orwell's dystopian 'Big Brother' vision of a totalitarian state is possible. Because of technology advances, barriers associated with collecting and processing real-time data about many millions of individuals have been removed. This article explores how the capture and use of new data streams, and processing with AI and predictive analytics can support government control of its citizens. Some components of a system for thought control and real-time surveillance are already in use. These components like cameras, sensors, No SQL databases, predictive analytics, and artificial intelligence can be connected and improved. Decision support researchers must understand the issues and resist attempts to use information technologies to support current or future totalitarian governments.



]]>
http://www.b-eye-network.com/blogs/power/archives/2016/06/big_brother_can.php Tue, 28 Jun 2016 09:07:15 MST http://www.b-eye-network.com/blogs/power/archives/2016/06/big_brother_can.php
Data science: supporting decision-making Power, D.J., "Data science: supporting decision-making," Journal of Decision Systems, published online April 25, 2016 at URL http://www.tandfonline.com/doi/full/10.1080/12460125.2016.1171610.

Abstract

Data science is a new academic trans-discipline that builds on 60 years of research about supporting decision-making in organisations. It is an important and potentially significant concept and practice. Contemplating the need for data scientists encourages academics and managers to examine issues of decision-maker rationality, data and data analysis needs, analytical tools, job skills and academic preparation. This article explores data science and the data professionals who will use new data streams and analytics to support decision-making. It also examines the dimensions that are changing in the data stream and the skills needed by data scientists to analyse the new data streams. Organisations need data scientists, but academics need to understand the new data science jobs to prepare more people to support decision-making.




]]>
http://www.b-eye-network.com/blogs/power/archives/2016/06/data_science_su.php Tue, 14 Jun 2016 23:23:55 MST http://www.b-eye-network.com/blogs/power/archives/2016/06/data_science_su.php
Going Beyond Business Sponsorship to Achieve Program Sustainability and Success

When implementing data governance (DG), enterprise information management (EIM) or any similar business program, strong sponsorship is essential. Without it, a new program is all but guaranteed to fail.

But sponsorship can mean different things to different organizations. Often there is no clear understanding of what the role of Business Sponsor entails, and this may present a very real challenge. When success evades a new program, it may be because the sponsor has missed the mark and failed to progress the program to a point of sustainability. We have observed this in many organizations we have worked with.

Let's consider a different term: engagement. Engagement means there is leadership embracing accountability for success. Engaged leaders go beyond just buy-in and typical sponsorship activities in order to effect change. Programs and initiatives like DG and EIM are more likely to succeed--and the success more likely to be sustainable--when business leadership prevails.

That's not to say that the role of Business Sponsor--or the concept of sponsorship in general--should necessarily be eliminated from the equation. Indeed, a very powerful Sponsor will be able to secure needed buy-in and engagement. But when that is not the case, you may need to shift from a sponsorship model to one of leadership to be effective.

This transition should occur in three stages.

Stage 1: Align the Program to the Business

Look at enterprise needs--not user wants--when setting up the program. An enterprise need, for example, might be "to increase brand awareness." The program should then be structured so that it supports these kinds of overall business needs.

Indeed, it is common for stakeholders to think alignment is about granting access to all transaction details. But that is not what is meant by alignment in this context.

Stage 2: Develop a Vision

Create a vision for the program and a purpose for why the program is being implemented and resourced. But be practical. The vision must support that of the enterprise where data assets are adding to business value.

Stage 3: Pivot and Operate the Program

Once program goals and objectives have been aligned with those of the enterprise, and armed with your vision, you can then start to engage the organization in rallying behind the plan. You'll want to be sure to identify who needs to participate, when and how. Be as specific as possible--details are important here.

There are multiple ways to approach this. You might hold an orientation or a series of meetings, or you might establish operational groups. You may also decide to hold a Pivot Workshop, which can help you pull together cross-functional leaders in the organization.

The key is to plan the pivoting from being developed to being operational--and from being business sponsored to truly business led.

And don't forget to follow through: Summarize results and activities and follow-up to ensure continued accountability and program sustainability.


This post was written jointly with First San Francisco Partners' John Ladley and drawn from our January 7, 2016 CDO Webinar, The Difference between Business Sponsored and Business Led. For an expanded discussion of this topic, watch the video replay or view the slide deck.


]]>
http://www.b-eye-network.com/blogs/oneal/archives/2016/02/going_beyond_business_sponsorship.php Mon, 22 Feb 2016 22:00:00 MST http://www.b-eye-network.com/blogs/oneal/archives/2016/02/going_beyond_business_sponsorship.php
What factors indicate a special study is more appropriate than a DSS? by Daniel J. Power
Editor, DSSResources.com

Personal and organizational factors, cultural factors, decision factors, information factors and psychological factors vary among decision situations. Every decision situation can not and should not be supported with a computerized decision support system (DSS). A major alternative is to conduct a special study using some computer-based analyses. Given that some decision situations are better supported by preparing a one-time special study, what factors indicate that it is more appropriate to prepare a special study than build a decision support system?

Continue reading this classic column at http://dssresources.com/faq/index.php?action=artikel&id=26

Please cite as:

Power, D. J. "What factors indicate a special study is more appropriate than a DSS?" Decision Support News, Vol. 16, No. 25, December 27, 2015 at URL http://dssresources.com/newsletters/410.php


]]>
http://www.b-eye-network.com/blogs/power/archives/2015/12/what_factors_in.php Sun, 27 Dec 2015 11:41:50 MST http://www.b-eye-network.com/blogs/power/archives/2015/12/what_factors_in.php
Three Predictions for Performance Management in 2016

The trends for 2015 we identified a year ago have largely come to pass and continue to be a factor in the performance management space. We believe the three new trends described below will have a significant impact on performance management in 2016.

Combination of BPM and BI

This is a natural progression as users of performance management (BPM) systems are demanding more and better analytics and starting to request predictive analytics specifically. Their needs now go beyond just dashboards and data visualization. In addition, IT executives involved in BPM vendor selection projects are looking for BI to address custom use cases related to the core data in the system. What this leads to is a preference for performance management vendors that also offer integrated and robust business intelligence capabilities. While all vendors offer some BI-like functionality and most are built on a platform of BI tools, the need is for access to more complete and exposed BI functionality.

The vendors that benefit most from this trend are those that have acquired or partnered as resellers with BI vendors. These include Adaptive Insights (with Adaptive Discovery based on their myDials acquisition), Centage (with Analytics Maestro based on their BI-Metrix acquisition), Host Analytics with their Birst partnership, Longview with their arcplan merger, and Tagetik with their Qlik relationship. Of course IBM, Oracle, and SAP have significant BI offerings they can bundle in with their performance management solutions as well.

Cloud as the Standard

We have now reached the point where cloud has become the default deployment option in most performance management deals. At BPM Partners we did not encounter a single deal this year where the cloud option was off the table. Every customer we worked with was open to the cloud and the majority had it as their preference. In some instances they would only consider an on-premise solution if it had significantly better pricing or functionality when compared to the cloud-based alternatives. Recognizing this trend a number of vendors have been re-architecting for the cloud, those with multiple deployment options have been leading with the cloud, and even the largest vendors have introduced new cloud-based performance management offerings. We doubt that we will see any vendors left that don't offer a cloud option by the end of 2016.

Vendors that benefit most from this trend include the cloud-based performance management pioneers Adaptive Insights and Host Analytics, as well as the newer cloud vendors focused on large enterprises - Anaplan and Tidemark, and XLerant focused on smaller organizations. Tagetik and Vena Solutions have largely moved away from their on-premise options (unless pressed) and now lead with their cloud versions. SAP and Oracle Hyperion have introduced new native cloud planning solutions in the past year as well.

Availability of Pre-Built Solutions

This is a trend that has really been picking up steam. The idea is to pre-package specific processes or industry-specific needs so as to reduce the time and effort required to configure your new performance management system. The offerings are referred to by many different names. They include: blueprints, shell systems, templates, best practices, apps, starter kits, accelerators, solutions, etc. They have one primary goal - to demonstrate how to set-up the system to accomplish a specific task or series of tasks, and in so doing to reduce the time and cost involved in implementation, shorten the time to payback, and ultimately lower the total cost of ownership. Those benefits help to explain why numerous vendors have recently introduced their own versions of these pre-packaged solution sets.

Vendors that benefit most from this trend include IBM Cognos who led the way with their blueprints, Anaplan with their application models, deFacto Global by including pre-built functionality unique to particular industries in the core product itself, Longview with their Smart Client apps and pre-packaged content, OneStream Software with their self-service app store, and Tidemark with their packaged processes. There are three other vendors with unique approaches worth noting. While most of these packages are free Adaptive Insights charges for theirs. The reason is that to make their solution more useful they bundle in some best practice consulting time. AxiomEPM does something similar by bundling in embedded best practices in the form of management consulting delivering what they call 'software with a point-of-view'. On the other end of the spectrum from these pre-packaged solution sets is Altius. Instead of providing templates for you to follow or bundling in some expert consulting they go quite a bit further. They bundle in all the consulting you may require to customize their product for your particular needs at no additional cost.

While there were many other trends we observed in 2015 that will continue into 2016 we think these three are the most significant. They will help reshape the vendor landscape once again creating new leaders and laggards, and therefore should be taken into consideration when you evaluate solutions.



]]>
http://www.b-eye-network.com/blogs/schiff/archives/2015/12/three_predictio_1.php Tue, 22 Dec 2015 10:10:54 MST http://www.b-eye-network.com/blogs/schiff/archives/2015/12/three_predictio_1.php
What is data mining and how is it related to DSS? by Daniel J. Power
Editor, DSSResources.com

Data mining is a data analysis innovation first discussed in the 1990s. "Big data" and analytics has led to a renewed and expanded interest in data mining technologies. Academics tend to use the related terms Knowledge Discovery and Intelligent Decision Support Methods (Dhar and Stein, 1997) or more derogatory terms like data surfing or data dredging. In general, data mining is a group of analytical methods like neural networks, genetic algorithms, and decision trees, that help people conduct computerized searchs for patterns in a data set. Data mining is both a process and a set of tools.

Continue reading at http://dssresources.com/faq/index.php?action=artikel&id=39

Please cite as:

Power, D. J. "What is data mining and how is it related to DSS?" Decision Support News, Vol. 16, No. 23, November 29, 2015 at URL http://dssresources.com/newsletters/408.php


]]>
http://www.b-eye-network.com/blogs/power/archives/2015/11/what_is_data_mi_1.php Sun, 29 Nov 2015 10:12:50 MST http://www.b-eye-network.com/blogs/power/archives/2015/11/what_is_data_mi_1.php
The Importance of a Communication Plan to Maintain Stakeholder Commitment
Whether your data governance program has been recently deployed or has already matured into a going concern in your organization, consistent and impactful communication plays a critical role in translating data value into business value.

A communication plan lays out a strategy to help an organization achieve its awareness goals. It describes the What, When, Where, Why and How of a communication program and is meant to create a bi-directional conversation.

With a solid communication plan, you can keep stakeholders informed of your program's progress and accomplishments, fostering executive buy-in and ongoing commitment.

Your communication plan can also:

  • Give the working team a day-to-day work focus
  • Help stakeholders and the working team set priorities
  • Provide stakeholders with a sense of order and controls
  • Provide a demonstration of value to the stakeholders and other business folks
  • Help stakeholders support the data program
  • Help to protect the data program against last-minute demands from stakeholders

The key to communication that resonates is to ensure the metrics and measurements map to stakeholders' defined professional and personal goals.

Here are some starter questions to help as you develop your plan:

  • Who needs to be communicated to?
  • What information is important to them?
  • How frequently should they be updated?
  • What is the method of communication?
  • Who should be communicating the message?

For further inspiration, here are some components of a communication plan you'll want to tailor to specific stakeholder groups.

Components of Communication Plan



]]>
http://www.b-eye-network.com/blogs/oneal/archives/2015/11/the_importance_of_a_communication_plan_to_maintain_stakeholder_commitment.php Sun, 22 Nov 2015 04:15:15 MST http://www.b-eye-network.com/blogs/oneal/archives/2015/11/the_importance_of_a_communication_plan_to_maintain_stakeholder_commitment.php
Best Practices to Address Common Pitfalls to Data Governance

Having previously identified five common pitfalls to data governance, let's now look at some best practices you'll want to adopt in each case to help you get back on track or avoid falling into that trap in the first place.


Pitfall #1: Governing data from within IT

Best Practice: Identify and recruit a change leader on the Business side

Though IT is often the first to identify the need for data governance, Business is generally the primary creator, fixer and user of that data. Not surprisingly, data governance tends to be much more successful when the control of data occurs from within Business.

When looking for an executive sponsor for your data governance program, you'll want to consider the following qualities:

  • Ability to lead cross-functional initiatives
  • Ability to manage multiple political functions simultaneously
  • High regard as a respected leader
  • Self-confidence and flexibility
  • Ability to communicate effectively and inspire others

You'll also want to consider the impact data has on the potential sponsor's business unit, ensuring there is a high level of interest. The right business-focused change leader can subsequently help drive implementation of other best practices.


Pitfall #2: Governing data in silos

Best Practice: Establish enterprise data governance so people "Think globally and act locally"

While a data governance program confined to an individual business unit or line of business will likely help that unit or line, problems arise because data is shared across different business groups--each group defines a given data element according to their own needs and perspectives and this can lead to inconsistencies in information and inefficient or suboptimal decisions.

Data governance can only be successful when an organization governs data as an enterprise asset. The cross-functional steering committee, for example, can ensure shared definitions are consistent across groups and that value is created across the enterprise. It is important for people to "think globally but act locally." Data governance existing in silos is not without value, however, as what is already in place can be leveraged and built out to the enterprise.


Pitfall #3: Assuming everyone understands and appreciates the value of data

Best Practice: Communicate early and often about the impact of inaccurate and inconsistent data and the benefit of data governance

While some stakeholders are involved in or at least aware of all that goes into cleaning data and appreciate the value of that data to the organization, others who only see the data after it has been cleaned may have little appreciation for its true value. Thus it is important to communicate this value and the benefit of data governance from the start and repeatedly after that. The change leader and sponsor identified in the first best practice should play a central role in this communication.


Pitfall #4: Using meaningless metrics

Best Practice: Measure impact as well as progress

Borrowing from the pitfall addressed above, we can understand how, for example, a process metric reflecting a decrease in data errors is meaningful to a group that actively scrubs the data and addresses problems with it, but that metric may mean little to another group that only sees the data in its clean state.

That's why it is important to measure (and, of course, communicate) both impact and progress and to translate metrics into business value. In the above example, we want to understand how the progress metric (reduction in data errors) translates into improvements in the business, a KPI.


Pitfall #5: Treating data governance as a project

Best Practice: Embed data governance into your operations

When implementing a data governance initiative, organizations will often approach it as they would a project, with a distinct beginning, middle and end, and funding allocated accordingly. However, data governance is an ongoing program that cannot be sustained without continued resources and support.

It is critical for organizations to ensure data governance is fully integrated into their operations and that it continues to receive necessary funding and attention. Ensuring from the start that your operating model fits the culture of your company facilitates embedding data governance into your operations and aligning your strategy for long-term sustainability.


]]>
http://www.b-eye-network.com/blogs/oneal/archives/2015/11/best_practices_to_address_common_pitfalls_to_dg.php Sat, 21 Nov 2015 04:00:59 MST http://www.b-eye-network.com/blogs/oneal/archives/2015/11/best_practices_to_address_common_pitfalls_to_dg.php