Blog: Jill Dyché Subscribe to this blog's RSS feed!

Jill Dyché

There you are! What took you so long? This is my blog and it's about YOU.

Yes, you. Or at least it's about your company. Or people you work with in your company. Or people at other companies that are a lot like you. Or people at other companies that you'd rather not resemble at all. Or it's about your competitors and what they're doing, and whether you're doing it better. You get the idea. There's a swarm of swamis, shrinks, and gurus out there already, but I'm just a consultant who works with lots of clients, and the dirty little secret - shhh! - is my clients share a lot of the same challenges around data management, data governance, and data integration. Many of their stories are universal, and that's where you come in.

I'm hoping you'll pour a cup of tea (if this were another Web site, it would be a tumbler of single-malt, but never mind), open the blog, read a little bit and go, "Jeez, that sounds just like me." Or not. Either way, welcome on in. It really is all about you.

About the author >

Jill is a partner co-founder of Baseline Consulting, a technology and management consulting firm specializing in data integration and business analytics. Jill is the author of three acclaimed business books, the latest of which is Customer Data Integration: Reaching a Single Version of the Truth, co-authored with Evan Levy. Her blog, Inside the Biz, focuses on the business value of IT.

Editor's Note: More articles and resources are available in Jill's BeyeNETWORK Expert Channel. Be sure to visit today!

November 2009 Archives

By Mary Anne Hopper, Senior Consultant

A Pleasant Evening by Gabriel Villena (via Flickr)

When not at work, I enjoy racing sailboats - sometimes day racing and sometimes offshore racing.   Last year while racing from San Diego, CA to Puerto Vallarta, Mexico, we spent a considerable amount of time drifting off of Cabo San Lucas in absolutely no wind.     In the middle of the night with no wind, conversation tends to drift to stories amongst the team on watch.   Call it a lack of sleep but some of my stories made complete sense to me but not to my two watch-mates.   So they told me, ”Mary Anne, every story has a beginning, a middle, and an end – now why don’t you start over from the beginning?”  

That line has become somewhat of a joke since that day and night spent off Cabo.     But, one of the interesting places it plays in my mind is the BI Requirements process.     That process can be broken down into three steps – business, data, and functional requirements.   A beginning, a middle, and an end.     Starting at the beginning with business requirements sets the stage for what business questions need to be answered ensuring value is delivered.   Data requirements can be gathered to support those business questions.   In the end, that functional requirements can detail how to access that data.  

Oftentimes, I see clients start with a set of functional requirements with no idea of whether or not they can be delivered with available data or what business questions they will answer.     This approach can take the project down one of two paths.   The first is back-tracking back through the requirements process that inevitably takes the project team longer to deliver because they start over with each step backwards.     The other path delivers the functional requirement but only with work-arounds (logic built into reporting, extra spreadsheets, new desktop applications, and so on), also a costly alternative when others try to support or extend functionality.   Neither of these approaches deliver the needed business value with the agility required.So when you’re planning your next BI deliverable, remember that every story has a beginning, middle and an end.   Start at the beginning with the business requirement, work forward to the middle data requirements and then the functional requirement at the end and you’ll enjoy the success of a story well told.

photo by Gabriel Villena via Flickr Creative Commons License


MAHopper_BW

Mary Anne has 15 years of experience as a data management professional in all aspects of successful delivery of data solutions to support business needs.   She has worked in the capacity of both project manager and business analyst to lead business and technical project teams through data warehouse/data mart implementation, data integration, tool selection and implementation, and process automation projects.


Posted November 20, 2009 6:00 AM
Permalink | No Comments |

Below is an excerpt from the white paper Planning Your BI Program: A Portfolio Management Approach by senior consultant, Kimberly Nevala. In this white paper, Kimberly discusses ways in which your BI Program is, in many ways, very similar to your personal financial portfolio, and how, by applying some of the same principles you apply to your own finances, you can create a sustainable approach to optimize and capitalize on your corporate information. You can download the full paper here.


Portfolio Management and BI

In a very real sense, your BI program is an investment akin to your corporate or personal financial portfolio. It is multifaceted, dynamic, and subject to sometimes unpredictable internal and external pressures. Just like a financial portfolio, BI represents a significant, strategic investment and requires a similar level of discipline relative to its ongoing, deliberate management and control.

NevalaGraphic_Pyramid2v5

Figure 1: Balancing BI Investments

As seen in Figure 1, taking a portfolio management approach to BI implies we must continuously:

  • Understand business strategies and objectives
  • Monitor internal business performance, as well as external industry and economic trends
  • Identify business opportunities and needs
  • Balance opportunities against available capital, enterprise resources and risk/reward ratios (aka ROI)
  • Deploy appropriate business intelligence capabilities to confirm and/or capitalize upon selected opportunities

In the investment parlance, portfolio management can be active or passive. Immature BI programs tend to focus on passive—or reactive—capabilities. They are focused on basic reporting about events that have occurred and simple analytic extrapolations. They are usually based on ad hoc requests and lack of user prowess with toolsets. Mature BI programs include (pro)active capabilities such as forecasting, predictive analysis and data mining. In addition, these capabilities—or outputs thereof—are integrated into the business’s decision-making and management processes.

Continue reading here.


Posted November 12, 2009 6:00 AM
Permalink | No Comments |

In which Jill predicts the future, thus inducing knowing smirks among her loyal (and patient) readership.


Just putting the luggage away from TDWI's World Conference in Orlando. This was a surprisingly successful conference. I say "surprisingly" because across its repertory of quarterly conferences, TDWI's Orlando event isn't as "hot" as Vegas or San Diego. But this week's conference was better-attended than San Diego in August. What gives?

The general theory is that people are spending their leftover budgets before 2009 slips away. Makes sense. But my observation from talking to attendees is that as they budget for the next fiscal year, they're anticipating some important technology investments and doing their due diligence now. This certainly showed in the questions attendees asked during the week's workshops.

In my BI from Both Sides: Aligning Business and IT workshop on Sunday, there were more questions than usual about a best-of-breed approach to BI tools. It seems that CIOs are driving standardization--often, if you ask me, without considering business requirements.  If you're in IT, fight the blind urge to standardize on a single BI tool vendor. Instead, regularly present new technology solutions to business users who might be ready to evolve their information usage. This gives you the opportunity to transcend the "IT as cost center" mentality and serve as Trusted Advisor to the business.

In our Data Governance for BI Professionals class on Tuesday, my co-instructor Kimberly Nevala and I got lots of after-session questions on data stewardship dashboards and workflow automation tools. It seems that as companies mature their decision rights, they want data anomalies and other discrepancies "pushed" to the desktops of appropriate decision owners, be they data stewards, subject matter experts on the business side, or the data governance council.  MDM vendors are leading the charge with such capabilities, and Kalido, Initiate Systems, Siperian, and DataFlux all offer  workflow functionality.

I sat in on Evan Levy's Introduction to MDM for BI Professionals course on Wednesday morning just as someone was asking about hierarchy management. Seems his company, a name-brand electronics firm, had been writing code to manage the B2B hierarchies on the data warehouse, but realized that multiple hierarchies were exponentially more difficult to manage. Evan explained how MDM tools managed multiple hierarchies and groupings, often leveraging external data providers to not only help resolve relationships but enrich the data in the process.

Evan also did a good job fielding questions from an engaged group in his workshop, Gathering MDM Requirements: A Different Formula. Just when you thought it was safe to engage business users with a structured requirements gathering process, MDM invites a different set of requirements steps, focused again on OLTP-type processing. It was Evan's birthday and the audience of engaged attendees was a nice gift.

I caught snippets of talks by Cindi Howson and John O'Brien--shout-out to Cindi, too, for an inspirational Monday keynote--and got updates from vendors like Sybase, Talend and Paraccel. Some fresh "wins" were evident on the vendor side of things, confirming my theory that 2010 is going to be a Big Year. Hey, 2010, hurry up already!

Technorati Tags: TDWI, business intelligence, business IT alignment, data governance, MDM, master data management, Sybase, Paraccel, Talend, Initiate Systems, Kalido, Siperian, DataFlux


Posted November 6, 2009 3:16 PM
Permalink | No Comments |

By Carol Newcomb, Senior Consultant


Mind The Gap by limaoscarjuliet via Flickr (Creative Commons)

Somewhere along the way to understanding the whole data governance thing, someone forgot to mention that data management really is central.   Did I say Mention?   I really meant Emphasize.   Urgently Emphasize.   Data management processes are critical to the establishment and maintenance of a data governance program.

The thing is, data governance is quite impossible without a fundamental data management process that provides tactical direction.  For example, if you have a bunch of people running around acting as data stewards, asking people questions about metadata, source systems, definitions, use of data and implications of sharing it, what happens when they have to write a policy?   And when data stewards meet as a group, how can they describe the magnitude of issues to discuss with their users?   How do issues get prioritized?   How can anybody decide who is accountable if due diligence hasn’t been paid to the fundamental source data issues, and how can decisions about governance enforcement be made?

Data governance should never be undertaken as a project.   It is a long term program that needs to start with a foothold in core data problems, like the inability to use or share data.   To get that foothold, stewards need to identify data quality and consistency issues that resonate with end-users.   Find some tangible pain.   This requires a sound set of data quality profiling activities to prove where conflicts or inconsistencies are most problematic, and a quantifiable statement of what the impact is.   The problem needs to be quantifiable to get people’s attention, as well as to compare one set of problems against another.   The data management program should have the capacity to routinely analyze and reveal data quality issues,and keep bringing them back to end-users to build a sense of urgency towards taking restorative and preventive action.  

Data governance is all about People, Processes, Decision Rights and Controls.   But before you can even begin, you will need to have identified the problems through hardcore data management practices.   This involves understanding the data architecture, examining privacy and compliance mandates around data visibility and sharing, exploring metadata definitions and having the ability to make changes through sound data administration practices.   Without these systems and roles in place, data governance is just a hollow shell.

photo by limaoscarjuliet (via Flickr)


CarolNewcomb_thumb Carol Newcomb is a Senior Consultant with Baseline Consulting. She specializes in developing BI and data governance programs to drive competitive advantage and fact-based decision making. Carol has consulted for a variety of health care organizations, including Rush Health Associates, Kaiser Permanente, OSF Healthcare, the Blue Cross Blue Shield Association and more. While working at the Joint Commission and Northwestern Memorial Hospital, she designed and conducted scientific research projects and contributed to statistical analyses.


Posted November 5, 2009 6:00 AM
Permalink | No Comments |