Blog: Jill Dyché Subscribe to this blog's RSS feed!

Jill Dyché

There you are! What took you so long? This is my blog and it's about YOU.

Yes, you. Or at least it's about your company. Or people you work with in your company. Or people at other companies that are a lot like you. Or people at other companies that you'd rather not resemble at all. Or it's about your competitors and what they're doing, and whether you're doing it better. You get the idea. There's a swarm of swamis, shrinks, and gurus out there already, but I'm just a consultant who works with lots of clients, and the dirty little secret - shhh! - is my clients share a lot of the same challenges around data management, data governance, and data integration. Many of their stories are universal, and that's where you come in.

I'm hoping you'll pour a cup of tea (if this were another Web site, it would be a tumbler of single-malt, but never mind), open the blog, read a little bit and go, "Jeez, that sounds just like me." Or not. Either way, welcome on in. It really is all about you.

About the author >

Jill is a partner co-founder of Baseline Consulting, a technology and management consulting firm specializing in data integration and business analytics. Jill is the author of three acclaimed business books, the latest of which is Customer Data Integration: Reaching a Single Version of the Truth, co-authored with Evan Levy. Her blog, Inside the Biz, focuses on the business value of IT.

Editor's Note: More articles and resources are available in Jill's BeyeNETWORK Expert Channel. Be sure to visit today!

December 2007 Archives

In which Jill and a friend ponder the cultural impact of disruptive technologies while--yes, you guessed it--eating lunch.

"Why does it take us so long to get all these emerging technologies right?" asked my friend and erstwhile client as we tucked into a hearty New York lunch at the Longwood Gourmet.

"Maybe we're supposed to test it first," I said, "before we do anything Big with it. You know. Sort of co-exist with it."

"Don't get all Zen on my ass," he said, and bit into his Number 29 (Grilled Chicken with Mozzarella and Basil on a Hoagie).

But he's right. Once management approves the investment and we write our favorite vendor the check, there's a built-in expectation of quick delivery. The proof-of-concept has become a given with most technology acquisitions, irrespective of its appropriateness. But then what? Often, it's business as usual.

My friend's company needed master data management and he had the budget for it. He realized that CDI was a disruptive technology, would be perceived as a threat by some of his IT colleagues, and that it would require changes to both organization and infrastructure. He knew that CDI would help his company in its M&A frenzy: as it gobbled up smaller competitors, the myriad newfound customer lists languished on various servers. Lost cross-selling opportunities were costing the company untold revenues. My friend also understood CDI's promise in addressing poor data quality, a corporate phenomenon that had actually pervaded the culture. He was also hoping that he could use CDI to justify the role of Data Steward for corporate customer information, and that he could start formalizing data management development processes.

Unlike many of my clients who face off with colleagues, their business cases their weapons, and IT budget the spoils, my friend had the money. He had a vision for what could be accomplished, the battles that could be fought and won for the greater good of the firm. He had the artillery, he just needed a few more weapons in the salvo. It was just that his leadership, namely IT management, refused to fall on its sword for change.

Technorati tags: master data management, MDM, CDI, customer data integration, business-IT alignment


Posted December 29, 2007 9:47 AM
Permalink | No Comments |

In which Jill goes in for Europe and comes out with a steaming bowlful of Asia. And some kick-ass shu mai.


So I'm on my way to speak at a conference in Asia and I have a 3-hour layover in Taipei. I make my way to the Dynasty Lounge to plug in, turn on, and scarf down the customary shortbread biscuit. I've found that shortbread biscuits are a staple in business class lounges worldwide, so it would be wrong not to have one. Or two.

When I enter the lounge, my expectations are exceeded: the lounge has a Dim Sum bar. Spread out are all sorts of Chinese delicacies from little steamed dumplings to plump meat pies to piping hot noodle soup with meatballs. It's a veritable cavalcade of Asian cuisine! I grab a pair of chopsticks and a spoon, ladle some soup into a bowl, load on some chili sauce, and dig in.

This is how companies feel right after they finally get to buy a data quality tool. They are delighted with the number of options they have, but at the same time conflicted. After all, consider the choices! Where to begin?

Many of our clients begin the automation of data cleansing with a subset of customer data. This is largely because customer data has the most management support, and receives the lion's share of executive attention. If cleaning up customer data means generating more accurate predictive models, which in turn drive higher marketing hit rates, most managers will happily jump on the bandwagon.

Other companies focus more on business needs. A client of ours that recently re-engineered its supply chain started its data cleansing with product item data, planning a horizontal expansion to suppliers, then eventually to customers. It's a requirements-driven approach, and there's nothing wrong with that.

The point is, there's no one right answer, but the old adage of "start small, think big" is an apt one here. Viewing a new data quality tool as a black box that will process all the data on the data warehouse like some sort of meat grinder--in goes the data steak, out comes the spicy, Chinese meatballs delicately flavored with cumin and chili, oops, sorry--can do more harm than good.

A deliberate approach to prioritizing data cleansing projects will serve you well. Understand which data subject areas or subsets will yield the highest improvements or drive the best business decisions, then chunk up your data cleansing accordingly. There's plenty of data to go around, and you can always go back for seconds!

Technorati tags: data quality, data cleansing, data governance, data warehouse, data management


Posted December 11, 2007 4:37 PM
Permalink | No Comments |

In which Jill hangs out with a bunch of bikers and learns something new.

My dog Lu and I went hiking last Sunday after the rains stopped. At the top of the trail, there's an abandoned missile site, a relic from the cold war. The site has been converted by the Santa Monica Mountains Conservancy into a picnic area, and is a regular meeting place for weekend warriors, especially hardcore mountain bikers.

I was sitting with a group of them comparing protein bars when a disheveled looking biker arrived. He didn't sport the standard issue biking gear of helmet, quick-dri bike shorts, and clip-on shoes, but instead wore knee-length cargo pants and a tank top. His bike looked more like a 10-speed than a mountain bike. His head was helmet-free.

"Guys, can I get to Sullivan Canyon from up here?" he asked the rough-hewn throng of diehards.

"Not on that, you can't," replied one, gesturing toward the 10-speed.

The group went on to explain that Sullivan Canyon had been washed out. In some places the mud was a foot deep and you'd have to walk your bike. There were rockslides to navigate. The novice's bike clearly wouldn't hold up on Sullivan. There were suggestions of easier ways down the trail.

The novice politely thanked the group and took off in the direction of Sullivan Canyon. I admired the bikers' forbearance as they watched him ride off.

The experience reminded me of a conversation I had a while ago with an ETL developer. He was trying to reconcile customer data from his client's different distributors and business partners. Of course, the data varied wildly by source system, and he was using the ETL tool to write custom data cleansing code. It was brutal work, and when he explained that his client was running out of money for the project, I wasn't surprised.

I politely suggested that a data quality tool could significantly reduce his manual efforts. A tool would let him profile the incoming data from the different data sources, then subsequently establish rules to handle anomalies. Over time, he could refine the rules and save himself a lot of work. "We're too far into it," was all he would say.

I couldn't help thinking that this was misinformed. Had my friend known about but rejected the data quality tool in order to perpetuate his contract? Did he need an update on the features of these tools? Or did he, like the novice bike rider, assume the conditions simply weren't that bad?

Makes me wonder if the novice biker made it to the bottom of Sullivan Canyon and, if he did, what cuts and bruises he'd have to show for the ride. And whether he'd think they were worth it.

Technorati Tags: data quality, ETL, Business Intelligence, data governance


Posted December 1, 2007 4:08 PM
Permalink | 2 Comments |