Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in Market Category

Pivotlink is a pure SaaS play for midmarket or enterprise departmental needs.  They make a clear claim to target companies with BI shelfware and have a number of case studies where customer's BI needs were met with PivotLink after failures with other solutions. 

Pivotlink delivers data in various forms, but mostly based around the concept of a pivot table, providing ad-hoc data access across a high level of dimensionality.   

pivotlink.jpg

Interestingly, I have learned that PivotLink uses a proprietary columnar in-memory database management system.  This DBMS is not sold separately.  As with any columnar, there is a high compression ratio.

And here you can see how the Pivotlink data model is customizable through an abstracted view.

 

pivotlink2.jpg

From a technology perspective, Pivotlink is interesting from its combination of in vogue approaches like columnar DBMS and SaaS.  It claims access to a variety of source systems, supporting customers with billions of rows of data like REI.


Posted April 3, 2009 11:36 AM
Permalink | No Comments |

I have received some questions on my article "Information Management and the Financial Meltdown" so I thought I'd address them here. The article was written in September, after the meltdown of Fannie Mae and Freddie Mac and Lehman Brothers filing for bankruptcy. AIG had suffered a liquidity crisis, but had not received the government loans yet to come. Goldman Sachs and Morgan Stanley had yet to be converted to bank holding companies, Washington Mutual yet to be seized, Wachovia acquired, etc., etc. And now we see it spreading to the auto industry and probably eventually the airline industry will be front and center. In other words, it was and continues to be a moving target.

It is really difficult to tell the depth of the deleveraging and decoupling that the world economy will go through. The economy is wound up pretty tight and must let out the built up pressure. Questions remain about the approach and the timing, but there is no avoiding that pain has, and will, occur.

One point I made is that financial companies were motivated to get mortgages out the door and that they sold their toxicity. This was true, but why were they motivated as such? Some point to the Community Reinvestment Act of 1977, which required institutions to loan to those less qualified. 1994's Riegle-Neal act compounded the CRA's effect by rewarding banks with high CRA scores to bank across state lines. And then more ability to compound behavior was possible in 1999 with Gramm-Leach-Bliley, which allowed banks to combine investment and commercial operations.

There was also incentive to take undue risks with the dilution of executive accountability once the firms went public and the executives became more minority interests in the entities. This started with Salomon Brothers, important in Citi's heritage, going public in 1981. While I'm at it, the rating agencies' presentation of their business intelligence left some things to be desired. And over 100% home equity loans, combined with a real estate downturn, tossed more toxicity on the fire.

Another point is that the mortgages were put into complex packaging, which business intelligence did not keep up with. So, in context of business intelligence, did the financial companies know what they were buying? I think business intelligence has some room to grow in terms of that, as pointed out in the article. A better question may be did they care? In some respects they did, but in other respects business intelligence was relegated to secondary consideration given that the institutions were not incented purely by profitability and good business. As I said "full visibility into exposure and liquidity is going to be a must." Visibility and rewarding only good business are part of the "executive sponsorship" I mention that is required.

I had an MBA professor who went through some of the early lineage above with his students and predicted a dire outcome. I took his notes (early 1990's) and extrapolated the more recent events for this entry. Many probably could have seen this coming, but when times were going well, nobody wants to stop the music. Executive sponsorship and business intelligence will be critical to mend the markets as painlessly as possible.

What are your thoughts?

Technorati tags: data, Business Intelligence, financial crisis, Information Management,Community Reinvestment Act, Gramm Leach Bliley


Posted November 30, 2008 5:34 PM
Permalink | 4 Comments |

I had a chance to review Lyza a few times in the last couple of weeks - both before and after its launch on Sept. 22. The biggest reason why I like it is I found immediate applicability to both a client situation and a personal situation. I.e., I've actually used it. Perhaps another reason, by way of disclosure, is that I've known and like the team at Lyzasoft and know their goal to provide a strong value proposition to the market. The extent of the focus groups that went into the product development is amazing.

With dynamic connections to the underlying data sets you define to Lyza (refreshed with a click), I find that it extends the functionality of the desktop. Think of it as providing a functional way of enabling joins and analysis across file types. You will mostly use this with Excel, Word, Access and text files. You pick the cell to start the connection or the range of cells that define the connection. It does not have to be the entire file. Then, again, there will be enterprise uses for its ODBC/JDBC connectivity, which is probably its ultimate destination.

My favorite feature is the ability to put all these data types on an equal footing and establish the joins.

It also has the ability to store data that you may want to derive from the underlying data sources in its own (column oriented) data store. So, in effect Lyza itself can become one of the data stores used in the analysis. And you can publish complex worksheets that contain the logic, from the underlying files, to determine sales commissions, vendor rankings, promotion effectiveness, etc. Worksheets can also be effectively a data set and connect dynamically. The metadata makes tracking your way back very easy.

Though not to the level of a Tableau Software yet in terms of charts and display options, the conditional logic, rich function library and ability to subdivide a data set (i.e., 1st 10,000 rows, a random 500 rows) make it pretty rich for a version one.

Unlike more complex tools that fit the gather requirements, out-of-sight development and launch to users, many of the Lyza applications for users can be developed in front of the user, or by the user.

Lyza doesn't categorize easily, but I think it's going to find a fit in the large gap between Excel capabilities and data integration - the lair of the true business analyst. With its quasi-EII capabilities to understand source data from its metadata, Lyza fits the unstructured nature of the analyst's work in a modern, heterogeneous corporate information environment.

lyza.gif

Technorati tags: lyza, lyzasoft, Business Intelligence


Posted September 27, 2008 2:33 PM
Permalink | No Comments |

In a very strategic move for Microsoft's enterprise goals, they have just announced the purchase of data warehouse appliance vendor Datallegro!

While Microsoft has significantly expanded SQL Server's scale over the past few years, the perception of its limitations has been somewhere below the "big guys" of Oracle and IBM. And, wherever you believe the scalability of SQL Server has grown to, now undoubtedly the scale of Microsoft solutions goes beyond 100 terabytes. This is the scale that many, myself included, believe accessible data management capabilities need to get to in order to manage the future of telecommunications, retail, healthcare and other transactions and make them available.

Look for Microsoft, and others like myself, to publish reference architectures and guidance on the changeover point from SQL Server to Datallegro (or should we start calling it Microsoft MPP?) as well as integration points.

I have found Microsoft's integration of its acquisitions to be very above average in terms of making the most of the acquired products. There are too many data appliances and Datallegro was caught up in this frenzy. It has found its way to be a long-term appliance play.

The open source DBMS that Datallegro was using, Ingres, will be scrapheaped over time and replaced by SQL Server. This will take some time, but Microsoft has that. Its customers now can see a plan in action and that will hold them over for a while. Many customers have settled into the "Microsoft zone" of pricing, which is more than open source (duh), but less than its big competitors. Look for Datallegro, likewise, to be in the low (but not "no") cost points for its capabilities.

Congratulations to the respective teams.


Posted July 24, 2008 1:25 PM
Permalink | No Comments |

I was looking forward to this presentation. However, I must admit, with the plethora of appliance vendors who have hit the market lately and made their way onto client short-cum-long lists, I was more than happy to dismiss NeoView if this data point did not move the story forward several paces. However, Greg Battas addressed NeoView's lack of market penetration and their 'soft roll out' up front. They spent a full year with customers before the announcement in 2007.

HP, as a company, was losing big deals to IBM and Oracle since those 2 had full suites. Back in 2004/2005, Tandem (now part of HP) had built an earlier form of NeoView, but ultimately didn’t go to market with it because they didn’t want to compete with Oracle. That's not an issue now.

The first place to test NeoView was at HP itself, where they have, according to Greg, shut down 500 internal databases in a consolidation project.

HP still lacks in the data access space. Obviously, they were looking at BO and Cognos as well as SAP and Oracle did. They are working closely with Ab Initio for ETL although they're philosophy is less 'load and analyze' and more 'ingest and do things inline.' The philosophy, supposedly manifested in the architecture, is very Operational BI-centric.

NeoView is meant to be a "Teradata killer." However, as Greg pointed out, the road is littered with those who claimed to be "better than Teradata" and still, there's Teradata.

Technorati tags: Business Intelligence, Independent Analyst Platform, HP, NeoView


Posted July 14, 2008 7:44 AM
Permalink | 1 Comment |
PREV 1 2 3 4

   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›