Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

April 2012 Archives

The story goes that Willy Sutton robbed banks because "that's where the money is."  While this attribution appears to be an urban legend, it's no myth that Oracle has a lion's share of databases - both transactional and analytic.

IBM started an advanced land grab for Oracle customer conversions by bringing a high compatibility of PL/SQL into the DB2 database.

Now, Teradata has invested resources in facilitating the migration away from Oracle.  With the Teradata Migration Accelerator (TMA). structure and SQL (PL/SQL) code can be converted to Teradata structures and code. This is a different philosophy from IBM, which requires few code changes for the move, but also doesn't immediately optimize that code for DB2.

While data definition language (DDL) has only minor changes from DBMS to DBMS, such as putting quotes around keywords, Teradata's key activity and opportunity in the migration is to change Oracle cursors to Teradata set-based SQL. 

"Rule sets" - for how to do conversions - can be applied selectivity across the structure and code in the migration.  TMA supports selective data movement, if desired, with WHERE clauses for the data.  TMA also supports multiple users doing a coordinated migration effort.

TMA also works for DB2 migrations.

While it will not do the trick on its own, having these tools, which convinces a shop that the move could be more pain-free than originally thought, will support DBMS migrations.  


Posted April 30, 2012 10:15 AM
Permalink | No Comments |

Teradata Aster demonstrates its graphical "pathing" capabilities very nicely by showing the relationships between tweeters and their tweets at events, like the Teradata Third-Party Influencers Event I attended last week. 

The demonstration shows how to produce some sentiment of the event, but more importantly demonstrates relationships and influence power.  Customer relationships and influence power are becoming part of the set of derived data needed to fully understand a company's customers.  This leads to identifying engagement models and the early identification of patterns of activity that lead to certain events - desired or otherwise.

One important point noted by Stephanie McReynolds, Director of Product Marketing, at Teradata Aster, was that the sphere of relevant influence depends on the situation.  You can retweet hundreds of tweets, many for which you do not even know the tweeter.  However, when buying a car, those who would influence you would be only a handful.

One would need to take some more heed of an influencer's opinion - or that of someone with a relationships to the influencer.  It can become quite a layered analysis and influence power is hard to measure.  Grabbing various digital breadcrumbs is relatively easy, but is it indicative of influence?  Likewise, is a tweetstream indicative of the sentiment of an event?  I'm not sure.  It may not even be indicative of the sentiment of the tweeters.  Digital is all a start.  The worlds of third-party data, real sentiment analysis and possibly sensor data are coming together.   


Posted April 24, 2012 11:17 AM
Permalink | No Comments |

Teradata rolled out Teradata Data Labs (TDL) in Teradata 14.  Though it is not a high-profile enhancement, it is worth understanding for not only Teradata data warehouse customers, but also for all data warehouse programs as a signal for how program architectures now look. Teradata Data Labs supports how customers are actually playing with their resource allocations in production environments in an effort to support more agile practices under more control by business users.

TDL is part of Teradata Viewpoint, a portal-based system management solution.  TDL is used to manage "analytic sandboxes" by these non-traditional builders of data systems.  Companies can allocate a percentage of overall disk and other resources to the lab area and the authorities can be managed with the TDL.  By creating "data labs" and assigning them to requesting business users, TDL minimizes the potential dangers of the "can of worms" that has long been opened, supporting production create, alter and delete activity - not just select activity - by business users.

These sandboxes must be managed since resources are limited.  Queries can be limited, various defaults set and, obviously, disk space is limited for each lab.  Expiration dates can be placed on the labs, which is not dissimilar to how a public library works.  Timeframes will span a week through a year.  The users may also send a "promotion" request to the labs manager, requesting the entities within the lab be moved out of labs and into production.

Data labs can be joined to data in the regular data warehouse.  One Teradata customer has 25% of the data warehouse space allocated to TDL.

TDL can support temporary processing needs with strong resources - not what is usually found in development environments.  I can also see TDL supporting normal IT development.  Look into TDL, or home-grow the idea within your non-Teradata data warehouse environment.  It's an idea whose time has come.

TDL is backward-compatible to Teradata 13.


Posted April 17, 2012 9:38 AM
Permalink | No Comments |


   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›