We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Steve Dine Subscribe to this blog's RSS feed!

Steve Dine

If you're looking for a change from analysts, thought leaders and industry gurus, you've come to the right place. Don't get me wrong, many of these aforementioned are my colleagues and friends that provide highly intelligent insight into our industry. However, there is nothing like a view from the trenches. I often find that there is what I like to call the "conference hangover." It is the headache that is incurred after trying to implement the "best" practices preached to your boss at a recent conference. It is the gap between how business intelligence (BI) projects, programs, architectures and toolsets should be in an ideal world versus the realities on the ground. It's that space between relational and dimensional or ETL and ELT. This blog is dedicated to sharing experiences, insights and ideas from inside BI projects and programs of what works, what doesn't and what could be done better. I welcome your feedback of what you observe and experience as well as topics that you would like to see covered. If you have a specific question, please email me at sdine@datasourceconsulting.com.

About the author >

Steve Dine is President and founder of Datasource Consulting, LLC. He has more than 12 years of hands-on experience delivering and managing successful, highly scalable and maintainable data integration and business intelligence (BI) solutions. Steve is a faculty member at The Data Warehousing Institute (TDWI) and a judge for the Annual TDWI Best Practices Awards. He is the former director of global data warehousing for a major durable medical equipment manufacturer and former BI practice director for an established Denver based consulting company. Steve earned his bachelor's degree from the University of Vermont and a MBA from the University of Colorado at Boulder.

Editor's Note: More articles and resources are available in Steve's BeyeNETWORK Expert Channel. Be sure to visit today!

I recently decided to check out the SaaS BI offering called Bime (http://businessintelligence.me).  According to their website, they are focused on delivering a simple-to-use business intelligence product on top of the latest data visualization and cloud computing groundbreaking innovations.  There are now in excess of 25 vendors offering SaaS BI solutions, from dashboards to predictive analytics.  They offer the prospects of a BI solution that can be faster to implement, externally managed and less upfront cost than traditional enterprise BI suites.

Getting started was surprisingly easy.  I created an account by providing a login (my email address), a password and a self-chosen domain.  They offer a free account that limits the user to 2 connections, 2 dashboards and 10MB of storage or an enterprise account that allows 20 connections, 20 dashboards and 10GB of storage.  For testing purposes, the free account was fine.  Once my account was created, I was off and running.  They have short video tutorials that make it easy to get started. 

Like most front-end BI applications, development is a three step process.  Create a connection to a data source, define a semantic layer and create reports/analyses.  However, Bime makes it easy by creating an initial, dimensional semantic layer based on the structure of your data.  You can source data from a relational database, MS Excel file, Google spreadsheet, Amazon SimpleDB, XML, Lighthouse, Salesforce or Composite.  However, to create a connection to a relational database or Excel spreadsheet, the desktop version of Bime is required, which is an Adobe Air application that runs the same code as the web version.  You can then synchronize it to push it to the web based version.

When the data is uploaded, you have the choice of loading the data into Deja Vu, which is a distributed cache that is stored in the Amazon cloud.   That cache can be updated on a scheduled basis but ensures that each user has access to the same data.  It also improves the performance by not requiring the uploading of data each time you access the dashboard or analysis.  Once the data is uploaded, the response is very quick.

The strengths of Bime are its low barrier to entry, ease of use, advanced visualizations, multi-tenant caching mechanism, web 2.0 interface, collaboration features and low cost.  They make it easy for business users to upload data, perform an analysis, save it to a dashboard and share it with others.  The data is secure and you can choose who can have access.  They also integrate Google maps into their application to create interactive geospatial charts.  A feature that should appeal to business users.

The weakness of Bime is it's lack of maturity and 10GB data limit.  It lacks ETL capabilities, which can make uploading complex data a challenge.  To upload data from a relational database, you either need to select a table, a view or write custom SQL.  Once the connection is defined, the attributes in the semantic layer are ordered alphanumerically and cannot be custom ordered.  Also, if you have any changes to your database, the schema is blown away and any changes are lost.  In addition, I ran into a number of bugs in the application that made it difficult to create meaningful analysis.  For example, the custom SQL feature didn't seem to work and aliases do not import correctly.  I also received error messages that the datasource had changed and the schema needed to be recreated. 

My overall opinion of Bime is good.  They make it easy to create meaningful analyses and dashboards at a low cost.  It is visually appealing and the collaboration features are a step in the right direction.  It is a great option for the mid-market but will likely require time to mature before appealing to the larger companies.  Features like LDAP integration, ETL capabilities, group based security, data lineage, web service api's and drill-through will likely be required before large companies will jump on board.


Posted January 20, 2010 1:47 PM
Permalink | No Comments |

Leave a comment