I took a few days off last week and read the first science fiction book in maybe twenty years...and the first computer science fiction I've ever read. The book was briefly recommended by an attendee at my presentation in DW2013 in Zurich a few weeks ago, a thought inspired (I imagine) by some of my comments about various possibilities that social collaboration could offer to innovation in business. The book in question is "The Circle" by Dave Eggers, which explores the logical and apparently inevitable conclusion to current developments in social networking and search in the world at large. Set in a post-Facebook, post-Google world, the Circle is the all-encompassing company that delivers and controls all aspects of social interaction on the Web. And it is moving towards a vision encapsulated by the slogan "Secrets are Lies. Sharing is Caring. Privacy is Theft" crafted by the somewhat clueless heroine, Mae, who goes Transparent (recording everything she sees, hears and says on a live Web feed) but seems totally unaware of the delicious irony of using "bathroom breaks" to conceal some activities strictly unrelated to toilet.
While Mae's unquestioning embrace of all things social may seem a tad naive to many of us old-timers, there is little doubt that many people and organizations believe implicitly in one or more aspects of what the Circle proposes. One example is the explosion of video surveillance, both overt and covert, that is purported to improve citizens' behavior on the basis that increasing the risk of getting caught decreases our propensity to misbehave. Dan Ariely in "The (Honest) Truth about Dishonesty" begs to differ. According to his research, the Simple Model of Rational Crime, devised by University of Chicago economist and Nobel laureate, Gary Becker, which proposes that people commit crimes based on a rational analysis of each situation, is too simplistic. Rather, he believes that our behavior is driven by two conflicting motivations, one is to see ourselves as honest, honorable people and the second is that we want to benefit from cheating and get as much money as possible. However, our cognitive flexibility allows us to rationalize that if cheat by only a little bit, we can benefit from cheating and still view ourselves as honest. The extent of that "little bit" differs from person to person and determines how we actually behave. Personally, I suspect that this model of human motivation is also too simplistic, but, not being a professor at either Duke or MIT, I don't have the resources to test that hypothesis... My fundamental concern is that the erosion of privacy inherent in ongoing surveillance is being ignored in the misguided belief that society is benefiting from improved behavior.
In my last post, I discussed similar privacy issues that seem to go unnoticed when advances in technology allow monitoring our behavior while watching TV, in the interest of targeting advertising or improving monetization of pay-per-view programming.
Of course, exactly the same ethical issues arise as we consider the use of informal information within the enterprise to improve insight on what motivates decision makers and increases their ability to innovate. In my book, I define informal information as information generated as part of every process, in both formal activities (project kick-off and plan review meetings, shareholder and board meetings, court proceedings, etc.) but also during informal activities (chats at the water cooler, ad hoc meetings, phone calls, conferences attended, and so on) that is increasingly captured and stored digitally. A brief look at the two lists of activities above will show just how many are or could easily be recorded. My belief is that such records can be analyzed to understand how innovative thinking emerges and is often suppressed in teams working together on any creative project. In video-conferenced meetings, for example, recording and analysis of facial micro-expressions could show when, how and by whom particular ideas are approved or otherwise by team members and how this affects their proponents. Combine it with spoken and written comments, project success or failure and other information and we can build over time an understanding of how to construct innovative teams by blending skills and attitudes in the optimal proportions to encourage invention and balance it with the constructive criticism and enthusiastic support needed to convert it to innovation.
So, you can probably see the benefits of that as a proponent of decision support software if you really want to look beyond traditional BI. The above thinking is the logical conclusion of analytics, using yet further sets of big data. We get to see not just how participants behave but also gain insight into their subconscious reactions and motivations. And if you think this is science fiction, a recent New York Times article, "When Algorithms Grow Accustomed to Your Face", will convince you otherwise, but be careful how your expression may reveal your reaction, especially if a Webcam is pointing at you!
In terms of ethics and privacy, we are now treading on very dangerous ground. Increasingly, the technology is allowing us to record and analyze human motivation and intention. It can be used for good or ill. We must apply some of the innovative thinking that created this technology to figuring out how to control and manage its use.
I will be further exploring the themes and messages of "Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data" over the coming weeks. You can order the book at the link above; it is now available. Also, check out my presentation at the BrightTALK Business Intelligence and Big Data Analytics Summit, recorded Sept. 11 and "Beyond BI is... Business unIntelligence" recorded Sept. 26. Read my interview with Lindy Ryan in Rediscovering BI.
Posted December 4, 2013 2:37 AM
Permalink | No Comments |