We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

Recently in Internet of Things Category

Thoughts on the societal impact of the Internet of Things inspired by a unique dashboard product.

VisualCue tile.pngNewcomer to the BBBT, on 2nd May, Kerry Gilger, Founder of VisualCue took the members by storm with an elegant, visually intuitive and, to me at least, novel approach to delivering dashboards. VisualCue is based on the concept of a tile that represents a set of metrics as icons colored according to their state relative to defined threshold values. The main icon in the tile shown here represents the overall performance of a call center agent, with the secondary icons showing other KPIs, such as total calls answered, average handling time, sales per hour worked, customer satisfaction, etc. Tiles are assembled into mosaics, which function rather like visual bar charts that can be sorted according to the different metrics, drilled down to related items and displayed in other formats, including tabular numbers.

Visual Cue Mosaic.jpgThe product seems particularly useful in operational BI applications, with Kerry showing examples from call centers, logistics and educational settings. The response of the BBBT members was overwhelmingly positive. @rick_vanderlans described it as "revolutionary technology", while @gildardorojas asked "why we didn't have before something as neat and logical?" @marcusborba opined "@VisualCue's capability is amazing, and the data visualization is gorgeous!"

So, am I being a Luddite, or even a curmudgeon, to have made the only negative comments of the call? My concern was not about the product at all, but rather around the power it unleashes simply by being so good at what it does. Combine this level of ease-of-use in analytics with big data and, especially, data from the Internet of Things, and we take a quantum leap from measurement to invasiveness, from management to Big-Brother-like control.

Each of the three example use cases described by Gilger provided wonderful examples of real and significant business benefit; but, taken together, they also opened up appalling possibilities of abuse of privacy, misappropriation of personal information and disempowerment of the people involved. I'll briefly explore the three examples, realizing that in the absence of the full story, I'm undoubtedly imagining some aspects. Nor is this about VisualCue (who tweeted that "Privacy is certainly a critical issue! We focus on presenting data that an organization already has--maybe we make it obvious") or the companies using it; it's meant to be a warning that we who know some of the possibilities--positive and negative--offered by big data analytics must consider in advance the unintended consequences.

Detailed monitoring of call center agents' performance is nothing new. Indeed, it is widely seen as best practice and key to improving both individual and overall call center results. VisualCue, according to Gilger, has provided outstanding performance gains, including one center where agents in competition with peers have personally sought out training to improve their own metrics, something that is apparently unheard of in the industry. Based on past best practices and detailed knowledge of where the agent is weak, VisualCue can provide individually customized advice. In a sense, this example illustrates the pinnacle of such use of monitoring data and analytics to drive personnel performance. But, within it lies the seeds of its own destruction. As the agent's job is more and more broken down into repeatable tasks, each measurable by a different metric, human innovation and empathy is removed and the job prepared for automation. In fact, a 2013 study puts at 99% the probability that certain call center jobs, particularly telemarketing, will be soon eliminated by technology.

The old adage "what you can't measure, you can't manage" is at the heart of traditional BI. In an era when data was scarce and often incoherent, this focus makes sense. However, applying it to all aspects of life today is, to me, ethically problematical. The example of monitoring the entire scope of an educational institution in a single dashboard--from financials through administration to student performance--is a case where our ability to analyze so many data points leads to the illusion that we can manage the entire process mechanically. The Latin root of "educate" means "to draw forth" from the student, the success of which simply cannot be gauged through basic numerical measures, and is certainly not correlated with the business measures of the institution.

vehtrack.jpgThe final example of tracking the operational performance of a waste management company's routes, trucks and drivers emphasizes our growing ability to measure and monitor the details of real life minute by minute. By continuously tracking the location and engine management signals from its trucks, the dashboard created by this company enabled it to make significant financial savings and improvements to its operational performance. However, it also enables supervisors to drill into the ongoing behavior of the company's drivers: deviations from planned routes, long stops with the engine running, extreme braking, exceeding the speed limit, etc. While presumably covered by their employment contract, such micromanagement of employees is at best disempowering and at worst open to abuse by increasingly all-seeing supervisors. Of much greater concern is the fact that these sensors are increasingly embedded in private automobiles and that such tracking capability is already being applied without owners' consent to smartphones. As far as a year back, Euclid Analytics had already tracked about 50 million devices in 4,000 locations according to a New York Times blog.

1984-big-brother-is-watching-you.jpgI'm grateful to Kerry Gilger for sharing the use cases that inspired my speculations above. Of course, my point is beyond the individual companies involved and products used. At issue is the range of social and ethical dilemmas raised by the rapid advances in sensor technology, data gathered and the power of analytic software. Our every action online is already monitored by the likes of Google and Facebook for profit and by organizations like the NSA allegedly for security and crime prevention. The level of monitoring of our physical lives is now rapidly increasing. Anonymity is rapidly disappearing, if not already extinct. Our personal privacy rights are being usurped by the data gathering and analysis programs of these commercial and governmental organizations, as eloquently described by Shoshana Zuboff of Harvard Business and Law schools in a recent article in Frankfurter Allgemeine Zeitung.

It is imperative that those of us who have grown up with and nurtured business intelligence over the past three decades--from hardware and software vendors, to consultants and analysts, to BI managers and implementers in businesses everywhere--begin to deeply consider the ethical, legal and societal issues now being raised and take action to guide the industry and society appropriately through the development of new codes of ethical behavior and use of information, and input to national and international legislation.


Posted May 4, 2014 6:26 AM
Permalink | No Comments |
Parts 1, 2, 3, 4 and 4A of this series explored the problem as I see it. Now, finally, I consider what we might do if my titular question actually makes sense.

Mammoth kill.jpgTo start, let's review my basic thesis. Mass production and competition, facilitated by ever improving technology, have been delivering better and cheaper products and improving many people's lives (at least in the developed world) for nearly two centuries. Capital, in the form of technology, and people--labor--work together in today's system to produce goods that people purchase using earnings from their labor. As technology grows exponentially better, an ever greater range of jobs are open to displacement. When technology displaces some yet to be determined percentage of labor, this system swings out of balance; there are simply not enough people with sufficient money to buy the products made, no matter how cheaply. We have not yet reached this tipping point because, throughout most of this period, the new jobs created by technology have largely offset the losses. However, employment trends in the past 10-15 years in the Western world suggest that this effect is no longer operating to the extent that it was, if at all.

In brief, the problem is that although technology produces greater wealth (as all economists agree), without its transfer to the masses through wages paid for labor, the number of consumers becomes insufficient to justify further production. The owners of the capital assets accumulate more wealth--and we see this happening in the increasing inequality in society--but they cannot match the consumption of the masses. Capitalism, or perhaps more precisely, the free market then collapses.

Let's first look at the production side of the above equation. What can be done to prevent job losses outpacing job creation as a result of technological advances? Can we prevent or put a damper on the great hollowing out of middle-income jobs that is creating a dumbbell-shaped distribution of a few highly-paid experts at one end and a shrinking swathe of lower-paid, less-skilled workers at the other? Can (or should) we move to address the growing imbalance of power and wealth between capital (especially technologically based) and labor? Let's be clear at the start, however, turning off automation is not an option I consider.

My suggestions, emerging mainly from the thinking discussed earlier, are mainly economic and social in nature. An obvious approach is to use the levers of taxation--as is done in many other areas--to drive a desired social outcome. We could, for example, reform taxation and social charges on labor to reduce the cost difference between using people and automating a process. In a similar vein, shifting taxation from labor to capital could also be tried. I can already hear the Tea Party screaming to protect the free market from the damn socialist. But, if my analysis is correct, the free market is about to undergo, at best, a radical change, if employment drops below some critical level. Pulling these levers soon and fairly dramatically is probably necessary; this is an approach that can only delay the inevitable. Another approach is for industry itself to take steps to protect employment. Mark Bonchek , writing in a recent Harvard Business Review blog, describes a few "job entrepreneurs" who maximize jobs instead of profits (but still make profits as well), including one in the Detroit area aimed at creating jobs for unemployed auto workers.

Moving from the producer's side to the consumer's view, profit aside, why did we set off down the road of the Industrial Revolution? To improve people's daily lives, to lessen the load of hard labor, to alleviate drudgery. The early path was not clear. Seven-day labor on the farm was replaced by seven-day labor in the factory. But, by the middle of the last century, working hours were being reduced in the workplace and in the home, food was cheaper and more plentiful; money and time were available for leisure. In theory, the result should have been an improvement in the human condition. In practice, the improvement was subverted by the mass producers. They needed to sell ever more of the goods they could produce so cheaply that profit came mainly through volume sales. Economist Victor Lebow's 1955 proclamation of "The Real Meaning of Consumer Demand" sums it up: "Our enormously productive economy demands that we make consumption our way of life... that we seek our spiritual satisfaction and our ego satisfaction in consumption... We need things consumed, burned up, worn out, replaced and discarded at an ever-increasing rate". Of course, some part of this is human nature, but it has been driven inexorably by advertising. We've ended up in the classic race to the bottom, even to the extent of products being produced with ever shorter lifespans to drive earlier replacement. Such consumption is becoming increasingly unsustainable as the world population grows, finite resources run out and the energy consumed in both production and use drives increasing climate change. As the president of Uruguay, Jose Mujica, asked of the Rio+20 Summit in 2012, "Does this planet have enough resources so seven or eight billion can have the same level of consumption and waste that today is seen in rich societies?"

My counter-intuitive suggestion here, and one I have not seen raised by economists (surprisingly?), is to ramp down consumerism, mainly through a reinvention of the purposes and practices of advertising. Reducing over-competition and over-consumption would probably drive interesting changes in the production side of the equation, including reduced demand for further automation, lower energy consumption, product quality being favored over quantity, higher savings rates by (non-)consumers, and more. Turning down the engine of consumption could also enable changes for the better in the financial markets, reducing the focus on quarterly results in favor of strategically sounder investment. Input from economists would be much appreciated.

But, let's wrap up. The title of this series asked: will automation through big data and the Internet of Things drive the death of capitalism? Although some readers may have assumed that this was my preferred outcome, I am more of the opinion that capitalism and the free market need to evolve rather quickly if they are to survive and, preferably, thrive. But, this would mean some radical changes. For example, a French think-tank, LH Forum, suggests the development of a positive economy that: "reorients capitalism towards long-term challenges. Altruism toward future generations is a much more powerful incentive than [the] selfishness which is supposed to steer the market economy". Other fundamental rethinking comes from British/Scottish historian, Niall Ferguson, who takes a wider view of "The Great Degeneration" of Western civilization. In a word, this is a topic that requires broad, deep and urgent thought.

For my more IT-oriented readers, I suspect this blog series has taken you far from basic ground. For this, I do not apologize. As described in Chapter 2 of "Business unIntelligence", I believe that the future of business and IT is to be joined at the hip. The biz-tech ecosystem declares that technology is at the heart of all business development. Business must understand IT. IT must be involved in the business. I suggest that understanding the impact of automation on business and society is a task for IT strategists and architects as much, if not more, as it is for economists and business planners.

Image from: www.moonbattery.com/archives/2010/07/prehistoric-cli.html. All elephant photos in the earlier posts are my own work!


Posted March 11, 2014 6:47 AM
Permalink | 2 Comments |
As seen in Parts 1, 2 and 3, mainstream economists completely disagree with my thinking. Am I alone? It turns out that I'm not...

Baby elephant.JPGSurely I'm not the first to think that today's technological advances have the potential to seriously disrupt the current market economy? I eventually found that Martin Ford, founder of an unnamed software development company in Silicon Valley, wrote a book in 2009: "The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future". It turns out that his analysis of the problem is exactly the same as mine... not to mention more than four years ahead of me! Ford explains it very simply and compellingly: "the free market economy, as we understand it today, simply cannot work without a viable labor market. Jobs are the primary mechanism through which income-- and, therefore, purchasing power-- is distributed to the people who consume everything the economy produces. If at some point, machines are likely to permanently take over a great deal of the work now performed by human beings, then that will be a threat to the very foundation of our economic system."

Ford-graph.pngFor the more visually minded, Ford graphs capability to perform routine jobs against historical time for both humans and computer technology. I reproduce this simple graph here. Ford posits that there was a spurt in human capability after the Industrial Revolution as people learned to operate machines, but that this has now largely leveled off. As a general principle, this seems reasonable. Computer technology, on the other hand is widely accepted (via Moore's Law) to be on a geometric growth path in terms of its general capability. Unless one or both trends change dramatically, the cross-over of these two lines is inevitable. When technology becomes more efficient than a human at any particular job, competitive pressure in the market will ensure that the former replaces the latter. At some stage, the percentage of human jobs and their associated income that is automated away will be enough to disrupt the consumption side of the free market. Even increased uncertainty about income is often sufficient to cause consumers to increase savings and reduce discretionary spending. This behavior occurs in every recession and affects production in a predictable manner; production is cut back, often involving further job losses. A positive feedback cycle of reduced jobs drives reduced spending and drives further job losses. In today's cyclical economy, the trend is eventually reversed, sometimes through governmental action, or at times by war--World War II is credited by some for the end of the Great Depression. However, the graph above shows no such cyclical behavior: this is a one-way, one time transition.

Of course, the 64 million dollar questions (assuming you agree with the reasoning above) are: where we are on this timeline and how steep is the geometric rise in technological capability? It is likely that both aspects differ depending on the job involved. For some jobs, we are far from the inflection point in the technology curve, while others are much closer. For information-based jobs, the rate of growth in capability of computers may be very close to the Moore's Law formulation: a doubling in capacity every 18 months. Physical automation may grow more slowly.  But the outcome is assured: the lines will cross. Ford felt that in some areas we were getting close to the inflection point in 2009. The presumed approximate quadrupling of technological ability since then has not yet, however, tipped us over the edge of the job cliff, although few would argue the extent of the technological advances in the interim. Of course, Ford--and I--may be wrong.

If would seem that the hypothesis put forward by Ford should be amenable to mathematical modeling, I have found only one attempt to do so, in an academic paper "Economic growth given  machine intelligence", published in 1999 by Robin Hansen. Given the title, I hoped that this paper might provide some usable mathematical models capable of answering my questions. Unfortunately, I was disappointed. My mathematical skills are no longer up to the equations involved! More importantly, however, Hansen's framing assumptions seem strong on historical precedent (surely favoring continuation of the current situation) and fail to address the fundamental issue (in my view) that consumption is directly tied to income and its distribution among the population. Furthermore, Hansen has taken a largely dismissive attitude to Ford's thesis, as demonstrated by Hansen's advice to Ford in an online exchange: "he needs to learn some basic economics".

So, the hypothesis that I and, previously, Ford have put forward so far remains both intuitively reasonable and formally unproven. For now, I ask can any data scientist / economics major take on the task of producing a useful model of our basic hypothesis.

In the final part of the series, I look at what a collapse of the current economic order might look like and ask what, if anything, might be done to avert it.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data.  

A bonus Part 4A follows!


Posted February 25, 2014 3:34 AM
Permalink | No Comments |
Part 1 introduced the elephant in the room: a few of the ways that current technological advances affect jobs and society.  Part 2 explored the economics. Here I dissect some further recent economic thinking.

Elephant Rears.JPGAs we saw in my previous blog, the topic of the impact of technology on jobs has generated increased interest in the past few months. Tyler Cowen's Sept. 2013 "Average is Over" was one of the earlier books (of this current phase) that addressed this area. The pity is that, in the long run, he avoids the core of the problem as I see it: if technology is replacing current jobs and failing to create new ones in at least equal numbers, who will be the consumers of the products of the new technology?

In the early part of his book, Cowen builds the hypothesis that technology has been impacting employment for some decades now, creating a bifurcation in the job market. As the jobs that can be partially or fully automated expand in scope and number, people with strong skills that support, complement or expand technology will find relatively rich pickings. Those with more average technologically oriented skills do less well. If their skills are relatively immune to technological replacement--from building laborers to masseurs--they will continue to be in some demand, although increased competition will likely push down wages. However, the key trend is that middle-income jobs susceptible to physical or informational automation have been disappearing and continue to do so. 60% of US job losses in the 2007-2009 recession (choose your dates and name) were in the mid-income categories, while three-quarters of the jobs added since then have been low-income. The switch to low-paying jobs dates back to 1999; it's a well-established trend.

The largely unmentioned category in Cowen's book, beyond a nod to the figures on page one, is the growing numbers of young, long-term unemployed or underemployed, even among college graduates. I suggest we are actually seeing a trifurcation in the jobs market, with the percentages of this third category growing in the developed world and the actual numbers--due to population growth--exploding in the emerging economies. I don't have easy access to the statistics. However, given that the income of the first two categories, whether directly or indirectly, must support the third (as well as the young, the elderly and the ill), parts of the developed world seem pretty close already to the relative percentages at which the system could break down. Current ongoing automation of information-based jobs may well tip the balance.

Cowen's prescription for the future is profoundly depressing, even in the US context, at which the book is squarely aimed. To paraphrase, tax the wealthy and capital more. Just a little. Reduce benefits to the poor. Maybe a little more. And let both the poorly paid and unemployed move to cheaper accommodation far from expensive real estate. "...in a few parts of... the warmer states... [w]e would build some 'tiny homes'... about 400 square feet and cost in the range of $20,000 to $40,000. ... very modest dwellings, as we used to build in the 1920s. We also would build some makeshift structures, similar to the better dwellings you might find in a Rio de Janeiro favela. The quality of the water and electrical infrastructure might be low by American standards, though we could supplement the neighborhood with free municipal wireless (the future version of Marie Antoinette's famous alleged phrase will be 'Let them watch internet!')."

If your response is "look what happened to Marie Antoinette", Cowen declares that, given the age demographic of the US, revolution is an unlikely outcome. Consider, however, that this is a US view. In much of the developing world, the great transition from Agriculture to Manufacturing is only now underway. Some of that is supported by off-shoring from the West, some by local economic and population growth. However, robotic automation is already on the increase even in those factories. Foxconn, long infamous as Apple's iPhone hardware and assembly "sweatshop", announced as far back as 2011 that it planned on installing up to 1 million robots in its factories by this year. A year ago this week, they announced"We have canceled hiring entry-level workers, a decision that is partly associated with our efforts in production automation."

Last week, the Wall Street Journal reported that Foxconn is working with Google with the aim of bringing Google's latest robotics acquisitions to the assembly lines to reduce job costs. Google also benefits: it gains a test bed for its purported new robotic operating system as well as access to Foxconn's mechanical engineering skills.  According to PCWorld, Foxconn's chairman, Terry Guo, told investors in June last year: "We have over 1 million workers. In the future we will add 1 million robotic workers. Our [human] workers will then become technicians and engineers."  The question of how many of the million human workers would become technicians and engineers seemingly went unaddressed. Echoing comments in Part 2 of this series, Foxconn are also reported to be looking to next-shore automated manufacturing to the US. To complete the picture, Bloomberg BusinessWeek meanwhile reported that Foxconn was also off-shoring jobs from China to Indonesia. A complex story, but the underlying driver is obvious: reduce labor, and then distribution, costs by any and every means possible.

Cowen's comments above about favelas, Marie Antoinette and revolutions should thus be seen in a broader view. The Industrial era middle class boom of the West may pass quickly through the emerging economies or perhaps bypass the later arrivals entirely. Shanty towns are already widespread in emerging economies; they house the displaced agricultural workers who cannot find or have already lost employment in manufacturing.  The age demographic in these countries is highly compatible with revolution and migration / invasion. For reasons of geography, the US may be less susceptible to invasion, but Europe's borders are under increasing siege. The Roman Empire's latter-day bread and circuses were very quickly overrun by the Vandals and the Visigoths.

The numbers and percentages of new jobs vs. unemployed worldwide are still up for debate among the economists. But their belief, echoed by the larger multinationals, that the consumer boom in the West of the 20th century will be replicated in the emerging economies stands perhaps on shaky ground.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data

Part 4 follows.


Posted February 18, 2014 7:40 AM
Permalink | No Comments |
Kruger Elephant.JPGPart 1 introduced the elephant in the room: a few of the ways that current technological advances affect jobs and society.  Here, I probe deeper into the economics.

We in the world of IT and data tend to focus on the positive outcomes of technology. Diagnoses of illness will be improved. Wired reports that, in tests, IBM Watson's successful diagnosis rate for lung cancer is 90%, compared to 50% for human doctors according to Wellpoint's Samuel Nessbaum. We celebrate our ability to avoid traffic jams based on smartphone data collection. We imagine how we can drive targeted advertising. A trial (since discontinued) in London in mid-2013 of recycling bins that monitored passing smartphone WiFi addresses to eventually push product is but one of the more amusing examples. Big data can drive sustainable business, reducing resource usage and carbon footprint both within the enterprise and to the ends of its supply chains. Using a combination of big data, robotics and the IoT, Google already has prototype driverless cars on the road. Their recent acquisitions of Boston Dynamics (robotics), Nest (IoT sensors), and DeepMind Technologies (artificial intelligence) to name but a few indicate the direction of their thinking. The list goes on. Mostly, we see the progress but are largely blind to the down-sides.

The elephant in the room seems particularly perplexing to politicians, economists and others charged with taking a macro-view of where human society is going. Perhaps they feel some immunity to the excrement pile that is surely coming. But, surely an important question should be what will happen in economic and societal terms when significant numbers of workers in a wide range of industries are displaced fully or partially by technology, both physical and information-based? As a non-economist, the equation seems obvious to me: less workers means less consumers means less production means less workers. A positive feedback loop of highly negative factors for today's business models. Sounds like the demise of capitalism. Am I being too simplistic?

A recent article in the Economist, "The Future of Jobs: the Onrushing Wave" explores the widely held belief among economists that technological advances drive higher living standards for the population at large. The belief is based on historical precedent, most particularly the rise of the middle classes in Europe and the US during the 20th century. Anyone who opposes this consensus risks being labeled a Luddite, after the craft workers in the 19th century English textile industry who attacked the machines that were destroying their livelihoods. And perhaps that is why the Economist, after exploring the historical pattern at some length and touching on many of the aspects I've mentioned earlier, concludes rather lamely, in my opinion: "[Keynes] worry about technological unemployment was mainly a worry about a 'temporary phase of maladjustment' as society and the economy adjusted to ever greater levels of productivity. So it could well prove. However, society may find itself sorely tested if, as seems possible, growth and innovation deliver handsome gains to the skilled, while the rest cling to dwindling employment opportunities at stagnant wages."

 "Sorely tested" sounds like a serious understatement to me. The Industrial Revolution saw a mass movement of jobs from Agriculture to Manufacturing; the growth of the latter largely offset the shrinkage in Agricultural employment due to mechanization. In the Western world, the trend in employment since the 1960's has been from Manufacturing to Services. Services has compensated for the loss of Manufacturing jobs to both off-shoring and automation. But, as Larry Summers, former US Treasury Secretary, mentioned in the debate on "Rethinking Technology and Employment" at the World Economic Forum in Davos in January, the percentage of 25-54 year old males not working in the US will have risen from 5% in 1965 to an estimated near 15% in 2017. This trend suggests strongly that the shrinkage in Manufacturing is not being effectively taken up elsewhere. The Davos debate itself lumbered to a soporific draw on the motion that technological innovation is driving jobless growth. Prof. Erik Brynjolfsson, speaking with Summers in favor of the motion, offered that "off-shoring is just a way-station on the road to automation", a theme echoed by the January 2014 McKinsey Quarterly "Next-shoring: A CEO's guide". Meanwhile, Brynjolfsson's latest book, with Andrew McAfee, seems to limit its focus to the quality of work in the "Second Machine Age" rather than its actual quantity.
 
Blind_monks_examining_an_elephant 1.jpgAs in the tale of the blind men and the elephant, it seems that we are individually focusing only on small parts of this beast.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data.  

Part 3 follows.
 

Posted February 10, 2014 11:11 AM
Permalink | No Comments |
PREV 1 2

   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›