We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

Recently in Business unIntelligence Category

Privacy Padlock.pngIn the year since Edward Snowden spoke out on governmental spying, much has been written about privacy but little enough done to protect personal information, either from governments or from big business.

It's now a year since the material gathered by Edward Snowden at the NSA was first published by the Guardian and Washington Post newspapers. In one of a number of anniversary-related items, Vodafone revealed that secret wires are mandated in "about six" of the 29 countries in which it operates. It also noted that, in addition, Albania, Egypt, Hungary, India, Malta, Qatar, Romania, South Africa and Turkey deem it unlawful to disclose any information related to wiretapping or content interception. Vodafone's move is to be welcomed. Hopefully, it will encourage further transparency from other telecommunications providers on governmental demands for information.

However, governmental big data collection and analysis is only one aspect of this issue. Personal data is also of keen interest to a range of commercial enterprises, from telcos themselves to retailers and financial institutions, not to mention the Internet giants, such as Google and Facebook, which are the most voracious consumers of such information. Many people are rightly concerned about how governments--from allegedly democratic to manifestly totalitarian--may use our personal data. To be frank, the dangers are obvious. However, commercial uses of personal data are more insidious, and potentially more dangerous and destructive to humanity. Governments at least purport to represent the people to a greater or lesser extent; commercial enterprises don't even wear that minimal fig leaf.

Take, as one example among many, indoor proximity detection systems based on Bluetooth Low Energy devices such as Apple's iBeacon and Google's rumored upcoming Nearby. The inexorable progress of communications technology--smaller, faster, cheaper, lower power--enables more and more ways of determining the location of your smartphone or tablet and, by extension, you. The operating system or app on your phone requires an opt-in to enable it to transmit your location. However, it is becoming increasingly difficult to avoid opting-in as many apps require it to work at all. More worrying are the systems that record and track without asking permission the MAC addresses of smartphones and tablets that poll public Wi-Fi network routers, which all such devices automatically do. (See, for example, this article, subscription required.) The only way to avoid such tracking is to turn off the device's Wi-Fi receiver. On the desktop, the situation is little better, with Facebook last week joining Google and Yahoo! in ignoring browser "do not track" settings.

It would be simple to blame the businesses involved--both the technology companies that develop the systems and the businesses that buy or use the data. They certainly must take their fair share of responsibility, together with the data scientists and other IT staff involved in building the systems. But the reality is that it is we, the general public, who hand over our personal data without a second thought about its possible uses, who must step up to demanding real change in the collection and use of such data. This demands significant rethinking in at least two areas.

First is the oft-repeated marketing story that "people want more targeted advertising", reiterated again last week by Facebook's Brian Boland. A more nuanced view is provided by Sara M. Watson, a Fellow at the Berkman Center for Internet and Society at Harvard University, in a recent Atlantic article Data Doppelgängers and the Uncanny Valley of Personalization: "Data tracking and personalized advertising is often described as 'creepy.' Personalized ads and experiences are supposed to reflect individuals, so when these systems miss their mark, they can interfere with a person's sense of self. It's hard to tell whether the algorithm doesn't know us at all, or if it actually knows us better than we know ourselves. And it's disconcerting to think that there might be a glimmer of truth in what otherwise seems unfamiliar. This goes beyond creepy, and even beyond the sense of being watched."

I would suggest that given the choice between less irrelevant advertising or, simply, less advertising on the Web, many people would opt for the latter, particularly given the increasing invasiveness of the data collection needed to drive allegedly more accurate targeting. Clearly, this latter choice would not be in the interest of the advertising industry, a position that crystalizes in the widespread resistance to limits on data gathering, especially in the United States. An obvious first step in addressing this issue is a people-driven, legally mandated move from opt-out data gathering to a formal opt-in approach. To be really useful, of course, this would need to be preceded by a widespread mass deletion of previously gathered data.

This leads directly to the second area in need of substantial rethinking--the funding model for Internet business. Most of us accept that "there's no such thing as a free lunch". But a free email service, Cloud store or search engine, well apparently that's eminently reasonable. Of course, it isn't. All these services cost money to build and run, costs that are covered (with significant profits in many cases) by advertising. More of it and supposedly better targeted via big data and analytics.

There is little doubt that the majority of people using the Internet gain real, daily value from it. Today, that value is paid for through personal data. The loss of privacy seems barely noticed. People I ask are largely disinterested in any possible consequences. However, privacy is the foundation for many aspects of society, including democracy--as can be clearly seen in totalitarian states, where widespread surveillance and destruction of privacy are among the first orders of business. We, the users of the Web, must do the unthinkable: we must demand the right to pay real money for mobile access, search, email and so on in exchange for an end to tracking personal data.

These are but two arguably simplistic suggestions to address issues that have been made more obvious by Snowden's revelations. A more complete theoretical and legal foundation for a new approach is urgently needed. One possible starting point is The Dangers of Surveillance by Neil Richards, Professor of Law at Washington University Law, published in the Harvard Law Review a few short months before Snowden spilled at least some of the beans.

Image courtesy Marc Kjerland

Posted June 19, 2014 12:53 AM
Permalink | No Comments |
Parts 1, 2, 3, 4 and 4A of this series explored the problem as I see it. Now, finally, I consider what we might do if my titular question actually makes sense.

Mammoth kill.jpgTo start, let's review my basic thesis. Mass production and competition, facilitated by ever improving technology, have been delivering better and cheaper products and improving many people's lives (at least in the developed world) for nearly two centuries. Capital, in the form of technology, and people--labor--work together in today's system to produce goods that people purchase using earnings from their labor. As technology grows exponentially better, an ever greater range of jobs are open to displacement. When technology displaces some yet to be determined percentage of labor, this system swings out of balance; there are simply not enough people with sufficient money to buy the products made, no matter how cheaply. We have not yet reached this tipping point because, throughout most of this period, the new jobs created by technology have largely offset the losses. However, employment trends in the past 10-15 years in the Western world suggest that this effect is no longer operating to the extent that it was, if at all.

In brief, the problem is that although technology produces greater wealth (as all economists agree), without its transfer to the masses through wages paid for labor, the number of consumers becomes insufficient to justify further production. The owners of the capital assets accumulate more wealth--and we see this happening in the increasing inequality in society--but they cannot match the consumption of the masses. Capitalism, or perhaps more precisely, the free market then collapses.

Let's first look at the production side of the above equation. What can be done to prevent job losses outpacing job creation as a result of technological advances? Can we prevent or put a damper on the great hollowing out of middle-income jobs that is creating a dumbbell-shaped distribution of a few highly-paid experts at one end and a shrinking swathe of lower-paid, less-skilled workers at the other? Can (or should) we move to address the growing imbalance of power and wealth between capital (especially technologically based) and labor? Let's be clear at the start, however, turning off automation is not an option I consider.

My suggestions, emerging mainly from the thinking discussed earlier, are mainly economic and social in nature. An obvious approach is to use the levers of taxation--as is done in many other areas--to drive a desired social outcome. We could, for example, reform taxation and social charges on labor to reduce the cost difference between using people and automating a process. In a similar vein, shifting taxation from labor to capital could also be tried. I can already hear the Tea Party screaming to protect the free market from the damn socialist. But, if my analysis is correct, the free market is about to undergo, at best, a radical change, if employment drops below some critical level. Pulling these levers soon and fairly dramatically is probably necessary; this is an approach that can only delay the inevitable. Another approach is for industry itself to take steps to protect employment. Mark Bonchek , writing in a recent Harvard Business Review blog, describes a few "job entrepreneurs" who maximize jobs instead of profits (but still make profits as well), including one in the Detroit area aimed at creating jobs for unemployed auto workers.

Moving from the producer's side to the consumer's view, profit aside, why did we set off down the road of the Industrial Revolution? To improve people's daily lives, to lessen the load of hard labor, to alleviate drudgery. The early path was not clear. Seven-day labor on the farm was replaced by seven-day labor in the factory. But, by the middle of the last century, working hours were being reduced in the workplace and in the home, food was cheaper and more plentiful; money and time were available for leisure. In theory, the result should have been an improvement in the human condition. In practice, the improvement was subverted by the mass producers. They needed to sell ever more of the goods they could produce so cheaply that profit came mainly through volume sales. Economist Victor Lebow's 1955 proclamation of "The Real Meaning of Consumer Demand" sums it up: "Our enormously productive economy demands that we make consumption our way of life... that we seek our spiritual satisfaction and our ego satisfaction in consumption... We need things consumed, burned up, worn out, replaced and discarded at an ever-increasing rate". Of course, some part of this is human nature, but it has been driven inexorably by advertising. We've ended up in the classic race to the bottom, even to the extent of products being produced with ever shorter lifespans to drive earlier replacement. Such consumption is becoming increasingly unsustainable as the world population grows, finite resources run out and the energy consumed in both production and use drives increasing climate change. As the president of Uruguay, Jose Mujica, asked of the Rio+20 Summit in 2012, "Does this planet have enough resources so seven or eight billion can have the same level of consumption and waste that today is seen in rich societies?"

My counter-intuitive suggestion here, and one I have not seen raised by economists (surprisingly?), is to ramp down consumerism, mainly through a reinvention of the purposes and practices of advertising. Reducing over-competition and over-consumption would probably drive interesting changes in the production side of the equation, including reduced demand for further automation, lower energy consumption, product quality being favored over quantity, higher savings rates by (non-)consumers, and more. Turning down the engine of consumption could also enable changes for the better in the financial markets, reducing the focus on quarterly results in favor of strategically sounder investment. Input from economists would be much appreciated.

But, let's wrap up. The title of this series asked: will automation through big data and the Internet of Things drive the death of capitalism? Although some readers may have assumed that this was my preferred outcome, I am more of the opinion that capitalism and the free market need to evolve rather quickly if they are to survive and, preferably, thrive. But, this would mean some radical changes. For example, a French think-tank, LH Forum, suggests the development of a positive economy that: "reorients capitalism towards long-term challenges. Altruism toward future generations is a much more powerful incentive than [the] selfishness which is supposed to steer the market economy". Other fundamental rethinking comes from British/Scottish historian, Niall Ferguson, who takes a wider view of "The Great Degeneration" of Western civilization. In a word, this is a topic that requires broad, deep and urgent thought.

For my more IT-oriented readers, I suspect this blog series has taken you far from basic ground. For this, I do not apologize. As described in Chapter 2 of "Business unIntelligence", I believe that the future of business and IT is to be joined at the hip. The biz-tech ecosystem declares that technology is at the heart of all business development. Business must understand IT. IT must be involved in the business. I suggest that understanding the impact of automation on business and society is a task for IT strategists and architects as much, if not more, as it is for economists and business planners.

Image from: www.moonbattery.com/archives/2010/07/prehistoric-cli.html. All elephant photos in the earlier posts are my own work!


Posted March 11, 2014 6:47 AM
Permalink | 2 Comments |
As seen in Parts 1, 2 and 3, mainstream economists completely disagree with my thinking. Am I alone? It turns out that I'm not...

Baby elephant.JPGSurely I'm not the first to think that today's technological advances have the potential to seriously disrupt the current market economy? I eventually found that Martin Ford, founder of an unnamed software development company in Silicon Valley, wrote a book in 2009: "The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future". It turns out that his analysis of the problem is exactly the same as mine... not to mention more than four years ahead of me! Ford explains it very simply and compellingly: "the free market economy, as we understand it today, simply cannot work without a viable labor market. Jobs are the primary mechanism through which income-- and, therefore, purchasing power-- is distributed to the people who consume everything the economy produces. If at some point, machines are likely to permanently take over a great deal of the work now performed by human beings, then that will be a threat to the very foundation of our economic system."

Ford-graph.pngFor the more visually minded, Ford graphs capability to perform routine jobs against historical time for both humans and computer technology. I reproduce this simple graph here. Ford posits that there was a spurt in human capability after the Industrial Revolution as people learned to operate machines, but that this has now largely leveled off. As a general principle, this seems reasonable. Computer technology, on the other hand is widely accepted (via Moore's Law) to be on a geometric growth path in terms of its general capability. Unless one or both trends change dramatically, the cross-over of these two lines is inevitable. When technology becomes more efficient than a human at any particular job, competitive pressure in the market will ensure that the former replaces the latter. At some stage, the percentage of human jobs and their associated income that is automated away will be enough to disrupt the consumption side of the free market. Even increased uncertainty about income is often sufficient to cause consumers to increase savings and reduce discretionary spending. This behavior occurs in every recession and affects production in a predictable manner; production is cut back, often involving further job losses. A positive feedback cycle of reduced jobs drives reduced spending and drives further job losses. In today's cyclical economy, the trend is eventually reversed, sometimes through governmental action, or at times by war--World War II is credited by some for the end of the Great Depression. However, the graph above shows no such cyclical behavior: this is a one-way, one time transition.

Of course, the 64 million dollar questions (assuming you agree with the reasoning above) are: where we are on this timeline and how steep is the geometric rise in technological capability? It is likely that both aspects differ depending on the job involved. For some jobs, we are far from the inflection point in the technology curve, while others are much closer. For information-based jobs, the rate of growth in capability of computers may be very close to the Moore's Law formulation: a doubling in capacity every 18 months. Physical automation may grow more slowly.  But the outcome is assured: the lines will cross. Ford felt that in some areas we were getting close to the inflection point in 2009. The presumed approximate quadrupling of technological ability since then has not yet, however, tipped us over the edge of the job cliff, although few would argue the extent of the technological advances in the interim. Of course, Ford--and I--may be wrong.

If would seem that the hypothesis put forward by Ford should be amenable to mathematical modeling, I have found only one attempt to do so, in an academic paper "Economic growth given  machine intelligence", published in 1999 by Robin Hansen. Given the title, I hoped that this paper might provide some usable mathematical models capable of answering my questions. Unfortunately, I was disappointed. My mathematical skills are no longer up to the equations involved! More importantly, however, Hansen's framing assumptions seem strong on historical precedent (surely favoring continuation of the current situation) and fail to address the fundamental issue (in my view) that consumption is directly tied to income and its distribution among the population. Furthermore, Hansen has taken a largely dismissive attitude to Ford's thesis, as demonstrated by Hansen's advice to Ford in an online exchange: "he needs to learn some basic economics".

So, the hypothesis that I and, previously, Ford have put forward so far remains both intuitively reasonable and formally unproven. For now, I ask can any data scientist / economics major take on the task of producing a useful model of our basic hypothesis.

In the final part of the series, I look at what a collapse of the current economic order might look like and ask what, if anything, might be done to avert it.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data.  

A bonus Part 4A follows!


Posted February 25, 2014 3:34 AM
Permalink | No Comments |
Part 1 introduced the elephant in the room: a few of the ways that current technological advances affect jobs and society.  Part 2 explored the economics. Here I dissect some further recent economic thinking.

Elephant Rears.JPGAs we saw in my previous blog, the topic of the impact of technology on jobs has generated increased interest in the past few months. Tyler Cowen's Sept. 2013 "Average is Over" was one of the earlier books (of this current phase) that addressed this area. The pity is that, in the long run, he avoids the core of the problem as I see it: if technology is replacing current jobs and failing to create new ones in at least equal numbers, who will be the consumers of the products of the new technology?

In the early part of his book, Cowen builds the hypothesis that technology has been impacting employment for some decades now, creating a bifurcation in the job market. As the jobs that can be partially or fully automated expand in scope and number, people with strong skills that support, complement or expand technology will find relatively rich pickings. Those with more average technologically oriented skills do less well. If their skills are relatively immune to technological replacement--from building laborers to masseurs--they will continue to be in some demand, although increased competition will likely push down wages. However, the key trend is that middle-income jobs susceptible to physical or informational automation have been disappearing and continue to do so. 60% of US job losses in the 2007-2009 recession (choose your dates and name) were in the mid-income categories, while three-quarters of the jobs added since then have been low-income. The switch to low-paying jobs dates back to 1999; it's a well-established trend.

The largely unmentioned category in Cowen's book, beyond a nod to the figures on page one, is the growing numbers of young, long-term unemployed or underemployed, even among college graduates. I suggest we are actually seeing a trifurcation in the jobs market, with the percentages of this third category growing in the developed world and the actual numbers--due to population growth--exploding in the emerging economies. I don't have easy access to the statistics. However, given that the income of the first two categories, whether directly or indirectly, must support the third (as well as the young, the elderly and the ill), parts of the developed world seem pretty close already to the relative percentages at which the system could break down. Current ongoing automation of information-based jobs may well tip the balance.

Cowen's prescription for the future is profoundly depressing, even in the US context, at which the book is squarely aimed. To paraphrase, tax the wealthy and capital more. Just a little. Reduce benefits to the poor. Maybe a little more. And let both the poorly paid and unemployed move to cheaper accommodation far from expensive real estate. "...in a few parts of... the warmer states... [w]e would build some 'tiny homes'... about 400 square feet and cost in the range of $20,000 to $40,000. ... very modest dwellings, as we used to build in the 1920s. We also would build some makeshift structures, similar to the better dwellings you might find in a Rio de Janeiro favela. The quality of the water and electrical infrastructure might be low by American standards, though we could supplement the neighborhood with free municipal wireless (the future version of Marie Antoinette's famous alleged phrase will be 'Let them watch internet!')."

If your response is "look what happened to Marie Antoinette", Cowen declares that, given the age demographic of the US, revolution is an unlikely outcome. Consider, however, that this is a US view. In much of the developing world, the great transition from Agriculture to Manufacturing is only now underway. Some of that is supported by off-shoring from the West, some by local economic and population growth. However, robotic automation is already on the increase even in those factories. Foxconn, long infamous as Apple's iPhone hardware and assembly "sweatshop", announced as far back as 2011 that it planned on installing up to 1 million robots in its factories by this year. A year ago this week, they announced"We have canceled hiring entry-level workers, a decision that is partly associated with our efforts in production automation."

Last week, the Wall Street Journal reported that Foxconn is working with Google with the aim of bringing Google's latest robotics acquisitions to the assembly lines to reduce job costs. Google also benefits: it gains a test bed for its purported new robotic operating system as well as access to Foxconn's mechanical engineering skills.  According to PCWorld, Foxconn's chairman, Terry Guo, told investors in June last year: "We have over 1 million workers. In the future we will add 1 million robotic workers. Our [human] workers will then become technicians and engineers."  The question of how many of the million human workers would become technicians and engineers seemingly went unaddressed. Echoing comments in Part 2 of this series, Foxconn are also reported to be looking to next-shore automated manufacturing to the US. To complete the picture, Bloomberg BusinessWeek meanwhile reported that Foxconn was also off-shoring jobs from China to Indonesia. A complex story, but the underlying driver is obvious: reduce labor, and then distribution, costs by any and every means possible.

Cowen's comments above about favelas, Marie Antoinette and revolutions should thus be seen in a broader view. The Industrial era middle class boom of the West may pass quickly through the emerging economies or perhaps bypass the later arrivals entirely. Shanty towns are already widespread in emerging economies; they house the displaced agricultural workers who cannot find or have already lost employment in manufacturing.  The age demographic in these countries is highly compatible with revolution and migration / invasion. For reasons of geography, the US may be less susceptible to invasion, but Europe's borders are under increasing siege. The Roman Empire's latter-day bread and circuses were very quickly overrun by the Vandals and the Visigoths.

The numbers and percentages of new jobs vs. unemployed worldwide are still up for debate among the economists. But their belief, echoed by the larger multinationals, that the consumer boom in the West of the 20th century will be replicated in the emerging economies stands perhaps on shaky ground.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data

Part 4 follows.


Posted February 18, 2014 7:40 AM
Permalink | No Comments |
Kruger Elephant.JPGPart 1 introduced the elephant in the room: a few of the ways that current technological advances affect jobs and society.  Here, I probe deeper into the economics.

We in the world of IT and data tend to focus on the positive outcomes of technology. Diagnoses of illness will be improved. Wired reports that, in tests, IBM Watson's successful diagnosis rate for lung cancer is 90%, compared to 50% for human doctors according to Wellpoint's Samuel Nessbaum. We celebrate our ability to avoid traffic jams based on smartphone data collection. We imagine how we can drive targeted advertising. A trial (since discontinued) in London in mid-2013 of recycling bins that monitored passing smartphone WiFi addresses to eventually push product is but one of the more amusing examples. Big data can drive sustainable business, reducing resource usage and carbon footprint both within the enterprise and to the ends of its supply chains. Using a combination of big data, robotics and the IoT, Google already has prototype driverless cars on the road. Their recent acquisitions of Boston Dynamics (robotics), Nest (IoT sensors), and DeepMind Technologies (artificial intelligence) to name but a few indicate the direction of their thinking. The list goes on. Mostly, we see the progress but are largely blind to the down-sides.

The elephant in the room seems particularly perplexing to politicians, economists and others charged with taking a macro-view of where human society is going. Perhaps they feel some immunity to the excrement pile that is surely coming. But, surely an important question should be what will happen in economic and societal terms when significant numbers of workers in a wide range of industries are displaced fully or partially by technology, both physical and information-based? As a non-economist, the equation seems obvious to me: less workers means less consumers means less production means less workers. A positive feedback loop of highly negative factors for today's business models. Sounds like the demise of capitalism. Am I being too simplistic?

A recent article in the Economist, "The Future of Jobs: the Onrushing Wave" explores the widely held belief among economists that technological advances drive higher living standards for the population at large. The belief is based on historical precedent, most particularly the rise of the middle classes in Europe and the US during the 20th century. Anyone who opposes this consensus risks being labeled a Luddite, after the craft workers in the 19th century English textile industry who attacked the machines that were destroying their livelihoods. And perhaps that is why the Economist, after exploring the historical pattern at some length and touching on many of the aspects I've mentioned earlier, concludes rather lamely, in my opinion: "[Keynes] worry about technological unemployment was mainly a worry about a 'temporary phase of maladjustment' as society and the economy adjusted to ever greater levels of productivity. So it could well prove. However, society may find itself sorely tested if, as seems possible, growth and innovation deliver handsome gains to the skilled, while the rest cling to dwindling employment opportunities at stagnant wages."

 "Sorely tested" sounds like a serious understatement to me. The Industrial Revolution saw a mass movement of jobs from Agriculture to Manufacturing; the growth of the latter largely offset the shrinkage in Agricultural employment due to mechanization. In the Western world, the trend in employment since the 1960's has been from Manufacturing to Services. Services has compensated for the loss of Manufacturing jobs to both off-shoring and automation. But, as Larry Summers, former US Treasury Secretary, mentioned in the debate on "Rethinking Technology and Employment" at the World Economic Forum in Davos in January, the percentage of 25-54 year old males not working in the US will have risen from 5% in 1965 to an estimated near 15% in 2017. This trend suggests strongly that the shrinkage in Manufacturing is not being effectively taken up elsewhere. The Davos debate itself lumbered to a soporific draw on the motion that technological innovation is driving jobless growth. Prof. Erik Brynjolfsson, speaking with Summers in favor of the motion, offered that "off-shoring is just a way-station on the road to automation", a theme echoed by the January 2014 McKinsey Quarterly "Next-shoring: A CEO's guide". Meanwhile, Brynjolfsson's latest book, with Andrew McAfee, seems to limit its focus to the quality of work in the "Second Machine Age" rather than its actual quantity.
 
Blind_monks_examining_an_elephant 1.jpgAs in the tale of the blind men and the elephant, it seems that we are individually focusing only on small parts of this beast.

For a broader and deeper view of the business and technological aspects of this topic, please take a look at my new book: Business unIntelligence: Insight and Innovation Beyond Analytics and Big Data.  

Part 3 follows.
 

Posted February 10, 2014 11:11 AM
Permalink | No Comments |
PREV 1 2 3 4

   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›