|Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|May the source be with you, but remember the KISS principle ;-)|
First and foremost Nicholas Carr has very limited understanding of the technology he is writing about (Carr holds a B.A. from Dartmouth College and an M.A., in English literature, from Harvard University). At best he can approach IT as a sociologist, not as a professional. Such a sociological approach is completely legitimate and useful but prone to oversimplifications and overgeneralizations. We can discuss several issues connected with oversimplifications and overgeneralizations here:
I would like to stress that IMHO Carr has an extremely week understanding of networking issues, especially the cost of high bandwidth wide area network connections and technical challenges of providing them reliably on 24*7 basis for a large enterprise customers such as, say, SAP/R3 users. He also lacks understanding of the fact that enterprise-class software often contain implicit "institutional memory" of the organization and sometimes important enhancements to the decision-making capabilities.
When it comes to such a complex technology, architecture matters, and that architecture matters absolutely. In other works in It the whole is not only bigger that the sum of its parts it is completely different animal with the life of its own.
Carr understanding of IT architecture is essentially an understanding of a desktop user, no not surprisingly he does not understand what IT architecture is about and why it is so important and why the differences in the architecture alone can provide substantial competitive advantages even if two It system that we compare consist of identical hardware servers and software packages. In a sense for Carr like for a hammer everything is the nail: everything in It is suitable to outsourced deployment via "in the cloud" providers. A simple idea that there might be multiple cases when this is inefficient or costly or both never comes to his mind.
|In the long run, though, the greatest IT risk facing most
companies is more prosaic than a catastrophe. It is, simply, overspending.
Carr's "Spend less" is a good, generic, risk-free advice applicable to any large corporation department and any situation. Essentially, this is another way to say that penny saved is penny earned. The problem with it is that "stupidity is punishable" and "scrooges usually pay twice". You need to understand where you need cut spending and where you need to spend more; elimination of local IT that Carr advocates under the misleading label of "utility computing" prevents you from seeing this and as such is a stupid solution to the problem akin to throwing baby with a dirty bathwater.
In a big manufacturing company with multiple local datacenters the total cost of all IT operations is often less then 1% of total costs (in financial companies it is more, reaching 3%-10%). And it was around this for quite a long time. So anybody who tries to speak about overspending on IT needs to explain why current costs are excessive and why we need to jump into bandwagon of "in the cloud" service providers (which might BTW be more expensive and provide lesser quality/agility then local services). As of July 2008 cloud computing is not high in the priority list (Computerworld) of most CIOs:
The CIOs indicated that server virtualization and server consolidation are their No. 1 and No. 2 priorities. Following these two are cost-cutting, application integration, and data center consolidation. At the bottom of the list of IT priorities are grid computing, open-source software, content management and cloud computing (called on-demand/utility computing in the survey) -- less than 2% of the respondents said cloud computing was a priority.
In big corporation economy of scale already fully realized so from this point of view move to the cloud is questionable. Also, if total costs are under 1% to justify further cutting you need to be sure that risks are lower then rewards. The same thinking is applicable to the return on investment for efforts in in reducing those costs. In no way modern enterprise IT is a resource hog. It is already pretty slim.
At the same time race to the bottom in IT costs is self-destructing. There is some limit in cost-cutting after which attempts to diminish IT costs and cut people backfire and actually lead to increased costs. One problem here is the corporate brass's inability to separate wheat from shaft. The real competitive value of a good IT is not only the infrastructure and smooth operation of network and corporate applications; it also provides valuable protection for the company from ruthless and energetic charlatans (aka consultants) who often try to sell the company some new fashionable snake oil in best big-pharma traditions (IT analogs of Vioxx and other useless and/or dangerous medications, produced to generate profit not to solve real problems :-). Outsourcing also destroys loyalty and that has its own (often huge) costs. It is not uncommon that disenfranchised IT staff stops being the filter against aggressive marketing and soon the affected company successfully completes the useless half-million of more acquisition of useless or harmful software. Vioxx-style appliances and software packages are readily available on the market. For example, SAP/R3 is a mixed blessing in many environments leading to increased not decreased costs and flexibility. Actually to deploy SAP/R3 in unsuitable environment is one of the best way to destroy the company. SOX games also can be played pretty destructively for uninformed and clueless brass after no critical mass of IT IQ left the company. See my IT Outsourcing/Offshoring Skeptic: Fighting Outsourcing Myths article for detail.
That means that it might be hard to use IT to gain a strategic advantage (first of all you need talented people as IT is just a technology), but absence of IT talent puts the company in the cost disadvantage. A lot of firms that tried to outsource IT discovered this dimension of the strategic value of IT rather quickly. Because IT expenditures remains high and businesses can be held hostage by poorly designed IT system, there's enormous need for technically astute, hard-nosed IT professionals who can navigate the companies between Scylla of IT vendors and Haribda of IT outsources. Dilbertalization of IT inevitably leads to high additional costs, costs that exceed to the cost of keeping local IT staff.
Churchill once quipped that "Capitalism is the worst economic system, except for all the others." That might be applicable to local IT and datacenters ;-). I think that Carr falls in the same trap. His almost socialist style dirt-digging about how wasteful and inefficient local datacenters with local infrastructure are (with simplistic notion of excessive capacity) does not mean that in the cloud providers are less wasteful and that moving infrastructure "in the cloud" (using Internet for the connection between clients and servers) will improve this situation. It serves mainly as a (false) justification of his Utopian vision. But the fact that existing situation is bad does not mean that what he proposes cannot be worse. Road to hell is paid with good intensions. Sometimes his attacks are funny for about the level of misunderstanding of issues even for a business writer without specific IT training:
Every year, businesses purchase more than 100 million PCs, most of which replace older models. Yet the vast majority of workers who use PCs rely on only a few simple applications – word processing, spreadsheets, e-mail, and Web browsing. These applications have been technologically mature for years; they require only a fraction of the computing power provided by today’s microprocessors. Nevertheless, companies continue to roll out across-the-board hardware and software upgrades.
Those laments about companies rolling out "across-the-board hardware and software upgrades" are simply naive. There three consideration here that Carr does not understand:
He also claims that there is a low storage capacity utilization is a problem in IT:
In addition to being passive in their purchasing, companies have been sloppy in their use of IT. That’s particularly true with data storage, which has come to account for more than half of many companies’ IT expenditures. The bulk of what’s being stored on corporate networks has little to do with making products or serving customers – it consists of employees’ saved e-mails and files, including terabytes of spam, MP3s, and video clips. Computerworld estimates that as much as 70% of the storage capacity of a typical Windows network is wasted – an enormous unnecessary expense. Restricting employees’ ability to save files indiscriminately and indefinitely may seem distasteful to many managers, but it can have a real impact on the bottom line. Now that IT has become the dominant capital expense for most businesses, there’s no excuse for waste and sloppiness.
I wonder how restricting ability to save files can have impact on the bottom line with the current harddrives capacities unless we are talking about storing movies. And please take into account that if total cost of IT is less then 1% impart on the bottom line is negligible in any case. This is a typical example Carr's faulty reasoning when he pick up the fact and interpret it based on his overly simplistic or outright missing understanding of underling technology. In no way IT can be considered the dominant capital expense for most business (with the probably exception of ecommerce shops and similar entities). Fact are not here to prove this ridiculous statement. It was not true even during dot-com boom with the exception of ecommerce related companies. His laments about wasting 70% of storage capacity are also detached from reality.
It looks like Carr does not understand what "excessive capacity" means in IT and how much in costs overhead large enterprise is suffering from it. Excessive capacity in hardware is different from excessive capacity in, say, electricity network, or in SUVs. This is a complex topic and I will take elementary example that shows that naive laments about unused capacity are without merit. First of all there is a dynamic of increasing of load of the server during life time so a server that started with 3-5% CPU load at the beginning of the life cycle can have 20% load at the end (three-five years later). For example application upgrades often increase load on the CPU. Also the cost of this supposal redundant capacity is negligible and adding capacity at the beginning is a good insurance against very costly hardware upgrades at the end of the life-cycle. Doubling memory for three year old server using memory from the vendor is usually costly and often it make sense simply to replace the server.
For example entry as of May, 2008 level Dell PowerEdge 1950 server with two 2.33GHz CPU , 8G of RAM and four 80G 15K RPM drives costs around $4K and PowerEdge 2950 server with two 3.0GHz dual core CPUs, 16G RAM and four 146G, 15K RPM drives is approximately $6K. If we assume that PE1950 represents minimal requirements, the question arise does it make sense to double those requirements taking into consideration possible inadequacies in pilot implementation load testing and growth of the load on the server in subsequent three years. In our example vastly superior server with a lot of "room for growth" will cost additional $2K. The answer, of course, is individual, but implications for the error in specification by going with "barely suitable" server can be significant and cost company more then $2K (you need to write down or reuse the old investment on order to buy a new server; if this was a mass deployment reuse is problematic and selling will recover the company less then 30% of the initial cost). The second ("over specified") server has much less chances to cause application problems, will better handle bottlenecks and rush hours load and might not need any upgrades during its life-cycle. In a way it is an insurance and as such it well worse additional money. So it is often wise to over-specify the hardware using this as an insurance against adverse developments as losses are minor, but potential benefits are great (for example what if we underestimated the load of company will buy another one which double the user population and the load of the server). If we are wrong there is always a possibility to use several virtual OS instances on a single server.
Moreover with blades and virtualization (for example using XEN) the load theoretically can be moved dynamically from one server to another within the datacenter so "cloud based" provider even in principle does not have any "capacity utilization" advantages over local deployment. Specialization advantages of a service provider still exists but large part of them can be replicated by outsourcing maintenance of the servers and OS to the third party.
As lower level of IT became readily available and more standardized the comparative advantage just moves to higher level. what Carr is view as disappearance in reality is just a shift. For example, TCP/IP connectivity for example was a huge competitive advantage of the USA 1980th and 1990th). Later this advantage modes to higher level (http based protocols, search, ecommerce). Then in a side move TCP/IP moved into wireless. The latter made possible dynamic tracking of trucks and other extremely sophisticated applications. It is still evolving and it might be that now IP6 can give to some companies a competitive advantage (for example, better security, as hackers are by-and-large IPv4 bound). Just imagine how much money large enterprises are wasting on all those expensive and semi-useless Firewall 1, F5 and BlueCoat appliances and IDS boxes ;-).
IT technology now serves as both the right of entry, and the key to success for almost any business you can think off. In a way IT technology strategic importance is not importance of the set of tools or protocols (which now can be standard like Windows, Microsoft Office, TCP/IP, DNS, SMTP, etc) but importance of the superstructure, architecture of integration of those tools. In InfoTech like in Lego it is quite possible to implement completely non-standard, unique system from standard parts. And this flexibility of IT serves as a catalyst for implementing other innovations (GPS on trucks, RFID in packaging, ecommerce and internet auctions systems, etc). In this sense the strategic importance of IT even increased as technologies became more diverse and mature.
|In InfoTech like in Lego it is quite possible to implement completely non-standard, unique system from standard parts.|
In the same way the fact that companies can purchase the same hardware does not mean that strategic advantage disappear. Here, like in cooking, ingredients might be the same, but quality of a dish can be completely different. Mismanaged, fashion driven and/or badly though out hardware acquisitions can drive the company into serious problems and even doom it to a failure. the same in true about enterprise software acquisitions as was the case with several misguided SAP/R3 implementations, which brought down large, multi-billion companies. Even moderately "mis-designed" or "mis-stress-tested" enterprise software can put the company on the ropes, as we all saw recently in airlines: inferior IT system can wipe the profits for the year due to a single large storm.
Although competitive advantage via better hardware is becoming non-existent, a superior code written by a good developer is much more difficult to duplicate. Carr dismissed the value of custom programming code in IT since it is a "commodity". This is completely false, preposterous idea. It is necessary for the organization to use some custom code for get full benefits from software. And the custom code written by talented programmer can give the organization substantial competitive advantage. Carr's understanding of IT in this respect is limited to the level of secretary who knows nothing but a basic staff about few Office applications.
|Although competitive advantage via better hardware is becoming non-existent, a superior code written by a good developer is much more difficult to duplicate. Carr dismissed the value of custom programming code in IT since it is a "commodity". This is completely false, preposterous idea. It is necessary for the organization to use some custom code for get full benefits from software. And the custom code written by talented programmer can give the organization substantial competitive advantage.|
Of course, there is a lot of problem with IT as it exists in large corporations. It is often inefficient, wasteful and not very friendly if not outright hostile to business needs. It looks like Carr provided more or less objective and insightful analysis of the problems with corporate IT (I actually enjoy reading his blog) and offered unscientific, snail-oil salesman style explanation/solution to those serious problems, a simplistic fallacy (aka "vision") of "in the cloud" paradise based of simplistic and misplaced railways/electrical generation-based analogy. I would like to stress that being unscientific does not matter in case some false idea is needed by some political forces and "Carr hypothesis" served perfectly well as an ideological justification of IT outsourcing.
|It looks like Carr provided more or less objective and insightful analysis of the problems with corporate IT and offered unscientific, snail-oil salesman style explanation/solution to those serious problems...|
But while it is easy (and absolutely appropriate) to criticize corporate IT for inefficiency and excessive bureaucratization, you should not throw a child with the dirty bathwater, which Carr definitely does:
"In the long run, the IT department is unlikely to survive, at least not in its familiar form," Carr writes. "It will have little left to do once the bulk of business computing shifts out of private data centers and into the cloud. Business units and even individual employees will be able to control the processing of information directly, without the need for legions of technical people."
Here are some Amazon reviews on his book that contain apt observations about his first book which is an enlarged version of the original paper:
Twisty language beguiles the easily amused,
May 30, 2004
By A Customer
It's star time. While filling in for a 'let go' editor of the Harvard Business Review (HBR), a business writer with no personal involvement or experience in IT uses prime-time pages of HBR to conjure up a British tabloid piece that raises him to IT stardom.
Watch the movie, 'Being There' to catch what's going on with this follow-on book from the smash hit, "IT Doesn't Matter" in the May 2004 issue of HBR. Although a formula for business failure, this is a shoe-in for the Hype Award for Selling Books. Peter Sellers, ...and Chance the Gardener...take notes.
Klyde Hartler, Frankfurt, GermanyWritten by an author who never managed an IT organization,
May 26, 2007
Although Nicholas Carr has some good eye opening arguments, most of it is based on theory, and not practice. As usual, although such arguments are worthy of thought and debate, it should never be taken at its face value as the new paradigm.
For example, Nicholas Carr makes an argument there are huge amount of IT "waste" because of excessive disk capacity and CPU capabilities. People who have worked in IT for any reasonable period of time knows that excess capacity rarely remains wasted, especially when many organizations experience 50%+ growth in disk storage needs every year. The same thing goes for the CPU and memory capacity. Having excessive hardware capacity for the future needs is called "capacity planning" and not a waste. Anyone who has been taken off guard (and paid a dear price with costly downtime) due to lack of disk storage, memory, or CPU horsepower knows the value of having some excessive hardware capacity. Minimizing the risk of critical downtime is hardly a "waste".
Some of Nicholas Carr's argument, admittedly, is at least somewhat, if not mostly, true. For example, he argues that IT, particularly its infrastructure, offers little competitive advantage because IT has become a commodity. This is true of SAN, OS, RDBMS, email, routers, switches, and servers (among others). If one's competitors have better and faster storage, for example, all one needs to do is buy the same storage from the same vendor at the same price to negate that competitive advantage.
Nicholas Carr does not address two important factors enough in the competitive forces of IT:
1)The power of innovation. Although competitive advantage via better hardware is becoming non-existent, a superior code written by a good developer is much more difficult to duplicate. Carr seems to dismiss the idea of innovation in IT since it is a "commodity". This is only partially true.
2)The competitive advantage of having superior IT personnel. The quality of the Knowledge Worker has become a key component, and perhaps the main key component, of competitive forces. In an era where it is increasingly difficult to gain competitive advantage, the quality of a company's Knowledge Worker is becoming more crucial. Despite the increasing commoditization of IT, the quality of IT depends largely on the quality of the IT personnel.
Excellent questions with inferior answers,
`Does IT Matter' is a difficult book to rate.
July 8, 2004
As to the questions it raises, it deserves 5 stars. But its answers, are two-star, at best.
By way of analogy, most bomb threats are bogus, but each one must be treated as if it were genuine. With that, in his new book Does IT Matter?, Nicholas Carr throws a bomb, and it turns out to be a dud.
Carr's book is an outgrowth of his article "IT Doesn't Matter," which appeared in the May 2003 issue of the Harvard Business Review. His hypothesis is that the strategic importance of IT has diminished. Carr views IT as a commodity, akin to electricity.
He also compares IT to the railroad infrastructure. In the early days, railroads that had their own tracks had a huge advantage, but once the rails become ubiquitous and open, that advantage went away.
Carr feels that since all companies can purchase the same hardware and software, any strategic advantage is obviated. It's true that the core functions of IT (processing, network transport, storage, etc.) are affordable and available to all, but there's still huge strategic advantage to be gained in how they're implemented.
It's much like two airlines that purchase the same model of airplane. If one airline streamlines and optimizes operations, trains its staff and follows standard operating procedures, it can expect to make a profit. If the other has operational inefficiencies, labor problems and other setbacks, it could lose money. The airplane is identical, but the outcome is not.
Carr is correct in that there have been some huge IT outlays of dubious value. But to say that IT is simply the procurement of hardware and software is to be blind to the fact that hardware and software are but two of the myriad components of IT.
To use the railroad metaphor, hardware and off-the-shelf software are the rails of IT; how they are designed and implemented is what provides their strategic value. Carr views IT as completely evolved. But the reality is that although IT has matured, it still is in a growth mode. The IT of today is vastly different from the IT of both 1999 and 2009.
Carr's view that most innovations within IT will tend to enhance the reliability and efficiency of IT rather than provide a competitive advantage is in direct opposition to what is said by every CIO I have met.
YOU BET IT DOES.,
May 29, 2004
Happened to pick up and browse through this paperweight at the airport and patted myself for not having bought it.
For one thing, in Carr's world, information technology managers are fools that drain dollars to the tune of 2 trillion every year, slavishly upgrading to whatever new thing vendors want to sell. This is a VERY narrow definition of organizational "IT". The world of technology is much wider and ever-expanding, making a direct or indirect impact into our daily lives, both personal and commercial. So the scope of this book is a bit hog-tied.
For another, these petty cavils are not new. Paul Strassman (Paul Strassman Inc.) has been saying the same things for a while. Morgan Stanley's star analyst Stephen Roach broke into the same bungled song and dance in the early 90s. Even more recently, "the bubble burst, I told you" has been recently toted as the safely wise justification. One could think of the many financial bubbles that have burst to disastrous consequences, but that's for another day.
Is IT unproductive? If the answer to that is yes (which it is not) then there are two possible explanations. That IT investments are ineffective (the author's take) or that we're not measuring productivity right. Nobel laureate MIT professor Robert Solow, with his work on "productivity paradox", has proven that the latter may be just as true as the ineffectiveness of tech spending.
Come to think of it, consider this:
- Without IT, the author would not have the website that promotes this hackneyed refrain.
- Nor would we have the Amazon.com website which allows hundreds of readers to commit the mistake of spending money on his book.
- Or on its PDF ebook version.
- IT was used in the typesetting of this very book.
- IT is used by his publisher to manage all their accounts, including the royalties the author will get.
- IT is used to power almost every record in the university and the office that pays the author his monthly salary (Harvard)
- It is IT in the car that he drives to work that parks automatically or that aids his navigation
- ...and so on...
Overall, the book didn't catch my attention for more than 15 minutes. Which is about the time I've spent on writing the above.
Carr addresses the declining efficiency of IT under two labels "commoditization" and "loss of comparative advantage". But in reality a large part of the decline might be connected with another problem he never mentions -- dilbertalization.
This is problem common to older corporations in general and as IT grow older there are less and less visionaries and more and more bozos in positions of power. As bozos bring on board another bozos this is a generic problem that might be just a little bit more visible in IT. Again, the problem of bureaucratization is usually not a local symptom, but the sign of rot at the highest echelons of the corporation. Historically bureaucratization of datacenter is a very old problem which first on mass scale surfaced early in mainframe era. The classic reflection of this problem are Dilbert cartoons so it is often called "dilbertalization of IT" (to be more exact dilbertalization presuppose also a lethal doze of micromanagement -- another widespread IT menace which is beyond the scope of this article). Here is an apt definition of "dilbertalised" IT organization from Todd's Humor Archive (reproduced with minor variations):
Computing Center [n], is an organization whose functions are
- To impede wherever possible the development and usefulness of computing in the company or University.
- To gain the lion's share of funding, spend it largely on obsolete, bloated and otherwise inappropriate IT Solutions, and convince the businesses/campuses wherever possible to spend funds on the same.
- To oppose vigorously any new, useful and popular technology for three years or more until nearly everyone on the business/campuses and elsewhere in the world is using it, then to adopt that technology and immediately attempt to centralize and gain complete and sole control of it
We have zombie IT departments like we have zombie corporations. Therefore, the fact that "Dilbertalised IT" of many enterprise datacenters is horrible as a business enabler is nothing new and does not mean that IT reached the stage of commodization. It is just a sign of the level of IT mismanagement, which is a part of larger issue of quality of management in the large corporations. Like any business mismanagement, IT mismanagement erodes business profits, shareholder value, and business operational efficiency. In any relationship, it is dangerous for one side to "decide" what the other one wants. Marriage advisors usually warn about pitfalls like over controlling by saying "Don't control others or make choices for them." Yet often either IT department try to dictate what is necessary for business, or businesses try to force upon IT department something horrible bought due to slick marketing. In other words one side wants to play control freak and is surprised on the level of resistance encountered. Think about those symmetrical cases as the major source of IT screw-ups. Still it is important to understand that this problem has a long history and started in early 60th. As one Amazon reader recollects dilbertalised IT faced those problems since early mainframe days:
When I started out we "MIS" professionals were the priests and priestesses who worked our magic in glass rooms. We were merely arrogant then. Life was simpler and some vendors worked closely with us. IBM, which is my main background, had a reputation for never letting their customers fail. That is not to say that their recommendations and solutions always translated into business value for their customers, but rarely did they result in disasters either. As time went on though MIS became IS, then IT. Systems grew more complex, proprietary systems gave way to interoperability, then open systems, and new vendors started arriving in droves. Innovation fanned the flames of complexity, and IT remained arrogant, but began focusing so much on the technology (and trying to keep up with it) that they lost sight of business needs. Methods devolved into chaos and the chasm between IT and the business widened to the point where IT was in some cases counter-productive to business needs.
Some IT departments are virtually useless because of bureaucratization, when good workers leave and the vast majority of IT staff became demoralized and by-and-large utterly incompetent, lack the necessary skills, both technical and business to effectively and efficiently perform their duties. Those problems cannot be solved with the utility computing. On the contrary trying to cure mismanaging of IT with outsourcing can make problems even worse as utilities is a perfect place for those perversions. And such turn of events in case of outsourcing is more common that most people assume.
Still Carr did a reasonably good job in exposing IT warts and this part of his writing has some sociological value and might partially explain the success of the article and books, the success that permitted him to milk this cash cow for half of the decade. There is no doubt that the process of dilbertization of IT went too far, but Carr recipes for the cure are all wrong. But for political reason, his claims were instantly attractive to bean counters and some people in positions of power. In a way he provided plausible and useful justification (with the big fig leaf of HBR respectability) for outsourcing everything and everybody in IT ;-). That side effect was the key factor that allowed him to survive and prosper by publishing two additional books. But if we looks at the validity of his recommendations then my impression is that the king was and remains naked. And it is not only my impression.
It seems pretty widely accepted in IT management circles that blaming IT by other parts of the business are somewhat connected with misunderstanding or the role and capabilities of the IT department. Often business demand to make some system work are completely detached from the reality: nothing can save system that does not suit business and was bought because of hype and/or low qualification of people who made a particular decision. Many executives don't want to hear about the potential complications, don't want to put the time and effort into evaluation process and pilot implementation, and have a misguided expectation that IT technology is a panacea. Carr actually is actually playing into this superstition as extremes meet: is not panacea then it should completely disappear.
IT projects are very challenging as they usually involve shifting, not fixed specifications. So the delays partially can be explained that the product delivered was not the product that was ordered, often this is a completely different product. and such dynamic retuning cost money and time. A coordination problem occurs when you have a task to perform, the task has multiple and shifting components, the time for completion is limited, and your performance is affected by the order and sequence of the actions you take. The trick is to manage it so that the components don’t bump into each other in ways that produce confusion, frustration and inefficiency. this is a very difficult thing in do in large IT projects.
Carr naively believe that outsourcing is the magic solution to IT problems and that "follow, don't lead" is a viable strategy for enterprise IT. My experience completely contradicts both of those claim. In his initial HBR article he claims:
At a high level, stronger cost management requires more rigor in evaluating expected returns from systems investments, more creativity in exploring simpler and cheaper alternatives, and a greater openness to outsourcing and other partnerships. But most companies can also reap significant savings by simply cutting out waste.
So what should companies do? From a practical standpoint, the most important lesson to be learned from earlier infrastructural technologies may be this: When a resource becomes essential to competition but inconsequential to strategy, the risks it creates become more important than the advantages it provides. Think of electricity. Today, no company builds its business strategy around its electricity usage, but even a brief lapse in supply can be devastating (as some California businesses discovered during the energy crisis of 2000). The operational risks associated with IT are many – technical glitches, obsolescence, service outages, unreliable vendors or partners, security breaches, even terrorism – and some have become magnified as companies have moved from tightly controlled, proprietary systems to open, shared ones. Today, an IT disruption can paralyze a company’s ability to make its products, deliver its services, and connect with its customers, not to mention foul its reputation. Yet few companies have done a thorough job of identifying and tempering their vulnerabilities. Worrying about what might go wrong may not be as glamorous a job as speculating about the future, but it is a more essential job right now.
In the long run, though, the greatest IT risk facing most companies is more prosaic than a catastrophe. It is, simply, overspending
Carr promotes outsourcing under the hot sauce of "utility computing" to conceal its bad taste. Actually "utility computing" is a quite misleading term like many other that he uses ("infrastructural technologies" is another example):
Many technology vendors are already repositioning themselves and their products in response to the changes in the market. Microsoft’s push to turn its Office software suite from a packaged good into an annual subscription service is a tacit acknowledgment that companies are losing their need – and their appetite – for constant upgrades. Dell has succeeded by exploiting the commoditization of the PC market and is now extending that strategy to servers, storage, and even services. (Michael Dell’s essential genius has always been his unsentimental trust in the commoditization of information technology.) And many of the major suppliers of corporate IT, including Microsoft, IBM, Sun, and Oracle, are battling to position themselves as dominant suppliers of “Web services” – to turn themselves, in effect, into utilities. This war for scale, combined with the continuing transformation of IT into a commodity, will lead to the further consolidation of many sectors of the IT industry. The winners will do very well; the losers will be gone.
We will discuss his "in the cloud" vision later.
First of all learning complex software is far beyond many "regular users" and outsourcing of It function make the situation worse. Often much worse. While there is a possibility to switch from more expensive packages to less expensive (in some cases, zero initial cost open source packages) preoccupation with cutting the waste (IT staff) s a very misguided strategy. There should be a strategic vision of enhancing the role of IT in the organization as its nerve system. Even with most common software skills of most users are entry-level and they cannot use even 10% of capabilities. That statement probably applicable to Mr. Carr) as a user of MS Word ;-). Most users are unable to reach the level necessary for creating and using complex macros (which requires VBA skills) and as such use Ms Word as a better typewriter. Often users are not even able to use style sheets. This is another facet of complexity problem, where quantity turns into the quality. In no way trend observable during the last five years support such a hypothesis. On the contrary, event regular desktop applications became so complex that regular user are just scratch the surface and are incapable to uncover competitive advantages of Microsoft Office and similar application suits. Even for such common applications as MS Word and Excel that hypothesis that users can do well without IT professionals relying, say, on outsourced support is too far fetched.
There is little objective evidence that "IT does not matter" fallacy has typical right-wing backers. Still this is a clear example of obscurantism in action and in essence is not that different from similar more politically charged types of obscurantism like Neo-Creationism. An important point is that the impact of anti-intellectualism is much greater today than the 1800's when science and technology had much less to offer. The ability of obscurantism to move and find a fertile ground in IT is in itself a very interesting and telling development.
While hypocritically denying this in some of this articles all Carr's writings are directed toward discounting the value of local datacenter and local IT departments.
But much of the value of enterprise software is limited to the level of skills of IT personnel organized around local datacenter(s) which serve as a focal point of the IT infrastructure. That means that there is a distinct competitive advantage of having superior IT personnel.
There is a distinct competitive advantage of having superior IT personnel
Availability of software and hardware does not mean the automatic ability to use it productively. After all a lot of software is now available free of charge (as open source software and freeware). But there is a significant barrier of entry and not all organizations can use Linux, Perl/PHP, MySQL, of free Microsoft Visual Studio compilers equally well. That means that they are not available to all despite zero cost, or in other words contrary to Carr's view they are not "commodity inputs". The problem of utilization of the shelf software including Microsoft Office in organizations is far from trivial. It is fair to say that the amount of money that are spend by IT organization on unused software is considerable. There are all reasons to expect that if users became sole decision makers in this process this amount will only increase. For better or worse, most users needs a professional to get a better return on any complex software investment. IT serves as a combination of local tech adviser and testing lab for such products. This is a strategic function as mistakes are costly and can even destroy enterprise (several firms went down or were acquired by competitions due to weakening after SAP/R3 implementation which was completely unsuitable to the specifics of their primary business).
That means that just having talented IT personnel on the floor is in itself a comparative advantage, especially important in crisis situations. End users will never be able to learn reasonably advanced features of complex software packages without presence of more advanced users on the floor. Good external specialists are very expensive and it is not the easy to find person who simultaneously knows the product and the business in which application is running. In the current complex world with few sources of lasting comparatives advantages, "peopleware" is one of the few that cannot be replicated easily.
|Just having talented IT personnel on the floor is in itself a comparative advantage, especially important in crisis situations. In the current complex world with few sources of lasting comparatives advantages, "peopleware" is one of the few that cannot be replicated easily.|
Carr notion that users can maintain complex IT system is simply ridiculous. Typical user does not abilities to master MS Word or Excel (and to be frank, few professional can fully master them too ;-). The level of complexity of modern IT software products and IT systems in general is just staggering: this are the most complex system created by men. And "experts" like Carr propose to delegate understanding and creating IT architecture for such system to users or (intermediated by their own revenue stream) providers.
But without knowledgeable and loyal staff how can I as a manager make targeted strategic investments, as opposed to overall upgrades? Should I move servers to Solaris or should I follow the trend and try to convert my server farms to Linux and virtualizes all instances of Windows on VMware. Or my company would be better off using blades instead of VMware ? Should I outsource my development to India or some other lower cost country or critical mass of developers is important to the company success ? IT Proposed by Carr primitive solutions of complex issues that face people in this industry could be tough to swallow.
For example if I supported particular complex application in a particular company and then switch sides and move from my company to some application provider I essentially can dictate the conditions (and believe me that will not be favorable conditions, why they should be ?) as nobody can match my level of understanding of the particular area. And such milking of companies due to lack of local IQ is not that uncommon. Actually the best situation for me as a sale rep is to face as a company representative some bizarre over-promoted secretary who due to her "non-technical virtues" was promoted to the head of IT security or, better, to the level of vice-president of IT. As she is afraid to demonstrate her own incompetence and does not trust her own people she can serve as a Trojan house for pretty lucrative deals. Usually there are very few talented and dedicated people in IT which can be the architects of good systems. But IT brass usually cannot distinguish them from regular ladder climbers and sociopaths. So in a way the cost of employing a lot of IT staff is the cost connected with the inherent inability by higher management to separate the wheat from the shaft in modern bureaucratic organizations.
Superior architecture and software code are difficult to duplicate and they does provide huge strategic advantage. Another often neglected side of strategic advantage of IT is the protection provided by local staff from greedy vendors and service providers. For example, protection form SOX excesses and greedy accounting firms is definitely an advantage, strategic or not: without it accounting firms can milk weakened by loss of talent enterprises for many millions additional dollars. And all in the name of better security :-). Protection from snake oil salesman who promise miracle diets for the IT datacenter (including Nicholas Carr ;-) is yet another advantage...
Nearly all large enterprises run Office applications on Intel-based PC under Windows. But that does not mean that some companies cannot find an advantage to use Mac or Linux. Yes, hardware is now so cheap that just about everyone can afford it. But to afford does not means to use efficiently. While your competitors have access to the same online information you do ability to absorb and creatively use this information is quite different. It's not clear that ubiquitous, standardized technology can help any particular firm stand out. But such benefits are not automatic. If they do not have capable people (or worse have a PHB as the head of "PC standardization" project) they will definitely overpay for it and underutilize it (incompetence of IT personnel definitely double IBM or Dell revenue stream) . But as we move to the datacenter level and especially datacenter architecture, the level of standardization is much less and issues are more complex. For example cases when company can benefit from adopting blades are different from cases when company can adopt VMware and without proper level of knowledge no vendor will spell the difference for you: each will push its own product. So while standardization in the small is evident, standardization at large is not.
The fact that Lego consist of the same peaces does not limit the amount of things you can create from it. The same it true with IT. Carr's vision is that we are about to shift to massive, centralized utility computing plants in which individuals will not need to carry laptops. simplified terminals are sufficient. The current trend is just an opposite: laptop are becoming more and more powerful and versatile.
Also the current level of complexity if IT system presuppose need to be specialized and trained in the field for years if not decades.
And if we take an opposite extreme "complete dilbertalization" of IT is a source of huge comparative disadvantage as many outsourcing efforts have shown to surprised brass. Bad apples in IT can really provide huge damage to the enterprise as several failures of SAP/R3 implementations (some of which resulted in bankruptcies of firesale of companies) had shown.
In the worst case scenario the company loses critical mass of knowledge and became a hostage of unscrupulous outsourcer and greedy vendors. Also the ability to pioneer some new higher level integration approaches or improve the logistics infrastructure can be a source of comparative advantage.
The most questionable part of Carr's Utopia is his assertion that defensive measure are more important then strategic vision and adoption of new technology. This is the view that I would call "defensive idiotism" and like other types of "trained idiotism" it is quite widespread. The Carr's mantra is "Focus on vulnerabilities, not opportunities":
Focus on vulnerabilities, not opportunities. It's unusual for a company to gain a competitive advantage through the distinctive use of a mature infrastructural technology, but even a brief disruption in the availability of the technology can be devastating. As corporations continue to cede control over their IT applications and networks to vendors and other third parties, the threats they face will proliferate. They need to prepare themselves for technical glitches, outages, and security breaches, shifting their attention from opportunities to vulnerabilities
This is really the most dangerous advice as Carr completely, fails to understand that value of sound architecture in security and disaster recovery. Right now too many companies are focused on issues of vulnerabilities and compliance, but it is architectural blunders and operational mistakes destroy most shareholder value. As one industry study states [New Report Reveals Causes for Shareholder Value Destruction]):
"Only 13% of the decrease in shareholder value in these companies resulted from compliance failures. Sixty percent of the value destruction was attributable to strategic mistakes, such as misjudging customer demand or competitive pressure, or management ineffectiveness.
An additional 27% was due to operational blunders, such as cost overruns or poorly managed integration during mergers and acquisitions."
In other words Carr's recommendation that companies should be more concerned with IT risk mitigation then IT strategy (especially IT architecture ) is complete baloney. There is an inherent relationship between technology used, architecture used and vulnerabilities. For example on Unix boxes viruses and malware threat are of lesser importance then on Windows because of different architecture of operating system
As for architecture, neither high availability, nor security cannot be achieved without sound IT architecture. Sound IT architecture (which is a result of proper "IT strategy" and adoption of advanced technologies that Carr discounted) is more important then any amount of "risk mitigation" activities which, in large enterprise environment, are most commonly completely "socialist" type of activities. In best case they are waist of money but often then entail direct harm to the organizations by giving power to clueless bureaucrats with security hat on their heads. Often security managers are the worst managers in organization written down for any production tasks because their inability to provide any constructive contribution, micromanagement, utter stupidity and like. As SOX enthusiasts from big accounting firms recently aptly demonstrated to the surprised corporate world it is possible to destroy a lot of money in the name of security, if this is profitable for consultants.
Concentration on vulnerabilities has another negative aspect: a social one. The real problem here is the conservative role the security staff usually plays in the whole organization: usually security functions are not never a strategic place, so they have no real say ( or have real understanding) of the IT architecture. The level of incompetence of security personnel can be staggering and decision influenced by such staff can have profound negative effect on organization. I know one case when a female executive secretary was promoted to the head of IT security due to her other qualities, then mere technical competence which was limited to the ability to work with Word and Excel. Another similar, but not directly related to IT example, is a well know case of appointment of a homeland security adviser by the former NJ governor McGreevey.
Often the security department is just a stone on IT neck and an exile point for less competent staffers (or worse). IT architecture is of paramount importance and no amount of "risk mitigation" blah-blah-blah can change that. Also even incompetent security personnel costs money and with better architecture you can use less of this personnel which is strictly speaking an overhead (the author knows this all too well from personal experience).
Prev | Contents | Next
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least
Copyright © 1996-2018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: September 12, 2017