||Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix|
I think Carr's understanding of data security is "cloudy" at best, but it is less clear to me how he managed to miss the threat to privacy that cloud computing represents. After all he has zero background in computer security.
TCP/IP networks along with tremendous opportunities they provide also represent a strategic danger. The first debate about this topic happened around 1984. PGP was created by Phil Zimmermann in 1991 and a big debate followed. See PGP Timeline, History of PGP (PGP en français) and Wikipedia article on the topic. Here is one relevant quote:
- The US gov gets a complaint from Bidzos that PGP breaks a bunch of laws. When customs first started investigating PRZ they were under the impression that PGP was developed by PKP and PRZ stole it and was not distributing it around the world.
- USG decides that they don't like PRZ because the NSA can't tap all those internet mail messages anymore. (the NSA part is speculation, but in my opinion likely true).
- USG begins investigating PRZ for alleged aiding with ITAR violation. It is clear from the very begining that PRZ did not, and is not encouraging export of PGP, as demonstrated by this excerpt from the pgp1.0 docs:
The Government has made it illegal in many cases to export good cryptographic technology, and that may include PGP. This is determined by volatile State Department policies, not fixed laws. Many foreign governments impose serious penalties on anyone inside their country using encrypted communications. In some countries they might even shoot you for that. I will not export this software in cases when it is illegal to do so under US State Department policies, and I assume no responsibility for other people exporting it without my permission.
- Phil Zimmermann legal defense fund (the yellow ribbon campaign) set up to cover his legal expenses. This defense fund is now closed since the investigation was dropped. See: http://www.netresponse.com/zldf/
The mere fact that network communication over public Internet is used has important security implications was quite clear around 1997: Report to the President's Commission on Critical Infrastructure Protection  (pdf) explicitly warned about loss of confidence to network services due to network attacks:
As described earlier, the Internet was designed to survive the disruption of its transport mechanism; but once data was somehow successfully delivered, users believed it to be legitimate. The “internal” attacks now possible enable an intruder to modify programs and configuration files in subtle ways so that they still appear to work. The programs may even appear to be unmodified but will fail under circumstances specified by the intruder. After a successful computer system intrusion, it can be very difficult or impossible to determine precisely what subtle damage, if any, was left by the intruder. Loss of confidence can result even if an intruder leaves no damage because the site cannot prove none was left. With some infrastructures, such as electricity, gas, and emergency services, once an overt denial-of-service attack has been resolved and the service returned, consumers immediately regain trust in the service they receive. But the Internet is highly susceptible to a loss-of-confidence crisis. Only recently have some vendors begun using a cryptographic technique (checksums) that makes it possible to determine whether files or programs have been modified, and providing features that prevent modification of system files.
In summary, intruders on the Internet continue to prey on the lack of security in many of the products and protocols in use on the Internet today. As the U.S. becomes more dependent 10 CMU/SEI-97-SR-003 on the Internet, the potential impact of a successful Internet-based attack against the U.S. increases. The next section describes examples of the possible effect of Internet attacks on several critical national infrastructures.
Around 2003 networked computer related threats were much more prominent and became critically important, and not to mention them is a plain vanilla incompetence or worse. CERT issued its influential advisory Home Network Security in June 2001. On July 16, 2001 Code Red started propagating on Microsoft Windows systems with ISS installed and affected more then 250,000 hosts. It exploited the vulnerability described in CERT advisory CA-2001-13 Buffer Overflow In IIS Indexing Service DLL. It was essentially a proof of concept that snooping software can be easily installed on a large number of networked systems by exploiting unpatched vulnerabilities.
The current push of users into cloud has even more sinister implications. Those cloud services suddenly start to smell like KGB and STASI. As Evgeny Morozov noted in The Internet Ideology
Now that our communication networks are in the hands of the private sector, we should avoid making the same mistake with privacy. We shouldn’t reduce this complex problem to market-based solutions. Alas, thanks to Silicon Valley’s entrepreneurial zeal, privatization is already creeping in. Privacy is becoming a commodity. How does one get privacy these days? Just ask any hacker: only by studying how the right tools work. Privacy is no longer something to be taken for granted or enjoyed for free: you have to expend some resources to master the tools. Those resources could be money, patience, attention – you might even hire a consultant to do all this for you – but the point is that privacy is becoming expensive.
... ... ...
The trouble with Silicon Valley is not just that it enables the NSA –it also encourages, even emboldens them. It inspires the NSA to keep searching for connections in a world of meaningless links, to record every click, to ensure that no interaction goes unnoticed, undocumented and unanalyzed. Like Silicon Valley, NSA assumes that everything is interconnected: if we can’t yet link two pieces of data, it’s because we haven’t looked deep enough – or we need a third piece of data, to be collected in the future, to make sense of it all.
CSI + FBI joint study found that between 2000-2003 about 40% of all companies confronted an attempted information breach each year. Other known facts in early 2000-2005 looked at least equally alarming (How Much is Too Much Data Loss - InternetNews, May 06, 2005 ):
This statistic means that cloud computing companies serve as amplifiers of network security risks.
If we talk about security, then cloud computing should probably be named "Big Brother Computing." And not only due to natural interest to such services from government three letter agencies. In a sense cloud service is a "Big Brother" which holds (and analyzes) all your data and can do with them whatever it think is necessary. You are no longer in control. Along with mobile technologies cloud computing is really dissolving both private citizens and especially the corporate security perimeter, where most security efforts were concentrated in the past. In other word it eliminates the last bits of privacy that used to exist in this interconnected world.
Actually not only storage of data is a privacy problem. In his book on Google, The Search, John Battelle noted fundamental problem with "in the cloud" services:
" Every search we make, every link we click, every word we write, every moment we spend looking at a page - each is a little piece of data about ourselves that we leave behind.
Here is another warning. Garett Rogers made several interesting observation about possible pitfalls for users of Google apps:
Things you need to think about:
- You are putting your application in Google’s hands
Think about that for a minute. You are at the mercy of Google — if disaster strikes and Google one day disappears, you are done too. Or, more realistically, if the Google App Engine goes down for an hour, you are also down for an hour — and you will have no idea what happened. Even if you try and get an answer from someone at Google, you won’t. Just like Google Apps, it will be impossible to explain things to your end users.
What if you are violating some terms of service (which likely won’t, but theoretically could happen to people without their knowing)? You thought making your company’s revenue dependent on AdSense was risky — what if your whole application was banned because of something you didn’t know about? Like I said, this scenario isn’t likely to happen — but it’s true that it could.
- Once you are in, you are really in
Using Google’s infrastructure is very tempting. But any smart company should have some sort of plan for the future. What if you realized that you didn’t want to host your application on Google App Engine anymore? Good luck, almost everything you are given access to is proprietary — that means all your data is locked into BigTable in a format that isn’t like a traditional relational database. It’s also very tempting to use the API’s Google provides to interface with things like Google accounts.
On top of that, you will be using the “webapp” framework that Google built that makes writing Python applications real nice — but good luck porting that to another language or putting it on a machine of your own.
- It’s free right?
Not only are you locked in, you are completely at the mercy of Google’s future pricing strategy for the Google App Engine. It’s true that it’s likely to be cheaper than anything else comparable, but are you willing to take that risk? Right now it’s free, so everyone and their dog wants to at least give it a try — but what if your application actually really takes off? You will one day have to pay for your success, or shut down your service.
- Privacy should not be taken lightly
In other words Carr's recommendation "Focus on vulnerabilities, not opportunities" is completely naive as it ignored the new threats that arise from usage of cloud services. Spending on patching vulnerabilities while using cloud services can be called Boondoggle investment:
Boondoggle, in the sense of a term for a project that wastes time and money, first appeared during the Great Depression in the 1930s, referring to the millions of jobs given to unemployed men and women to try to get the economy moving again, as part of the New Deal.
It came into common usage after a 1935 New York Times headline claimed that over $3 million had been spent teaching the jobless how to make boon doggles.
Focus on vulnerabilities gave nothing if there are serious architectural problems. And storing your sensitive data in the cloud is an architectural problem and should be recognized as such. Concentrating on vulnerabilities in case your data are stored at remote datacenter is like relocating into a swamp and then trying manually catch all mosquitoes.
Architectural issues related to move to private cloud are mindboggling. One definite side affect of large cloud services providers is that they are natural point of government surveillance. Here is are talking about not a single government with the jurisdiction of the territory of datacenter, but several governments with their own set of three letter agencies interested in metadata and the content particular folders. Also security of data within large datacenters is always problematic due to the size and complexity of operation.
And that means that large corporations are vary about their employees exposing trade secrets and confidential information via such providers. For example in May 2012 IBM has forbidden its employees from using cloud-based services such as Siri, Dropbox and iCloud.[IBM's Ban]
And the main danger is not the security of documents inside the "cloud" service providers per se, although it is also an issue (and blatant disregard for security is certainly possible by "cloud centers" staff). The main danger to security is that to survive the "cloud centers" need to be really big. As such they are becoming natural target of government agencies attention. And not only government agencies. For example the presence of criminal clients affects everybody else as it can lead to FBI raid of the "cloud datacenter" with the subsequent confiscation of the equipment, the possibility that is often overlooked in advertising cloud computer model (When the FBI Raids a Data Center -- A Rare Danger).
Also really big datacenters put premium on expertise and competence and increase chances of leaks of information due to large number of technically sophisticated personnel involved. The larger is the datacenter the more lucrative it is for infiltration by representatives of foreign governments and other organizations which can benefit from the data it contains.
While Google denies having knowledge of the NSA PRISM scheme it recently confirmed that it’s Transparency Report does not include data on NSA surveillance. Examining the roles and responsibilities of the private sector, the UN Special Rapporteur stated “States must ensure that the private sector is able to carry out its functions independently in a manner that promotes individuals’ human rights.”. This report, published in April 2013, was the first explicit statement by a UN body about the dangerous increases of state surveillance which became possible due to growth of "in the cloud" providers.
As surveillance technologies and methods advance, communications traffic data – traditionally treated as “less private” by law and as such subjected to lower authorization thresholds – becomes a treasure trove from which the State can derive vast amounts of information. This includes who we talk to and for how long; where we go and who we meet; who we bank with, shop with and receive a variety of other services from – creating a detailed profile of our associations, movements, relationships, and activities.
For example, Google has millions of users and while in "normal" corporate environment in most case typo lead to bouncing of email, in Google Gmail the story is can be different as you can see below. Also in a regular corporate email even if the address obtained by introducing a typo exists this person belongs to the same corporation, which minimizes consequences. The mere enormous scale of Google userbase creates new type of problems.
The amplification of consequences of accidental errors (like typo in e-mail address used) creates serious security problems like sharing sensitive documents with complete strangers. This "amplification" of innocent or not so innocent errors due to the size of the datacenter and sharing of computer facilities between many (sometimes criminal) clients is very underappreciated problem in cloud computing.
Also Google in its infinite wisdom does not distinguishes between two email address if two names differ only by dots (like in "joe.user" and "joeuser").
Google Apps for Your Domain (the pay service Google offers) actually includes a setting that admins can turn on that will either prevent users from sharing Google Docs or internal web pages with email addresses outside the company domain, or prominently warn users when they attempt to do so. But as you can expect in most cases this setting is disabled. So fallible humans get another powerful amplifier for their errors and misjudgments.
Here is a recent (Aug 27, 2008) NYT article by David F. Gallagher that explains this danger quote nicely I’m in Your Google Docs, Reading Your Spreadsheets. Please note that this is different situation from a misdirected e-mail: he not only could read the spreadsheets, he could also monitor updates to them over time through handy e-mail alerts, and even edit them, which could have led to some real mischief if he was the mischievous type. He essentially became an invisible member of the team. This is not something that is unique to "in the cloud" sharing system like Google Docs.
Sharing documents with your co-workers via Google Docs sure is convenient. It can also be hazardous. Make one little typo and your sensitive data could fall into the hands of… someone like me.
Last spring a batch of invitations to collaborate on some Google Docs spreadsheets showed up in my Gmail account. The spreadsheets had something to do with Web advertising. I forgot about them until I signed up recently for Google Analytics, a free service that lets Web-site owners track visitors and monitor traffic trends.
Waiting for me there were live Web traffic reports for a whopping 130 sites, most of them belonging to heartland newspapers like The Muskogee (Oklahoma) Phoenix, The Oskaloosa (Iowa) Herald and The Shelbyville (Illinois) Daily Union. All of these are part of Community Newspaper Holdings Inc. (CNHI), a company based in Birmingham, Ala., that owns more than 90 daily papers around the country.
Closer examination of the spreadsheets, along with some online digging, indicated that a CNHI employee had most likely intended to share the reports and spreadsheets with an employee named Deirdre Gallagher. Instead, he or she typed in my Gmail address and handed me the keys to a chunk of CNHI’s Web kingdom, including the detailed financial terms for scores of Web advertising deals.
The spreadsheets were shared among several CNHI advertising and Internet executives, including Chris Muldrow, vice president for Internet operations, who said earlier this year that the company’s sites were getting tens of millions of page views a month.
Mr. Muldrow is on vacation this week, and other CNHI employees did not respond to phone and e-mail messages asking about the glitch. Someone at the company did, however, rescind most of my sharing privileges Wednesday.
There was a time when it would have taken a fair amount of criminal activity to get access to this much information about a company’s internal workings and Web site performance. Now an employee can accidentally drop it into the lap of a random outsider without even knowing that anything is amiss. That’s the power of cloud computing at work.
Most of the discussion about the security of online applications revolves around whether or not you can trust Google and its competitors to protect your data. In this case, CNHI needed to be protected from its own employee. Google could help with this by, for example, flashing a warning before you share a document with a person you have not exchanged e-mail with in Gmail. But in the end, security requires careful typing — and perhaps some careful decisions about whether some documents would be better left behind the corporate firewall.
Several reader responses provided additional facets of the problem:
— Posted by C.T.
— Posted by Sycryc
— Posted by Ben
— Posted by Greg
Or deliberately drop it into the lap of an outsider with a fantastic alibi. Corporate espionage gets easier from both directions.
Of course, this happens with other technologies as well. I occasionally get mis-delivered real mail, and I get frequest wrong number phone calls (especially calls intended for the Quebec Tourism Board). It just goes a lot deeper when you put everything you’ve got into the cloud instead of keeping it in your own hands.
— Posted by JD
One thing I learned from Google is that they ignore periods in e-mail addresses, so john.doe and johndoe and j.o.h.n.d.o.e are all the same account. Since I always use a ‘.’ between my first and last names, I can usually recognize misdirected e-mail by the absence of the ‘.’ and use a filter to automatically sweep it from my inbox.
— Posted by PJJ
If anything, this was a problem that started with the invention of email.
Also, if they were using the Enterprise version of Google Docs, their admin could lock it down so that files could only be shared with users within the company’s domain.
In any case, I guess Google becomes the easy target because they are so successful right now. But, they didn’t invent the Internet, they didn’t invent email, and they didn’t invent typos.
— Posted by StareClips.com
— Posted by John
— Posted by SK
— Posted by Joshua
— Posted by Kenton Varda
I think that the commentators who mention that using the enterprise version of Google Docs have hit the nail on the head. Obviously “free” has a lot of appeal for some people but for a business this is unacceptable.
As a side note I own the domain copperbooks.com while a well known Bay Area bookstore owns copperbook.com. Needless to say I get a ton of emails not meant for me.
— Posted by Tim
Contra Ben, when IT says no it usually means the app is useful and you should push to use it.
— Posted by Don N.,
>How is this specific to Google Docs, or even cloud computing?>
It is different, because of the way Google Docs is structured. Nearly every company, except for perhaps the smallest ones, has internal email, where you select other employees from some type of company address book. This limits the opportunity for addressing typos in the scenario you outline.
However, because Google docs are on the internet and not the intranet, and they’re well integrated to other Google products like iGoogle and gmail, it is convenient to many to share the docs via a non-company gmail account, thus introducing the possibilities discussed her.
— Posted by Ted
Google should be able to contribute to the fix, with a company being able to specify a restricted list of addresses. There’s no reason not to restrict to the company’s own domainname.
And it is trivial to use gmail with your own domain. Many registrars now have Google Apps integrated with their services, such as idotz.net. Companies should not be conducting business at gmail.com addresses, sheesh.
Something strange that I just discovered is that Google has gotten into by browser somehow. I have not downloaded Google Apps nor installed a Google toolbar on my Mac running Apple Darwin Unix, but when I look at my web traffic using the fabulous ‘Privoxy’, I can see things like this being retrieved:
New HTTP Request-Line: POST /safebrowsing/downloads?[snip]
Aug 29 14:17:56.052 Privoxy Connect: to safebrowsing.clients.google.com successful
Header: scan: HTTP/1.1 200 OK
Header: scan: Content-Type: application/vnd.google.safebrowsing-update
This apparently has to do with protecting users from malware sites:
But how did it get into my system, and why is it POSTing data to Google? I’ll have to hunt it down and kill it.
Read about a NYC lawyer taking advantage
of Melvyn Kaufman’s daughter.
— Posted by Eric Blair
— Posted by Tyler Moore
Issues of control over data
First of all, most companies want control over their data. That means that using outside data storage increases anxiety. Which in turn after the train left the station stimulated investment into stupid and expensive solutions to solve the problems introduced by moving the data into the cloud. Which are often adopted without thorough review of benefits or under direct marketing pressure. Which in turn decease reliability and in turn increase anxiety :-). Such a perfect Catch 22.
History shows quite convincingly that outside data storage could be compromised by disgruntled employee or random hacker. There were several high profile cases of such compromises with Web hosting providers. Moreover the company that uses the service provider has no control on social atmosphere at the service providers (who probably will feel tremendous pressures to cut costs and might well have atmosphere of a sweetshop -- much like many outsourcers). And nobody wants to give their intellectual property to somebody else due to glitches at the provider, no matter how cheap it is (usually it is not; see below) or how reliable the service is.
Stealing electricity doesn't do the kind of damage that stealing payroll data with SSNs does. In a way Nicholas Carr lives in the dream world of unpatented, unprotected information. For him information that does not matter as if all of it consists of HBR articles. The real world is different. As I mentioned before, any large scale breach into an ISP affects all its customers and PR losses can be tremendous, up to and including going out of business. Such intrusions happened with several WEB services providers in 2007 and before. There is no reasons to believe that this trend will not continue as the most reckless customer providers with the most unsecure software serves as an entry point for intruder. and if this customer has access to root account the game is over. Often the game is over even if he does not as availability of local account in many cases is enough for discovery of the vulnerability that can be exploited for getting root.
Low attention to security happens in WEB providers world more often then Carr would like to admit and actually can be attributed to misdirected cost cutting efforts. Putting lip stick on the pig after the fact with "Hacker Proof" certificates etc. does not solve the problem: infrastructure is complex, user population is diverse, staff is overloaded with other problems and overall security is as good as the weakest link. As Bill Thompson aptly noted in his BBC article:
It is often useful to conceptualize online activities as cyberspace, the place behind the screen, but the internet is firmly of the real world, and that is one of the greatest problems facing cloud computing today.
In the real world national borders, commercial rivalries and political imperatives all come into play, turning the cloud into a miasma as heavy with menace as the fog over the Grimpen Mire that concealed the Hound of the Baskervilles in Arthur Conan Doyle's story.
The issue was recently highlighted by reports that the Canadian government has a policy of not allowing public sector IT projects to use US-based hosting services because of concerns over data protection.
Under the US Patriot Act the FBI and other agencies can demand to see content stored on any computer, even if it being hosted on behalf of another sovereign state.
If your data hosting company gets a National Security Letter then not only do they have to hand over the information, they are forbidden from telling you or anyone else - apart from their lawyer - about it.
The Canadians are rather concerned about this, and rightly so. According to the US-based Electronic Frontier Foundation, a civil liberties group that helped the Internet Archive successfully challenge an NSL, more than 200,000 were issued between 2003 and 2006, and the chances are that Google, Microsoft and Amazon were on the recipient list.
Important problem with SaaS is that you data are data are stored in another firm in the format that might be proprietary — and then the firm going out of business. What happens to that data? Will you ever get it back? In what format and will it be usable? What about if their servers get hacked and your data is modified? If you are a regulated entity, will their policies about data loss affect you?
Another problem is connected with lawsuits. Situation is definitely more dangerous if company email is stored on the provider server:
Companies have no real choice, but to comply with the law in countries where they operate, and I don't expect a campaign of civil disobedience from the big hosting providers.
Those of us who use the cloud just need to be clear about the realities of the situation - and not send or store anything on GoogleMail or HotMail that the US government might want to use against us.
The latter is one of the most powerful argument against using "in the cloud" mail services. France's decision to ban government ministers from using Blackberries since the messages are stored on servers sitting in data centers in the US and the UK belongs to the same category.
Encrypting the data stored in data centers won't always work as that denies the user benefits of using local processing power: the data need to be decrypted before that can be processed on the remote server. encryption might work for storage but this is a huge overhead and potential source of problems as encrypted archive is much less reliable source of backup then unencrypted: recovery of any damaged encrypted archive usually is not possible. My experience suggest that company usually lose more data to encryption snafus than to any other data corruption source.
Issues of trust are even more complex. IT managers also should think twice before assuming that consultants and systems integrators are always acting as objective advisers when recommending best-of-breed products and technologies, experts say. Many companies [clients] still view consultants as neutral and bringing the best solution, but that's often not the case. They don't view it as a problem until a large project blows up in their face.
Recently, however, the ties linking vendors and service providers have grown tighter. Consultants and vendors have been investing in one another and launching joint companies. Cisco Systems Inc., for example, invested $1 billion in KPMG Consulting LLC when it launched in January as a separate company from KPMG LLP.
As I mentioned above it's a myth to think you can bring in a systems integrator at the beginning and delegate responsibility for architecture to the integrator. Just as the companies may be forming alliances with specific vendors, they also have varying levels of experience and skills in different technologies. And having final "résumé control"—in other words, being able to handpick the specialists from consultancies and integrators that will work on a project might be a limited remedy. The last thing you want is a consultant to come in and not be familiar with the technology you're implementing and use the company as a testbed. Among questions to ask
The quality of backups is another problem that is more complex then it looks in advertising materials. Often there is some kind of "vendor lock" that stimulates users to keep everything on the remote server, because remote application they are using does not provide a friendly way to backup data locally. In such cases situations where a lot of data were lost due to the crash are not that uncommon. Here is one relevant blog entry (The Problems and Challenges with Software as a Service Gear Diary):
Gear Diary is a WordPress enabled site, so many team members use the online WYSIWYG editor to create and edit content. It saves drafts, allows you to upload (and even watermark) graphics/pictures, and is, for all intent and purposes, an online word processor, much like Google Docs. When the site went belly up, most of the content headed south the border as well. Most team members had not saved a local copy of their work…which got me thinking…
One of the biggest and hottest trends I’ve been hearing a lot about lately is software as a service, a la Google Docs, Office Live, etc. If you take the Gear Diary site issue as a point of reference, and apply software as a service (which is basically what WordPress is acting as), you get an interesting and fairly destructive situation. WordPress doesn’t offer any kind of method of saving its documents locally, or in a format that can be read (or edited) by any other local application. Despite the fact that WordPress creates HTML documents, all data stays on the server.
If you bump into a server issue, i.e. you go down, your data gets lost. It happened to Gear Diary. It can happen to any user that uses a software as a service app. what bothers me more, is that unless there’s a specific viewer or offline editing tool for the document type, the data is useless. Further, if the app doesn’t allow you to save data locally, an off line viewer isn’t going to do much good anyway.
Many users here (those that wor of WordPress and saving it as a text or HTML file. That at least gets the data out and saved to your local hard drive. However, it doesn’t address disaster recovery on the client side (which was one of the big draws, aside from cost savings and the lack of deployment problems…all you need in most cases is a compatible browser…).
As we noted above, Carr's recommendation to focus of vulnerabilities instead of opportunities is extremely dangerous. The key to minimization of "technical glitches, outages, and security breaches" is the "state of the art" IT architecture which naturally belongs to the category of "opportunities".
This is an area were I spend several years of my professional career and I can attest that there is no better way to waist a lot of money (millions for a large enterprise) then concentrate on vulnerabilities instead of opportunities which new technologies provide.
Part of the reason why many organization have multi-million dollars security boondoggles is that security like love is a very complex notion and its interpretation by end users, administrators and system integrators are all different:
Users want to be reasonably secure, but without any additional hassle
Administrators want the same, but without sacrificing the flexibility of their systems or existing administration tools and complication their daily workload
System integrators and software providers want to sell you as much security solutions as possible, pushing it through the throat if necessary.
The key issue here is that security department actually serves as a Trojan house of system integrators and software providers. They just want the problem to go away, because they have no technical or architectural knowledge necessary for understanding what exactly caused the problem in the first place. thus they are easily believe in miracle cures and "snake oil" type of security products. What is even worse they sincerely believe that spending money on additional products will solve the issue just because they are completely unaware about the real roots of the problems and the ways of implementing practical solutions. The view their mission more from the point of view of "protecting their own butt, by adopting additional policies and procedures": the goal is the compliance with existing regulations and best practices, even if they are completely inapplicable to the particular situation.
At the same time they do not possess the understanding of the infrastructure necessary for the creating of feasible policies and procedures so their creation usually suffer from heavy doze of red tape and are hated and by-and-large ignored.
the real solution is to view security problem as one (tiny) sign of architectural problem in the current infrastructure and as the first line of defense always try to improve the architecture in such a way that the problem can be compartmentalized, minimized or eliminated. That requires the presence of talented architects who need to have institutional power to bring the other parties to the table and hammer down the best solutions even they are not understood by administrators (that can happen, not not very common) or security department (which is a usual situation).
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2020 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to to buy a cup of coffee for authors of this site|
Last modified: March 12, 2019