Softpanorama

Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells

Software Engineering

News Programming Languages Design Recommended Books Recommended Links Selected Papers LAMP Stack Unix Component Model
Architecture Brooks law Conway Law A Note on the Relationship of Brooks Law and Conway Law Configuration Management Simplification and KISS Git
Software Life Cycle Models Software Prototyping Program Understanding Exteme programming as yet another SE fad Distributed software development anti-OO Literate Programming
Reverse Engineering Links Programming style Project Management Code Reviews and Inspections The Mythical Man-Month Design patterns CMM
Bad Software Information Overload Inhouse vs outsourced applications development OSS Development as a Special Type of Academic Research A Second Look at the Cathedral and Bazaar  Labyrinth of Software Freedom Programming as a profession
Testing Over 50 and unemployed Sysadmin Horror Stories Health Issues SE quotes Humor Etc

Software Engineering: A study akin to numerology and astrology, but lacking the precision of the former and the success of the latter.

KISS Principle     /kis' prin'si-pl/ n.     "Keep It Simple, Stupid". A maxim often invoked when discussing design to fend off creeping featurism and control development complexity. Possibly related to the marketroid maxim on sales presentations, "Keep It Short and Simple".

creeping featurism     /kree'ping fee'chr-izm/ n.     [common] 1. Describes a systematic tendency to load more chrome and features onto systems at the expense of whatever elegance they may have possessed when originally designed. See also feeping creaturism. "You know, the main problem with BSD Unix has always been creeping featurism." 2. More generally, the tendency for anything complicated to become even more complicated because people keep saying "Gee, it would be even better if it had this feature too". (See feature.) The result is usually a patchwork because it grew one ad-hoc step at a time, rather than being planned. Planning is a lot of work, but it's easy to add just one extra little feature to help someone ... and then another ... and another... When creeping featurism gets out of hand, it's like a cancer. Usually this term is used to describe computer programs, but it could also be said of the federal government, the IRS 1040 form, and new cars. A similar phenomenon sometimes afflicts conscious redesigns; see second-system effect. See also creeping elegance.
Jargon file

Software engineering (SE)  has probably largest concentration of snake oil salesman after OO programming and software architecture is far from being an exclusion.  Many published software methodologies/architectures claim to provide the benefits, that most of them can not deliver (UML is one good example). I see a lot of oversimplification of the real situation and unnecessary (and useless) formalisms.  The main idea advocated here is simplification of software architecture (including usage of well-understood "Pipe and Filter model")  and scripting languages.

There are few quality general architectural resources available from the Net, therefore the list below represent only some links that I am interested personally. The stress here is on skepticism and this collection is neither complete, nor up to date. But still it might help students that are trying to study this complex and interesting subject. Or perhaps, if you already a software architect you might be able to expand your knowledge of the subject. 

Excessive zeal in adopting some fashionable but questionable methodology is a "real and present danger" in software engineering. This is not a new threat, it started with structured programming revolution and then verification "holy land" searching with Edsger W. Dijkstra as a new prophet of an obsure cult.  The main problem here that all those methodologies contain 20% of useful elements; but the other 80% kill all the useful elements and introduce probably some real disadvantages. After a dozen or so partially useful but mostly useless methodologies came, were enthusiastically  adopted and went into oblivion we should definitely be skeptical.

All this "extreme programming" idiotism or CMM Lysenkoism should be treated as we treat dangerous religious sects.  It's undemocratic and stupid to prohibit them but it's equally dangerous and stupid to follow their recommendations ;-). As Talleyrand advised to junior diplomats: "Above all, gentlemen, not too much zeal. "  By this phrase, Talleyrand was reportedly recommended to his subordinates that important decisions must be based upon the exercise of cool-headed reason and not upon emotions or any waxing or waning popular delusion.

One interesting fact about software architecture is that it can't be practiced from the "ivory tower". Only when you do coding yourself and faces limitations of the tools and hardware you can create a great architecture.  See Real Insights into Architecture Come Only From Actual Programming

One interesting fact about software architecture is that it can't be practiced from the "ivory tower". Only when you do coding yourself and faces limitations of the tools and hardware you can create a great architecture.  See Real Insights into Architecture Come Only From Actual Programming

The primary purpose of Software Architecture courses is to teach students some higher level skills useful in designing and implementing complex software systems. In usually includes some information about classification (general and domain specific architectures), analysis and tools.  As guys in Breadmear consulting aptly noted in their paper  Who Software Architect Role:

A simplistic view of the role is that architects create architectures, and their responsibilities encompass all that is involved in doing so. This would include articulating the architectural vision, conceptualizing and experimenting with alternative architectural approaches, creating models and component and interface specification documents, and validating the architecture against requirements and assumptions.

However, any experienced architect knows that the role involves not just these technical activities, but others that are more political and strategic in nature on the one hand, and more like those of a consultant, on the other. A sound sense of business and technical strategy is required to envision the "right" architectural approach to the customer's problem set, given the business objectives of the architect's organization. Activities in this area include the creation of technology roadmaps, making assertions about technology directions and determining their consequences for the technical strategy and hence architectural approach.

Further, architectures are seldom embraced without considerable challenges from many fronts. The architect thus has to shed any distaste for what may be considered "organizational politics", and actively work to sell the architecture to its various stakeholders, communicating extensively and working networks of influence to ensure the ongoing success of the architecture.

But "buy-in" to the architecture vision is not enough either. Anyone involved in implementing the architecture needs to understand it. Since weighty architectural documents are notorious dust-gatherers, this involves creating and teaching tutorials and actively consulting on the application of the architecture, and being available to explain the rationale behind architectural choices and to make amendments to the architecture when justified.

Lastly, the architect must lead--the architecture team, the developer community, and, in its technical direction, the organization.

Again, I would like to stress that the main principle of software architecture is simple and well known -- it's famous KISS principle. While principle is simple its implementation is not and a lot of developers (especially developers with limited resources) paid dearly for violating this principle.  I have found one one reference on simplicity in SE: R. S. Pressman. Simplicity. In Software Engineering, A Practitioner's Approach, page 452. McGraw Hill, 1997. Here open source tools can help because for those tools a complexity is not such a competitive advantage as for closed source tools. But that not necessary true about actual tools as one problem with open source projects is change of the leader. This is the moment when many projects lose architectural integrity and became a Byzantium compendium of conflicting approaches.

I appreciate am architecture of software system that lead to small size implementations with simple, Spartan interface. In these days the usage of scripting languages can cut the volume of code more than in half in comparison with Java.  That's why this site is advocating usage of scripting languages for complex software projects.

"Real Beauty can be found in Simplicity," and as you may know already, ' "Less" sometimes equal "More".' I continue to adhere to that philosophy. If you, too, have an eye for simplicity in software engineering, then you might benefit from this collection of links.

I think writing a good software system is somewhat similar to writing a multivolume series of books. Most writers will rewrite each chapter of book several times and changes general structure a lot. Rewriting large systems is more difficult, but also very beneficial. It make sense always consider the current version of the system a draft that can be substantially improved and simplified by discovering some new unifying and simplifying paradigm.  Sometimes you can take a wrong direction, but still "nothing venture nothing have."

On a subsystem level a decent configuration management system can help going back. Too often people try to write and debug their fundamentally flawed architecturally "first draft", when it would have been much simpler and faster to rewrite it based on better understanding of architecture and better understanding of the problem.  Actually rewriting can save time spend in debugging of the old version.  That way, when you're done, you may get easy-to-understand, simple software systems, instead of just systems that "seems to work okay" (only as correct as your testing).

On component level refactoring (see Refactoring: Improving the Design of Existing Code) might be a useful simplification technique. Actually rewriting is a simpler term, but let's assume that refactoring is rewriting with some ideological frosting ;-). See Slashdot Book Reviews Refactoring Improving the Design of Existing Code.

I have found one reference on simplicity in SE: R. S. Pressman. Simplicity. In Software Engineering, A Practitioner's Approach, page 452. McGraw Hill, 1997.

Another relevant work (he try to promote his own solution -- you can skip this part) is the critique of "the technology mud slide" in a  book The Innovator's Dilemma by Harvard Business School Professor Clayton M. Christensen . He defined  the term"technology mudslide", the concept very similar to Brooks "software development tar pit" -- a perpetual cycle of abandonment or retooling of existing systems in pursuit of the latest fashionable technology trend -- a cycle in which

 "Coping with the relentless onslaught of technology change was akin to trying to climb a mudslide raging down a hill. You have to scramble with everything you've got to stay on top of it. and if you ever once stop to catch your breath, you get buried."

The complexity caused by adopting new technology for the sake of new technology is further exacerbated by the narrow focus and inexperience of many project leaders -- inexperience with mission-critical systems, systems of larger scale then previously built, software development disciplines, and project management. A Standish Group International survey recently showed that 46% of IT projects were over budget and overdue -- and 28% failed altogether. That's normal and probably the real failures figures are higher: great software managers and architects are rare and it is those people who determine the success of a software project.

Dr. Nikolai Bezroukov


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Sep 21, 2018] 'It Just Seems That Nobody is Interested in Building Quality, Fast, Efficient, Lasting, Foundational Stuff Anymore'

Sep 21, 2018 | tech.slashdot.org

Nikita Prokopov, a software programmer and author of Fira Code, a popular programming font, AnyBar, a universal status indicator, and some open-source Clojure libraries, writes :

Remember times when an OS, apps and all your data fit on a floppy? Your desktop todo app is probably written in Electron and thus has userland driver for Xbox 360 controller in it, can render 3d graphics and play audio and take photos with your web camera. A simple text chat is notorious for its load speed and memory consumption. Yes, you really have to count Slack in as a resource-heavy application. I mean, chatroom and barebones text editor, those are supposed to be two of the less demanding apps in the whole world. Welcome to 2018.

At least it works, you might say. Well, bigger doesn't imply better. Bigger means someone has lost control. Bigger means we don't know what's going on. Bigger means complexity tax, performance tax, reliability tax. This is not the norm and should not become the norm . Overweight apps should mean a red flag. They should mean run away scared. 16Gb Android phone was perfectly fine 3 years ago. Today with Android 8.1 it's barely usable because each app has become at least twice as big for no apparent reason. There are no additional functions. They are not faster or more optimized. They don't look different. They just...grow?

iPhone 4s was released with iOS 5, but can barely run iOS 9. And it's not because iOS 9 is that much superior -- it's basically the same. But their new hardware is faster, so they made software slower. Don't worry -- you got exciting new capabilities like...running the same apps with the same speed! I dunno. [...] Nobody understands anything at this point. Neither they want to. We just throw barely baked shit out there, hope for the best and call it "startup wisdom." Web pages ask you to refresh if anything goes wrong. Who has time to figure out what happened? Any web app produces a constant stream of "random" JS errors in the wild, even on compatible browsers.

[...] It just seems that nobody is interested in building quality, fast, efficient, lasting, foundational stuff anymore. Even when efficient solutions have been known for ages, we still struggle with the same problems: package management, build systems, compilers, language design, IDEs. Build systems are inherently unreliable and periodically require full clean, even though all info for invalidation is there. Nothing stops us from making build process reliable, predictable and 100% reproducible. Just nobody thinks its important. NPM has stayed in "sometimes works" state for years.


K. S. Kyosuke ( 729550 ) , Friday September 21, 2018 @11:32AM ( #57354556 )

Re:Why should they? ( Score: 4 , Insightful)

Less resource use to accomplish the required tasks? Both in manufacturing (more chips from the same amount of manufacturing input) and in operation (less power used)?

K. S. Kyosuke ( 729550 ) writes: on Friday September 21, 2018 @11:58AM ( #57354754 )
Re:Why should they? ( Score: 2 )

Ehm...so for example using smaller cars with better mileage to commute isn't more environmentally friendly either, according to you?https://slashdot.org/comments.pl?sid=12644750&cid=57354556#

DontBeAMoran ( 4843879 ) writes: on Friday September 21, 2018 @12:04PM ( #57354826 )
Re:Why should they? ( Score: 2 )

iPhone 4S used to be the best and could run all the applications.

Today, the same power is not sufficient because of software bloat. So you could say that all the iPhones since the iPhone 4S are devices that were created and then dumped for no reason.

It doesn't matter since we can't change the past and it doesn't matter much since improvements are slowing down so people are changing their phones less often.

Mark of the North ( 19760 ) , Friday September 21, 2018 @01:02PM ( #57355296 )
Re:Why should they? ( Score: 5 , Interesting)

Can you really not see the connection between inefficient software and environmental harm? All those computers running code that uses four times as much data, and four times the number crunching, as is reasonable? That excess RAM and storage has to be built as well as powered along with the CPU. Those material and electrical resources have to come from somewhere.

But the calculus changes completely when the software manufacturer hosts the software (or pays for the hosting) for their customers. Our projected AWS bill motivated our management to let me write the sort of efficient code I've been trained to write. After two years of maintaining some pretty horrible legacy code, it is a welcome change.

The big players care a great deal about efficiency when they can't outsource inefficiency to the user's computing resources.

eth1 ( 94901 ) , Friday September 21, 2018 @11:45AM ( #57354656 )
Re:Why should they? ( Score: 5 , Informative)
We've been trained to be a consuming society of disposable goods. The latest and greatest feature will always be more important than something that is reliable and durable for the long haul.

It's not just consumer stuff.

The network team I'm a part of has been dealing with more and more frequent outages, 90% of which are due to bugs in software running our devices. These aren't fly-by-night vendors either, they're the "no one ever got fired for buying X" ones like Cisco, F5, Palo Alto, EMC, etc.

10 years ago, outages were 10% bugs, and 90% human error, now it seems to be the other way around. Everyone's chasing features, because that's what sells, so there's no time for efficiency/stability/security any more.

LucasBC ( 1138637 ) , Friday September 21, 2018 @12:05PM ( #57354836 )
Re:Why should they? ( Score: 3 , Interesting)

Poor software engineering means that very capable computers are no longer capable of running modern, unnecessarily bloated software. This, in turn, leads to people having to replace computers that are otherwise working well, solely for the reason to keep up with software that requires more and more system resources for no tangible benefit. In a nutshell -- sloppy, lazy programming leads to more technology waste. That impacts the environment. I have a unique perspective in this topic. I do web development for a company that does electronics recycling. I have suffered the continued bloat in software in the tools I use (most egregiously, Adobe), and I see the impact of technological waste in the increasing amount of electronics recycling that is occurring. Ironically, I'm working at home today because my computer at the office kept stalling every time I had Photoshop and Illustrator open at the same time. A few years ago that wasn't a problem.

arglebargle_xiv ( 2212710 ) writes:
Re: ( Score: 3 )

There is one place where people still produce stuff like the OP wants, and that's embedded. Not IoT wank, but real embedded, running on CPUs clocked at tens of MHz with RAM in two-digit kilobyte (not megabyte or gigabyte) quantities. And a lot of that stuff is written to very exacting standards, particularly where something like realtime control and/or safety is involved.

The one problem in this area is the endless battle with standards morons who begin each standard with an implicit "assume an infinitely

commodore64_love ( 1445365 ) , Friday September 21, 2018 @03:58PM ( #57356680 ) Journal
Re:Why should they? ( Score: 3 )

> Poor software engineering means that very capable computers are no longer capable of running modern, unnecessarily bloated software.

Not just computers.

You can add Smart TVs, settop internet boxes, Kindles, tablets, et cetera that must be thrown-away when they become too old (say 5 years) to run the latest bloatware. Software non-engineering is causing a lot of working hardware to be landfilled, and for no good reason.

[Sep 21, 2018] Fast, cheap (efficient) and reliable (robust, long lasting): pick 2

Sep 21, 2018 | tech.slashdot.org

JoeDuncan ( 874519 ) , Friday September 21, 2018 @12:58PM ( #57355276 )

Obligatory ( Score: 2 )

Fast, cheap (efficient) and reliable (robust, long lasting): pick 2.

roc97007 ( 608802 ) , Friday September 21, 2018 @12:16PM ( #57354946 ) Journal
Re:Bloat = growth ( Score: 2 )

There's probably some truth to that. And it's a sad commentary on the industry.

[Sep 21, 2018] Since Moore's law appears to have stalled since at least five years ago, it will be interesting to see if we start to see algorithm research or code optimization techniques coming to the fore again.

Sep 21, 2018 | tech.slashdot.org

Anonymous Coward , Friday September 21, 2018 @11:26AM ( #57354512 )

Moore's law ( Score: 5 , Interesting)

When the speed of your processor doubles every two year along with a concurrent doubling of RAM and disk space, then you can get away with bloatware.

Since Moore's law appears to have stalled since at least five years ago, it will be interesting to see if we start to see algorithm research or code optimization techniques coming to the fore again.

[Sep 16, 2018] After the iron curtain fell, there was a big demand for Russian-trained programmers because they could program in a very efficient and light manner that didn't demand too much of the hardware, if I remember correctly

Notable quotes:
"... It's a bit of chicken-and-egg problem, though. Russia, throughout 20th century, had problem with developing small, effective hardware, so their programmers learned how to code to take maximum advantage of what they had, with their technological deficiency in one field giving rise to superiority in another. ..."
"... Russian tech ppl should always be viewed with certain amount of awe and respect...although they are hardly good on everything. ..."
"... Soviet university training in "cybernetics" as it was called in the late 1980s involved two years of programming on blackboards before the students even touched an actual computer. ..."
"... I recall flowcharting entirely on paper before committing a program to punched cards. ..."
Aug 01, 2018 | turcopolier.typepad.com

Bill Herschel 2 days ago ,

Very, very slightly off-topic.

Much has been made, including in this post, of the excellent organization of Russian forces and Russian military technology.

I have been re-investigating an open-source relational database system known as PosgreSQL (variously), and I remember finding perhaps a decade ago a very useful whole text search feature of this system which I vaguely remember was written by a Russian and, for that reason, mildly distrusted by me.

Come to find out that the principle developers and maintainers of PostgreSQL are Russian. OMG. Double OMG, because the reason I chose it in the first place is that it is the best non-proprietary RDBS out there and today is supported on Google Cloud, AWS, etc.

The US has met an equal or conceivably a superior, case closed. Trump's thoroughly odd behavior with Putin is just one but a very obvious one example of this.

Of course, Trump's nationalistic blather is creating a "base" of people who believe in the godliness of the US. They are in for a very serious disappointment.

kao_hsien_chih Bill Herschel a day ago ,

After the iron curtain fell, there was a big demand for Russian-trained programmers because they could program in a very efficient and "light" manner that didn't demand too much of the hardware, if I remember correctly.

It's a bit of chicken-and-egg problem, though. Russia, throughout 20th century, had problem with developing small, effective hardware, so their programmers learned how to code to take maximum advantage of what they had, with their technological deficiency in one field giving rise to superiority in another.

Russia has plenty of very skilled, very well-trained folks and their science and math education is, in a way, more fundamentally and soundly grounded on the foundational stuff than US (based on my personal interactions anyways).

Russian tech ppl should always be viewed with certain amount of awe and respect...although they are hardly good on everything.

TTG kao_hsien_chih a day ago ,

Well said. Soviet university training in "cybernetics" as it was called in the late 1980s involved two years of programming on blackboards before the students even touched an actual computer.

It gave the students an understanding of how computers works down to the bit flipping level. Imagine trying to fuzz code in your head.

FarNorthSolitude TTG a day ago ,

I recall flowcharting entirely on paper before committing a program to punched cards. I used to do hex and octal math in my head as part of debugging core dumps. Ah, the glory days.

Honeywell once made a military computer that was 10 bit. That stumped me for a while, as everything was 8 or 16 bit back then.

kao_hsien_chih FarNorthSolitude 10 hours ago ,

That used to be fairly common in the civilian sector (in US) too: computing time was expensive, so you had to make sure that the stuff worked flawlessly before it was committed.

No opportunity to seeing things go wrong and do things over like much of how things happen nowadays. Russians, with their hardware limitations/shortages, I imagine must have been much more thorough than US programmers were back in the old days, and you could only get there by being very thoroughly grounded n the basics.

[Sep 07, 2018] How Can We Fix The Broken Economics of Open Source?

Notable quotes:
"... [with some subset of features behind a paywall] ..."
Sep 07, 2018 | news.slashdot.org

If we take consulting, services, and support off the table as an option for high-growth revenue generation (the only thing VCs care about), we are left with open core [with some subset of features behind a paywall] , software as a service, or some blurring of the two... Everyone wants infrastructure software to be free and continuously developed by highly skilled professional developers (who in turn expect to make substantial salaries), but no one wants to pay for it. The economics of this situation are unsustainable and broken ...

[W]e now come to what I have recently called "loose" open core and SaaS. In the future, I believe the most successful OSS projects will be primarily monetized via this method. What is it? The idea behind "loose" open core and SaaS is that a popular OSS project can be developed as a completely community driven project (this avoids the conflicts of interest inherent in "pure" open core), while value added proprietary services and software can be sold in an ecosystem that forms around the OSS...

Unfortunately, there is an inflection point at which in some sense an OSS project becomes too popular for its own good, and outgrows its ability to generate enough revenue via either "pure" open core or services and support... [B]uilding a vibrant community and then enabling an ecosystem of "loose" open core and SaaS businesses on top appears to me to be the only viable path forward for modern VC-backed OSS startups.
Klein also suggests OSS foundations start providing fellowships to key maintainers, who currently "operate under an almost feudal system of patronage, hopping from company to company, trying to earn a living, keep the community vibrant, and all the while stay impartial..."

"[A]s an industry, we are going to have to come to terms with the economic reality: nothing is free, including OSS. If we want vibrant OSS projects maintained by engineers that are well compensated and not conflicted, we are going to have to decide that this is something worth paying for. In my opinion, fellowships provided by OSS foundations and funded by companies generating revenue off of the OSS is a great way to start down this path."

[Apr 30, 2018] New Book Describes Bluffing Programmers in Silicon Valley

Notable quotes:
"... Live Work Work Work Die: A Journey into the Savage Heart of Silicon Valley ..."
"... Older generations called this kind of fraud "fake it 'til you make it." ..."
"... Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring ..."
"... It's not a "kids these days" sort of issue, it's *always* been the case that shameless, baseless self-promotion wins out over sincere skill without the self-promotion, because the people who control the money generally understand boasting more than they understand the technology. ..."
"... In the bad old days we had a hell of a lot of ridiculous restriction We must somehow made our programs to run successfully inside a RAM that was 48KB in size (yes, 48KB, not 48MB or 48GB), on a CPU with a clock speed of 1.023 MHz ..."
"... So what are the uses for that? I am curious what things people have put these to use for. ..."
"... Also, Oracle, SAP, IBM... I would never buy from them, nor use their products. I have used plenty of IBM products and they suck big time. They make software development 100 times harder than it could be. ..."
"... I have a theory that 10% of people are good at what they do. It doesn't really matter what they do, they will still be good at it, because of their nature. These are the people who invent new things, who fix things that others didn't even see as broken and who automate routine tasks or simply question and erase tasks that are not necessary. ..."
"... 10% are just causing damage. I'm not talking about terrorists and criminals. ..."
"... Programming is statistically a dead-end job. Why should anyone hone a dead-end skill that you won't be able to use for long? For whatever reason, the industry doesn't want old programmers. ..."
Apr 30, 2018 | news.slashdot.org

Long-time Slashdot reader Martin S. pointed us to this an excerpt from the new book Live Work Work Work Die: A Journey into the Savage Heart of Silicon Valley by Portland-based investigator reporter Corey Pein.

The author shares what he realized at a job recruitment fair seeking Java Legends, Python Badasses, Hadoop Heroes, "and other gratingly childish classifications describing various programming specialities.

" I wasn't the only one bluffing my way through the tech scene. Everyone was doing it, even the much-sought-after engineering talent.

I was struck by how many developers were, like myself, not really programmers , but rather this, that and the other. A great number of tech ninjas were not exactly black belts when it came to the actual onerous work of computer programming. So many of the complex, discrete tasks involved in the creation of a website or an app had been automated that it was no longer necessary to possess knowledge of software mechanics. The coder's work was rarely a craft. The apps ran on an assembly line, built with "open-source", off-the-shelf components. The most important computer commands for the ninja to master were copy and paste...

[M]any programmers who had "made it" in Silicon Valley were scrambling to promote themselves from coder to "founder". There wasn't necessarily more money to be had running a startup, and the increase in status was marginal unless one's startup attracted major investment and the right kind of press coverage. It's because the programmers knew that their own ladder to prosperity was on fire and disintegrating fast. They knew that well-paid programming jobs would also soon turn to smoke and ash, as the proliferation of learn-to-code courses around the world lowered the market value of their skills, and as advances in artificial intelligence allowed for computers to take over more of the mundane work of producing software. The programmers also knew that the fastest way to win that promotion to founder was to find some new domain that hadn't yet been automated. Every tech industry campaign designed to spur investment in the Next Big Thing -- at that time, it was the "sharing economy" -- concealed a larger programme for the transformation of society, always in a direction that favoured the investor and executive classes.

"I wasn't just changing careers and jumping on the 'learn to code' bandwagon," he writes at one point. "I was being steadily indoctrinated in a specious ideology."


Anonymous Coward , Saturday April 28, 2018 @11:40PM ( #56522045 )

older generations already had a term for this ( Score: 5 , Interesting)

Older generations called this kind of fraud "fake it 'til you make it."

raymorris ( 2726007 ) , Sunday April 29, 2018 @02:05AM ( #56522343 ) Journal
The people who are smarter won't ( Score: 5 , Informative)

> The people can do both are smart enough to build their own company and compete with you.

Been there, done that. Learned a few lessons. Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring, managing people, corporate strategy, staying up on the competition, figuring out tax changes each year and getting taxes filed six times each year, the various state and local requirements, legal changes, contract hassles, etc, while hoping the company makes money this month so they can take a paycheck and lay their rent.

I learned that I'm good at creating software systems and I enjoy it. I don't enjoy all-nighters, partners being dickheads trying to pull out of a contract, or any of a thousand other things related to running a start-up business. I really enjoy a consistent, six-figure compensation package too.

brian.stinar ( 1104135 ) writes:
Re: ( Score: 2 )

* getting taxes filled eighteen times a year.

I pay monthly gross receipts tax (12), quarterly withholdings (4) and a corporate (1) and individual (1) returns. The gross receipts can vary based on the state, so I can see how six times a year would be the minimum.

Cederic ( 9623 ) writes:
Re: ( Score: 2 )

Fuck no. Cost of full automation: $4m Cost of manual entry: $0 Opportunity cost of manual entry: $800/year

At worse, pay for an accountant, if you can get one that cheaply. Bear in mind talking to them incurs most of that opportunity cost anyway.

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring

There's nothing wrong with not wnting to run your own business, it's not for most people, and even if it was, the numbers don't add up. But putting the scare qoutes in like that makes it sound like you have huge chip on your shoulder. Those things re just as essential to the business as your work and without them you wouldn't have the steady 9:30-4:30 with good paycheck.

raymorris ( 2726007 ) writes:
Important, and dumb. ( Score: 3 , Informative)

Of course they are important. I wouldn't have done those things if they weren't important!

I frequently have friends say things like "I love baking. I can't get enough of baking. I'm going to open a bakery.". I ask them "do you love dealing with taxes, every month? Do you love contract law? Employment law? Marketing? Accounting?" If you LOVE baking, the smart thing to do is to spend your time baking. Running a start-up business, you're not going to do much baking.

If you love marketing, employment law, taxes

raymorris ( 2726007 ) writes:
Four tips for a better job. Who has more? ( Score: 3 )

I can tell you a few things that have worked for me. I'll go in chronological order rather than priority order.

Make friends in the industry you want to be in. Referrals are a major way people get jobs.

Look at the job listings for jobs you'd like to have and see which skills a lot of companies want, but you're missing. For me that's Java. A lot companies list Java skills and I'm not particularly good with Java. Then consider learning the skills you lack, the ones a lot of job postings are looking for.

Certifi

goose-incarnated ( 1145029 ) , Sunday April 29, 2018 @02:34PM ( #56524475 ) Journal
Re: older generations already had a term for this ( Score: 5 , Insightful)
You don't understand the point of an ORM do you? I'd suggest reading why they exist

They exist because programmers value code design more than data design. ORMs are the poster-child for square-peg-round-hole solutions, which is why all ORMs choose one of three different ways of squashing hierarchical data into a relational form, all of which are crappy.

If the devs of the system (the ones choosing to use an ORM) had any competence at all they'd design their database first because in any application that uses a database the database is the most important bit, not the OO-ness or Functional-ness of the design.

Over the last few decades I've seen programs in a system come and go; a component here gets rewritten, a component there gets rewritten, but you know what? They all have to work with the same damn data.

You can more easily switch out your code for new code with new design in a new language, than you can switch change the database structure. So explain to me why it is that you think the database should be mangled to fit your OO code rather than mangling your OO code to fit the database?

cheekyboy ( 598084 ) writes:
im sick of reinventors and new frameworks ( Score: 3 )

Stick to the one thing for 10-15years. Often all this new shit doesn't do jack different to the old shit, its not faster, its not better. Every dick wants to be famous so make another damn library/tool with his own fancy name and feature, instead of enhancing an existing product.

gbjbaanb ( 229885 ) writes:
Re: ( Score: 2 )

amen to that.

Or kids who can't hack the main stuff, suddenly discover the cool new, and then they can pretend they're "learning" it, and when the going gets tough (as it always does) they can declare the tech to be pants and move to another.

hence we had so many people on the bandwagon for functional programming, then dumped it for ruby on rails, then dumped that for Node.js, not sure what they're on at currently, probably back to asp.net.

Greyfox ( 87712 ) writes:
Re: ( Score: 2 )

How much code do you have to reuse before you're not really programming anymore? When I started in this business, it was reasonably possible that you could end up on a project that didn't particularly have much (or any) of an operating system. They taught you assembly language and the process by which the system boots up, but I think if I were to ask most of the programmers where I work, they wouldn't be able to explain how all that works...

djinn6 ( 1868030 ) writes:
Re: ( Score: 2 )
It really feels like if you know what you're doing it should be possible to build a team of actually good programmers and put everyone else out of business by actually meeting your deliverables, but no one has yet. I wonder why that is.

You mean Amazon, Google, Facebook and the like? People may not always like what they do, but they manage to get things done and make plenty of money in the process. The problem for a lot of other businesses is not having a way to identify and promote actually good programmers. In your example, you could've spent 10 minutes fixing their query and saved them days of headache, but how much recognition will you actually get? Where is your motivation to help them?

Junta ( 36770 ) writes:
Re: ( Score: 2 )

It's not a "kids these days" sort of issue, it's *always* been the case that shameless, baseless self-promotion wins out over sincere skill without the self-promotion, because the people who control the money generally understand boasting more than they understand the technology. Yes it can happen that baseless boasts can be called out over time by a large enough mass of feedback from competent peers, but it takes a *lot* to overcome the tendency for them to have faith in the boasts.

It does correlate stron

cheekyboy ( 598084 ) writes:
Re: ( Score: 2 )

And all these modern coders forget old lessons, and make shit stuff, just look at instagram windows app, what a load of garbage shit, that us old fuckers could code in 2-3 weeks.

Instagram - your app sucks, cookie cutter coders suck, no refinement, coolness. Just cheap ass shit, with limited usefulness.

Just like most of commercial software that's new - quick shit.

Oh and its obvious if your an Indian faking it, you haven't worked in 100 companies at the age of 29.

Junta ( 36770 ) writes:
Re: ( Score: 2 )

Here's another problem, if faced with a skilled team that says "this will take 6 months to do right" and a more naive team that says "oh, we can slap that together in a month", management goes with the latter. Then the security compromises occur, then the application fails due to pulling in an unvetted dependency update live into production. When the project grows to handling thousands instead of dozens of users and it starts mysteriously folding over and the dev team is at a loss, well the choice has be

molarmass192 ( 608071 ) , Sunday April 29, 2018 @02:15AM ( #56522359 ) Homepage Journal
Re:older generations already had a term for this ( Score: 5 , Interesting)

These restrictions is a large part of what makes Arduino programming "fun". If you don't plan out your memory usage, you're gonna run out of it. I cringe when I see 8MB web pages of bloated "throw in everything including the kitchen sink and the neighbor's car". Unfortunately, the careful and cautious way is a dying in favor of the throw 3rd party code at it until it does something. Of course, I don't have time to review it but I'm sure everybody else has peer reviewed it for flaws and exploits line by line.

AmiMoJo ( 196126 ) writes: < mojo@@@world3...net > on Sunday April 29, 2018 @05:15AM ( #56522597 ) Homepage Journal
Re:older generations already had a term for this ( Score: 4 , Informative)
Unfortunately, the careful and cautious way is a dying in favor of the throw 3rd party code at it until it does something.

Of course. What is the business case for making it efficient? Those massive frameworks are cached by the browser and run on the client's system, so cost you nothing and save you time to market. Efficient costs money with no real benefit to the business.

If we want to fix this, we need to make bloat have an associated cost somehow.

locketine ( 1101453 ) writes:
Re: older generations already had a term for this ( Score: 2 )

My company is dealing with the result of this mentality right now. We released the web app to the customer without performance testing and doing several majorly inefficient things to meet deadlines. Once real load was put on the application by users with non-ideal hardware and browsers, the app was infuriatingly slow. Suddenly our standard sub-40 hour workweek became a 50+ hour workweek for months while we fixed all the inefficient code and design issues.

So, while you're right that getting to market and opt

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

In the bad old days we had a hell of a lot of ridiculous restriction We must somehow made our programs to run successfully inside a RAM that was 48KB in size (yes, 48KB, not 48MB or 48GB), on a CPU with a clock speed of 1.023 MHz

We still have them. In fact some of the systems I've programmed have been more resource limited than the gloriously spacious 32KiB memory of the BBC model B. Take the PIC12F or 10F series. A glorious 64 bytes of RAM, max clock speed of 16MHz, but not unusual to run it 32kHz.

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

So what are the uses for that? I am curious what things people have put these to use for.

It's hard to determine because people don't advertise use of them at all. However, I know that my electric toothbrush uses an Epson 4 bit MCU of some description. It's got a status LED, basic NiMH batteryb charger and a PWM controller for an H Bridge. Braun sell a *lot* of electric toothbrushes. Any gadget that's smarter than a simple switch will probably have some sort of basic MCU in it. Alarm system components, sensor

tlhIngan ( 30335 ) writes:
Re: ( Score: 3 , Insightful)
b) No computer ever ran at 1.023 MHz. It was either a nice multiple of 1Mhz or maybe a multiple of 3.579545Mhz (ie. using the TV output circuit's color clock crystal to drive the CPU).

Well, it could be used to drive the TV output circuit, OR, it was used because it's a stupidly cheap high speed crystal. You have to remember except for a few frequencies, most crystals would have to be specially cut for the desired frequency. This occurs even today, where most oscillators are either 32.768kHz (real time clock

Anonymous Coward writes:
Re: ( Score: 2 , Interesting)

Yeah, nice talk. You could have stopped after the first sentence. The other AC is referring to the Commodore C64 [wikipedia.org]. The frequency has nothing to do with crystal availability but with the simple fact that everything in the C64 is synced to the TV. One clock cycle equals 8 pixels. The graphics chip and the CPU take turns accessing the RAM. The different frequencies dictated by the TV standards are the reason why the CPU in the NTSC version of the C64 runs at 1.023MHz and the PAL version at 0.985MHz.

Wraithlyn ( 133796 ) writes:
Re: ( Score: 2 )

LOL what exactly is so special about 16K RAM? https://yourlogicalfallacyis.c... [yourlogicalfallacyis.com]

I cut my teeth on a VIC20 (5K RAM), then later a C64 (which ran at 1.023MHz...)

Anonymous Coward writes:
Re: ( Score: 2 , Interesting)

Commodore 64 for the win. I worked for a company that made detection devices for the railroad, things like monitoring axle temperatures, reading the rail car ID tags. The original devices were made using Commodore 64 boards using software written by an employee at the one rail road company working with them.

The company then hired some electrical engineers to design custom boards using the 68000 chips and I was hired as the only programmer. Had to rewrite all of the code which was fine...

wierd_w ( 1375923 ) , Saturday April 28, 2018 @11:58PM ( #56522075 )
... A job fair can easily test this competency. ( Score: 4 , Interesting)

Many of these languages have an interactive interpreter. I know for a fact that Python does.

So, since job-fairs are an all day thing, and setup is already a thing for them -- set up a booth with like 4 computers at it, and an admin station. The 4 terminals have an interactive session with the interpreter of choice. Every 20min or so, have a challenge for "Solve this problem" (needs to be easy and already solved in general. Programmers hate being pimped without pay. They don't mind tests of skill, but hate being pimped. Something like "sort this array, while picking out all the prime numbers" or something.) and see who steps up. The ones that step up have confidence they can solve the problem, and you can quickly see who can do the work and who can't.

The ones that solve it, and solve it to your satisfaction, you offer a nice gig to.

ShanghaiBill ( 739463 ) , Sunday April 29, 2018 @01:50AM ( #56522321 )
Re:... A job fair can easily test this competency. ( Score: 5 , Informative)
Then you get someone good at sorting arrays while picking out prime numbers, but potentially not much else.

The point of the test is not to identify the perfect candidate, but to filter out the clearly incompetent. If you can't sort an array and write a function to identify a prime number, I certainly would not hire you. Passing the test doesn't get you a job, but it may get you an interview ... where there will be other tests.

wierd_w ( 1375923 ) writes:
Re: ( Score: 2 )

BINGO!

(I am not even a professional programmer, but I can totally perform such a trivially easy task. The example tests basic understanding of loop construction, function construction, variable use, efficient sorting, and error correction-- especially with mixed type arrays. All of these are things any programmer SHOULD now how to do, without being overly complicated, or clearly a disguised occupational problem trying to get a free solution. Like I said, programmers hate being pimped, and will be turned off

wierd_w ( 1375923 ) , Sunday April 29, 2018 @04:02AM ( #56522443 )
Re: ... A job fair can easily test this competency ( Score: 5 , Insightful)

Again, the quality applicant and the code monkey both have something the fakers do not-- Actual comprehension of what a program is, and how to create one.

As Bill points out, this is not the final exam. This is the "Oh, I see you do actually know how to program-- show me more" portion of the process. This is the part that HR drones are not capable of performing, due to Dunning-Krueger. Those that are actually, REALLY competent will do more than just satisfy the requirements of the challenge, they will provide actually working solutions to the challenge that properly validate their input, and return proper error states if the input is invalid, etc-- You can learn a LOT about a potential hire by observing their work. *THAT* is what this is really about. The triviality of the problem is a necessity, because you ***DON'T*** try to get free solutions out of people.

I realize that may be difficult for you to comprehend, but you *DON'T* do that. The job fair is to let people know that you have a position available, and try to curry interest in people to apply. A successful pre-screening is confidence building, and helps the potential hire to feel that your company is actually interested in actually hiring somebody, and not just fucking off in the booth, to cover for "failing to find somebody" and then "Getting yet another H1B". It gives them a chance to show you what they can do. That is what it is for, and what it does. It also excludes the fakers that this article is about-- The ones that can talk a good talk, but could not program a simple boolean check condition if their life depended on it.

If it were not for the time constraints of a job fair (usually only 2 days, and in that time you need to try and pre-screen as many as possible), I would suggest a tiered challenge, with progressively harder challenges, where you hand out resumes to the ones that make it to the top 3 brackets, but that is not the way the world works.

luis_a_espinal ( 1810296 ) writes:
Re: ( Score: 2 )
This in my opinion is really a waste of time. Challenges like this have to be so simple they can be done walking up to a booth are not likely to filter the "all talks" any better than a few interview questions could (imperson so the candidate can't just google it).

Tougher more involved stuff isn't good either it gives a huge advantage to the full time job hunter, the guy or gal that already has a 9-5 and a family that wants to seem them has not got time for games. We have been struggling with hiring where I work ( I do a lot of the interviews ) and these are the conclusions we have reached

You would be surprised at the number of people with impeccable-looking resumes failing at something as simple as the FizzBuzz test [codinghorror.com]

PaulRivers10 ( 4110595 ) writes:
Re: ... A job fair can easily test this competenc ( Score: 2 )

The only thing fuzzbuzz tests is "have you done fizzbuzz before"? It's a short question filled with every petty trick the author could think ti throw in there. If you haven't seen the tricks they trip you up for no reason related to your actual coding skills. Once you have seen them they're trivial and again unrelated to real work. Fizzbuzz is best passed by someone aiming to game the interview system. It passes people gaming it and trips up people who spent their time doing on the job real work.

Hognoxious ( 631665 ) writes:
Re: ( Score: 2 )
they trip you up for no reason related to your actual codung skills.

Bullshit!

luis_a_espinal ( 1810296 ) , Sunday April 29, 2018 @07:49AM ( #56522861 ) Homepage
filter the lame code monkeys ( Score: 4 , Informative)
Lame monkey tests select for lame monkeys.

A good programmer first and foremost has a clean mind. Experience suggests puzzle geeks, who excel at contrived tests, are usually sloppy thinkers.

No. Good programmers can trivially knock out any of these so-called lame monkey tests. It's lame code monkeys who can't do it. And I've seen their work. Many night shifts and weekends I've burned trying to fix their shit because they couldn't actually do any of the things behind what you call "lame monkey tests", like:

    pulling expensive invariant calculations out of loops using for loops to scan a fucking table to pull rows or calculate an aggregate when they could let the database do what it does best with a simple SQL statement systems crashing under actual load because their shitty code was never stress tested ( but it worked on my dev box! .) again with databases, having to redo their schemas because they were fattened up so much with columns like VALUE1, VALUE2, ... VALUE20 (normalize you assholes!) chatting remote APIs - because these code monkeys cannot think about the need for bulk operations in increasingly distributed systems. storing dates in unsortable strings because the idiots do not know most modern programming languages have a date data type.

Oh and the most important, off-by-one looping errors. I see this all the time, the type of thing a good programmer can spot on quickly because he or she can do the so-called "lame monkey tests" that involve arrays and sorting.

I've seen the type: "I don't need to do this shit because I have business knowledge and I code for business and IT not google", and then they go and code and fuck it up... and then the rest of us have to go clean up their shit at 1AM or on weekends.

If you work as an hourly paid contractor cleaning that crap, it can be quite lucrative. But sooner or later it truly sucks the energy out of your soul.

So yeah, we need more lame monkey tests ... to filter the lame code monkeys.

ShanghaiBill ( 739463 ) writes:
Re: ( Score: 3 )
Someone could Google the problem with the phone then step up and solve the challenge.

If given a spec, someone can consistently cobble together working code by Googling, then I would love to hire them. That is the most productive way to get things done.

There is nothing wrong with using external references. When I am coding, I have three windows open: an editor, a testing window, and a browser with a Stackoverflow tab open.

Junta ( 36770 ) writes:
Re: ( Score: 2 )

Yeah, when we do tech interviews, we ask questions that we are certain they won't be able to answer, but want to see how they would think about the problem and what questions they ask to get more data and that they don't just fold up and say "well that's not the sort of problem I'd be thinking of" The examples aren't made up or anything, they are generally selection of real problems that were incredibly difficult that our company had faced before, that one may not think at first glance such a position would

bobstreo ( 1320787 ) writes:
Nothing worse ( Score: 2 )

than spending weeks interviewing "good" candidates for an opening, selecting a couple and hiring them as contractors, then finding out they are less than unqualified to do the job they were hired for.

I've seen it a few times, Java "experts", Microsoft "experts" with years of experience on their resumes, but completely useless in coding, deployment or anything other than buying stuff from the break room vending machines.

That being said, I've also seen projects costing hundreds of thousands of dollars, with y

Anonymous Coward , Sunday April 29, 2018 @12:34AM ( #56522157 )
Re:Nothing worse ( Score: 4 , Insightful)

The moment you said "contractors", and you have lost any sane developer. Keep swimming, its not a fish.

Anonymous Coward writes:
Re: ( Score: 2 , Informative)

I agree with this. I consider myself to be a good programmer and I would never go into contractor game. I also wonder, how does it take you weeks to interview someone and you still can't figure out if the person can't code? I could probably see that in 15 minutes in a pair coding session.

Also, Oracle, SAP, IBM... I would never buy from them, nor use their products. I have used plenty of IBM products and they suck big time. They make software development 100 times harder than it could be. Their technical supp

Lanthanide ( 4982283 ) writes:
Re: ( Score: 2 )

It's weeks to interview multiple different candidates before deciding on 1 or 2 of them. Not weeks per person.

Anonymous Coward writes:
Re: ( Score: 3 , Insightful)
That being said, I've also seen projects costing hundreds of thousands of dollars, with years of delays from companies like Oracle, Sun, SAP, and many other "vendors"

Software development is a hard thing to do well, despite the general thinking of technology becoming cheaper over time, and like health care the quality of the goods and services received can sometimes be difficult to ascertain. However, people who don't respect developers and the problems we solve are very often the same ones who continually frustrate themselves by trying to cheap out, hiring outsourced contractors, and then tearing their hair out when sub par results are delivered, if anything is even del

pauljlucas ( 529435 ) writes:
Re: ( Score: 2 )

As part of your interview process, don't you have candidates code a solution to a problem on a whiteboard? I've interviewed lots of "good" candidates (on paper) too, but they crashed and burned when challenged with a coding exercise. As a result, we didn't make them job offers.

VeryFluffyBunny ( 5037285 ) writes:
I do the opposite ( Score: 2 )

I'm not a great coder but good enough to get done what clients want done. If I'm not sure or don't think I can do it, I tell them. I think they appreciate the honesty. I don't work in a tech-hub, startups or anything like that so I'm not under the same expectations and pressures that others may be.

Tony Isaac ( 1301187 ) writes:
Bigger building blocks ( Score: 2 )

OK, so yes, I know plenty of programmers who do fake it. But stitching together components isn't "fake" programming.

Back in the day, we had to write our own code to loop through an XML file, looking for nuggets. Now, we just use an XML serializer. Back then, we had to write our own routines to send TCP/IP messages back and forth. Now we just use a library.

I love it! I hated having to make my own bricks before I could build a house. Now, I can get down to the business of writing the functionality I want, ins

Anonymous Coward writes:
Re: ( Score: 2 , Insightful)

But, I suspect you could write the component if you had to. That makes you a very different user of that component than someone who just knows it as a magic black box.

Because of this, you understand the component better and have real knowledge of its strengths and limitations. People blindly using components with only a cursory idea of their internal operation often cause major performance problems. They rarely recognize when it is time to write their own to overcome a limitation (or even that it is possibl

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

You're right on all counts. A person who knows how the innards work, is better than someone who doesn't, all else being equal. Still, today's world is so specialized that no one can possibly learn it all. I've never built a processor, as you have, but I still have been able to build a DNA matching algorithm for a major DNA lab.

I would argue that anyone who can skillfully use off-the-shelf components can also learn how to build components, if they are required to.

thesupraman ( 179040 ) writes:
Ummm. ( Score: 2 )

1, 'Back in the Day' there was no XML, XMl was not very long ago.
2, its a parser, a serialiser is pretty much the opposite (unless this weeks fashion has redefined that.. anything is possible).
3, 'Back then' we didnt have TCP stacks...

But, actually I agree with you. I can only assume the author thinks there are lots of fake plumbers because they dont cast their own toilet bowels from raw clay, and use pre-build fittings and pipes! That car mechanics start from raw steel scrap and a file.. And that you need

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

For the record, XML was invented in 1997, you know, in the last century! https://en.wikipedia.org/wiki/... [wikipedia.org]
And we had a WinSock library in 1992. https://en.wikipedia.org/wiki/... [wikipedia.org]

Yes, I agree with you on the "middle ground." My reaction was to the author's point that "not knowing how to build the components" was the same as being a "fake programmer."

Tony Isaac ( 1301187 ) , Sunday April 29, 2018 @01:46AM ( #56522313 ) Homepage
Re:Bigger building blocks ( Score: 5 , Interesting)

If I'm a plumber, and I don't know anything about the engineering behind the construction of PVC pipe, I can still be a good plumber. If I'm an electrician, and I don't understand the role of a blast furnace in the making of the metal components, I can still be a good electrician.

The analogy fits. If I'm a programmer, and I don't know how to make an LZW compression library, I can still be a good programmer. It's a matter of layers. These days, we specialize. You've got your low-level programmers that make the components, the high level programmers that put together the components, the graphics guys who do HTML/CSS, and the SQL programmers that just know about databases. Every person has their specialty. It's no longer necessary to be a low-level programmer, or jack-of-all-trades, to be "good."

If I don't know the layout of the IP header, I can still write quality networking software, and if I know XSLT, I can still do cool stuff with XML, even if I don't know how to write a good parser.

frank_adrian314159 ( 469671 ) writes:
Re: ( Score: 3 )

I was with you until you said " I can still do cool stuff with XML".

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

LOL yeah I know it's all JSON now. I've been around long enough to see these fads come and go. Frankly, I don't see a whole lot of advantage of JSON over XML. It's not even that much more compact, about 10% or so. But the point is that the author laments the "bad old days" when you had to create all your own building blocks, and you didn't have a team of specialists. I for one don't want to go back to those days!

careysub ( 976506 ) writes:
Re: ( Score: 3 )

The main advantage is that JSON is that it is consistent. XML has attributes, embedded optional stuff within tags. That was derived from the original SGML ancestor where is was thought to be a convenience for the human authors who were supposed to be making the mark-up manually. Programmatically it is a PITA.

Cederic ( 9623 ) writes:
Re: ( Score: 3 )

I got shit for decrying XML back when it was the trendy thing. I've had people apologise to me months later because they've realized I was right, even though at the time they did their best to fuck over my career because XML was the new big thing and I wasn't fully on board.

XML has its strengths and its place, but fuck me it taught me how little some people really fucking understand shit.

Anonymous Coward writes:
Silicon Valley is Only Part of the Tech Business ( Score: 2 , Informative)

And a rather small part at that, albeit a very visible and vocal one full of the proverbial prima donas. However, much of the rest of the tech business, or at least the people working in it, are not like that. It's small groups of developers working in other industries that would not typically be considered technology. There are software developers working for insurance companies, banks, hedge funds, oil and gas exploration or extraction firms, national defense and many hundreds and thousands of other small

phantomfive ( 622387 ) writes:
bonfire of fakers ( Score: 2 )

This is the reason I wish programming didn't pay so much....the field is better when it's mostly populated by people who enjoy programming.

Njovich ( 553857 ) , Sunday April 29, 2018 @05:35AM ( #56522641 )
Learn to code courses ( Score: 5 , Insightful)
They knew that well-paid programming jobs would also soon turn to smoke and ash, as the proliferation of learn-to-code courses around the world lowered the market value of their skills, and as advances in artificial intelligence allowed for computers to take over more of the mundane work of producing software.

Kind of hard to take this article serious after saying gibberish like this. I would say most good programmers know that neither learn-to-code courses nor AI are going to make a dent in their income any time soon.

AndyKron ( 937105 ) writes:
Me? No ( Score: 2 )

As a non-programmer Arduino and libraries are my friends

Escogido ( 884359 ) , Sunday April 29, 2018 @06:59AM ( #56522777 )
in the silly cone valley ( Score: 5 , Interesting)

There is a huge shortage of decent programmers. I have personally witnessed more than one phone "interview" that went like "have you done this? what about this? do you know what this is? um, can you start Monday?" (120K-ish salary range)

Partly because there are way more people who got their stupid ideas funded than good coders willing to stain their resume with that. partly because if you are funded, and cannot do all the required coding solo, here's your conundrum:

  • top level hackers can afford to be really picky, so on one hand it's hard to get them interested, and if you could get that, they often want some ownership of the project. the plus side is that they are happy to work for lots of equity if they have faith in the idea, but that can be a huge "if".
  • "good but not exceptional" senior engineers aren't usually going to be super happy, as they often have spouses and children and mortgages, so they'd favor job security over exciting ideas and startup lottery.
  • that leaves you with fresh-out-of-college folks, which are really really a mixed bunch. some are actually already senior level of understanding without the experience, some are absolutely useless, with varying degrees in between, and there's no easy way to tell which is which early.

so the not-so-scrupulous folks realized what's going on, and launched multiple coding boot camps programmes, to essentially trick both the students into believing they can become a coder in a month or two, and also the prospective employers that said students are useful. so far it's been working, to a degree, in part because in such companies coding skill evaluation process is broken. but one can only hide their lack of value add for so long, even if they do manage to bluff their way into a job.

quonset ( 4839537 ) , Sunday April 29, 2018 @07:20AM ( #56522817 )
Duh! ( Score: 4 , Insightful)

All one had to do was look at the lousy state of software and web sites today to see this is true. It's quite obvious little to no thought is given on how to make something work such that one doesn't have to jump through hoops.

I have many times said the most perfect word processing program ever developed was WordPefect 5.1 for DOS. Ones productivity was astonishing. It just worked.

Now we have the bloated behemoth Word which does its utmost to get in the way of you doing your work. The only way to get it to function is to turn large portions of its "features" off, and even then it still insists on doing something other than what you told it to do.

Then we have the abomination of Windows 10, which is nothing but Clippy on 10X steroids. It is patently obvious the people who program this steaming pile have never heard of simplicity. Who in their right mind would think having to "search" for something is more efficient than going directly to it? I would ask the question if these people wander around stores "searching" for what they're looking for, but then I realize that's how their entire life is run. They search for everything online rather than going directly to the source. It's no wonder they complain about not having time to things. They're always searching.

Web sites are another area where these people have no clue what they're doing. Anything that might be useful is hidden behind dropdown menus, flyouts, popup bubbles and intriately designed mazes of clicks needed to get to where you want to go. When someone clicks on a line of products, they shouldn't be harassed about what part of the product line they want to look at. Give them the information and let the user go where they want.

This rant could go on, but this article explains clearly why we have regressed when it comes to software and web design. Instead of making things simple and easy to use, using the one or two brain cells they have, programmers and web designers let the software do what it wants without considering, should it be done like this?

swb ( 14022 ) , Sunday April 29, 2018 @07:48AM ( #56522857 )
Tech industry churn ( Score: 3 )

The tech industry has a ton of churn -- there's some technological advancement, but there's an awful lot of new products turned out simply to keep customers buying new licenses and paying for upgrades.

This relentless and mostly phony newness means a lot of people have little experience with current products. People fake because they have no choice. The good ones understand the general technologies and problems they're meant to solve and can generally get up to speed quickly, while the bad ones are good at faking it but don't really know what they're doing. Telling the difference from the outside is impossible.

Sales people make it worse, promoting people as "experts" in specific products or implementations because the people have experience with a related product and "they're all the same". This burns out the people with good adaption skills.

DaMattster ( 977781 ) , Sunday April 29, 2018 @08:39AM ( #56522979 )
Interesting ( Score: 3 )

From the summary, it sounds like a lot of programmers and software engineers are trying to develop the next big thing so that they can literally beg for money from the elite class and one day, hopefully, become a member of the aforementioned. It's sad how the middle class has been utterly decimated in the United States that some of us are willing to beg for scraps from the wealthy. I used to work in IT but I've aged out and am now back in school to learn automotive technology so that I can do something other than being a security guard. Currently, the only work I have been able to find has been in the unglamorous security field.

I am learning some really good new skills in the automotive program that I am in but I hate this one class called "Professionalism in the Shop." I can summarize the entire class in one succinct phrase, "Learn how to appeal to, and communicate with, Mr. Doctor, Mr. Lawyer, or Mr. Wealthy-man." Basically, the class says that we are supposed to kiss their ass so they keep coming back to the Audi, BMW, Mercedes, Volvo, or Cadillac dealership. It feels a lot like begging for money on behalf of my employer (of which very little of it I will see) and nothing like professionalism. Professionalism is doing the job right the first time, not jerking the customer off. Professionalism is not begging for a 5 star review for a few measly extra bucks but doing absolute top quality work. I guess the upshot is that this class will be the easiest 4.0 that I've ever seen.

There is something fundamentally wrong when the wealthy elite have basically demanded that we beg them for every little scrap. I can understand the importance of polite and professional interaction but this prevalent expectation that we bend over backwards for them crosses a line with me. I still suck it up because I have to but it chafes my ass to basically validate the wealthy man.

ElitistWhiner ( 79961 ) writes:
Natural talent... ( Score: 2 )

In 70's I worked with two people who had a natural talent for computer science algorithms .vs. coding syntax. In the 90's while at COLUMBIA I worked with only a couple of true computer scientists out of 30 students. I've met 1 genius who programmed, spoke 13 languages, ex-CIA, wrote SWIFT and spoke fluent assembly complete with animated characters.

According to the Bluff Book, everyone else without natural talent fakes it. In the undiluted definition of computer science, genetics roulette and intellectual d

fahrbot-bot ( 874524 ) writes:
Other book sells better and is more interesting ( Score: 2 )
New Book Describes 'Bluffing' Programmers in Silicon Valley

It's not as interesting as the one about "fluffing" [urbandictionary.com] programmers.

Anonymous Coward writes:
Re: ( Score: 3 , Funny)

Ah yes, the good old 80:20 rule, except it's recursive for programmers.

80% are shit, so you fire them. Soon you realize that 80% of the remaining 20% are also shit, so you fire them too. Eventually you realize that 80% of the 4% remaining after sacking the 80% of the 20% are also shit, so you fire them!

...

The cycle repeats until there's just one programmer left: the person telling the joke.

---

tl;dr: All programmers suck. Just ask them to review their own code from more than 3 years ago: they'll tell you that

luis_a_espinal ( 1810296 ) writes:
Re: ( Score: 3 )
Who gives a fuck about lines? If someone gave me JavaScript, and someone gave me minified JavaScript, which one would I want to maintain?

I donâ(TM)t care about your line savings, less isnâ(TM)t always better.

Because the world of programming is not centered about JavasScript and reduction of lines is not the same as minification. If the first thing that came to your mind was about minified JavaScript when you saw this conversation, you are certainly not the type of programmer I would want to inherit code from.

See, there's a lot of shit out there that is overtly redundant and unnecessarily complex. This is specially true when copy-n-paste code monkeys are left to their own devices for whom code formatting seems

Anonymous Coward , Sunday April 29, 2018 @01:17AM ( #56522241 )
Re:Most "Professional programmers" are useless. ( Score: 4 , Interesting)

I have a theory that 10% of people are good at what they do. It doesn't really matter what they do, they will still be good at it, because of their nature. These are the people who invent new things, who fix things that others didn't even see as broken and who automate routine tasks or simply question and erase tasks that are not necessary. If you have a software team that contain 5 of these, you can easily beat a team of 100 average people, not only in cost but also in schedule, quality and features. In theory they are worth 20 times more than average employees, but in practise they are usually paid the same amount of money with few exceptions.

80% of people are the average. They can follow instructions and they can get the work done, but they don't see that something is broken and needs fixing if it works the way it has always worked. While it might seem so, these people are not worthless. There are a lot of tasks that these people are happily doing which the 10% don't want to do. E.g. simple maintenance work, implementing simple features, automating test cases etc. But if you let the top 10% lead the project, you most likely won't be needed that much of these people. Most work done by these people is caused by themselves, by writing bad software due to lack of good leader.

10% are just causing damage. I'm not talking about terrorists and criminals. I have seen software developers who have tried (their best?), but still end up causing just damage to the code that someone else needs to fix, costing much more than their own wasted time. You really must use code reviews if you don't know your team members, to find these people early.

Anonymous Coward , Sunday April 29, 2018 @01:40AM ( #56522299 )
Re:Most "Professional programmers" are useless. ( Score: 5 , Funny)
to find these people early

and promote them to management where they belong.

raymorris ( 2726007 ) , Sunday April 29, 2018 @01:51AM ( #56522329 ) Journal
Seems about right. Constantly learning, studying ( Score: 5 , Insightful)

That seems about right to me.

I have a lot of weaknesses. My people skills suck, I'm scrawny, I'm arrogant. I'm also generally known as a really good programmer and people ask me how/why I'm so much better at my job than everyone else in the room. (There are a lot of things I'm not good at, but I'm good at my job, so say everyone I've worked with.)

I think one major difference is that I'm always studying, intentionally working to improve, every day. I've been doing that for twenty years.

I've worked with people who have "20 years of experience"; they've done the same job, in the same way, for 20 years. Their first month on the job they read the first half of "Databases for Dummies" and that's what they've been doing for 20 years. They never read the second half, and use Oracle database 18.0 exactly the same way they used Oracle Database 2.0 - and it was wrong 20 years ago too. So it's not just experience, it's 20 years of learning, getting better, every day. That's 7,305 days of improvement.

gbjbaanb ( 229885 ) writes:
Re: ( Score: 2 )

I think I can guarantee that they are a lot better at their jobs than you think, and that you are a lot worse at your job than you think too.

m00sh ( 2538182 ) writes:
Re: ( Score: 2 )
That seems about right to me.

I have a lot of weaknesses. My people skills suck, I'm scrawny, I'm arrogant. I'm also generally known as a really good programmer and people ask me how/why I'm so much better at my job than everyone else in the room. (There are a lot of things I'm not good at, but I'm good at my job, so say everyone I've worked with.)

I think one major difference is that I'm always studying, intentionally working to improve, every day. I've been doing that for twenty years.

I've worked with people who have "20 years of experience"; they've done the same job, in the same way, for 20 years. Their first month on the job they read the first half of "Databases for Dummies" and that's what they've been doing for 20 years. They never read the second half, and use Oracle database 18.0 exactly the same way they used Oracle Database 2.0 - and it was wrong 20 years ago too. So it's not just experience, it's 20 years of learning, getting better, every day. That's 7,305 days of improvement.

If you take this attitude towards other people, people will not ask your for help. At the same time, you'll be also be not able to ask for their help.

You're not interviewing your peers. They are already in your team. You should be working together.

I've seen superstar programmers suck the life out of project by over-complicating things and not working together with others.

raymorris ( 2726007 ) writes:
Which part? Learning makes you better? ( Score: 2 )

You quoted a lot. Is there one part exactly do you have in mind? The thesis of my post is of course "constant learning, on purpose, makes you better"

> you take this attitude towards other people, people will not ask your for help. At the same time, you'll be also be not able to ask for their help.

Are you saying that trying to learn means you can't ask for help, or was there something more specific? For me, trying to learn means asking.

Trying to learn, I've had the opportunity to ask for help from peop

phantomfive ( 622387 ) writes:
Re: ( Score: 2 )

The difference between a smart programmer who succeeds and a stupid programmer who drops out is that the smart programmer doesn't give up.

complete loony ( 663508 ) writes:
Re: ( Score: 2 )

In other words;

What is often mistaken for 20 years' experience, is just 1 year's experience repeated 20 times.
serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

10% are just causing damage. I'm not talking about terrorists and criminals.

Terrorists and criminals have nothing on those guys. I know guy who is one of those. Worse, he's both motivated and enthusiastic. He also likes to offer help and advice to other people who don't know the systems well.

asifyoucare ( 302582 ) , Sunday April 29, 2018 @08:49AM ( #56522999 )
Re:Most "Professional programmers" are useless. ( Score: 5 , Insightful)

Good point. To quote Kurt von Hammerstein-Equord:

"I divide my officers into four groups. There are clever, diligent, stupid, and lazy officers. Usually two characteristics are combined. Some are clever and diligent -- their place is the General Staff. The next lot are stupid and lazy -- they make up 90 percent of every army and are suited to routine duties. Anyone who is both clever and lazy is qualified for the highest leadership duties, because he possesses the intellectual clarity and the composure necessary for difficult decisions. One must beware of anyone who is stupid and diligent -- he must not be entrusted with any responsibility because he will always cause only mischief."

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

Oops. Good thing I never did anything military. I am definitely in the "clever and lazy" class.

apoc.famine ( 621563 ) writes:
Re: ( Score: 2 )

I was just thinking the same thing. One of my passions in life is coming up with clever ways to do less work while getting more accomplished.

Software_Dev_GL ( 5377065 ) writes:
Re: ( Score: 2 )

It's called the Pareto Distribution [wikipedia.org]. The number of competent people (people doing most of the work) in any given organization goes like the square root of the number of employees.

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

Matches my observations. 10-15% are smart, can think independently, can verify claims by others and can identify and use rules in whatever they do. They are not fooled by things "everybody knows" and see standard-approaches as first approximations that, of course, need to be verified to work. They do not trust anything blindly, but can identify whether something actually work well and build up a toolbox of such things.

The problem is that in coding, you do not have a "(mass) production step", and that is the

geoskd ( 321194 ) writes:
Re: ( Score: 2 )

In basic concept I agree with your theory, it fits my own anecdotal experience well, but I find that your numbers are off. The top bracket is actually closer to 20%. The reason it seems so low is that a large portion of the highly competent people are running one programmer shows, so they have no co-workers to appreciate their knowledge and skill. The places they work do a very good job of keeping them well paid and happy (assuming they don't own the company outright), so they rarely if ever switch jobs.

The

Tablizer ( 95088 ) , Sunday April 29, 2018 @01:54AM ( #56522331 ) Journal
Re:Most "Professional programmers" are useless. ( Score: 4 , Interesting)
at least 70, probably 80, maybe even 90 percent of professional programmers should just fuck off and do something else as they are useless at programming.

Programming is statistically a dead-end job. Why should anyone hone a dead-end skill that you won't be able to use for long? For whatever reason, the industry doesn't want old programmers.

Otherwise, I'd suggest longer training and education before they enter the industry. But that just narrows an already narrow window of use.

Cesare Ferrari ( 667973 ) writes:
Re: ( Score: 2 )

Well, it does rather depend on which industry you work in - i've managed to find interesting programming jobs for 25 years, and there's no end in sight for interesting projects and new avenues to explore. However, this isn't for everyone, and if you have good personal skills then moving from programming into some technical management role is a very worthwhile route, and I know plenty of people who have found very interesting work in that direction.

gweihir ( 88907 ) writes:
Re: ( Score: 3 , Insightful)

I think that is a misinterpretation of the facts. Old(er) coders that are incompetent are just much more obvious and usually are also limited to technologies that have gotten old as well. Hence the 90% old coders that can actually not hack it and never really could get sacked at some time and cannot find a new job with their limited and outdated skills. The 10% that are good at it do not need to worry though. Who worries there is their employers when these people approach retirement age.

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

My experience as an IT Security Consultant (I also do some coding, but only at full rates) confirms that. Most are basically helpless and many have negative productivity, because people with a clue need to clean up after them. "Learn to code"? We have far too many coders already.

tomhath ( 637240 ) writes:
Re: ( Score: 2 )

You can't bluff you way through writing software, but many, many people have bluffed their way into a job and then tried to learn it from the people who are already there. In a marginally functional organization those incompetents are let go pretty quickly, but sometimes they stick around for months or years.

Apparently the author of this book is one of those, probably hired and fired several times before deciding to go back to his liberal arts roots and write a book.

DaMattster ( 977781 ) writes:
Re: ( Score: 2 )

There are some mechanics that bluff their way through an automotive repair. It's the same damn thing

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

I think you can and this is by far not the first piece describing that. Here is a classic: https://blog.codinghorror.com/... [codinghorror.com]
Yet these people somehow manage to actually have "experience" because they worked in a role they are completely unqualified to fill.

phantomfive ( 622387 ) writes:
Re: ( Score: 2 )
Fiddling with JavaScript libraries to get a fancy dancy interface that makes PHB's happy is a sought-after skill, for good or bad. Now that we rely more on half-ass libraries, much of "programming" is fiddling with dark-grey boxes until they work good enough.

This drives me crazy, but I'm consoled somewhat by the fact that it will all be thrown out in five years anyway.

[Nov 30, 2017] Will Robots Kill the Asian Century

This aritcle is two years old and not much happned during those two years. But still there is a chance that highly authomated factories can make manufacturing in the USA again profitable. the problme is that they will be even more profible in East Asia;-)
Notable quotes:
"... The National Interest ..."
The National Interest

The rise of technologies such as 3-D printing and advanced robotics means that the next few decades for Asia's economies will not be as easy or promising as the previous five.

OWEN HARRIES, the first editor, together with Robert Tucker, of The National Interest, once reminded me that experts-economists, strategists, business leaders and academics alike-tend to be relentless followers of intellectual fashion, and the learned, as Harold Rosenberg famously put it, a "herd of independent minds." Nowhere is this observation more apparent than in the prediction that we are already into the second decade of what will inevitably be an "Asian Century"-a widely held but rarely examined view that Asia's continued economic rise will decisively shift global power from the Atlantic to the western Pacific Ocean.

No doubt the numbers appear quite compelling. In 1960, East Asia accounted for a mere 14 percent of global GDP; today that figure is about 27 percent. If linear trends continue, the region could account for about 36 percent of global GDP by 2030 and over half of all output by the middle of the century. As if symbolic of a handover of economic preeminence, China, which only accounted for about 5 percent of global GDP in 1960, will likely surpass the United States as the largest economy in the world over the next decade. If past record is an indicator of future performance, then the "Asian Century" prediction is close to a sure thing.

[Nov 29, 2017] Take This GUI and Shove It

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.
Notable quotes:
"... Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI. ..."
"... What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers. ..."
"... AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT. ..."
"... Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI. ..."
"... Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead. ..."
"... Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location. ..."
"... I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script. ..."
Slashdot

Deep End's Paul Venezia speaks out against the overemphasis on GUIs in today's admin tools, saying that GUIs are fine and necessary in many cases, but only after a complete CLI is in place, and that they cannot interfere with the use of the CLI, only complement it. Otherwise, the GUI simply makes easy things easy and hard things much harder. He writes, 'If you have to make significant, identical changes to a bunch of Linux servers, is it easier to log into them one-by-one and run through a GUI or text-menu tool, or write a quick shell script that hits each box and either makes the changes or simply pulls down a few new config files and restarts some services? And it's not just about conservation of effort - it's also about accuracy. If you write a script, you're certain that the changes made will be identical on each box. If you're doing them all by hand, you aren't.'"

alain94040 (785132)

Here is a Link to the print version of the article [infoworld.com] (that conveniently fits on 1 page instead of 3).

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.

A bad GUI with no CLI is the worst of both worlds, the author of the article got that right. The 80/20 rule applies: 80% of the work is common to everyone, and should be offered with a GUI. And the 20% that is custom to each sysadmin, well use the CLI.

maxwell demon:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

0123456 (636235) writes:

What would be nice is if the GUI could automatically create a shell script doing the change.

While it's not quite the same thing, our GUI-based home router has an option to download the config as a text file so you can automatically reconfigure it from that file if it has to be reset to defaults. You could presumably use sed to change IP addresses, etc, and copy it to a different router. Of course it runs Linux.

Alain Williams:

AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT.

Ephemeriis:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI.

We've just started working with Aruba hardware. Installed a mobility controller last week. They've got a GUI that does something similar. It's all a pretty web-based front-end, but it again generates CLI commands and a human-readable configuration. I'm still very new to the platform, but I'm already learning about their CLI through the GUI. And getting work done that I wouldn't be able to if I had to look up the CLI commands for everything.

Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead.

Anpheus:

Just about every Microsoft tool newer than 2007 does this. Virtual machine manager, SQL Server has done it for ages, I think almost all the system center tools do, etc.

It's a huge improvement.

PoV:

All good admins document their work (don't they? DON'T THEY?). With a CLI or a script that's easy: it comes down to "log in as user X, change to directory Y, run script Z with arguments A B and C - the output should look like D". Try that when all you have is a GLUI (like a GUI, but you get stuck): open this window, select that option, drag a slider, check these boxes, click Yes, three times. The output might look a little like this blurry screen shot and the only record of a successful execution is a window that disappears as soon as the application ends.

I suppose the Linux community should be grateful that windows made the fundemental systems design error of making everything graphic. Without that basic failure, Linux might never have even got the toe-hold it has now.

skids:

I think this is a stronger point than the OP: GUIs do not lead to good documentation. In fact, GUIs pretty much are limited to procedural documentation like the example you gave.

The best they can do as far as actual documentation, where the precise effect of all the widgets is explained, is a screenshot with little quote bubbles pointing to each doodad. That's a ridiculous way to document.

This is as opposed to a command reference which can organize, usually in a pretty sensible fashion, exact descriptions of what each command does.

Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location.

Not that even good command references are mandatory by today's pathetic standards. Even the big boys like Cisco have shown major degradation in the quality of their documentation during the last decade.

pedantic bore:

I think the author might not fully understand who most admins are. They're people who couldn't write a shell script if their lives depended on it, because they've never had to. GUI-dependent users become GUI-dependent admins.

As a percentage of computer users, people who can actually navigate a CLI are an ever-diminishing group.

arth1: /etc/resolv.conf

/etc/init.d/NetworkManager stop
chkconfig NetworkManager off
chkconfig network on
vi /etc/sysconfig/network
vi /etc/sysconfig/network-scripts/eth0

At least they named it NetworkManager, so experienced admins could recognize it as a culprit. Anything named in CamelCase is almost invariably written by new school programmers who don't grok the Unix toolbox concept and write applications instead of tools, and the bloated drivel is usually best avoided.

Darkness404 (1287218) writes: on Monday October 04, @07:21PM (#33789446)

There are more and more small businesses (5, 10 or so employees) realizing that they can get things done easier if they had a server. Because the business can't really afford to hire a sysadmin or a full-time tech person, its generally the employee who "knows computers" (you know, the person who has to help the boss check his e-mail every day, etc.) and since they don't have the knowledge of a skilled *Nix admin, a GUI makes their administration a lot easier.

So with the increasing use of servers among non-admins, it only makes sense for a growth in GUI-based solutions.

Svartalf (2997) writes: Ah... But the thing is... You don't NEED the GUI with recent Linux systems- you do with Windows.

oatworm (969674) writes: on Monday October 04, @07:38PM (#33789624) Homepage

Bingo. Realistically, if you're a company with less than a 100 employees (read: most companies), you're only going to have a handful of servers in house and they're each going to be dedicated to particular roles. You're not going to have 100 clustered fileservers - instead, you're going to have one or maybe two. You're not going to have a dozen e-mail servers - instead, you're going to have one or two. Consequently, the office admin's focus isn't going to be scalability; it just won't matter to the admin if they can script, say, creating a mailbox for 100 new users instead of just one. Instead, said office admin is going to be more focused on finding ways to do semi-unusual things (e.g. "create a VPN between this office and our new branch office", "promote this new server as a domain controller", "install SQL", etc.) that they might do, oh, once a year.

The trouble with Linux, and I'm speaking as someone who's used YaST in precisely this context, is that you have to make a choice - do you let the GUI manage it or do you CLI it? If you try to do both, there will be inconsistencies because the grammar of the config files is too ambiguous; consequently, the GUI config file parser will probably just overwrite whatever manual changes it thinks is "invalid", whether it really is or not. If you let the GUI manage it, you better hope the GUI has the flexibility necessary to meet your needs. If, for example, YaST doesn't understand named Apache virtual hosts, well, good luck figuring out where it's hiding all of the various config files that it was sensibly spreading out in multiple locations for you, and don't you dare use YaST to manage Apache again or it'll delete your Apache-legal but YaST-"invalid" directive.

The only solution I really see is for manual config file support with optional XML (or some other machine-friendly but still human-readable format) linkages. For example, if you want to hand-edit your resolv.conf, that's fine, but if the GUI is going to take over, it'll toss a directive on line 1 that says "#import resolv.conf.xml" and immediately overrides (but does not overwrite) everything following that. Then, if you still want to use the GUI but need to hand-edit something, you can edit the XML file using the appropriate syntax and know that your change will be reflected on the GUI.

That's my take. Your mileage, of course, may vary.

icebraining (1313345) writes: on Monday October 04, @07:24PM (#33789494) Homepage

I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script.

devent (1627873) writes:

Why Windows servers have a GUI is beyond me anyway. The servers are running 99,99% of the time without a monitor and normally you just login per ssh to a console if you need to administer them. But they are consuming the extra RAM, the extra CPU cycles and the extra security threats. I don't now, but can you de-install the GUI from a Windows server? Or better, do you have an option for no-GUI installation? Just saw the minimum hardware requirements. 512 MB RAM and 32 GB or greater disk space. My server runs

sirsnork (530512) writes: on Monday October 04, @07:43PM (#33789672)

it's called a "core" install in Server 2008 and up, and if you do that, there is no going back, you can't ever add the GUI back.

What this means is you can run a small subset of MS services that don't need GUI interaction. With R2 that subset grew somwhat as they added the ability to install .Net too, which mean't you could run IIS in a useful manner (arguably the strongest reason to want to do this in the first place).

Still it's a one way trip and you better be damn sure what services need to run on that box for the lifetime of that box or you're looking at a reinstall. Most windows admins will still tell you the risk isn't worth it.

Simple things like network configuration without a GUI in windows is tedious, and, at least last time i looked, you lost the ability to trunk network poers because the NIC manufactuers all assumed you had a GUI to configure your NICs

prichardson (603676) writes: on Monday October 04, @07:27PM (#33789520) Journal

This is also a problem with Max OS X Server. Apple builds their services from open source products and adds a GUI for configuration to make it all clickable and easy to set up. However, many options that can be set on the command line can't be set in the GUI. Even worse, making CLI changes to services can break the GUI entirely.

The hardware and software are both super stable and run really smoothly, so once everything gets set up, it's awesome. Still, it's hard for a guy who would rather make changes on the CLI to get used to.

MrEricSir (398214) writes:

Just because you're used to a CLI doesn't make it better. Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. In essence, the question here is whether it's okay for the user to be lazy and use a GUI, or whether the programmer should be too lazy to develop a GUI.

ak_hepcat (468765) writes: <leif@MENCKENdenali.net minus author> on Monday October 04, @07:38PM (#33789626) Homepage Journal

Probably because it's also about the ease of troubleshooting issues.

How do you troubleshoot something with a GUI after you've misconfigured? How do you troubleshoot a programming error (bug) in the GUI -> device communication? How do you scale to tens, hundreds, or thousands of devices with a GUI?

CLI makes all this easier and more manageable.

arth1 (260657) writes:

Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. Because then you'll be stuck at doing simple tasks, and will never be able to do more advanced tasks. Without hiring a team to write an app for you instead of doing it yourself in two minutes, that is. The time you spend reading man

fandingo (1541045) writes: on Monday October 04, @07:54PM (#33789778)

I don't think you really understand systems administration. 'Users,' or in this case admins, don't typically do stuff once. Furthermore, they need to know what he did and how to do it again (i.e. new server or whatever) or just remember what he did. One-off stuff isn't common and is a sign of poor administration (i.e. tracking changes and following processes).

What I'm trying to get at is that admins shouldn't do anything without reading the manual. As a Windows/Linux admin, I tend to find Linux easier to properly administer because I either already know how to perform an operation or I have to read the manual (manpage) and learn a decent amount about the operation (i.e. more than click here/use this flag).

Don't get me wrong, GUIs can make unknown operations significantly easier, but they often lead to poor process management. To document processes, screenshots are typically needed. They can be done well, but I find that GUI documentation (created by admins, not vendor docs) tend to be of very low quality. They are also vulnerable to 'upgrades' where vendors change the interface design. CLI programs typically have more stable interfaces, but maybe that's just because they have been around longer...

maotx (765127) writes: <maotx@NoSPAM.yahoo.com> on Monday October 04, @07:42PM (#33789666)

That's one thing Microsoft did right with Exchange 2007. They built it entirely around their new powershell CLI and then built a GUI for it. The GUI is limited in compared to what you can do with the CLI, but you can get most things done. The CLI becomes extremely handy for batch jobs and exporting statistics to csv files. I'd say it's really up there with BASH in terms of scripting, data manipulation, and integration (not just Exchange but WMI, SQL, etc.)

They tried to do similar with Windows 2008 and their Core [petri.co.il] feature, but they still have to load a GUI to present a prompt...Reply to This

Charles Dodgeson (248492) writes: <jeffrey@goldmark.org> on Monday October 04, @08:51PM (#33790206) Homepage Journal

Probably Debian would have been OK, but I was finding admin of most Linux distros a pain for exactly these reasons. I couldn't find a layer where I could do everything that I needed to do without worrying about one thing stepping on another. No doubt there are ways that I could manage a Linux system without running into different layers of management tools stepping on each other, but it was a struggle.

There were other reasons as well (although there is a lot that I miss about Linux), but I think that this was one of the leading reasons.

(NB: I realize that this is flamebait (I've got karma to burn), but that isn't my intention here.)

[Nov 28, 2017] Sometimes the Old Ways Are Best by Brian Kernighan

Notable quotes:
"... Sometimes the old ways are best, and they're certainly worth knowing well ..."
Nov 01, 2008 | IEEE Software, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time.

But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past.

On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

[Nov 28, 2017] Rees Re OO

Notable quotes:
"... The conventional Simula 67-like pattern of class and instance will get you {1,3,7,9}, and I think many people take this as a definition of OO. ..."
"... Because OO is a moving target, OO zealots will choose some subset of this menu by whim and then use it to try to convince you that you are a loser. ..."
"... In such a pack-programming world, the language is a constitution or set of by-laws, and the interpreter/compiler/QA dept. acts in part as a rule checker/enforcer/police force. Co-programmers want to know: If I work with your code, will this help me or hurt me? Correctness is undecidable (and generally unenforceable), so managers go with whatever rule set (static type system, language restrictions, "lint" program, etc.) shows up at the door when the project starts. ..."
Nov 04, 2017 | www.paulgraham.com

(Jonathan Rees had a really interesting response to Why Arc isn't Especially Object-Oriented , which he has allowed me to reproduce here.)

Here is an a la carte menu of features or properties that are related to these terms; I have heard OO defined to be many different subsets of this list.

  1. Encapsulation - the ability to syntactically hide the implementation of a type. E.g. in C or Pascal you always know whether something is a struct or an array, but in CLU and Java you can hide the difference.
  2. Protection - the inability of the client of a type to detect its implementation. This guarantees that a behavior-preserving change to an implementation will not break its clients, and also makes sure that things like passwords don't leak out.
  3. Ad hoc polymorphism - functions and data structures with parameters that can take on values of many different types.
  4. Parametric polymorphism - functions and data structures that parameterize over arbitrary values (e.g. list of anything). ML and Lisp both have this. Java doesn't quite because of its non-Object types.
  5. Everything is an object - all values are objects. True in Smalltalk (?) but not in Java (because of int and friends).
  6. All you can do is send a message (AYCDISAM) = Actors model - there is no direct manipulation of objects, only communication with (or invocation of) them. The presence of fields in Java violates this.
  7. Specification inheritance = subtyping - there are distinct types known to the language with the property that a value of one type is as good as a value of another for the purposes of type correctness. (E.g. Java interface inheritance.)
  8. Implementation inheritance/reuse - having written one pile of code, a similar pile (e.g. a superset) can be generated in a controlled manner, i.e. the code doesn't have to be copied and edited. A limited and peculiar kind of abstraction. (E.g. Java class inheritance.)
  9. Sum-of-product-of-function pattern - objects are (in effect) restricted to be functions that take as first argument a distinguished method key argument that is drawn from a finite set of simple names.

So OO is not a well defined concept. Some people (eg. Abelson and Sussman?) say Lisp is OO, by which they mean {3,4,5,7} (with the proviso that all types are in the programmers' heads). Java is supposed to be OO because of {1,2,3,7,8,9}. E is supposed to be more OO than Java because it has {1,2,3,4,5,7,9} and almost has 6; 8 (subclassing) is seen as antagonistic to E's goals and not necessary for OO.

The conventional Simula 67-like pattern of class and instance will get you {1,3,7,9}, and I think many people take this as a definition of OO.

Because OO is a moving target, OO zealots will choose some subset of this menu by whim and then use it to try to convince you that you are a loser.

Perhaps part of the confusion - and you say this in a different way in your little memo - is that the C/C++ folks see OO as a liberation from a world that has nothing resembling a first-class functions, while Lisp folks see OO as a prison since it limits their use of functions/objects to the style of (9.). In that case, the only way OO can be defended is in the same manner as any other game or discipline -- by arguing that by giving something up (e.g. the freedom to throw eggs at your neighbor's house) you gain something that you want (assurance that your neighbor won't put you in jail).

This is related to Lisp being oriented to the solitary hacker and discipline-imposing languages being oriented to social packs, another point you mention. In a pack you want to restrict everyone else's freedom as much as possible to reduce their ability to interfere with and take advantage of you, and the only way to do that is by either becoming chief (dangerous and unlikely) or by submitting to the same rules that they do. If you submit to rules, you then want the rules to be liberal so that you have a chance of doing most of what you want to do, but not so liberal that others nail you.

In such a pack-programming world, the language is a constitution or set of by-laws, and the interpreter/compiler/QA dept. acts in part as a rule checker/enforcer/police force. Co-programmers want to know: If I work with your code, will this help me or hurt me? Correctness is undecidable (and generally unenforceable), so managers go with whatever rule set (static type system, language restrictions, "lint" program, etc.) shows up at the door when the project starts.

I recently contributed to a discussion of anti-OO on the e-lang list. My main anti-OO message (actually it only attacks points 5/6) was http://www.eros-os.org/pipermail/e-lang/2001-October/005852.html . The followups are interesting but I don't think they're all threaded properly.

(Here are the pet definitions of terms used above:

Complete Exchange

[Nov 28, 2017] Sometimes the Old Ways Are Best by Brian Kernighan

Nov 01, 2008 | IEEE Software, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time.

But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past.

On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

[Nov 27, 2017] Stop Writing Classes

Notable quotes:
"... If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code. ..."
Nov 27, 2017 | www.youtube.com

Tom coAdjoint , 1 year ago

My god I wish the engineers at my work understood this

kobac , 2 years ago

If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code.

If there's something else that I've noticed in my career, it's that their code is the hardest to maintain and for some reason they want the rest of the team to depend on them since they are the only "enough smart" to understand that code and change it. No need to say that these guys are not part of my team. Your code should be direct, simple and readable. End of story.

[Nov 27, 2017] Stop Writing Classes

Notable quotes:
"... If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code. ..."
Nov 27, 2017 | www.youtube.com

Tom coAdjoint , 1 year ago

My god I wish the engineers at my work understood this

kobac , 2 years ago

If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code.

If there's something else that I've noticed in my career, it's that their code is the hardest to maintain and for some reason they want the rest of the team to depend on them since they are the only "enough smart" to understand that code and change it. No need to say that these guys are not part of my team. Your code should be direct, simple and readable. End of story.

[Nov 27, 2017] The Robot Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Notable quotes:
"... "In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.) ..."
"... At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks. ..."
"... This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle. ..."
"... In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly. ..."
"... In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks ..."
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Nov 27, 2017] The productivity paradox by Ryan Avent

Notable quotes:
"... But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay. ..."
"... Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it. ..."
Mar 20, 2017 | medium.com

People are worried about robots taking jobs. Driverless cars are around the corner. Restaurants and shops increasingly carry the option to order by touchscreen. Google's clever algorithms provide instant translations that are remarkably good.

But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay.

The obvious conclusion, the one lots of people are drawing, is that the robot threat is totally overblown: the fantasy, perhaps, of a bubble-mad Silicon Valley - or an effort to distract from workers' real problems, trade and excessive corporate power. Generally speaking, the problem is not that we've got too much amazing new technology but too little.

This is not a strawman of my own invention. Robert Gordon makes this case. You can see Matt Yglesias make it here. Duncan Weldon, for his part, writes:

We are debating a problem we don't have, rather than facing a real crisis that is the polar opposite. Productivity growth has slowed to a crawl over the last 15 or so years, business investment has fallen and wage growth has been weak. If the robot revolution truly was under way, we would see surging capital expenditure and soaring productivity. Right now, that would be a nice "problem" to have. Instead we have the reality of weak growth and stagnant pay. The real and pressing concern when it comes to the jobs market and automation is that the robots aren't taking our jobs fast enough.

And in a recent blog post Paul Krugman concluded:

I'd note, however, that it remains peculiar how we're simultaneously worrying that robots will take all our jobs and bemoaning the stalling out of productivity growth. What is the story, really?

What is the story, indeed. Let me see if I can tell one. Last fall I published a book: "The Wealth of Humans". In it I set out how rapid technological progress can coincide with lousy growth in pay and productivity. Start with this:

Low labour costs discourage investments in labour-saving technology, potentially reducing productivity growth.

Peter K. -> Peter K.... Monday, March 20, 2017 at 09:26 AM

Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it.

anne -> Peter K.... March 20, 2017 at 10:32 AM

https://twitter.com/paulkrugman/status/843167658577182725

Paul Krugman @paulkrugman

But is [Ryan Avent] saying something different from the assertion that recent tech progress is capital-biased?

https://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

If so, what?

anne -> Peter K.... March 20, 2017 at 10:33 AM

http://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

December 26, 2012

Capital-biased Technological Progress: An Example (Wonkish)
By Paul Krugman

Ever since I posted about robots and the distribution of income, * I've had queries from readers about what capital-biased technological change – the kind of change that could make society richer but workers poorer – really means. And it occurred to me that it might be useful to offer a simple conceptual example – the kind of thing easily turned into a numerical example as well – to clarify the possibility. So here goes.

Imagine that there are only two ways to produce output. One is a labor-intensive method – say, armies of scribes equipped only with quill pens. The other is a capital-intensive method – say, a handful of technicians maintaining vast server farms. (I'm thinking in terms of office work, which is the dominant occupation in the modern economy).

We can represent these two techniques in terms of unit inputs – the amount of each factor of production required to produce one unit of output. In the figure below I've assumed that initially the capital-intensive technique requires 0.2 units of labor and 0.8 units of capital per unit of output, while the labor-intensive technique requires 0.8 units of labor and 0.2 units of capital.

[Diagram]

The economy as a whole can make use of both techniques – in fact, it will have to unless it has either a very large amount of capital per worker or a very small amount. No problem: we can just use a mix of the two techniques to achieve any input combination along the blue line in the figure. For economists reading this, yes, that's the unit isoquant in this example; obviously if we had a bunch more techniques it would start to look like the convex curve of textbooks, but I want to stay simple here.

What will the distribution of income be in this case? Assuming perfect competition (yes, I know, but let's deal with that case for now), the real wage rate w and the cost of capital r – both measured in terms of output – have to be such that the cost of producing one unit is 1 whichever technique you use. In this example, that means w=r=1. Graphically, by the way, w/r is equal to minus the slope of the blue line.

Oh, and if you're worried, yes, workers and machines are both paid their marginal product.

But now suppose that technology improves – specifically, that production using the capital-intensive technique gets more efficient, although the labor-intensive technique doesn't. Scribes with quill pens are the same as they ever were; server farms can do more than ever before. In the figure, I've assumed that the unit inputs for the capital-intensive technique are cut in half. The red line shows the economy's new choices.

So what happens? It's obvious from the figure that wages fall relative to the cost of capital; it's less obvious, maybe, but nonetheless true that real wages must fall in absolute terms as well. In this specific example, technological progress reduces the real wage by a third, to 0.667, while the cost of capital rises to 2.33.

OK, it's obvious how stylized and oversimplified all this is. But it does, I think, give you some sense of what it would mean to have capital-biased technological progress, and how this could actually hurt workers.

* http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

anne -> Peter K.... March 20, 2017 at 10:34 AM

http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

December 8, 2012

Rise of the Robots
By Paul Krugman

Catherine Rampell and Nick Wingfield write about the growing evidence * for "reshoring" of manufacturing to the United States. * They cite several reasons: rising wages in Asia; lower energy costs here; higher transportation costs. In a followup piece, ** however, Rampell cites another factor: robots.

"The most valuable part of each computer, a motherboard loaded with microprocessors and memory, is already largely made with robots, according to my colleague Quentin Hardy. People do things like fitting in batteries and snapping on screens.

"As more robots are built, largely by other robots, 'assembly can be done here as well as anywhere else,' said Rob Enderle, an analyst based in San Jose, California, who has been following the computer electronics industry for a quarter-century. 'That will replace most of the workers, though you will need a few people to manage the robots.' "

Robots mean that labor costs don't matter much, so you might as well locate in advanced countries with large markets and good infrastructure (which may soon not include us, but that's another issue). On the other hand, it's not good news for workers!

This is an old concern in economics; it's "capital-biased technological change," which tends to shift the distribution of income away from workers to the owners of capital.

Twenty years ago, when I was writing about globalization and inequality, capital bias didn't look like a big issue; the major changes in income distribution had been among workers (when you include hedge fund managers and CEOs among the workers), rather than between labor and capital. So the academic literature focused almost exclusively on "skill bias", supposedly explaining the rising college premium.

But the college premium hasn't risen for a while. What has happened, on the other hand, is a notable shift in income away from labor:

[Graph]

If this is the wave of the future, it makes nonsense of just about all the conventional wisdom on reducing inequality. Better education won't do much to reduce inequality if the big rewards simply go to those with the most assets. Creating an "opportunity society," or whatever it is the likes of Paul Ryan etc. are selling this week, won't do much if the most important asset you can have in life is, well, lots of assets inherited from your parents. And so on.

I think our eyes have been averted from the capital/labor dimension of inequality, for several reasons. It didn't seem crucial back in the 1990s, and not enough people (me included!) have looked up to notice that things have changed. It has echoes of old-fashioned Marxism - which shouldn't be a reason to ignore facts, but too often is. And it has really uncomfortable implications.

But I think we'd better start paying attention to those implications.

* http://www.nytimes.com/2012/12/07/technology/apple-to-resume-us-manufacturing.html

** http://economix.blogs.nytimes.com/2012/12/07/when-cheap-foreign-labor-gets-less-cheap/

anne -> anne... March 20, 2017 at 10:41 AM

https://fred.stlouisfed.org/graph/?g=d4ZY

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015


https://fred.stlouisfed.org/graph/?g=d507

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015

(Indexed to 1948)

[Nov 27, 2017] Nineteen Ninety-Six: The Robot/Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye.

The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Oct 26, 2017] Amazon.com Customer reviews Extreme Programming Explained Embrace Change

Rapid Development by Steve McConnell is an older and better book. The Mythical Man-Month remains valuable book as well, albeit dated.
Notable quotes:
"... Having a customer always available on site would mean that the customer in question is probably a small, expendable fish in his organization and is unlikely to have any useful knowledge of its business practices. ..."
"... Unit testing code before it is written means that one would have to have a mental picture of what one is going to write before writing it, which is difficult without upfront design. And maintaining such tests as the code changes would be a nightmare. ..."
"... Programming in pairs all the time would assume that your topnotch developers are also sociable creatures, which is rarely the case, and even if they were, no one would be able to justify the practice in terms of productivity. I won't discuss why I think that abandoning upfront design is a bad practice; the whole idea is too ridiculous to debate ..."
"... Both book and methodology will attract fledgling developers with its promise of hacking as an acceptable software practice and a development universe revolving around the programmer. It's a cult, not a methodology, were the followers shall find salvation and 40-hour working weeks ..."
"... Two stars for the methodology itself, because it underlines several common sense practices that are very useful once practiced without the extremity. ..."
"... The second is the dictatorial social engineering that eXtremity mandates. I've actually tried the pair programming - what a disaster. ..."
"... I've also worked with people who felt that their slightest whim was adequate reason to interfere with my work. That's what Beck institutionalizes by saying that any request made of me by anyone on the team must be granted. It puts me completely at the mercy of anyone walking by. The requisite bullpen physical environment doesn't work for me either. I find that the visual and auditory distraction make intense concentration impossible. ..."
"... One of the things I despise the most about the software development culture is the mindless adoption of fads. Extreme programming has been adopted by some organizations like a religious dogma. ..."
Oct 26, 2017 | www.amazon.com

Mohammad B. Abdulfatah on February 10, 2003

Programming Malpractice Explained: Justifying Chaos

To fairly review this book, one must distinguish between the methodology it presents and the actual presentation. As to the presentation, the author attempts to win the reader over with emotional persuasion and pep talk rather than with facts and hard evidence. Stories of childhood and comradeship don't classify as convincing facts to me.

A single case study-the C3 project-is often referred to, but with no specific information (do note that the project was cancelled by the client after staying in development for far too long).

As to the method itself, it basically boils down to four core practices:

  1. Always have a customer available on site.
  2. Unit test before you code.
  3. Program in pairs.
  4. Forfeit detailed design in favor of incremental, daily releases and refactoring.

If you do the above, and you have excellent staff on your hands, then the book promises that you'll reap the benefits of faster development, less overtime, and happier customers. Of course, the book fails to point out that if your staff is all highly qualified people, then the project is likely to succeed no matter what methodology you use. I'm sure that anyone who has worked in the software industry for sometime has noticed the sad state that most computer professionals are in nowadays.

However, assuming that you have all the topnotch developers that you desire, the outlined methodology is almost impossible to apply in real world scenarios. Having a customer always available on site would mean that the customer in question is probably a small, expendable fish in his organization and is unlikely to have any useful knowledge of its business practices.

Unit testing code before it is written means that one would have to have a mental picture of what one is going to write before writing it, which is difficult without upfront design. And maintaining such tests as the code changes would be a nightmare.

Programming in pairs all the time would assume that your topnotch developers are also sociable creatures, which is rarely the case, and even if they were, no one would be able to justify the practice in terms of productivity. I won't discuss why I think that abandoning upfront design is a bad practice; the whole idea is too ridiculous to debate.

Both book and methodology will attract fledgling developers with its promise of hacking as an acceptable software practice and a development universe revolving around the programmer. It's a cult, not a methodology, were the followers shall find salvation and 40-hour working weeks.

Experience is a great teacher, but only a fool would learn from it alone. Listen to what the opponents have to say before embracing change, and don't forget to take the proverbial grain of salt.

Two stars out of five for the presentation for being courageous and attempting to defy the standard practices of the industry. Two stars for the methodology itself, because it underlines several common sense practices that are very useful once practiced without the extremity.

wiredweird HALL OF FAME TOP 1000 REVIEWER on May 24, 2004
eXtreme buzzwording

Maybe it's an interesting idea, but it's just not ready for prime time.

Parts of Kent's recommended practice - including aggressive testing and short integration cycle - make a lot of sense. I've shared the same beliefs for years, but it was good to see them clarified and codified. I really have changed some of my practice after reading this and books like this.

I have two broad kinds of problem with this dogma, though. First is the near-abolition of documentation. I can't defend 2000 page specs for typical kinds of development. On the other hand, declaring that the test suite is the spec doesn't do it for me either. The test suite is code, written for machine interpretation. Much too often, it is not written for human interpretation. Based on the way I see most code written, it would be a nightmare to reverse engineer the human meaning out of any non-trivial test code. Some systematic way of ensuring human intelligibility in the code, traceable to specific "stories" (because "requirements" are part of the bad old way), would give me a lot more confidence in the approach.

The second is the dictatorial social engineering that eXtremity mandates. I've actually tried the pair programming - what a disaster. The less said the better, except that my experience did not actually destroy any professional relationships. I've also worked with people who felt that their slightest whim was adequate reason to interfere with my work. That's what Beck institutionalizes by saying that any request made of me by anyone on the team must be granted. It puts me completely at the mercy of anyone walking by. The requisite bullpen physical environment doesn't work for me either. I find that the visual and auditory distraction make intense concentration impossible.

I find revival tent spirit of the eXtremists very off-putting. If something works, it works for reasons, not as a matter of faith. I find much too much eXhortation to believe, to go ahead and leap in, so that I will eXperience the wonderfulness for myself. Isn't that what the evangelist on the subway platform keeps saying? Beck does acknowledge unbelievers like me, but requires their exile in order to maintain the group-think of the X-cult.
Beck's last chapters note a number of exceptions and special cases where eXtremism may not work - actually, most of the projects I've ever encountered.

There certainly is good in the eXtreme practice. I look to future authors to tease that good out from the positively destructive threads that I see interwoven.

A customer on May 2, 2004
A work of fiction

The book presents extreme programming. It is divided into three parts:
(1) The problem
(2) The solution
(3) Implementing XP.

The problem, as presented by the author, is that requirements change but current methodologies are not agile enough to cope with this. This results in customer being unhappy. The solution is to embrace change and to allow the requirements to be changed. This is done by choosing the simplest solution, releasing frequently, refactoring with the security of unit tests.

The basic assumption which underscores the approach is that the cost of change is not exponential but reaches a flat asymptote. If this is not the case, allowing change late in the project would be disastrous. The author does not provide data to back his point of view. On the other hand there is a lot of data against a constant cost of change (see for example discussion of cost in Code Complete). The lack of reasonable argumentation is an irremediable flaw in the book. Without some supportive data it is impossible to believe the basic assumption, nor the rest of the book. This is all the more important since the only project that the author refers to was cancelled before full completion.

Many other parts of the book are unconvincing. The author presents several XP practices. Some of them are very useful. For example unit tests are a good practice. They are however better treated elsewhere (e.g., Code Complete chapter on unit test). On the other hand some practices seem overkill. Pair programming is one of them. I have tried it and found it useful to generate ideas while prototyping. For writing production code, I find that a quiet environment is by far the best (see Peopleware for supportive data). Again the author does not provide any data to support his point.

This book suggests an approach aiming at changing software engineering practices. However the lack of supportive data makes it a work of fiction.
I would suggest reading Code Complete for code level advice or Rapid Development for management level advice.

A customer on November 14, 2002
Not Software Engineering.

Any Engineering discipline is based on solid reasoning and logic not on blind faith. Unfortunately, most of this book attempts to convince you that Extreme programming is better based on the author's experiences. A lot of the principles are counterintuitive and the author exhorts you just try it out and get enlightened. I'm sorry but these kind of things belong in infomercials not in s/w engineering.

The part about "code is the documentation" is the scariest part. It's true that keeping the documentation up to date is tough on any software project, but to do away with documentation is the most ridiculous thing I have heard.

It's like telling people to cut of their noses to avoid colds. Yes we are always in search of a better software process. Let me tell you that this book won't lead you there.

Philip K. Ronzone on November 24, 2000
The "gossip magazine diet plans" style of programming.

This book reminds me of the "gossip magazine diet plans", you know, the vinegar and honey diet, or the fat-burner 2000 pill diet etc. Occasionally, people actually lose weight on those diets, but, only because they've managed to eat less or exercise more. The diet plans themselves are worthless. XP is the same - it may sometimes help people program better, but only because they are (unintentionally) doing something different. People look at things like XP because, like dieters, they see a need for change. Overall, the book is a decently written "fad diet", with ideas that are just as worthless.

A customer on August 11, 2003
Hackers! Salvation is nigh!!

It's interesting to see the phenomenon of Extreme Programming happening in the dawn of the 21st century. I suppose historians can explain such a reaction as a truly conservative movement. Of course, serious software engineering practice is hard. Heck, documentation is a pain in the neck. And what programmer wouldn't love to have divine inspiration just before starting to write the latest web application and so enlightened by the Almighty, write the whole thing in one go, as if by magic? No design, no documentation, you and me as a pair, and the customer too. Sounds like a hacker's dream with "Imagine" as the soundtrack (sorry, John).
The Software Engineering struggle is over 50 years old and it's only logical to expect some resistance, from time to time. In the XP case, the resistance comes in one of its worst forms: evangelism. A fundamentalist cult, with very little substance, no proof of any kind, but then again if you don't have faith you won't be granted the gift of the mystic revelation. It's Gnosticism for Geeks.
Take it with a pinch of salt.. well, maybe a sack of salt. If you can see through the B.S. that sells millions of dollars in books, consultancy fees, lectures, etc, you will recognise some common-sense ideas that are better explained, explored and detailed elsewhere.

Ian K. VINE VOICE on February 27, 2015
Long have I hated this book

Kent is an excellent writer. He does an excellent job of presenting an approach to software development that is misguided for anything but user interface code. The argument that user interface code must be gotten into the hands of users to get feedback is used to suggest that complex system code should not be "designed up front". This is simply wrong. For example, if you are going to deploy an application in the Amazon Cloud that you want to scale, you better have some idea of how this is going to happen. Simply waiting until your application falls over and fails is not an acceptable approach.

One of the things I despise the most about the software development culture is the mindless adoption of fads. Extreme programming has been adopted by some organizations like a religious dogma.

Engineering large software systems is one of the most difficult things that humans do. There are no silver bullets and there are no dogmatic solutions that will make the difficult simple.

Anil Philip on March 24, 2005
not found - the silver bullet

Maybe I'm too cynical because I never got to work for the successful, whiz-kid companies; Maybe this book wasn't written for me!

This book reminds me of Jacobsen's "Use Cases" book of the 1990s. 'Use Cases' was all the rage but after several years, we slowly learned the truth: Uses Cases does not deal with the architecture - a necessary and good foundation for any piece of software.

Similarly, this book seems to be spotlighting Testing and taking it to extremes.

'the test plan is the design doc'

Not True. The design doc encapsulates wisdom and insight

a picture that accurately describes the interactions of the lower level software components is worth a thousand lines of code-reading.

Also present is an evangelistic fervor that reminds me of the rah-rah eighties' bestseller, "In Search Of Excellence" by Peters and Waterman. (Many people have since noted that most of the spotlighted companies of that book are bankrupt twenty five years later).

Lastly, I noted that the term 'XP' was used throughout the book, and the back cover has a blurb from an M$ architect. Was it simply coincidence that Windows shares the same name for its XP release? I wondered if M$ had sponsored part of the book as good advertising for Windows XP! :)

[Oct 08, 2017] >Disbelieving the 'many eyes' myth Opensource.com

Notable quotes:
"... This article originally appeared on Alice, Eve, and Bob – a security blog and is republished with permission. ..."
Oct 08, 2017 | opensource.com

Review by many eyes does not always prevent buggy code There is a view that because open source software is subject to review by many eyes, all the bugs will be ironed out of it. This is a myth. 06 Oct 2017 Mike Bursell (Red Hat) Feed 8 up Disbelieving the 'many eyes' hypothesis Image credits : Internet Archive Book Images . CC BY-SA 4.0 Writing code is hard. Writing secure code is harder -- much harder. And before you get there, you need to think about design and architecture. When you're writing code to implement security functionality, it's often based on architectures and designs that have been pored over and examined in detail. They may even reflect standards that have gone through worldwide review processes and are generally considered perfect and unbreakable. *

However good those designs and architectures are, though, there's something about putting things into actual software that's, well, special. With the exception of software proven to be mathematically correct, ** being able to write software that accurately implements the functionality you're trying to realize is somewhere between a science and an art. This is no surprise to anyone who's actually written any software, tried to debug software, or divine software's correctness by stepping through it; however, it's not the key point of this article.

Nobody *** actually believes that the software that comes out of this process is going to be perfect, but everybody agrees that software should be made as close to perfect and bug-free as possible. This is why code review is a core principle of software development. And luckily -- in my view, at least -- much of the code that we use in our day-to-day lives is open source, which means that anybody can look at it, and it's available for tens or hundreds of thousands of eyes to review.

And herein lies the problem: There is a view that because open source software is subject to review by many eyes, all the bugs will be ironed out of it. This is a myth. A dangerous myth. The problems with this view are at least twofold. The first is the "if you build it, they will come" fallacy. I remember when there was a list of all the websites in the world, and if you added your website to that list, people would visit it. **** In the same way, the number of open source projects was (maybe) once so small that there was a good chance that people might look at and review your code. Those days are past -- long past. Second, for many areas of security functionality -- crypto primitives implementation is a good example -- the number of suitably qualified eyes is low.

Don't think that I am in any way suggesting that the problem is any less in proprietary code: quite the opposite. Not only are the designs and architectures in proprietary software often hidden from review, but you have fewer eyes available to look at the code, and the dangers of hierarchical pressure and groupthink are dramatically increased. "Proprietary code is more secure" is less myth, more fake news. I completely understand why companies like to keep their security software secret, and I'm afraid that the "it's to protect our intellectual property" line is too often a platitude they tell themselves when really, it's just unsafe to release it. So for me, it's open source all the way when we're looking at security software.

So, what can we do? Well, companies and other organizations that care about security functionality can -- and have, I believe a responsibility to -- expend resources on checking and reviewing the code that implements that functionality. Alongside that, the open source community, can -- and is -- finding ways to support critical projects and improve the amount of review that goes into that code. ***** And we should encourage academic organizations to train students in the black art of security software writing and review, not to mention highlighting the importance of open source software.

We can do better -- and we are doing better. Because what we need to realize is that the reason the "many eyes hypothesis" is a myth is not that many eyes won't improve code -- they will -- but that we don't have enough expert eyes looking. Yet.


* Yeah, really: "perfect and unbreakable." Let's just pretend that's true for the purposes of this discussion.

** and that still relies on the design and architecture to actually do what you want -- or think you want -- of course, so good luck.

*** Nobody who's actually written more than about five lines of code (or more than six characters of Perl).

**** I added one. They came. It was like some sort of magic.

***** See, for instance, the Linux Foundation 's Core Infrastructure Initiative .

This article originally appeared on Alice, Eve, and Bob – a security blog and is republished with permission.

[Oct 03, 2017] Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about)

Notable quotes:
"... We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money. ..."
Oct 03, 2017 | discussion.theguardian.com

mlzarathustra , 21 Sep 2017 16:52

I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.

The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.

I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.

Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.

[Oct 03, 2017] Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

Notable quotes:
"... That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country! ..."
"... I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore... ..."
"... Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model. ..."
"... There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor ..."
"... If you want a high tech executive to suffer a stroke, mention the words "labor unions". ..."
"... India isn't being hired for the quality, they're being hired for cheap labor. ..."
"... Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again... ..."
"... Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology. ..."
"... I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children. ..."
"... Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact. ..."
Oct 02, 2017 | profile.theguardian.com
Terryl Dorian , 21 Sep 2017 13:26
That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country!
Ray D Wright -> RogTheDodge , , 21 Sep 2017 14:52
I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore...
Richard Livingstone -> KatieL , , 21 Sep 2017 14:50
+++1 to all of that.

Automated coding just pushes the level of coding further up the development food chain, rather than gets rid of it. It is the wrong approach for current tech. AI that is smart enough to model new problems and create their own descriptive and runnable language - hopefully after my lifetime but coming sometime.

Arne Babenhauserheide -> Evelita , , 21 Sep 2017 14:48
What coding does not teach is how to improve our non-code infrastructure and how to keep it running (that's the stuff which actually moves things). Code can optimize stuff, but it needs actual actuators to affect reality.

Sometimes these actuators are actual people walking on top of a roof while fixing it.

WyntonK , 21 Sep 2017 14:47
Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor .

If you want a high tech executive to suffer a stroke, mention the words "labor unions".

TheEgg -> UncommonTruthiness , , 21 Sep 2017 14:43

The ship has sailed on this activity as a career.

Nope. Married to a highly-technical skillset, you can still make big bucks. I say this as someone involved in this kind of thing academically and our Masters grads have to beat the banks and fintech companies away with dog shits on sticks. You're right that you can teach anyone to potter around and throw up a webpage but at the prohibitively difficult maths-y end of the scale, someone suitably qualified will never want for a job.

Mike_Dexter -> Evelita , , 21 Sep 2017 14:43
In a similar vein, if you accept the argument that it does drive down wages, wouldn't the culprit actually be the multitudes of online and offline courses and tutorials available to an existing workforce?
Terryl Dorian -> CountDooku , , 21 Sep 2017 14:42
Funny you should pick medicine, law, engineering... 3 fields that are *not* taught in high school. The writer is simply adding "coding" to your list. So it seems you agree with his "garbage" argument after all.
anticapitalist -> RogTheDodge , , 21 Sep 2017 14:42
Key word is "good". Teaching everyone is just going to increase the pool of programmers code I need to fix. India isn't being hired for the quality, they're being hired for cheap labor. As for women sure I wouldn't mind more women around but why does no one say their needs to be more equality in garbage collection or plumbing? (And yes plumbers are a high paid professional).

In the end I don't care what the person is, I just want to hire and work with the best and not someone I have to correct their work because they were hired by quota. If women only graduate at 15% why should IT contain more than that? And let's be a bit honest with the facts, of those 15% how many spend their high school years staying up all night hacking? Very few. Now the few that did are some of the better developers I work with but that pool isn't going to increase by forcing every child to program... just like sports aren't better by making everyone take gym class.

WithoutPurpose , 21 Sep 2017 14:42
I ran a development team for 10 years and I never had any trouble hiring programmers - we just had to pay them enough. Every job would have at least 10 good applicants.

Two years ago I decided to scale back a bit and go into programming (I can code real-time low latency financial apps in 4 languages) and I had four interviews in six months with stupidly low salaries. I'm lucky in that I can bounce between tech and the business side so I got a decent job out of tech.

My entirely anecdotal conclusion is that there is no shortage of good programmers just a shortage of companies willing to pay them.

oddbubble -> Tori Turner , , 21 Sep 2017 14:41
I've worn many hats so far, I started out as a started out as a sysadmin, then I moved on to web development, then back end and now I'm doing test automation because I am on almost the same money for half the effort.
peter nelson -> raffine , , 21 Sep 2017 14:38
But the concepts won't. Good programming requires the ability to break down a task, organise the steps in performing it, identify parts of the process that are common or repetitive so they can be bundled together, handed-off or delegated, etc.

These concepts can be applied to any programming language, and indeed to many non-software activities.

Oliver Jones -> Trumbledon , , 21 Sep 2017 14:37
In the city maybe with a financial background, the exception.
anticapitalist -> Ethan Hawkins , 21 Sep 2017 14:32
Well to his point sort of... either everything will go php or all those entry level php developers will be on the street. A good Java or C developer is hard to come by. And to the others, being a being a developer, especially a good one, is nothing like reading and writing. The industry is already saturated with poor coders just doing it for a paycheck.
peter nelson -> Tori Turner , 21 Sep 2017 14:31
I'm just going to say this once: not everyone with a computer science degree is a coder.

And vice versa. I'm retiring from a 40-year career as a software engineer. Some of the best software engineers I ever met did not have CS degrees.

KatieL -> Mishal Almohaimeed , 21 Sep 2017 14:30
"already developing automated coding scripts. "

Pretty much the entire history of the software industry since FORAST was developed for the ORDVAC has been about desperately trying to make software development in some way possible without driving everyone bonkers.

The gulf between FORAST and today's IDE-written, type-inferring high level languages, compilers, abstracted run-time environments, hypervisors, multi-computer architectures and general tech-world flavour-of-2017-ness is truly immense[1].

And yet software is still fucking hard to write. There's no sign it's getting easier despite all that work.

Automated coding was promised as the solution in the 1980s as well. In fact, somewhere in my archives, I've got paper journals which include adverts for automated systems that would programmers completely redundant by writing all your database code for you. These days, we'd think of those tools as automated ORM generators and they don't fix the problem; they just make a new one -- ORM impedance mismatch -- which needs more engineering on top to fix...

The tools don't change the need for the humans, they just change what's possible for the humans to do.

[1] FORAST executed in about 20,000 bytes of memory without even an OS. The compile artifacts for the map-reduce system I built today are an astonishing hundred million bytes... and don't include the necessary mapreduce environment, management interface, node operating system and distributed filesystem...

raffine , 21 Sep 2017 14:29
Whatever they are taught today will be obsolete tomorrow.
yannick95 -> savingUK , , 21 Sep 2017 14:27
"There are already top quality coders in China and India"

AHAHAHAHAHAHAHAHAHAHAHA *rolls on the floor laughting* Yes........ 1%... and 99% of incredibly bad, incompetent, untalented one that produce cost 50% of a good developer but produce only 5% in comparison. And I'm talking with a LOT of practical experience through more than a dozen corporations all over the world which have been outsourcing to India... all have been disasters for the companies (but good for the execs who pocketed big bonuses and left the company before the disaster blows up in their face)

Wiretrip -> mcharts , , 21 Sep 2017 14:25
Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again...
TomRoche , 21 Sep 2017 14:11

Tech executives have pursued [the goal of suppressing workers' compensation] in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement.

Folks interested in the story of the Techtopus (less drily presented than in the links in this article) should check out Mark Ames' reporting, esp this overview article and this focus on the egregious Steve Jobs (whose canonization by the US corporate-funded media is just one more impeachment of their moral bankruptcy).

Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status.

Folks interested in H-1B and US technical visas more generally should head to Norm Matloff 's summary page , and then to his blog on the subject .

Olympus68 , 21 Sep 2017 13:49

I have watched as schools run by trade unions have done the opposite for the 5 decades. By limiting the number of graduates, they were able to help maintain living wages and benefits. This has been stopped in my area due to the pressure of owners run "trade associations".

During that same time period I have witnessed trade associations controlled by company owners, while publicising their support of the average employee, invest enormous amounts of membership fees in creating alliances with public institutions. Their goal has been that of flooding the labor market and thus keeping wages low. A double hit for the average worker because membership fees were paid by employees as well as those in control.

And so it goes....

savingUK , 21 Sep 2017 13:38
Coding jobs are just as susceptible to being moved to lower cost areas of the world as hardware jobs already have. It's already happening. There are already top quality coders in China and India. There is a much larger pool to chose from and they are just as good as their western counterparts and work harder for much less money.

Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology.

whitehawk66 , 21 Sep 2017 15:18

I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children.

They would definitely not survive the zombie apocalypse.

P.S. not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread.

UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

William Fitch III , 21 Sep 2017 13:52
Hi: As I have said many times before, there is no shortage of people who fully understand the problem and can see all the connections.

However, they all fall on their faces when it comes to the solution. To cut to the chase, Concentrated Wealth needs to go, permanently. Of course the challenge is how to best accomplish this.....

.....Bill

MostlyHarmlessD , , 21 Sep 2017 13:16

Damn engineers and their black and white world view, if they weren't so inept they would've unionized instead of being trampled again and again in the name of capitalism.
mcharts -> Aldous0rwell , , 21 Sep 2017 13:07
Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact.

Woe to our children and grandchildren.

Where's Bernie Sanders when we need him.

[Oct 03, 2017] The dream of coding automation remain illusive... Very illusive...

Oct 03, 2017 | discussion.theguardian.com

Richard Livingstone -> Mishal Almohaimeed , 21 Sep 2017 14:46

Wrong again, that approach has been tried since the 80s and will keep failing only because software development is still more akin to a technical craft than an engineering discipline. The number of elements required to assemble a working non trivial system is way beyond scriptable.
freeandfair -> Taylor Dotson , 21 Sep 2017 14:26
> That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?

You don't believe there will be robots to do plumbing and cleaning? The cleaner's job will be to program robots to do what they need.
CEOs? Absolutely.

English teachers? Both of my kids have school laptops and everything is being done on the computers. The teachers use software and create websites and what not. Yes, even English teachers.

Not knowing / understanding how to code will be the same as not knowing how to use Word/ Excel. I am assuming there are people who don't, but I don't know any above the age of 6.

Wiretrip -> Mishal Almohaimeed , 21 Sep 2017 14:20
We've had 'automated coding scripts' for years for small tasks. However, anyone who says they're going to obviate programmers, analysts and designers doesn't understand the software development process.
Ethan Hawkins -> David McCaul , 21 Sep 2017 13:22
Even if expert systems (an 80's concept, BTW) could code, we'd still have a huge need for managers. The hard part of software isn't even the coding. It's determining the requirements and working with clients. It will require general intelligence to do 90% of what we do right now. The 10% we could automate right now, mostly gets in the way. I agree it will change, but it's going to take another 20-30 years to really happen.
Mishal Almohaimeed -> PolydentateBrigand , , 21 Sep 2017 13:17
wrong, software companies are already developing automated coding scripts. You'll get a bunch of door to door knives salespeople once the dust settles that's what you'll get.
freeandfair -> rgilyead , , 21 Sep 2017 14:22
> In 20 years time AI will be doing the coding

Possible, but your still have to understand how AI operates and what it can and cannot do.

[Oct 03, 2017] Coding and carpentry are not so distant, are they ?

Thw user "imipak" views are pretty common misconceptions. They are all wrong.
Notable quotes:
"... I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers. ..."
"... Many people can write, but few become journalists, and fewer still become real authors. ..."
Oct 03, 2017 | discussion.theguardian.com

imipak, 21 Sep 2017 15:13

Coding has little or nothing to do with Silicon Valley. They may or may not have ulterior motives, but ultimately they are nothing in the scheme of things.

I disagree with teaching coding as a discrete subject. I think it should be combined with home economics and woodworking because 90% of these subjects consist of transferable skills that exist in all of them. Only a tiny residual is actually topic-specific.

In the case of coding, the residual consists of drawing skills and typing skills. Programming language skills? Irrelevant. You should choose the tools to fit the problem. Neither of these needs a computer. You should only ever approach the computer at the very end, after you've designed and written the program.

Is cooking so very different? Do you decide on the ingredients before or after you start? Do you go shopping half-way through cooking an omelette?

With woodwork, do you measure first or cut first? Do you have a plan or do you randomly assemble bits until it does something useful?

Real coding, taught correctly, is barely taught at all. You teach the transferable skills. ONCE. You then apply those skills in each area in which they apply.

What other transferable skills apply? Top-down design, bottom-up implementation. The correct methodology in all forms of engineering. Proper testing strategies, also common across all forms of engineering. However, since these tests are against logic, they're a test of reasoning. A good thing to have in the sciences and philosophy.

Technical writing is the art of explaining things to idiots. Whether you're designing a board game, explaining what you like about a house, writing a travelogue or just seeing if your wild ideas hold water, you need to be able to put those ideas down on paper in a way that exposes all the inconsistencies and errors. It doesn't take much to clean it up to be readable by humans. But once it is cleaned up, it'll remain free of errors.

So I would teach a foundation course that teaches top-down reasoning, bottom-up design, flowcharts, critical path analysis and symbolic logic. Probably aimed at age 7. But I'd not do so wholly in the abstract. I'd have it thoroughly mixed in with one field, probably cooking as most kids do that and it lacks stigma at that age.

I'd then build courses on various crafts and engineering subjects on top of that, building further hierarchies where possible. Eliminate duplication and severely reduce the fictions we call disciplines.

oldzealand, 21 Sep 2017 14:58
I used to employ 200 computer scientists in my business and now teach children so I'm apparently as guilty as hell. To be compared with a carpenter is, however, a true compliment, if you mean those that create elegant, aesthetically-pleasing, functional, adaptable and long-lasting bespoke furniture, because our crafts of problem-solving using limited resources in confined environments to create working, life-improving artifacts both exemplify great human ingenuity in action. Capitalism or no.
peter nelson, 21 Sep 2017 14:29
"But coding is not magic. It is a technical skill, akin to carpentry."

But some people do it much better than others. Just like journalism. This article is complete nonsense, as I discuss in another comment. The author might want to consider a career in carpentry.

Fanastril, 21 Sep 2017 14:13
"But coding is not magic. It is a technical skill, akin to carpentry."

It is a way of thinking. Perhaps carpentry is too, but the arrogance of the above statement shows a soul who is done thinking.

NDReader, 21 Sep 2017 14:12
"But coding is not magic. It is a technical skill, akin to carpentry."

I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers.

Many people can write, but few become journalists, and fewer still become real authors.

MostlyHarmlessD, 21 Sep 2017 13:08
A carpenter!? Good to know that engineers are still thought of as jumped up tradesmen.

[Oct 02, 2017] Techs push to teach coding isnt about kids success – its about cutting wages by Ben Tarnoff

Highly recommended!
IT is probably one of the most "neoliberalized" industry (even in comparison with finance). So atomization of labor and "plantation economy" is a norm in IT. It occurs on rather high level of wages, but with influx of foreign programmers and IT specialists (in the past) and mass outsourcing (now) this is changing. Completion for good job positions is fierce. Dog eats dog competition, the dream of neoliberals. Entry level jobs are already paying $15 an hour, if not less.
Programming is a relatively rare talent, much like ability to play violin. Even amateur level is challenging. On high level (developing large complex programs in a team and still preserving your individuality and productivity ) it is extremely rare. Most of "commercial" programmers are able to produce only a mediocre code (which might be adequate). Only a few programmers can excel if complex software projects. Sometimes even performing solo. There is also a pathological breed of "programmer junkie" ( graphomania happens in programming too ) who are able sometimes to destroy something large projects singlehandedly. That often happens with open source projects after the main developer lost interest and abandoned the project.
It's good to allow children the chance to try their hand at coding when they otherwise may not had that opportunity, But in no way that means that all of them can became professional programmers. No way. Again the top level of programmers required position of a unique talent, much like top musical performer talent.
Also to get a decent entry position you iether need to be extremely talented or graduate from Ivy League university. When applicants are abundant, resume from less prestigious universities are not even considered. this is just easier for HR to filter applications this way.
Also under neoliberalism cheap labor via H1B visas flood the market and depresses wages. Many Silicon companies were so to say "Russian speaking in late 90th after the collapse of the USSR. Not offshoring is the dominant way to offload the development to cheaper labor.
Notable quotes:
"... As software mediates more of our lives, and the power of Silicon Valley grows, it's tempting to imagine that demand for developers is soaring. The media contributes to this impression by spotlighting the genuinely inspiring stories of those who have ascended the class ladder through code. You may have heard of Bit Source, a company in eastern Kentucky that retrains coalminers as coders. They've been featured by Wired , Forbes , FastCompany , The Guardian , NPR and NBC News , among others. ..."
"... A former coalminer who becomes a successful developer deserves our respect and admiration. But the data suggests that relatively few will be able to follow their example. Our educational system has long been producing more programmers than the labor market can absorb. ..."
"... More tellingly, wage levels in the tech industry have remained flat since the late 1990s. Adjusting for inflation, the average programmer earns about as much today as in 1998. If demand were soaring, you'd expect wages to rise sharply in response. Instead, salaries have stagnated. ..."
"... Tech executives have pursued this goal in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement . Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status. ..."
"... Guest workers and wage-fixing are useful tools for restraining labor costs. But nothing would make programming cheaper than making millions more programmers. ..."
"... Silicon Valley has been unusually successful in persuading our political class and much of the general public that its interests coincide with the interests of humanity as a whole. But tech is an industry like any other. It prioritizes its bottom line, and invests heavily in making public policy serve it. The five largest tech firms now spend twice as much as Wall Street on lobbying Washington – nearly $50m in 2016. The biggest spender, Google, also goes to considerable lengths to cultivate policy wonks favorable to its interests – and to discipline the ones who aren't. ..."
"... Silicon Valley is not a uniquely benevolent force, nor a uniquely malevolent one. Rather, it's something more ordinary: a collection of capitalist firms committed to the pursuit of profit. And as every capitalist knows, markets are figments of politics. They are not naturally occurring phenomena, but elaborately crafted contraptions, sustained and structured by the state – which is why shaping public policy is so important. If tech works tirelessly to tilt markets in its favor, it's hardly alone. What distinguishes it is the amount of money it has at its disposal to do so. ..."
"... The problem isn't training. The problem is there aren't enough good jobs to be trained for ..."
"... Everyone should have the opportunity to learn how to code. Coding can be a rewarding, even pleasurable, experience, and it's useful for performing all sorts of tasks. More broadly, an understanding of how code works is critical for basic digital literacy – something that is swiftly becoming a requirement for informed citizenship in an increasingly technologized world. ..."
"... But coding is not magic. It is a technical skill, akin to carpentry. Learning to build software does not make you any more immune to the forces of American capitalism than learning to build a house. Whether a coder or a carpenter, capital will do what it can to lower your wages, and enlist public institutions towards that end. ..."
"... Exposing large portions of the school population to coding is not going to magically turn them into coders. It may increase their basic understanding but that is a long way from being a software engineer. ..."
"... All schools teach drama and most kids don't end up becoming actors. You need to give all kids access to coding in order for some can go on to make a career out of it. ..."
"... it's ridiculous because even out of a pool of computer science B.Sc. or M.Sc. grads - companies are only interested in the top 10%. Even the most mundane company with crappy IT jobs swears that they only hire "the best and the brightest." ..."
"... It's basically a con-job by the big Silicon Valley companies offshoring as many US jobs as they can, or "inshoring" via exploitation of the H1B visa ..."
"... Masters is the new Bachelors. ..."
"... I taught CS. Out of around 100 graduates I'd say maybe 5 were reasonable software engineers. The rest would be fine in tech support or other associated trades, but not writing software. Its not just a set of trainable skills, its a set of attitudes and ways of perceiving and understanding that just aren't that common. ..."
"... Yup, rings true. I've been in hi tech for over 40 years and seen the changes. I was in Silicon Valley for 10 years on a startup. India is taking over, my current US company now has a majority Indian executive and is moving work to India. US politicians push coding to drive down wages to Indian levels. ..."
Oct 02, 2017 | www.theguardian.com

This month, millions of children returned to school. This year, an unprecedented number of them will learn to code.

Computer science courses for children have proliferated rapidly in the past few years. A 2016 Gallup report found that 40% of American schools now offer coding classes – up from only 25% a few years ago. New York, with the largest public school system in the country, has pledged to offer computer science to all 1.1 million students by 2025. Los Angeles, with the second largest, plans to do the same by 2020. And Chicago, the fourth largest, has gone further, promising to make computer science a high school graduation requirement by 2018.

The rationale for this rapid curricular renovation is economic. Teaching kids how to code will help them land good jobs, the argument goes. In an era of flat and falling incomes, programming provides a new path to the middle class – a skill so widely demanded that anyone who acquires it can command a livable, even lucrative, wage.

This narrative pervades policymaking at every level, from school boards to the government. Yet it rests on a fundamentally flawed premise. Contrary to public perception, the economy doesn't actually need that many more programmers. As a result, teaching millions of kids to code won't make them all middle-class. Rather, it will proletarianize the profession by flooding the market and forcing wages down – and that's precisely the point.

At its root, the campaign for code education isn't about giving the next generation a shot at earning the salary of a Facebook engineer. It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

As software mediates more of our lives, and the power of Silicon Valley grows, it's tempting to imagine that demand for developers is soaring. The media contributes to this impression by spotlighting the genuinely inspiring stories of those who have ascended the class ladder through code. You may have heard of Bit Source, a company in eastern Kentucky that retrains coalminers as coders. They've been featured by Wired , Forbes , FastCompany , The Guardian , NPR and NBC News , among others.

A former coalminer who becomes a successful developer deserves our respect and admiration. But the data suggests that relatively few will be able to follow their example. Our educational system has long been producing more programmers than the labor market can absorb. A study by the Economic Policy Institute found that the supply of American college graduates with computer science degrees is 50% greater than the number hired into the tech industry each year. For all the talk of a tech worker shortage, many qualified graduates simply can't find jobs.

More tellingly, wage levels in the tech industry have remained flat since the late 1990s. Adjusting for inflation, the average programmer earns about as much today as in 1998. If demand were soaring, you'd expect wages to rise sharply in response. Instead, salaries have stagnated.

Still, those salaries are stagnating at a fairly high level. The Department of Labor estimates that the median annual wage for computer and information technology occupations is $82,860 – more than twice the national average. And from the perspective of the people who own the tech industry, this presents a problem. High wages threaten profits. To maximize profitability, one must always be finding ways to pay workers less.

Tech executives have pursued this goal in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement . Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status.

Guest workers and wage-fixing are useful tools for restraining labor costs. But nothing would make programming cheaper than making millions more programmers. And where better to develop this workforce than America's schools? It's no coincidence, then, that the campaign for code education is being orchestrated by the tech industry itself. Its primary instrument is Code.org, a nonprofit funded by Facebook, Microsoft, Google and others . In 2016, the organization spent nearly $20m on training teachers, developing curricula, and lobbying policymakers.

Silicon Valley has been unusually successful in persuading our political class and much of the general public that its interests coincide with the interests of humanity as a whole. But tech is an industry like any other. It prioritizes its bottom line, and invests heavily in making public policy serve it. The five largest tech firms now spend twice as much as Wall Street on lobbying Washington – nearly $50m in 2016. The biggest spender, Google, also goes to considerable lengths to cultivate policy wonks favorable to its interests – and to discipline the ones who aren't.

Silicon Valley is not a uniquely benevolent force, nor a uniquely malevolent one. Rather, it's something more ordinary: a collection of capitalist firms committed to the pursuit of profit. And as every capitalist knows, markets are figments of politics. They are not naturally occurring phenomena, but elaborately crafted contraptions, sustained and structured by the state – which is why shaping public policy is so important. If tech works tirelessly to tilt markets in its favor, it's hardly alone. What distinguishes it is the amount of money it has at its disposal to do so.

Money isn't Silicon Valley's only advantage in its crusade to remake American education, however. It also enjoys a favorable ideological climate. Its basic message – that schools alone can fix big social problems – is one that politicians of both parties have been repeating for years. The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric. That if we teach students the right skills, we can solve poverty, inequality and stagnation. The school becomes an engine of economic transformation, catapulting young people from challenging circumstances into dignified, comfortable lives.

This argument is immensely pleasing to the technocratic mind. It suggests that our core economic malfunction is technical – a simple asymmetry. You have workers on one side and good jobs on the other, and all it takes is training to match them up. Indeed, every president since Bill Clinton has talked about training American workers to fill the "skills gap". But gradually, one mainstream economist after another has come to realize what most workers have known for years: the gap doesn't exist. Even Larry Summers has concluded it's a myth.

The problem isn't training. The problem is there aren't enough good jobs to be trained for . The solution is to make bad jobs better, by raising the minimum wage and making it easier for workers to form a union, and to create more good jobs by investing for growth. This involves forcing business to put money into things that actually grow the productive economy rather than shoveling profits out to shareholders. It also means increasing public investment, so that people can make a decent living doing socially necessary work like decarbonizing our energy system and restoring our decaying infrastructure.

Everyone should have the opportunity to learn how to code. Coding can be a rewarding, even pleasurable, experience, and it's useful for performing all sorts of tasks. More broadly, an understanding of how code works is critical for basic digital literacy – something that is swiftly becoming a requirement for informed citizenship in an increasingly technologized world.

But coding is not magic. It is a technical skill, akin to carpentry. Learning to build software does not make you any more immune to the forces of American capitalism than learning to build a house. Whether a coder or a carpenter, capital will do what it can to lower your wages, and enlist public institutions towards that end.

Silicon Valley has been extraordinarily adept at converting previously uncommodified portions of our common life into sources of profit. Our schools may prove an easy conquest by comparison.

See also:

willyjack, 21 Sep 2017 16:56

"Everyone should have the opportunity to learn how to code. " OK, and that's what's being done. And that's what the article is bemoaning. What would be better: teach them how to change tires or groom pets? Or pick fruit? Amazingly condescending article.

MrFumoFumo , 21 Sep 2017 14:54
However, training lots of people to be coders won't automatically result in lots of people who can actually write good code. Nor will it give managers/recruiters the necessary skills to recognize which programmers are any good.

congenialAnimal -> alfredooo , 24 Sep 2017 09:57

A valid rebuttal but could I offer another observation? Exposing large portions of the school population to coding is not going to magically turn them into coders. It may increase their basic understanding but that is a long way from being a software engineer.

Just as children join art, drama or biology classes so they do not automatically become artists, actors or doctors. I would agree entirely that just being able to code is not going to guarantee the sort of income that might be aspired to. As with all things, it takes commitment, perseverance and dogged determination. I suppose ultimately it becomes the Gattaca argument.

alfredooo -> racole , 24 Sep 2017 06:51
Fair enough, but, his central argument, that an overabundance of coders will drive wages in that sector down, is generally true, so in the future if you want your kids to go into a profession that will earn them 80k+ then being a "coder" is not the route to take. When coding is - like reading, writing, and arithmetic - just a basic skill, there's no guarantee having it will automatically translate into getting a "good" job.
Wiretrip , 21 Sep 2017 14:14
This article lumps everyone in computing into the 'coder' bin, without actually defining what 'coding' is. Yes there is a glut of people who can knock together a bit of HTML and JavaScript, but that is not really programming as such.

There are huge shortages of skilled developers however; people who can apply computer science and engineering in terms of analysis and design of software. These are the real skills for which relatively few people have a true aptitude.

The lack of really good skills is starting to show in some terrible software implementation decisions, such as Slack for example; written as a web app running in Electron (so that JavaScript code monkeys could knock it out quickly), but resulting in awful performance. We will see more of this in the coming years...

Taylor Dotson -> youngsteveo , 21 Sep 2017 13:53
My brother is a programmer, and in his experience these coding exams don't test anything but whether or not you took (and remember) a very narrow range of problems introduce in the first years of a computer science degree. The entire hiring process seems premised on a range of ill-founded ideas about what skills are necessary for the job and how to assess them in people. They haven't yet grasped that those kinds of exams mostly test test-taking ability, rather than intelligence, creativity, diligence, communication ability, or anything else that a job requires beside coughing up the right answer in a stressful, timed environment without outside resources.

The_Raven , 23 Sep 2017 15:45

I'm an embedded software/firmware engineer. Every similar engineer I've ever met has had the same background - starting in electronics and drifting into embedded software writing in C and assembler. It's virtually impossible to do such software without an understanding of electronics. When it goes wrong you may need to get the test equipment out to scope the hardware to see if it's a hardware or software problem. Coming from a pure computing background just isn't going to get you a job in this type of work.
waltdangerfield , 23 Sep 2017 14:42
All schools teach drama and most kids don't end up becoming actors. You need to give all kids access to coding in order for some can go on to make a career out of it.
TwoSugarsPlease , 23 Sep 2017 06:13
Coding salaries will inevitably fall over time, but such skills give workers the option, once they discover that their income is no longer sustainable in the UK, of moving somewhere more affordable and working remotely.
DiGiT81 -> nixnixnix , 23 Sep 2017 03:29
Completely agree. Coding is a necessary life skill for 21st century but there are levels to every skill. From basic needs for an office job to advanced and specialised.
nixnixnix , 23 Sep 2017 00:46
Lots of people can code but very few of us ever get to the point of creating something new that has a loyal and enthusiastic user-base. Everyone should be able to code because it is or will be the basis of being able to create almost anything in the future. If you want to make a game in Unity, knowing how to code is really useful. If you want to work with large data-sets, you can't rely on Excel and so you need to be able to code (in R?). The use of code is becoming so pervasive that it is going to be like reading and writing.

All the science and engineering graduates I know can code but none of them have ever sold a stand-alone software. The argument made above is like saying that teaching everyone to write will drive down the wages of writers. Writing is useful for anyone and everyone but only a tiny fraction of people who can write, actually write novels or even newspaper columns.

DolyGarcia -> Carl Christensen , 22 Sep 2017 19:24
Immigrants have always a big advantage over locals, for any company, including tech companies: the government makes sure that they will stay in their place and never complain about low salaries or bad working conditions because, you know what? If the company sacks you, an immigrant may be forced to leave the country where they live because their visa expires, which is never going to happen with a local. Companies always have more leverage over immigrants. Given a choice between more and less exploitable workers, companies will choose the most exploitable ones.

Which is something that Marx figured more than a century ago, and why he insisted that socialism had to be international, which led to the founding of the First International Socialist. If worker's fights didn't go across country boundaries, companies would just play people from one country against the other. Unfortunately, at some point in time socialists forgot this very important fact.

xxxFred -> Tomix Da Vomix , 22 Sep 2017 18:52
SO what's wrong with having lots of people able to code? The only argument you seem to have is that it'll lower wages. So do you think that we should stop teaching writing skills so that journalists can be paid more? And no one os going to "force" kids into high-level abstract coding practices in kindergarten, fgs. But there is ample empirical proof that young children can learn basic principles. In fact the younger that children are exposed to anything, the better they can enhance their skills adn knowlege of it later in life, and computing concepts are no different.
Tomix Da Vomix -> xxxFred , 22 Sep 2017 18:40
You're completely missing the point. Kids are forced into the programming field (even STEM as a more general term), before they evolve their abstract reasoning. For that matter, you're not producing highly skilled people, but functional imbeciles and a decent labor that will eventually lower the wages.
Conspiracy theory? So Google, FB and others paying hundreds of millions of dollars for forming a cartel to lower the wages is not true? It sounds me that you're sounding more like a 1969 denier that Guardian is. Tech companies are not financing those incentives because they have a good soul. Their primary drive has always been money, otherwise they wouldn't sell your personal data to earn money.

But hey, you can always sleep peacefully when your kid becomes a coder. When he is 50, everyone will want to have a Cobol, Ada programmer with 25 years of experience when you can get 16 year old kid from a high school for 1/10 of a price. Go back to sleep...

Carl Christensen -> xxxFred , 22 Sep 2017 16:49
it's ridiculous because even out of a pool of computer science B.Sc. or M.Sc. grads - companies are only interested in the top 10%. Even the most mundane company with crappy IT jobs swears that they only hire "the best and the brightest."
Carl Christensen , 22 Sep 2017 16:47
It's basically a con-job by the big Silicon Valley companies offshoring as many US jobs as they can, or "inshoring" via exploitation of the H1B visa - so they can say "see, we don't have 'qualified' people in the US - maybe when these kids learn to program in a generation." As if American students haven't been coding for decades -- and saw their salaries plummet as the H1B visa and Indian offshore firms exploded......
Declawed -> KDHughes , 22 Sep 2017 16:40
Dude, stow the attitude. I've tested code from various entities, and seen every kind of crap peddled as gold.

But I've also seen a little 5-foot giggly lady with two kids, grumble a bit and save a $100,000 product by rewriting another coder's man-month of work in a few days, without any flaws or cracks. Almost nobody will ever know she did that. She's so far beyond my level it hurts.

And yes, the author knows nothing. He's genuinely crying wolf while knee-deep in amused wolves. The last time I was in San Jose, years ago , the room was already full of people with Indian surnames. If the problem was REALLY serious, a programmer from POLAND was called in.

If you think fighting for a violinist spot is hard, try fighting for it with every spare violinist in the world . I am training my Indian replacement to do my job right now . At least the public can appreciate a good violin. Can you appreciate Duff's device ?

So by all means, don't teach local kids how to think in a straight line, just in case they make a dent in the price of wages IN INDIA.... *sheesh*

Declawed -> IanMcLzzz , 22 Sep 2017 15:35
That's the best possible summarisation of this extremely dumb article. Bravo.

For those who don't know how to think of coding, like the article author, here's a few analogies :

A computer is a box that replays frozen thoughts, quickly. That is all.

Coding is just the art of explaining. Anyone who can explain something patiently and clearly, can code. Anyone who can't, can't.

Making hardware is very much like growing produce while blind. Making software is very much like cooking that produce while blind.

Imagine looking after a room full of young eager obedient children who only do exactly, *exactly*, what you told them to do, but move around at the speed of light. Imagine having to try to keep them from smashing into each other or decapitating themselves on the corners of tables, tripping over toys and crashing into walls, etc, while you get them all to play games together.

The difference between a good coder and a bad coder is almost life and death. Imagine a broth prepared with ingredients from a dozen co-ordinating geniuses and one idiot, that you'll mass produce. The soup is always far worse for the idiot's additions. The more cooks you involve, the more chance your mass produced broth will taste bad.

People who hire coders, typically can't tell a good coder from a bad coder.

Zach Dyer -> Mystik Al , 22 Sep 2017 15:18
Tech jobs will probably always be available long after your gone or until another mass extinction.
edmundberk -> AmyInNH , 22 Sep 2017 14:59
No you do it in your own time. If you're not prepared to put in long days IT is not for you in any case. It was ever thus, but more so now due to offshoring - rather than the rather obscure forces you seem to believe are important.
WithoutPurpose -> freeandfair , 22 Sep 2017 13:21
Bit more rhan that.
peter nelson -> offworldguy , 22 Sep 2017 12:44
Sorry, offworldguy, but you're losing this one really badly. I'm a professional software engineer in my 60's and I know lots of non-professionals in my age range who write little programs, scripts and apps for fun. I know this because they often contact me for help or advice.

So you've now been told by several people in this thread that ordinary people do code for fun or recreation. The fact that you don't know any probably says more about your network of friends and acquaintances than about the general population.

xxxFred , 22 Sep 2017 12:18
This is one of the daftest articles I've come across in a long while.
If it's possible that so many kids can be taught to code well enough so that wages come down, then that proves that the only reason we've been paying so much for development costs is the scarcity of people able to do it, not that it's intrinsically so hard that only a select few could anyway. In which case, there is no ethical argument for keeping the pools of skilled workers to some select group. Anyone able to do it should have an equal opportunity to do it.
What is the argument for not teaching coding (other than to artificially keep wages high)? Why not stop teaching the three R's, in order to boost white-collar wages in general?
Computing is an ever-increasingly intrinsic part of life, and people need to understand it at all levels. It is not just unfair, but tantamount to neglect, to fail to teach children all the skills they may require to cope as adults.
Having said that, I suspect that in another generation or two a good many lower-level coding jobs will be redundant anyway, with such code being automatically generated, and "coders" at this level will be little more than technicians setting various parameters. Even so, understanding the basics behind computing is a part of understanding the world they live in, and every child needs that.
Suggesting that teaching coding is some kind of conspiracy to force wages down is well, it makes the moon-landing conspiracy looks sensible by comparison.
timrichardson -> offworldguy , 22 Sep 2017 12:16
I think it is important to demystify advanced technology, I think that has importance in its own right.Plus, schools should expose kids to things which may spark their interest. Not everyone who does a science project goes on years later to get a PhD, but you'd think that it makes it more likely. Same as giving a kid some music lessons. There is a big difference between serious coding and the basic steps needed to automate a customer service team or a marketing program, but the people who have some mastery over automation will have an advantage in many jobs. Advanced machines are clearly going to be a huge part of our future. What should we do about it, if not teach kids how to understand these tools?
rogerfederere -> William Payne , 22 Sep 2017 12:13
tl;dr.
Mystik Al , 22 Sep 2017 12:08
As automation is about to put 40% of the workforce permanently out of work getting into to tech seems like a good idea!
timrichardson , 22 Sep 2017 12:04
This is like arguing that teaching kids to write is nothing more than a plot to flood the market for journalists. Teaching first aid and CPR does not make everyone a doctor.
Coding is an essential skill for many jobs already: 50 years ago, who would have thought you needed coders to make movies? Being a software engineer, a serious coder, is hard. IN fact, it takes more than technical coding to be a software engineer: you can learn to code in a week. Software Engineering is a four year degree, and even then you've just started a career. But depriving kids of some basic insights may mean they won't have the basic skills needed in the future, even for controlling their car and house. By all means, send you kids to a school that doesn't teach coding. I won't.
James Jones -> vimyvixen , 22 Sep 2017 11:41
Did you learn SNOBOL, or is Snowball a language I'm not familiar with? (Entirely possible, as an American I never would have known Extended Mercury Autocode existed we're it not for a random book acquisition at my home town library when I was a kid.)
William Payne , 22 Sep 2017 11:17
The tide that is transforming technology jobs from "white collar professional" into "blue collar industrial" is part of a larger global economic cycle.

Successful "growth" assets inevitably transmogrify into "value" and "income" assets as they progress through the economic cycle. The nature of their work transforms also. No longer focused on innovation; on disrupting old markets or forging new ones; their fundamental nature changes as they mature into optimising, cost reducing, process oriented and most importantly of all -- dividend paying -- organisations.

First, the market invests. And then, .... it squeezes.

Immature companies must invest in their team; must inspire them to be innovative so that they can take the creative risks required to create new things. This translates into high skills, high wages and "white collar" social status.

Mature, optimising companies on the other hand must necessarily avoid risks and seek variance-minimising predictability. They seek to control their human resources; to eliminate creativity; to to make the work procedural, impersonal and soulless. This translates into low skills, low wages and "blue collar" social status.

This is a fundamental part of the economic cycle; but it has been playing out on the global stage which has had the effect of hiding some of its' effects.

Over the past decades, technology knowledge and skills have flooded away from "high cost" countries and towards "best cost" countries at a historically significant rate. Possibly at the maximum rate that global infrastructure and regional skills pools can support. Much of this necessarily inhumane and brutal cost cutting and deskilling has therefore been hidden by the tide of outsourcing and offshoring. It is hard to see the nature of the jobs change when the jobs themselves are changing hands at the same time.

The ever tighter ratchet of dehumanising industrialisation; productivity and efficiency continues apace, however, and as our global system matures and evens out, we see the seeds of what we have sown sail home from over the sea.

Technology jobs in developed nations have been skewed towards "growth" activities since for the past several decades most "value" and "income" activities have been carried out in developing nations. Now, we may be seeing the early preparations for the diffusion of that skewed, uneven and unsustainable imbalance.

The good news is that "Growth" activities are not going to disappear from the world. They just may not be so geographically concentrated as they are today. Also, there is a significant and attention-worthy argument that the re-balancing of skills will result in a more flexible and performant global economy as organisations will better be able to shift a wider variety of work around the world to regions where local conditions (regulation, subsidy, union activity etc...) are supportive.

For the individuals concerned it isn't going to be pretty. And of course it is just another example of the race to the bottom that pits states and public sector purse-holders against one another to win the grace and favour of globally mobile employers.

As a power play move it has a sort of inhumanly psychotic inevitability to it which is quite awesome to observe.

I also find it ironic that the only way to tame the leviathan that is the global free-market industrial system might actually be effective global governance and international cooperation within a rules-based system.

Both "globalist" but not even slightly both the same thing.

Vereto -> Wiretrip , 22 Sep 2017 11:17
not just coders, it put even IT Ops guys into this bin. Basically good old - so you are working with computers sentence I used to hear a lot 10-15 years ago.
Sangmin , 22 Sep 2017 11:15
You can teach everyone how to code but it doesn't necessarily mean everyone will be able to work as one. We all learn math but that doesn't mean we're all mathematicians. We all know how to write but we're not all professional writers.

I have a graduate degree in CS and been to a coding bootcamp. Not everyone's brain is wired to become a successful coder. There is a particular way how coders think. Quality of a product will stand out based on these differences.

Vereto -> Jared Hall , 22 Sep 2017 11:12
Very hyperbolic is to assume that the profit in those companies is done by decreasing wages. In my company the profit is driven by ability to deliver products to the market. And that is limited by number of top people (not just any coder) you can have.
KDHughes -> kcrane , 22 Sep 2017 11:06
You realise that the arts are massively oversupplied and that most artists earn very little, if anything? Which is sort of like the situation the author is warning about. But hey, he knows nothing. Congratulations, though, on writing one of the most pretentious posts I've ever read on CIF.
offworldguy -> Melissa Boone , 22 Sep 2017 10:21
So you know kids, college age people and software developers who enjoy doing it in their leisure time? Do you know any middle aged mothers, fathers, grandparents who enjoy it and are not software developers?

Sorry, I don't see coding as a leisure pursuit that is going to take off beyond a very narrow demographic and if it becomes apparent (as I believe it will) that there is not going to be a huge increase in coding job opportunities then it will likely wither in schools too, perhaps replaced by music lessons.

Bread Eater , 22 Sep 2017 10:02
From their perspective yes. But there are a lot of opportunities in tech so it does benefit students looking for jobs.
Melissa Boone -> jamesbro , 22 Sep 2017 10:00
No, because software developer probably fail more often than they succeed. Building anything worthwhile is an iterative process. And it's not just the compiler but the other devs, oyur designer, your PM, all looking at your work.
Melissa Boone -> peterainbow , 22 Sep 2017 09:57
It's not shallow or lazy. I also work at a tech company and it's pretty common to do that across job fields. Even in HR marketing jobs, we hire students who can't point to an internship or other kind of experience in college, not simply grades.
Vereto -> savingUK , 22 Sep 2017 09:50
It will take ages, the issue of Indian programmers is in the education system and in "Yes boss" culture.

But on the other hand most of Americans are just as bad as Indians

Melissa Boone -> offworldguy , 22 Sep 2017 09:50
A lot of people do find it fun. I know many kids - high school and young college age - who code in the leisure time because they find it pleasurable to make small apps and video games. I myself enjoy it too. Your argument is like saying since you don't like to read books in your leisure time, nobody else must.

The point is your analogy isn't a good one - people who learn to code can not only enjoy it in their spare time just like music, but they can also use it to accomplish all kinds of basic things. I have a friend who's a software developer who has used code to program his Roomba to vacuum in a specific pattern and to play Candy Land with his daughter when they lost the spinner.

Owlyrics -> CapTec , 22 Sep 2017 09:44
Creativity could be added to your list. Anyone can push a button but only a few can invent a new one.
One company in the US (after it was taken over by a new owner) decided it was more profitable to import button pushers from off-shore, they lost 7 million customers (gamers) and had to employ more of the original American developers to maintain their high standard and profits.
Owlyrics -> Maclon , 22 Sep 2017 09:40
Masters is the new Bachelors.
Maclon , 22 Sep 2017 09:22
So similar to 500k a year people going to university ( UK) now when it used to be 60k people a year( 1980). There was never enough graduate jobs in 1980 so can't see where the sudden increase in need for graduates has come from.
PaulDavisTheFirst -> Ethan Hawkins , 22 Sep 2017 09:17

They aren't really crucial pieces of technology except for their popularity

It's early in the day for me, but this is the most ridiculous thing I've read so far, and I suspect it will be high up on the list by the end of the day.

There's no technology that is "crucial" unless it's involved in food, shelter or warmth. The rest has its "crucialness" decided by how widespread its use is, and in the case of those 3 languages, the answer is "very".

You (or I) might not like that very much, but that's how it is.

Julian Williams -> peter nelson , 22 Sep 2017 09:12
My benchmark would be if the average new graduate in the discipline earns more or less than one of the "professions", Law, medicine, Economics etc. The short answer is that they don't. Indeed, in my experience of professions, many good senior SW developers, say in finance, are paid markedly less than the marketing manager, CTO etc. who are often non-technical.

My benchmark is not "has a car, house etc." but what does 10, 15 20 years of experience in the area generate as a relative income to another profession, like being a GP or a corporate solicitor or a civil servant (which is usually the benchmark academics use for pay scaling). It is not to denigrate, just to say that markets don't always clear to a point where the most skilled are the highest paid.

I was also suggesting that even if you are not intending to work in the SW area, being able to translate your imagination into a program that reflects your ideas is a nice life skill.

AmyInNH -> freeandfair , 22 Sep 2017 09:05
Your assumption has no basis in reality. In my experience, as soon as Clinton ramped up H1Bs, my employer would invite 6 same college/degree/curriculum in for interviews, 5 citizen, 1 foreign student and default offer to foreign student without asking interviewers a single question about the interview. Eventually, the skipped the farce of interviewing citizens all together. That was in 1997, and it's only gotten worse. Wall St's been pretty blunt lately. Openly admits replacing US workers for import labor, as it's the "easiest" way to "grow" the economy, even though they know they are ousting citizens from their jobs to do so.
AmyInNH -> peter nelson , 22 Sep 2017 08:59
"People who get Masters and PhD's in computer science" Feed western universities money, for degree programs that would otherwise not exist, due to lack of market demand. "someone has a Bachelor's in CS" As citizens, having the same college/same curriculum/same grades, as foreign grad. But as citizens, they have job market mobility, and therefore are shunned. "you can make something real and significant on your own" If someone else is paying your rent, food and student loans while you do so.
Ethan Hawkins -> farabundovive , 22 Sep 2017 07:40
While true, it's not the coders' fault. The managers and execs above them have intentionally created an environment where these things are secondary. What's primary is getting the stupid piece of garbage out the door for Q profit outlook. Ship it amd patch it.
offworldguy -> millartant , 22 Sep 2017 07:38
Do most people find it fun? I can code. I don't find it 'fun'. Thirty years ago as a young graduate I might have found it slightly fun but the 'fun' wears off pretty quick.
Ethan Hawkins -> anticapitalist , 22 Sep 2017 07:35
In my estimation PHP is an utter abomination. Python is just a little better but still very bad. Ruby is a little better but still not at all good.

Languages like PHP, Python and JS are popular for banging out prototypes and disposable junk, but you greatly overestimate their importance. They aren't really crucial pieces of technology except for their popularity and while they won't disappear they won't age well at all. Basically they are big long-lived fads. Java is now over 20 years old and while Java 8 is not crucial, the JVM itself actually is crucial. It might last another 20 years or more. Look for more projects like Ceylon, Scala and Kotlin. We haven't found the next step forward yet, but it's getting more interesting, especially around type systems.

A strong developer will be able to code well in a half dozen languages and have fairly decent knowledge of a dozen others. For me it's been many years of: Z80, x86, C, C++, Java. Also know some Perl, LISP, ANTLR, Scala, JS, SQL, Pascal, others...

millartant -> Islingtonista , 22 Sep 2017 07:26
You need a decent IDE
millartant -> offworldguy , 22 Sep 2017 07:24

One is hardly likely to 'do a bit of coding' in ones leisure time

Why not? The right problem is a fun and rewarding puzzle to solve. I spend a lot of my leisure time "doing a bit of coding"

Ethan Hawkins -> Wiretrip , 22 Sep 2017 07:12
The worst of all are the academics (on average).
Ethan Hawkins -> KatieL , 22 Sep 2017 07:09
This makes people like me with 35 years of experience shipping products on deadlines up and down every stack (from device drivers and operating systems to programming languages, platforms and frameworks to web, distributed computing, clusters, big data and ML) so much more valuable. Been there, done that.
Ethan Hawkins -> Taylor Dotson , 22 Sep 2017 07:01
It's just not true. In SV there's this giant vacuum created by Apple, Google, FB, etc. Other good companies struggle to fill positions. I know from being on the hiring side at times.
TheBananaBender -> peter nelson , 22 Sep 2017 07:00
You don't work for a major outsourcer then like Serco, Atos, Agilisys
offworldguy -> LabMonkey , 22 Sep 2017 06:59
Plenty of people? I don't know of a single person outside of my work which is teaming with programmers. Not a single friend, not my neighbours, not my wife or her extended family, not my parents. Plenty of people might do it but most people don't.
Ethan Hawkins -> finalcentury , 22 Sep 2017 06:56
Your ignorance of coding is showing. Coding IS creative.
Ricardo111 -> peter nelson , 22 Sep 2017 06:56
Agreed: by gifted I did not meant innate. It's more of a mix of having the interest, the persistence, the time, the opportunity and actually enjoying that kind of challenge.

While some of those things are to a large extent innate personality traits, others are not and you don't need max of all of them, you just need enough to drive you to explore that domain.

That said, somebody that goes into coding purelly for the money and does it for the money alone is extremely unlikelly to become an exceptional coder.

Ricardo111 -> eirsatz , 22 Sep 2017 06:50
I'm as senior as they get and have interviewed quite a lot of programmers for several positions, including for Technical Lead (in fact, to replace me) and so far my experience leads me to believe that people who don't have a knack for coding are much less likely to expose themselves to many different languages and techniques, and also are less experimentalist, thus being far less likely to have those moments of transcending merely being aware of the visible and obvious to discover the concerns and concepts behind what one does. Without those moments that open the door to the next Universe of concerns and implications, one cannot do state transitions such as Coder to Technical Designer or Technical Designer to Technical Architect.

Sure, you can get the title and do the things from the books, but you will not get WHY are those things supposed to work (and when they will not work) and thus cannot adjust to new conditions effectively and will be like a sailor that can't sail away from sight of the coast since he can't navigate.

All this gets reflected in many things that enhance productivity, from the early ability to quickly piece together solutions for a new problem out of past solutions for different problems to, later, conceiving software architecture designs fittted to the typical usage pattern in the industry for which the software is going to be made.

LabMonkey , 22 Sep 2017 06:50
From the way our IT department is going, needing millions of coders is not the future. It'll be a minority of developers at the top, and an army of low wage monkeys at the bottom who can troubleshoot from a script - until AI comes along that can code faster and more accurately.
LabMonkey -> offworldguy , 22 Sep 2017 06:46

One is hardly likely to 'do a bit of coding' in ones leisure time

Really? I've programmed a few simple videogames in my spare time. Plenty of people do.

CapTec , 22 Sep 2017 06:29
Interesting piece that's fundamentally flawed. I'm a software engineer myself. There is a reason a University education of a minimum of three years is the base line for a junior developer or 'coder'.

Software engineering isn't just writing code. I would say 80% of my time is spent designing and structuring software before I even touch the code.

Explaining software engineering as a discipline at a high level to people who don't understand it is simple.

Most of us who learn to drive learn a few basics about the mechanics of a car. We know that brake pads need to be replaced, we know that fuel is pumped into an engine when we press the gas pedal. Most of us know how to change a bulb if it blows.

The vast majority of us wouldn't be able to replace a head gasket or clutch though. Just knowing the basics isn't enough to make you a mechanic.

Studying in school isn't enough to produce software engineers. Software engineering isn't just writing code, it's cross discipline. We also need to understand the science behind the computer, we need too understand logic, data structures, timings, how to manage memory, security, how databases work etc.

A few years of learning at school isn't nearly enough, a degree isn't enough on its own due to the dynamic and ever evolving nature of software engineering. Schools teach technology that is out of date and typically don't explain the science very well.

This is why most companies don't want new developers, they want people with experience and multiple skills.

Programming is becoming cool and people think that because of that it's easy to become a skilled developer. It isn't. It takes time and effort and most kids give up.

French was on the national curriculum when I was at school. Most people including me can't hold a conversation in French though.

Ultimately there is a SKILL shortage. And that's because skill takes a long time, successes and failures to acquire. Most people just give up.

This article is akin to saying 'schools are teaching basic health to reduce the wages of Doctors'. It didn't happen.

offworldguy -> thecurio , 22 Sep 2017 06:19
There is a difference. When you teach people music you teach a skill that can be used for a lifetimes enjoyment. One might sit at a piano in later years and play. One is hardly likely to 'do a bit of coding' in ones leisure time.

The other thing is how good are people going to get at coding and how long will they retain the skill if not used? I tend to think maths is similar to coding and most adults have pretty terrible maths skills not venturing far beyond arithmetic. Not many remember how to solve a quadratic equation or even how to rearrange some algebra.

One more thing is we know that if we teach people music they will find a use for it, if only in their leisure time. We don't know that coding will be in any way useful because we don't know if there will be coding jobs in the future. AI might take over coding but we know that AI won't take over playing piano for pleasure.

If we want to teach logical thinking then I think maths has always done this and we should make sure people are better at maths.

Alex Mackaness , 22 Sep 2017 06:08
Am I missing something here? Being able to code is a skill that is a useful addition to the skill armoury of a youngster entering the work place. Much like reading, writing, maths... Not only is it directly applicable and pervasive in our modern world, it is built upon logic.

The important point is that American schools are not ONLY teaching youngsters to code, and producing one dimensional robots... instead coding makes up one part of their overall skill set. Those who wish to develop their coding skills further certainly can choose to do so. Those who specialise elsewhere are more than likely to have found the skills they learnt whilst coding useful anyway.

I struggle to see how there is a hidden capitalist agenda here. I would argue learning the basics of coding is simply becoming seen as an integral part of the school curriculum.

thecurio , 22 Sep 2017 05:56
The word "coding" is shorthand for "computer programming" or "software development" and it masks the depth and range of skills that might be required, depending on the application.

This subtlety is lost, I think, on politicians and perhaps the general public. Asserting that teaching lots of people to code is a sneaky way to commodotise an industry might have some truth to it, but remember that commodotisation (or "sharing and re-use" as developers might call it) is nothing new. The creation of freely available and re-usable software components and APIs has driven innovation, and has put much power in the hands of developers who would not otherwise have the skill or time to tackle such projects.

There's nothing to fear from teaching more people to "code", just as there's nothing to fear from teaching more people to "play music". These skills simply represent points on a continuum.

There's room for everyone, from the kid on a kazoo all the way to Coltrane at the Village Vanguard.

sbw7 -> ragingbull , 22 Sep 2017 05:44
I taught CS. Out of around 100 graduates I'd say maybe 5 were reasonable software engineers. The rest would be fine in tech support or other associated trades, but not writing software. Its not just a set of trainable skills, its a set of attitudes and ways of perceiving and understanding that just aren't that common.
offworldguy , 22 Sep 2017 05:02
I can't understand the rush to teach coding in schools. First of all I don't think we are going to be a country of millions of coders and secondly if most people have the skills then coding is hardly going to be a well paid job. Thirdly you can learn coding from scratch after school like people of my generation did. You could argue that it is part of a well rounded education but then it is as important for your career as learning Shakespeare, knowing what an oxbow lake is or being able to do calculus: most jobs just won't need you to know.
savingUK -> yannick95 , 22 Sep 2017 04:35
While you roll on the floor laughing, these countries will slowly but surely get their act together. That is how they work. There are top quality coders over there and they will soon promoted into a position to organise the others.

You are probably too young to remember when people laughed at electronic products when they were made in Japan then Taiwan. History will repeat it's self.

zii000 -> JohnFreidburg , 22 Sep 2017 04:04
Yes it's ironic and no different here in the UK. Traditionally Labour was the party focused on dividing the economic pie more fairly, Tories on growing it for the benefit of all. It's now completely upside down with Tories paying lip service to the idea of pay rises but in reality supporting this deflationary race to the bottom, hammering down salaries and so shrinking discretionary spending power which forces price reductions to match and so more pressure on employers to cut costs ... ad infinitum.
Labour now favour policies which would cause an expansion across the entire economy through pay rises and dramatically increased investment with perhaps more tolerance of inflation to achieve it.
ID0193985 -> jamesbro , 22 Sep 2017 03:46
Not surprising if they're working for a company that is cold-calling people - which should be banned in my opinion. Call centres providing customer support are probably less abuse-heavy since the customer is trying to get something done.
vimyvixen , 22 Sep 2017 02:04
I taught myself to code in 1974. Fortran, COBOL were first. Over the years as a aerospace engineer I coded in numerous languages ranging from PLM, Snowball, Basic, and more assembly languages than I can recall, not to mention deep down in machine code on more architectures than most know even existed. Bottom line is that coding is easy. It doesn't take a genius to code, just another way of thinking. Consider all the bugs in the software available now. These "coders", not sufficiently trained need adult supervision by engineers who know what they are doing for computer systems that are important such as the electrical grid, nuclear weapons, and safety critical systems. If you want to program toy apps then code away, if you want to do something important learn engineering AND coding.
Dwight Spencer , 22 Sep 2017 01:44
Laughable. It takes only an above-average IQ to code. Today's coders are akin to the auto mechanics of the 1950s where practically every high school had auto shop instruction . . . nothing but a source of cheap labor for doing routine implementations of software systems using powerful code libraries built by REAL software engineers.
sieteocho -> Islingtonista , 22 Sep 2017 01:19
That's a bit like saying that calculus is more valuable than arithmetic, so why teach children arithmetic at all?

Because without the arithmetic, you're not going to get up to the calculus.

JohnFreidburg -> Tommyward , 22 Sep 2017 01:15
I disagree. Technology firms are just like other firms. Why then the collusion not to pay more to workers coming from other companies? To believe that they are anything else is naive. The author is correct. We need policies that actually grow the economy and not leaders who cave to what the CEOs want like Bill Clinton did. He brought NAFTA at the behest of CEOs and all it ended up doing was ripping apart the rust belt and ushering in Trump.
Tommyward , 22 Sep 2017 00:53
So the media always needs some bad guys to write about, and this month they seem to have it in for the tech industry. The article is BS. I interview a lot of people to join a large tech company, and I can guarantee you that we aren't trying to find cheaper labor, we're looking for the best talent.

I know that lots of different jobs have been outsourced to low cost areas, but these days the top companies are instead looking for the top talent globally.

I see this article as a hit piece against Silicon Valley, and it doesn't fly in the face of the evidence.

finalcentury , 22 Sep 2017 00:46
This has got to be the most cynical and idiotic social interest piece I have ever read in the Guardian. Once upon a time it was very helpful to learn carpentry and machining, but now, even if you are learning those, you will get a big and indispensable headstart if you have some logic and programming skills. The fact is, almost no matter what you do, you can apply logic and programming skills to give you an edge. Even journalists.
hoplites99 , 22 Sep 2017 00:02
Yup, rings true. I've been in hi tech for over 40 years and seen the changes. I was in Silicon Valley for 10 years on a startup. India is taking over, my current US company now has a majority Indian executive and is moving work to India. US politicians push coding to drive down wages to Indian levels.

On the bright side I am old enough and established enough to quit tomorrow, its someone else's problem, but I still despise those who have sold us out, like the Clintons, the Bushes, the Googoids, the Zuckerboids.

liberalquilt -> yannick95 , 21 Sep 2017 23:45
Sure markets existed before governments, but capitalism didn't, can't in fact. It needs the organs of state, the banking system, an education system, and an infrastructure.
thegarlicfarmer -> canprof , 21 Sep 2017 23:36
Then teach them other things but not coding! Here in Australia every child of school age has to learn coding. Now tell me that everyone of them will need it? Look beyond computers as coding will soon be automated just like every other job.
Islingtonista , 21 Sep 2017 22:25
If you have never coded then you will not appreciate how labour intensive it is. Coders effectively use line editors to type in, line by line, the instructions. And syntax is critical; add a comma when you meant a semicolon and the code doesn't work properly. Yeah, we use frameworks and libraries of already written subroutines, but, in the end, it is all about manually typing in the code.

Which is an expensive way of doing things (hence the attractions of 'off-shoring' the coding task to low cost economies in Asia).

And this is why teaching kids to code is a waste of time.

Already, AI based systems are addressing the task of interpreting high level design models and simply generating the required application.

One of the first uses templates and a smart chatbot to enable non-tech business people to build their websites. By describe in non-coding terms what they want, the chatbot is able to assemble the necessary components and make the requisite template amendments to build a working website.

Much cheaper than hiring expensive coders to type it all in manually.

It's early days yet, but coding may well be one of the big losers to AI automation along with all those back office clerical jobs.

Teaching kids how to think about design rather than how to code would be much more valuable.

jamesbro -> peter nelson , 21 Sep 2017 21:31
Thick-skinned? Just because you might get a few error messages from the compiler? Call centre workers have to put up with people telling them to fuck off eight hours a day.
Joshua Ian Lee , 21 Sep 2017 21:03
Spot on. Society will never need more than 1% of its people to code. We will need far more garbage men. There are only so many (relatively) good jobs to go around and its about competing to get them.
canprof , 21 Sep 2017 20:53
I'm a professor (not of computer science) and yet, I try to give my students a basic understanding of algorithms and logic, to spark an interest and encourage them towards programming. I have no skin in the game, except that I've seen unemployment first-hand, and want them to avoid it. The best chance most of them have is to learn to code.
Evelita , 21 Sep 2017 14:35
Educating youth does not drive wages down. It drives our economy up. China, India, and other countries are training youth in programming skills. Educating our youth means that they will be able to compete globally. This is the standard GOP stand that we don't need to educate our youth, but instead fantasize about high-paying manufacturing jobs miraculously coming back.

Many jobs, including new manufacturing jobs have an element of coding because they are automated. Other industries require coding skills to maintain web sites and keep computer systems running. Learning coding skills opens these doors.

Coding teaches logic, an essential thought process. Learning to code, like learning anything, increases the brains ability to adapt to new environments which is essential to our survival as a species. We must invest in educating our youth.

cwblackwell , 21 Sep 2017 13:38
"Contrary to public perception, the economy doesn't actually need that many more programmers." This really looks like a straw man introducing a red herring. A skill can be extremely valuable for those who do not pursue it as a full time profession.

The economy doesn't actually need that many more typists, pianists, mathematicians, athletes, dietitians. So, clearly, teaching typing, the piano, mathematics, physical education, and nutrition is a nefarious plot to drive down salaries in those professions. None of those skills could possibly enrich the lives or enhance the productivity of builders, lawyers, public officials, teachers, parents, or store managers.

DJJJJJC , 21 Sep 2017 14:23

A study by the Economic Policy Institute found that the supply of American college graduates with computer science degrees is 50% greater than the number hired into the tech industry each year.

You're assuming that all those people are qualified to work in software because they have a piece of paper that says so, but that's not a valid assumption. The quality of computer science degree courses is generally poor, and most people aren't willing or able to teach themselves. Universities are motivated to award degrees anyway because if they only awarded degrees to students who are actually qualified then that would reflect very poorly on their quality of teaching.

A skills shortage doesn't mean that everyone who claims to have a skill gets hired and there are still some jobs left over that aren't being done. It means that employers are forced to hire people who are incompetent in order to fill all their positions. Many people who get jobs in programming can't really do it and do nothing but create work for everyone else. That's why most of the software you use every day doesn't work properly. That's why competent programmers' salaries are still high in spite of the apparently large number of "qualified" people who aren't employed as programmers.

[Oct 02, 2017] Programming vs coding

This idiotic US term "coder" is complete baloney.
Notable quotes:
"... You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field. ..."
"... Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to. ..."
"... I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it. ..."
"... "...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck. ..."
"... Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed. ..."
Oct 02, 2017 | profile.theguardian.com
Wiretrip -> Mark Mauvais , 21 Sep 2017 14:23
Yes, 'engineers' (and particularly mathematicians) write appalling code.
Trumbledon , 21 Sep 2017 14:23
A good developer can easily earn £600-800 per day, which suggests to me that they are in high demand, and society needs more of them.
Wiretrip -> KatieL , 21 Sep 2017 14:22
Agreed, to many people 'coding' consists of copying other people's JavaScript snippets from StackOverflow... I tire of the many frauds in the business...
stratplaya , 21 Sep 2017 14:21
You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field.
peter nelson -> UncommonTruthiness , 21 Sep 2017 14:21

The ship has sailed on this activity as a career.

Oh, rubbish. I'm in the process of retiring from my job as an Android software designer so I'm tasked with hiring a replacement for my organisation. It pays extremely well, the work is interesting, and the company is successful and serves an important worldwide industry.

Still, finding highly-qualified people is hard and they get snatched up in mid-interview because the demand is high. Not only that but at these pay scales, we can pretty much expect the Guardian will do yet another article about the unconscionable gap between what rich, privileged techies like software engineers make and everyone else.

Really, we're damned if we do and damned if we don't. If tech workers are well-paid we're castigated for gentrifying neighbourhoods and living large, and yet anything that threatens to lower what we're paid produces conspiracy-theory articles like this one.

Fanastril -> Taylor Dotson , 21 Sep 2017 14:17
I learned to cook in school. Was there a shortage of cooks? No. Did I become a professional cook? No. but I sure as hell would not have missed the skills I learned for the world, and I use them every day.
KatieL -> Taylor Dotson , 21 Sep 2017 14:13
Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to.
youngsteveo -> Taylor Dotson , 21 Sep 2017 14:12
I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it.
Fanastril -> Taylor Dotson , 21 Sep 2017 14:11
It is not zero-sum: If you teach something empowering, like programming, motivating is a lot easier, and they will learn more.
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented.

Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

KatieL -> Taylor Dotson , 21 Sep 2017 14:10
"intelligence, creativity, diligence, communication ability, or anything else that a job"

None of those are any use if, when asked to turn your intelligent, creative, diligent, communicated idea into some software, you perform as well as most candidates do at simple coding assessments... and write stuff that doesn't work.

peter nelson , 21 Sep 2017 14:09

At its root, the campaign for code education isn't about giving the next generation a shot at earning the salary of a Facebook engineer. It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

Of course the writer does not offer the slightest shred of evidence to support the idea that this is the actual goal of these programs. So it appears that the tinfoil-hat conspiracy brigade on the Guardian is operating not only below the line, but above it, too.

The fact is that few of these students will ever become software engineers (which, incidentally, is my profession) but programming skills are essential in many professions for writing little scripts to automate various tasks, or to just understand 21st century technology.

kcrane , 21 Sep 2017 14:07
Sadly this is another article by a partial journalist who knows nothing about the software industry, but hopes to subvert what he had read somewhere to support a position he had already assumed. As others had said, understanding coding had already become akin to being able to use a pencil. It is a basic requirement of many higher level roles.

But knowing which end of a pencil to put on the paper (the equivalent of the level of coding taught in schools) isn't the same as being an artist. Moreover anyone who knows the field recognises that top coders are gifted, they embody genius. There are coding Caravaggio's out there, but few have the experience to know that. No amount of teaching will produce high level coders from average humans, there is an intangible something needed, as there is in music and art, to elevate the merely good to genius.

All to say, however many are taught the basics, it won't push down the value of the most talented coders, and so won't reduce the costs of the technology industry in any meaningful way as it is an industry, like art, that relies on the few not the many.

DebuggingLife , 21 Sep 2017 14:06
Not all of those children will want to become programmers but at least the barrier to entry, - for more to at least experience it - will be lower.

Teaching music to only the children whose parents can afford music tuition means than society misses out on a greater potential for some incredible gifted musicians to shine through.

Moreover, learning to code really means learning how to wrangle with the practical application of abstract concepts, algorithms, numerical skills, logic, reasoning, etc. which are all transferrable skills some of which are not in the scope of other classes, certainly practically.
Like music, sport, literature etc. programming a computer, a website, a device, a smartphone is an endeavour that can be truly rewarding as merely a pastime, and similarly is limited only by ones imagination.

rgilyead , 21 Sep 2017 14:01
"...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck.
Taylor Dotson -> cwblackwell , 21 Sep 2017 14:00
Yeah, but mania over coding skills inevitably pushes over skills out of the curriculum (or deemphasizes it). Education is zero-sum in that there's only so much time and energy to devote to it. Hence, you need more than vague appeals to "enhancement," especially given the risks pointed out by the author.
Taylor Dotson -> PolydentateBrigand , 21 Sep 2017 13:57
"Talented coders will start new tech businesses and create more jobs."

That could be argued for any skill set, including those found in the humanities and social sciences likely to pushed out by the mania over coding ability. Education is zero-sum: Time spent on one subject is time that invariably can't be spent learning something else.

Taylor Dotson -> WumpieJr , 21 Sep 2017 13:49
"If they can't literally fix everything let's just get rid of them, right?"

That's a strawman. His point is rooted in the recognition that we only have so much time, energy, and money to invest in solutions. One's that feel good but may not do anything distract us for the deeper structural issues in our economy. The probably with thinking "education" will fix everything is that it leaves the status quo unquestioned.

martinusher , 21 Sep 2017 13:31
Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down.

To confuse things further there's various levels of skill that all look the same to the untutored eye. Suppose you wished to bridge a waterway. If that waterway was a narrow ditch then you could just throw a plank across. As the distance to be spanned got larger and larger eventually you'd have to abandon intuition for engineering and experience. Exactly the same issues happen with software but they're less tangible; anyone can build a small program but a complex system requires a lot of other knowledge (in my field, that's engineering knowledge -- coding is almost an afterthought).

Its a good idea to teach young people to code but I wouldn't raise their expectations of huge salaries too much. For children educating them in wider, more general, fields and abstract activities such as music will pay off huge dividends, far more than just teaching them whatever the fashionable language du jour is. (...which should be Logo but its too subtle and abstract, it doesn't look "real world" enough!).

freeandfair , 21 Sep 2017 13:30
I don't see this is an issue. Sure, there could be ulterior motives there, but anyone who wants to still be employed in 20 years has to know how to code . It is not that everyone will be a coder, but their jobs will either include part-time coding or will require understanding of software and what it can and cannot do. AI is going to be everywhere.
WumpieJr , 21 Sep 2017 13:23
What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

youngsteveo , 21 Sep 2017 13:16
I'm not going to argue that the goal of mass education isn't to drive down wages, but the idea that the skills gap is a myth doesn't hold water in my experience. I'm a software engineer and manager at a company that pays well over the national average, with great benefits, and it is downright difficult to find a qualified applicant who can pass a rudimentary coding exam.

A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed.

[Oct 02, 2017] Does programming provides a new path to the middle class? Probably no longer, unless you are really talanted. In the latter case it is not that different from any other fields, but the pressure from H1B makes is harder for programmers. The neoliberal USA have a real problem with the social mobility

Notable quotes:
"... I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse ..."
"... This is interesting. Indeed, I do think there is excess supply of software programmers. ..."
"... Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US. ..."
"... Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history. ..."
"... I was laid off at your age in the depths of the recent recession and I got a job. ..."
"... The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done. ..."
"... Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible. ..."
Oct 02, 2017 | discussion.theguardian.com
swelle , 21 Sep 2017 17:36
I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse, though many 'older' (read: over 40) native-born tech workers will tell your that's plenty of talent here already, but even with the immigration hassles, H1B workers will be cheaper overall...

Julian Williams , 21 Sep 2017 18:06

This is interesting. Indeed, I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security. However, these jobs are usually occupied and the incumbents are not likely to move on quickly. Road blocks are also put up by creating sub networks of engineers who ensure that some knowledge is not ubiquitous.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

Still, the ability to write a computer program in an enabler, knowing how it works means you have an ability to imagine something and make it real. To me it is a bit like language, some people can use language to make more money than others, but it is still important to be able to have a basic level of understanding.

FabBlondie -> peter nelson , 21 Sep 2017 17:42
And yet I know a lot of people that has happened to. Better to replace a $125K a year programmer with one who will do the same, or even less, job for $50K.

JMColwill , 21 Sep 2017 18:17

This could backfire if the programmers don't find the work or pay to match their expectations... Programmers, after all tend to make very good hackers if their minds are turned to it.

freeandfair -> FabBlondie , 21 Sep 2017 18:23

> While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

Well, I am a software architect and what he says sounds correct for a certain type of applications. Maybe you do a different type of programming.

peter nelson -> FabBlondie , 21 Sep 2017 18:23

While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

How else can you do it?

Java is popular because it's a very versatile language - On this list it's the most popular general-purpose programming language. (Above it javascript is just a scripting language and HTML/CSS aren't even programming languages) https://fossbytes.com/most-used-popular-programming-languages/ ... and below it you have to go down to C# at 20% to come to another general-purpose language, and even that's a Microsoft house language.

Also the "correct" choice of programming languages is also based on how many people in the shop know it so they maintain code that's written in it by someone else.

freeandfair -> FabBlondie , 21 Sep 2017 18:22
> job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training.

Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US.

freeandfair -> mlzarathustra , 21 Sep 2017 18:20
> The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the buck

Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history.

theindyisbetter -> Game Cabbage , 21 Sep 2017 18:18
No. The amount of work is not a fixed sum. That's the lump of labour fallacy. We are not tied to the land.
ConBrio , 21 Sep 2017 18:10
Since newspaper are consolidating and cutting jobs gotta clamp down on colleges offering BA degrees, particularly in English Literature and journalism.

And then... and...then...and...

LMichelle -> chillisauce , 21 Sep 2017 18:03
This article focuses on the US schools, but I can imagine it's the same in the UK. I don't think these courses are going to be about creating great programmers capable of new innovations as much as having a work force that can be their own IT Help Desk.

They'll learn just enough in these classes to do that.

Then most companies will be hiring for other jobs, but want to make sure you have the IT skills to serve as your own "help desk" (although they will get no salary for their IT work).

edmundberk -> FabBlondie , 21 Sep 2017 17:57
I find that quite remarkable - 40 years ago you must have been using assembler and with hardly any memory to work with. If you blitzed through that without applying the thought processes described, well...I'm surprised.
James Dey , 21 Sep 2017 17:55
Funny. Every day in the Brexit articles, I read that increasing the supply of workers has negligible effect on wages.
peter nelson -> peterainbow , 21 Sep 2017 17:54
I was laid off at your age in the depths of the recent recession and I got a job. As I said in another posting, it usually comes down to fresh skills and good personal references who will vouch for your work-habits and how well you get on with other members of your team.

The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done.

Game Cabbage -> theindyisbetter , 21 Sep 2017 17:52
The situation has a direct comparison to today. It has nothing to do with land. There was a certain amount of profit making work and not enough labour to satisfy demand. There is currently a certain amount of profit making work and in many situations (especially unskilled low paid work) too much labour.
edmundberk , 21 Sep 2017 17:52
So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Or is this the most obtuse argument yet for avoiding what everyone in tech knows - even more blatantly than in many other industries, wages are curtailed by offshoring; and in the US, by having offshoring centres on US soil.

chillisauce , 21 Sep 2017 17:48
Well, speaking as someone who spends a lot of time trying to find really good programmers... frankly there aren't that many about. We take most of ours from Eastern Europe and SE Asia, which is quite expensive, given the relocation costs to the UK. But worth it.

So, yes, if more British kids learnt about coding, it might help a bit. But not much; the real problem is that few kids want to study IT in the first place, and that the tuition standards in most UK universities are quite low, even if they get there.

Baobab73 , 21 Sep 2017 17:48
True......
peter nelson -> rebel7 , 21 Sep 2017 17:47
There was recently an programme/podcast on ABC/RN about the HUGE shortage in Australia of techies with specialized security skills.
peter nelson -> jigen , 21 Sep 2017 17:46
Robots, or AI, are already making us more productive. I can write programs today in an afternoon that would have taken me a week a decade or two ago.

I can create a class and the IDE will take care of all the accessors, dependencies, enforce our style-guide compliance, stub-in the documentation ,even most test cases, etc, and all I have to write is very-specific stuff required by my application - the other 90% is generated for me. Same with UI/UX - stubs in relevant event handlers, bindings, dependencies, etc.

Programmers are a zillion times more productive than in the past, yet the demand keeps growing because so much more stuff in our lives has processors and code. Your car has dozens of processors running lots of software; your TV, your home appliances, your watch, etc.

Quaestor , 21 Sep 2017 17:43

Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible.

jamesupton , 21 Sep 2017 17:42
Getting children to learn how to write code, as part of core education, will be the first step to the long overdue revolution. The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.
cjenk415 -> LMichelle , 21 Sep 2017 17:40
did you misread? it seemed like he was emphasizing that learning to code, like learning art (and sports and languages), will help them develop skills that benefit them in whatever profession they choose.
FabBlondie -> peter nelson , 21 Sep 2017 17:40
While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done. And, FWIW, IMHO choosing the tool (programming language) might reasonably be expected to follow designing a solution, in practice this rarely happens. No, these days it's Java all the way, from day one.
theindyisbetter -> Game Cabbage , 21 Sep 2017 17:40
There was a fixed supply of land and a reduced supply of labour to work the land.

Nothing like then situation in a modern economy.

LMichelle , 21 Sep 2017 17:39
I'd advise parents that the classes they need to make sure their kids excel in are acting/drama. There is no better way to getting that promotion or increasing your pay like being a skilled actor in the job market. It's a fake it till you make it deal.
theindyisbetter , 21 Sep 2017 17:36
What a ludicrous argument.

Let's not teach maths or science or literacy either - then anyone with those skills will earn more.

SheriffFatman -> Game Cabbage , 21 Sep 2017 17:36

After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions

It also produced wage-control legislation (which admittedly failed to work).

peter nelson -> peterainbow , 21 Sep 2017 17:32
if there were truly a shortage i wouldn't be unemployed

I've heard that before but when I've dug deeper I've usually found someone who either let their skills go stale, or who had some work issues.

LMichelle -> loveyy , 21 Sep 2017 17:26
Really? You think they are going to emphasize things like the importance of privacy and consumer rights?
loveyy , 21 Sep 2017 17:25
This really has to be one of the silliest articles I read here in a very long time.
People, let your children learn to code. Even more, educate yourselves and start to code just for the fun of it - look at it like a game.
The more people know how to code the less likely they are to understand how stuff works. If you were ever frustrated by how impossible it seems to shop on certain websites, learn to code and you will be frustrated no more. You will understand the intent behind the process.
Even more, you will understand the inherent limitations and what is the meaning of safety. You will be able to better protect yourself in a real time connected world.

Learning to code won't turn your kid into a programmer, just like ballet or piano classes won't mean they'll ever choose art as their livelihood. So let the children learn to code and learn along with them

Game Cabbage , 21 Sep 2017 17:24
Tipping power to employers in any profession by oversupply of labour is not a good thing. Bit of a macabre example here but...After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions and economic development for hundreds of years after this. Not suggesting a massive depopulation. But you can achieve the same effects by altering the power balance. With decades of Neoliberalism, the employers side of the power see-saw is sitting firmly in the mud and is producing very undesired results for the vast majority of people.
Zuffle -> peterainbow , 21 Sep 2017 17:23
Perhaps you're just not very good. I've been a developer for 20 years and I've never had more than 1 week of unemployment.
Kevin P Brown -> peterainbow , 21 Sep 2017 17:20
" at 55 finding it impossible to get a job"

I am 59, and it is not just the age aspect it is the money aspect. They know you have experience and expectations, and yet they believe hiring someone half the age and half the price, times 2 will replace your knowledge. I have been contracting in IT for 30 years, and now it is obvious it is over. Experience at some point no longer mitigates age. I think I am at that point now.

TheLane82 , 21 Sep 2017 17:20
Completely true! What needs to happen instead is to teach the real valuable subjects.

Gender studies. Islamic studies. Black studies. All important issues that need to be addressed.

peter nelson -> mlzarathustra , 21 Sep 2017 17:06
Dear, dear, I know, I know, young people today . . . just not as good as we were. Everything is just going down the loo . . . Just have a nice cuppa camomile (or chamomile if you're a Yank) and try to relax ... " hey you kids, get offa my lawn !"
FabBlondie , 21 Sep 2017 17:06
There are good reasons to teach coding. Too many of today's computer users are amazingly unaware of the technology that allows them to send and receive emails, use their smart phones, and use websites. Few understand the basic issues involved in computer security, especially as it relates to their personal privacy. Hopefully some introductory computer classes could begin to remedy this, and the younger the students the better.

Security problems are not strictly a matter of coding.

Security issues persist in tech. Clearly that is not a function of the size of the workforce. I propose that it is a function of poor management and design skills. These are not taught in any programming class I ever took. I learned these on the job and in an MBA program, and because I was determined.

Don't confuse basic workforce training with an effective application of tech to authentic needs.

How can the "disruption" so prized in today's Big Tech do anything but aggravate our social problems? Tech's disruption begins with a blatant ignorance of and disregard for causes, and believes to its bones that a high tech app will truly solve a problem it cannot even describe.

Kool Aid anyone?

peterainbow -> brady , 21 Sep 2017 17:05
indeed that idea has been around as long as cobol and in practice has just made things worse, the fact that many people outside of software engineering don;t seem to realise is that the coding itself is a relatively small part of the job
FabBlondie -> imipak , 21 Sep 2017 17:04
Hurrah.
peterainbow -> rebel7 , 21 Sep 2017 17:04
so how many female and old software engineers are there who are unable to get a job, i'm one of them at 55 finding it impossible to get a job and unlike many 'developers' i know what i'm doing
peterainbow , 21 Sep 2017 17:02
meanwhile the age and sex discrimination in IT goes on, if there were truly a shortage i wouldn't be unemployed
Jared Hall -> peter nelson , 21 Sep 2017 17:01
Training more people for an occupation will result in more people becoming qualified to perform that occupation, irregardless of the fact that many will perform poorly at it. A CS degree is no guarantee of competency, but it is one of the best indicators of general qualification we have at the moment. If you can provide a better metric for analyzing the underlying qualifications of the labor force, I'd love to hear it.

Regarding your anecdote, while interesting, it poor evidence when compared to the aggregate statistical data analyzed in the EPI study.

peter nelson -> FabBlondie , 21 Sep 2017 17:00

Job-specific training is completely different.

Good grief. It's not job-specific training. You sound like someone who knows nothing about computer programming.

Designing a computer program requires analysing the task; breaking it down into its components, prioritising them and identifying interdependencies, and figuring out which parts of it can be broken out and done separately. Expressing all this in some programming language like Java, C, or C++ is quite secondary.

So once you learn to organise a task properly you can apply it to anything - remodeling a house, planning a vacation, repairing a car, starting a business, or administering a (non-software) project at work.

[Oct 02, 2017] Evaluation of potential job candidates for programming job should include evaluation of thier previous projects and code written

Notable quotes:
"... Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst. ..."
"... how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts ..."
"... And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that. ..."
"... most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe. ..."
Oct 02, 2017 | discussion.theguardian.com

peter nelson -> c mm , 21 Sep 2017 19:49

Instant feedback is one of the things I really like about programming, but it's also the thing that some people can't handle. As I'm developing a program all day long the compiler is telling me about build errors or warnings or when I go to execute it it crashes or produces unexpected output, etc. Software engineers are bombarded all day with negative feedback and little failures. You have to be thick-skinned for this work.
peter nelson -> peterainbow , 21 Sep 2017 19:42
How is it shallow and lazy? I'm hiring for the real world so I want to see some real world accomplishments. If the candidate is fresh out of university they can't point to work projects in industry because they don't have any. But they CAN point to stuff they've done on their own. That shows both motivation and the ability to finish something. Why do you object to it?
anticapitalist -> peter nelson , 21 Sep 2017 14:47
Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst.
John Kendall , 21 Sep 2017 19:42
There is a big difference between "coding" and programming. Coding for a smart phone app is a matter of calling functions that are built into the device. For example, there are functions for the GPS or for creating buttons or for simulating motion in a game. These are what we used to call subroutines. The difference is that whereas we had to write our own subroutines, now they are just preprogrammed functions. How those functions are written is of little or no importance to today's coders.

Nor are they able to program on that level. Real programming requires not only a knowledge of programming languages, but also a knowledge of the underlying algorithms that make up actual programs. I suspect that "coding" classes operate on a quite superficial level.

Game Cabbage -> theindyisbetter , 21 Sep 2017 19:40
Its not about the amount of work or the amount of labor. Its about the comparative availability of both and how that affects the balance of power, and that in turn affects the overall quality of life for the 'majority' of people.
c mm -> Ed209 , 21 Sep 2017 19:39
Most of this is not true. Peter Nelson gets it right by talking about breaking steps down and thinking rationally. The reason you can't just teach the theory, however, is that humans learn much better with feedback. Think about trying to learn how to build a fast car, but you never get in and test its speed. That would be silly. Programming languages take the system of logic that has been developed for centuries and gives instant feedback on the results. It's a language of rationality.
peter nelson -> peterainbow , 21 Sep 2017 19:37
This article is about the US. The tech industry in the EU is entirely different, and basically moribund. Where is the EU's Microsoft, Apple, Google, Amazon, Oracle, Intel, Facebook, etc, etc? The opportunities for exciting interesting work, plus the time and schedule pressures that force companies to overlook stuff like age because they need a particular skill Right Now, don't exist in the EU. I've done very well as a software engineer in my 60's in the US; I cannot imagine that would be the case in the EU.
peterainbow -> peter nelson , 21 Sep 2017 19:37
sorry but that's just not true, i doubt you are really programming still, or quasi programmer but really a manager who like to keep their hand in, you certainly aren't busy as you've been posting all over this cif. also why would you try and hire someone with such disparate skillsets, makes no sense at all

oh and you'd be correct that i do have workplace issues, ie i have a disability and i also suffer from depression, but that shouldn't bar me from employment and again regarding my skills going stale, that again contradicts your statement that it's about planning/analysis/algorithms etc that you said above ( which to some extent i agree with )

c mm -> peterainbow , 21 Sep 2017 19:36
Not at all, it's really egalitarian. If I want to hire someone to paint my portrait, the best way to know if they're any good is to see their previous work. If they've never painted a portrait before then I may want to go with the girl who has
c mm -> ragingbull , 21 Sep 2017 19:34
There is definitely not an excess. Just look at projected jobs for computer science on the Bureau of Labor statistics.
c mm -> perble conk , 21 Sep 2017 19:32
Right? It's ridiculous. "Hey, there's this industry you can train for that is super valuable to society and pays really well!"
Then Ben Tarnoff, "Don't do it! If you do you'll drive down wages for everyone else in the industry. Build your fire starting and rock breaking skills instead."
peterainbow -> peter nelson , 21 Sep 2017 19:29
how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts
peter nelson -> eirsatz , 21 Sep 2017 19:25
I think the difference between gifted and not is motivation. But I agree it's not innate. The kid who stayed up all night in high school hacking into the school server to fake his coding class grade is probably more gifted than the one who spent 4 years in college getting a BS in CS because someone told him he could get a job when he got out.

I've done some hiring in my life and I always ask them to tell me about stuff they did on their own.

peter nelson -> TheBananaBender , 21 Sep 2017 19:20

Most coding jobs are bug fixing.

The only bugs I have to fix are the ones I make.

peter nelson -> Ed209 , 21 Sep 2017 19:19
As several people have pointed out, writing a computer program requires analyzing and breaking down a task into steps, identifying interdependencies, prioritizing the order, figuring out what parts can be organized into separate tasks that be done separately, etc.

These are completely independent of the language - I've been programming for 40 years in everything from FORTRAN to APL to C to C# to Java and it's all the same. Not only that but they transcend programming - they apply to planning a vacation, remodeling a house, or fixing a car.

peter nelson -> ragingbull , 21 Sep 2017 19:14
Neither coding nor having a bachelor's degree in computer science makes you a suitable job candidate. I've done a lot of recruiting and interviews in my life, and right now I'm trying to hire someone. And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that.

That's the thing that distinguishes software from many other fields - you can do something real and significant on your own. If you haven't managed to do so in 4 years of college you're not a good candidate.

peter nelson -> nickGregor , 21 Sep 2017 19:07
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it.

In a sense that's already true, as i noted elsewhere. 90% of the code in my projects (Java and C# in their respective IDEs) is machine generated. I do relatively little "coding". But the flaw in your idea is this: most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe.

Ricardo111 -> martinusher , 21 Sep 2017 19:03
Completely agree. At the highest levels there is more work that goes into managing complexity and making sure nothing is missed than in making the wheels turn and the beepers beep.
ragingbull , 21 Sep 2017 19:02
Hang on... if the current excess of computer science grads is not driving down wages, why would training more kids to code make any difference?
Ricardo111 -> youngsteveo , 21 Sep 2017 18:59
I've actually interviewed people for very senior technical positions in Investment Banks who had all the fancy talk in the world and yet failed at some very basic "write me a piece of code that does X" tests.

Next hurdle on is people who have learned how to deal with certain situations and yet don't really understand how it works so are unable to figure it out if you change the problem parameters.

That said, the average coder is only slightly beyond this point. The ones who can take in account maintenability and flexibility for future enhancements when developing are already a minority, and those who can understand the why of software development process steps, design software system architectures or do a proper Technical Analysis are very rare.

eirsatz -> Ricardo111 , 21 Sep 2017 18:57
Hubris. It's easy to mistake efficiency born of experience as innate talent. The difference between a 'gifted coder' and a 'non gifted junior coder' is much more likely to be 10 or 15 years sitting at a computer, less if there are good managers and mentors involved.
Ed209 , 21 Sep 2017 18:57
Politicians love the idea of teaching children to 'code', because it sounds so modern, and nobody could possible object... could they? Unfortunately it simply shows up their utter ignorance of technical matters because there isn't a language called 'coding'. Computer programming languages have changed enormously over the years, and continue to evolve. If you learn the wrong language you'll be about as welcome in the IT industry as a lamp-lighter or a comptometer operator.

The pace of change in technology can render skills and qualifications obsolete in a matter of a few years, and only the very best IT employers will bother to retrain their staff - it's much cheaper to dump them. (Most IT posts are outsourced through agencies anyway - those that haven't been off-shored. )

peter nelson -> YEverKnot , 21 Sep 2017 18:54
And this isn't even a good conspiracy theory; it's a bad one. He offers no evidence that there's an actual plan or conspiracy to do this. I'm looking for an account of where the advocates of coding education met to plot this in some castle in Europe or maybe a secret document like "The Protocols of the Elders of Google", or some such.
TheBananaBender , 21 Sep 2017 18:52
Most jobs in IT are shit - desktop support, operations droids. Most coding jobs are bug fixing.
Ricardo111 -> Wiretrip , 21 Sep 2017 18:49
Tool Users Vs Tool Makers. The really good coders actually get why certain things work as they do and can adjust them for different conditions. The mass produced coders are basically code copiers and code gluing specialists.
peter nelson -> AmyInNH , 21 Sep 2017 18:49
People who get Masters and PhD's in computer science are not usually "coders" or software engineers - they're usually involved in obscure, esoteric research for which there really is very little demand. So it doesn't surprise me that they're unemployed. But if someone has a Bachelor's in CS and they're unemployed I would have to wonder what they spent their time at university doing.

The thing about software that distinguishes it from lots of other fields is that you can make something real and significant on your own . I would expect any recent CS major I hire to be able to show me an app or an open-source component or something similar that they made themselves, and not just test scores and grades. If they could not then I wouldn't even think about hiring them.

Ricardo111 , 21 Sep 2017 18:44
Fortunately for those of us who are actually good at coding, the difference in productivity between a gifted coder and a non-gifted junior developer is something like 100-fold. Knowing how to code and actually being efficient at creating software programs and systems are about as far apart as knowing how to write and actually being able to write a bestselling exciting Crime trilogy.
peter nelson -> jamesupton , 21 Sep 2017 18:36

The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.

If you know how to write software you can get a robot to do those things.

peter nelson -> Julian Williams , 21 Sep 2017 18:34
I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security.

This article is about coding; most of those jobs require very little of that.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

How do you define "high paying". Everyone I know (and I know a lot because I've been a sw engineer for 40 years) who is working fulltime as a software engineer is making a high-middle-class salary, and can easily afford a home, travel on holiday, investments, etc.

YEverKnot , 21 Sep 2017 18:32

Tech's push to teach coding isn't about kids' success – it's about cutting wages

Nowt like a good conspiracy theory.
freeandfair -> WithoutPurpose , 21 Sep 2017 18:31
What is a stupidly low salary? 100K?
freeandfair -> AmyInNH , 21 Sep 2017 18:30
> Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.

That just means 50% of them are no good and need to develop their skills further or try something else.
Not every with a STEM degree from some 3rd rate college is capable of doing complex IT or STEM work.

peter nelson -> edmundberk , 21 Sep 2017 18:30

So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Yes. Haven't you noticed how wage growth has flattened? That's because some do-gooders" thought it would be a fine idea to educate the peasants. There was a time when only the well-to do knew how to read and write, and that's why they well-to-do were well-to-do. Education is evil. Stop educating people and then those of us who know how to read and write can charge them for reading and writing letters and email. Better yet, we can have Chinese and Indians do it for us and we just charge a transaction fee.

AmyInNH -> peter nelson , 21 Sep 2017 18:27
Massive amounts of public use cars, it doesn't mean millions need schooling in auto mechanics. Same for software coding. We aren't even using those who have Bachelors, Masters and PhDs in CS.
carlospapafritas , 21 Sep 2017 18:27
"..importing large numbers of skilled guest workers from other countries through the H1-B visa program..."

"skilled" is good. H1B has long ( appx 17 years) been abused and turned into trafficking scheme. One can buy H1B in India. Powerful ethnic networks wheeling & dealing in US & EU selling IT jobs to essentially migrants.

The real IT wages haven't been stagnant but steadily falling from the 90s. It's easy to see why. $82K/year IT wage was about average in the 90s. Comparing the prices of housing (& pretty much everything else) between now gives you the idea.

freeandfair -> whitehawk66 , 21 Sep 2017 18:27
> not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread

Taking a couple of years of programming are not enough to do this as a job, don't worry.
But learning to code is like learning maths, - it helps to develop logical thinking, which will benefit you in every area of your life.

James Dey , 21 Sep 2017 18:25
We should stop teaching our kids to be journalists, then your wage might go up.
peter nelson -> AmyInNH , 21 Sep 2017 18:23
What does this even mean?

[Oct 02, 2017] Programming is a culturally important skill

Notable quotes:
"... A lot of basic entry level jobs require a good level of Excel skills. ..."
"... Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money. ..."
"... Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled. ..."
"... Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely. ..."
"... We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? ..."
www.moonofalabama.org
David McCaul -> IanMcLzzz , 21 Sep 2017 13:03
There are very few professional Scribes nowadays, a good level of reading & writing is simplely a default even for the lowest paid jobs. A lot of basic entry level jobs require a good level of Excel skills. Several years from now basic coding will be necessary to manipulate basic tools for entry level jobs, especially as increasingly a lot of real code will be generated by expert systems supervised by a tiny number of supervisors. Coding jobs will go the same way that trucking jobs will go when driverless vehicles are perfected.

anticapitalist, 21 Sep 2017 14:25

Offer the class but not mandatory. Just like I could never succeed playing football others will not succeed at coding. The last thing the industry needs is more bad developers showing up for a paycheck.

Fanastril , 21 Sep 2017 14:08

Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. What's next, keep them off Math, because, you know . .
Taylor Dotson -> freeandfair , 21 Sep 2017 13:59
That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?
PolydentateBrigand , 21 Sep 2017 12:59
The economy isn't a zero-sum game. Developing a more skilled workforce that can create more value will lead to economic growth and improvement in the general standard of living. Talented coders will start new tech businesses and create more jobs.

WumpieJr , 21 Sep 2017 13:23

What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

mlzarathustra , 21 Sep 2017 16:52
I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.

The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.

I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.

Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.

AmyInNH -> peter nelson , 21 Sep 2017 16:51
Geeze it'd be nice if you'd make an effort.
rucore.libraries.rutgers.edu/rutgers-lib/45960/PDF/1/
https://rucore.libraries.rutgers.edu/rutgers-lib/46156 /
https://rucore.libraries.rutgers.edu/rutgers-lib/46207 /
peter nelson -> WyntonK , 21 Sep 2017 16:45
Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled.

But it's not. I'm in my 60's and retiring but I've been a software engineer all my life. I've worked for many different companies, and in different industries and I've never had any trouble competing with cheap imported workers. The people I've seen fall behind were ones who did not keep their skills fresh. When I was laid off in 2009 in my mid-50's I made sure my mobile-app skills were bleeding edge (in those days ANYTHING having to do with mobile was bleeding edge) and I used to go to job interviews with mobile devices to showcase what I could do. That way they could see for themselves and not have to rely on just a CV.

They older guys who fell behind did so because their skills and toolsets had become obsolete.

Now I'm trying to hire a replacement to write Android code for use in industrial production and struggling to find someone with enough experience. So where is this oversupply I keep hearing about?

Jared Hall -> RogTheDodge , 21 Sep 2017 16:42
Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely.
JayThomas , 21 Sep 2017 16:39

It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? That just seems inefficient.

FabBlondie -> RogTheDodge , 21 Sep 2017 16:39
There was never any need to give our jobs to foreigners. That is, if you are comparing the production of domestic vs. foreign workers. The sole need was, and is, to increase profits.
peter nelson -> AmyInNH , 21 Sep 2017 16:34
Link?
FabBlondie , 21 Sep 2017 16:34
Schools MAY be able to fix big social problems, but only if they teach a well-rounded curriculum that includes classical history and the humanities. Job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training. The existing social problems were not caused by a lack of programmers, and cannot be solved by Big Tech.

I agree with the author that computer programming skills are not that limited in availability. Big Tech solved the problem of the well-paid professional some years ago by letting them go, these were mostly workers in their 50s, and replacing them with H1-B visa-holders from India -- who work for a fraction of their experienced American counterparts.

It is all about profits. Big Tech is no different than any other "industry."

peter nelson -> Jared Hall , 21 Sep 2017 16:31
Supply of apples does not affect the demand for oranges. Teaching coding in high school does not necessarily alter the supply of software engineers. I studied Chinese History and geology at University but my doing so has had no effect on the job prospects of people doing those things for a living.
johnontheleft -> Taylor Dotson , 21 Sep 2017 16:30
You would be surprised just how much a little coding knowledge has transformed my ability to do my job (a job that is not directly related to IT at all).
peter nelson -> Jared Hall , 21 Sep 2017 16:29
Because teaching coding does not affect the supply of actual engineers. I've been a professional software engineer for 40 years and coding is only a small fraction of what I do.
peter nelson -> Jared Hall , 21 Sep 2017 16:28
You and the linked article don't know what you're talking about. A CS degree does not equate to a productive engineer.

A few years ago I was on the recruiting and interviewing committee to try to hire some software engineers for a scientific instrument my company was making. The entire team had about 60 people (hw, sw, mech engineers) but we needed 2 or 3 sw engineers with math and signal-processing expertise. The project was held up for SIX months because we could not find the people we needed. It would have taken a lot longer than that to train someone up to our needs. Eventually we brought in some Chinese engineers which cost us MORE than what we would have paid for an American engineer when you factor in the agency and visa paperwork.

Modern software engineers are not just generic interchangable parts - 21st century technology often requires specialised scientific, mathematical, production or business domain-specific knowledge and those people are hard to find.

freeluna -> freeluna , 21 Sep 2017 16:18
...also, this article is alarmist and I disagree with it. Dear Author, Phphphphtttt! Sincerely, freeluna
AmyInNH , 21 Sep 2017 16:16
Regimentation of the many, for benefit of the few.
AmyInNH -> Whatitsaysonthetin , 21 Sep 2017 16:15
Visa jobs are part of trade agreements. To be very specific, US gov (and EU) trade Western jobs for market access in the East.
http://www.marketwatch.com/story/in-india-british-leader-theresa-may-preaches-free-trade-2016-11-07
There is no shortage. This is selling off the West's middle class.
Take a look at remittances in wikipedia and you'll get a good idea just how much it costs the US and EU economies, for sake of record profits to Western industry.
jigen , 21 Sep 2017 16:13
And thanks to the author for not using the adjective "elegant" in describing coding.
freeluna , 21 Sep 2017 16:13
I see advantages in teaching kids to code, and for kids to make arduino and other CPU powered things. I don't see a lot of interest in science and tech coming from kids in school. There are too many distractions from social media and game platforms, and not much interest in developing tools for future tech and science.
jigen , 21 Sep 2017 16:13
Let the robots do the coding. Sorted.
FluffyDog -> rgilyead , 21 Sep 2017 16:13
Although coding per se is a technical skill it isn't designing or integrating systems. It is only a small, although essential, part of the whole software engineering process. Learning to code just gets you up the first steps of a high ladder that you need to climb a fair way if you intend to use your skills to earn a decent living.
rebel7 , 21 Sep 2017 16:11
BS.

Friend of mine in the SV tech industry reports that they are about 100,000 programmers short in just the internet security field.

Y'all are trying to create a problem where there isn't one. Maybe we shouldn't teach them how to read either. They might want to work somewhere besides the grill at McDonalds.

AmyInNH -> WyntonK , 21 Sep 2017 16:11
To which they will respond, offshore.
AmyInNH -> MrFumoFumo , 21 Sep 2017 16:10
They're not looking for good, they're looking for cheap + visa indentured. Non-citizens.
nickGregor , 21 Sep 2017 16:09
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it. Coding is going to change from its purely abstract form that is not utilized at peak- but if you can describe what you envision in an effective concise manner u could become a very good coder very quickly -- and competence will be determined entirely by imagination and the barriers of entry will all but be extinct
AmyInNH -> unclestinky , 21 Sep 2017 16:09
Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.
AmyInNH -> User10006 , 21 Sep 2017 16:06
Apparently a whole lot of people are just making it up, eh?
http://www.motherjones.com/politics/2017/09/inside-the-growing-guest-worker-program-trapping-indian-students-in-virtual-servitude /
From today,
http://www.computerworld.com/article/2915904/it-outsourcing/fury-rises-at-disney-over-use-of-foreign-workers.html
All the way back to 1995,
https://www.youtube.com/watch?v=vW8r3LoI8M4&feature=youtu.be
JCA1507 -> whitehawk66 , 21 Sep 2017 16:04
Bravo
JCA1507 -> DirDigIns , 21 Sep 2017 16:01
Total... utter... no other way... huge... will only get worse... everyone... (not a very nuanced commentary is it).

I'm glad pieces like this are mounting, it is relevant that we counter the mix of messianism and opportunism of Silicon Valley propaganda with convincing arguments.

RogTheDodge -> WithoutPurpose , 21 Sep 2017 16:01
That's not my experience.
AmyInNH -> TTauriStellarbody , 21 Sep 2017 16:01
It's a stall tactic by Silicon Valley, "See, we're trying to resolve the [non-existant] shortage."
AmyInNH -> WyntonK , 21 Sep 2017 16:00
They aren't immigrants. They're visa indentured foreign workers. Why does that matter? It's part of the cheap+indentured hiring criteria. If it were only cheap, they'd be lowballing offers to citizen and US new grads.
RogTheDodge -> Jared Hall , 21 Sep 2017 15:59
No. Because they're the ones wanting them and realizing the US education system is not producing enough
RogTheDodge -> Jared Hall , 21 Sep 2017 15:58
Except the demand is increasing massively.
RogTheDodge -> WyntonK , 21 Sep 2017 15:57
That's why we are trying to educate American coders - so we don't need to give our jobs to foreigners.
AmyInNH , 21 Sep 2017 15:56
Correct premises,
- proletarianize programmers
- many qualified graduates simply can't find jobs.
Invalid conclusion:
- The problem is there aren't enough good jobs to be trained for.

That conclusion only makes sense if you skip right past ...
" importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status"

Hiring Americans doesn't "hurt" their record profits. It's incessant greed and collusion with our corrupt congress.

Oldvinyl , 21 Sep 2017 15:51
This column was really annoying. I taught my students how to program when I was given a free hand to create the computer studies curriculum for a new school I joined. (Not in the UK thank Dog). 7th graders began with studying the history and uses of computers and communications tech. My 8th grade learned about computer logic (AND, OR, NOT, etc) and moved on with QuickBASIC in the second part of the year. My 9th graders learned about databases and SQL and how to use HTML to make their own Web sites. Last year I received a phone call from the father of one student thanking me for creating the course, his son had just received a job offer and now works in San Francisco for Google.
I am so glad I taught them "coding" (UGH) as the writer puts it, rather than arty-farty subjects not worth a damn in the jobs market.
WyntonK -> DirDigIns , 21 Sep 2017 15:47
I live and work in Silicon Valley and you have no idea what you are talking about. There's no shortage of coders at all. Terrific coders are let go because of their age and the availability of much cheaper foreign coders(no, I am not opposed to immigration).
Sean May , 21 Sep 2017 15:43
Looks like you pissed off a ton of people who can't write code and are none to happy with you pointing out the reason they're slinging insurance for geico.

I think you're quite right that coding skills will eventually enter the mainstream and slowly bring down the cost of hiring programmers.

The fact is that even if you don't get paid to be a programmer you can absolutely benefit from having some coding skills.

There may however be some kind of major coding revolution with the advent of quantum computing. The way code is written now could become obsolete.

Jared Hall -> User10006 , 21 Sep 2017 15:43
Why is it a fantasy? Does supply and demand not apply to IT labor pools?
Jared Hall -> ninianpark , 21 Sep 2017 15:42
Why is it a load of crap? If you increase the supply of something with no corresponding increase in demand, the price will decrease.
pictonic , 21 Sep 2017 15:40
A well-argued article that hits the nail on the head. Amongst any group of coders, very few are truly productive, and they are self starters; training is really needed to do the admin.
Jared Hall -> DirDigIns , 21 Sep 2017 15:39
There is not a huge skills shortage. That is why the author linked this EPI report analyzing the data to prove exactly that. This may not be what people want to believe, but it is certainly what the numbers indicate. There is no skills gap.

http://www.epi.org/files/2013/bp359-guestworkers-high-skill-labor-market-analysis.pdf

Axel Seaton -> Jaberwocky , 21 Sep 2017 15:34
Yeah, but the money is crap
DirDigIns -> IanMcLzzz , 21 Sep 2017 15:32
Perfect response for the absolute crap that the article is pushing.
DirDigIns , 21 Sep 2017 15:30
Total and utter crap, no other way to put it.

There is a huge skills shortage in key tech areas that will only get worse if we don't educate and train the young effectively.

Everyone wants youth to have good skills for the knowledge economy and the ability to earn a good salary and build up life chances for UK youth.

So we get this verbal diarrhoea of an article. Defies belief.

Whatitsaysonthetin -> Evelita , 21 Sep 2017 15:27
Yes. China and India are indeed training youth in coding skills. In order that they take jobs in the USA and UK! It's been going on for 20 years and has resulted in many experienced IT staff struggling to get work at all and, even if they can, to suffer stagnating wages.
WmBoot , 21 Sep 2017 15:23
Wow. Congratulations to the author for provoking such a torrent of vitriol! Job well done.
TTauriStellarbody , 21 Sep 2017 15:22
Has anyones job is at risk from a 16 year old who can cobble together a couple of lines of javascript since the dot com bubble?

Good luck trying to teach a big enough pool of US school kids regular expressions let alone the kind of test driven continuous delivery that is the norm in the industry now.

freeandfair -> youngsteveo , 21 Sep 2017 13:27
> A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job

I have exactly the same experience. There is undeniable a skill gap. It takes about a year for a skilled professional to adjust and learn enough to become productive, it takes about 3-5 years for a college grad.

It is nothing new. But the issue is, as the college grad gets trained, another company steal him/ her. And also keep in mind, all this time you are doing job and training the new employee as time permits. Many companies in the US cut the non-profit department (such as IT) to the bone, we cannot afford to lose a person and then train another replacement for 3-5 years.

The solution? Hire a skilled person. But that means nobody is training college grads and in 10-20 years we are looking at the skill shortage to the point where the only option is brining foreign labor.

American cut-throat companies that care only about the bottom line cannibalized themselves.

farabundovive -> Ethan Hawkins , 21 Sep 2017 15:10

Heh. You are not a coder, I take it. :) Going to be a few decades before even the easiest coding jobs vanish.

Given how shit most coders of my acquaintance have been - especially in matters of work ethic, logic, matching s/w to user requirements and willingness to test and correct their gormless output - most future coding work will probably be in the area of disaster recovery. Sorry, since the poor snowflakes can't face the sad facts, we have to call it "business continuation" these days, don't we?
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

[Sep 19, 2017] Boston Startups Are Teaching Boats to Drive Themselves by Joshua Brustein

Notable quotes:
"... He's also a sort of maritime-technology historian. A tall, white-haired man in a baseball cap, shark t-shirt and boat shoes, Benjamin said he's spent the last 15 years "making vehicles wet." He has the U.S. armed forces to thank for making his autonomous work possible. The military sparked the field of marine autonomy decades ago, when it began demanding underwater robots for mine detection, ..."
"... In 2006, Benjamin launched his open-source software project. With it, a computer is able to take over a boat's navigation-and-control system. Anyone can write programs for it. The project is funded by the U.S. Office for Naval Research and Battelle Memorial Institute, a nonprofit. Benjamin said there are dozens of types of vehicles using the software, which is called MOOS-IvP. ..."
Sep 19, 2017 | www.msn.com

Originally from: Bloomberg via Associated Press

Frank Marino, an engineer with Sea Machines Robotics, uses a remote control belt pack to control a self-driving boat in Boston Harbor. (Bloomberg) -- Frank Marino sat in a repurposed U.S. Coast Guard boat bobbing in Boston Harbor one morning late last month. He pointed the boat straight at a buoy several hundred yards away, while his colleague Mohamed Saad Ibn Seddik used a laptop to set the vehicle on a course that would run right into it. Then Ibn Seddik flipped the boat into autonomous driving mode. They sat back as the vessel moved at a modest speed of six knots, smoothly veering right to avoid the buoy, and then returned to its course.

In a slightly apologetic tone, Marino acknowledged the experience wasn't as harrowing as barreling down a highway in an SUV that no one is steering. "It's not like a self-driving car, where the wheel turns on its own," he said. Ibn Seddik tapped in directions to get the boat moving back the other way at twice the speed. This time, the vessel kicked up a wake, and the turn felt sharper, even as it gave the buoy the same wide berth as it had before. As far as thrills go, it'd have to do. Ibn Seddik said going any faster would make everyone on board nauseous.

The two men work for Sea Machines Robotics Inc., a three-year old company developing computer systems for work boats that can make them either remote-controllable or completely autonomous. In May, the company spent $90,000 to buy the Coast Guard hand-me-down at a government auction. Employees ripped out one of the four seats in the cabin to make room for a metal-encased computer they call a "first-generation autonomy cabinet." They painted the hull bright yellow and added the words "Unmanned Vehicle" in big, red letters. Cameras are positioned at the stern and bow, and a dome-like radar system and a digital GPS unit relay additional information about the vehicle's surroundings. The company named its new vessel Steadfast.

Autonomous maritime vehicles haven't drawn as much the attention as self-driving cars, but they're hitting the waters with increased regularity. Huge shipping interests, such as Rolls-Royce Holdings Plc, Tokyo-based fertilizer producer Nippon Yusen K.K. and BHP Billiton Ltd., the world's largest mining company, have all recently announced plans to use driverless ships for large-scale ocean transport. Boston has become a hub for marine technology startups focused on smaller vehicles, with a handful of companies like Sea Machines building their own autonomous systems for boats, diving drones and other robots that operate on or under the water.

As Marino and Ibn Seddik were steering Steadfast back to dock, another robot boat trainer, Michael Benjamin, motored past them. Benjamin, a professor at Massachusetts Institute of Technology, is a regular presence on the local waters. His program in marine autonomy, a joint effort by the school's mechanical engineering and computer science departments, serves as something of a ballast for Boston's burgeoning self-driving boat scene. Benjamin helps engineers find jobs at startups and runs an open-source software project that's crucial to many autonomous marine vehicles.

He's also a sort of maritime-technology historian. A tall, white-haired man in a baseball cap, shark t-shirt and boat shoes, Benjamin said he's spent the last 15 years "making vehicles wet." He has the U.S. armed forces to thank for making his autonomous work possible. The military sparked the field of marine autonomy decades ago, when it began demanding underwater robots for mine detection, Benjamin explained from a chair on MIT's dock overlooking the Charles River. Eventually, self-driving software worked its way into all kinds of boats.

These systems tended to chart a course based on a specific script, rather than sensing and responding to their environments. But a major shift came about a decade ago, when manufacturers began allowing customers to plug in their own autonomy systems, according to Benjamin. "Imagine where the PC revolution would have gone if the only one who could write software on an IBM personal computer was IBM," he said.

In 2006, Benjamin launched his open-source software project. With it, a computer is able to take over a boat's navigation-and-control system. Anyone can write programs for it. The project is funded by the U.S. Office for Naval Research and Battelle Memorial Institute, a nonprofit. Benjamin said there are dozens of types of vehicles using the software, which is called MOOS-IvP.

Startups using MOOS-IvP said it has created a kind of common vocabulary. "If we had a proprietary system, we would have had to develop training and train new employees," said Ibn Seddik. "Fortunately for us, Mike developed a course that serves exactly that purpose."

Teaching a boat to drive itself is easier than conditioning a car in some ways. They typically don't have to deal with traffic, stoplights or roundabouts. But water is unique challenge. "The structure of the road, with traffic lights, bounds your problem a little bit," said Benjamin. "The number of unique possible situations that you can bump into is enormous." At the moment, underwater robots represent a bigger chunk of the market than boats. Sales are expected to hit $4.6 billion in 2020, more than double the amount from 2015, according to ABI Research. The biggest customer is the military.

Several startups hope to change that. Michael Johnson, Sea Machines' chief executive officer, said the long-term potential for self-driving boats involves teams of autonomous vessels working in concert. In many harbors, multiple tugs bring in large container ships, communicating either through radio or by whistle. That could be replaced by software controlling all the boats as a single system, Johnson said.

Sea Machines' first customer is Marine Spill Response Corp., a nonprofit group funded by oil companies. The organization operates oil spill response teams that consist of a 210-foot ship paired with a 32-foot boat, which work together to drag a device collecting oil. Self-driving boats could help because staffing the 32-foot boat in choppy waters or at night can be dangerous, but the theory needs proper vetting, said Judith Roos, a vice president for MSRC. "It's too early to say, 'We're going to go out and buy 20 widgets.'"

Another local startup, Autonomous Marine Systems Inc., has been sending boats about 10 miles out to sea and leaving them there for weeks at a time. AMS's vehicles are designed to operate for long stretches, gathering data in wind farms and oil fields. One vessel is a catamaran dubbed the Datamaran, a name that first came from an employee's typo, said AMS CEO Ravi Paintal. The company also uses Benjamin's software platform. Paintal said AMS's longest missions so far have been 20 days, give or take. "They say when your boat can operate for 30 days out in the ocean environment, you'll be in the running for a commercial contract," he said.

... ... ...

[Sep 17, 2017] The last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization

Notable quotes:
"... To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI. ..."
"... Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person. ..."
"... Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment. ..."
"... If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh. ..."
"... One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true. ..."
"... Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: https://en.wikipedia.org/wiki/7_nanometer ). ..."
May 28, 2017 | economistsview.typepad.com

libezkova , May 27, 2017 at 10:53 PM

"When combined with our brains, human fingers are amazingly fine manipulation devices."

Not only fingers. The whole human arm is an amazing device. Pure magic, if you ask me.

To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI.

Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person.

Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment.

Such cruel natural experiments have shown that the level of flexibility of human brain is something really incredible. And IMHO can not be achieved by computers (although never say never).

Here we are talking about tasks that are 1 million times more complex task that playing GO or chess, or driving a car on the street.

My impression is that most of recent AI successes (especially IBM win in Jeopardy ( http://www.techrepublic.com/article/ibm-watson-the-inside-story-of-how-the-jeopardy-winning-supercomputer-was-born-and-what-it-wants-to-do-next/ ), which probably was partially staged, is by-and-large due to the growth of storage and the number of cores of computers, not so much sophistication of algorithms used.

The limits of AI are clearly visible when we see the quality of translation from one language to another. For more or less complex technical text it remains medium to low. As in "requires human editing".

If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh.

Same thing with the speech recognition. The progress is tremendous, especially the last three-five years. But it is still far from perfect. Now, with a some training, programs like Dragon are quite usable as dictation device on, say PC with 4 core 3GHz CPU with 16 GB of memory (especially if you are native English speaker), but if you deal with special text or have strong accent, they still leaves much to be desired (although your level of knowledge of the program, experience and persistence can improve the results considerably.

One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true.

Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: https://en.wikipedia.org/wiki/7_nanometer ).

[Sep 16, 2017] Google Publicly Releases Internal Developer Documentation Style Guide

Sep 12, 2017 | developers.slashdot.org

(betanews.com)

Posted by BeauHD on Tuesday September 12, 2017

@06:00AM from the free-for-all dept.

BrianFagioli shares a report from BetaNews: The documentation aspect of any project is very important, as it can help people to both understand it and track changes. Unfortunately, many developers aren't very interested in documentation aspect, so it often gets neglected. Luckily, if you want to maintain proper documentation and stay organized, today, Google is releasing its internal developer documentation style guide .

This can quite literally guide your documentation, giving you a great starting point and keeping things consistent. Jed Hartman, Technical Writer, Google says , "For some years now, our technical writers at Google have used an internal-only editorial style guide for most of our developer documentation. In order to better support external contributors to our open source projects, such as Kubernetes, AMP, or Dart, and to allow for more consistency across developer documentation, we're now making that style guide public.

If you contribute documentation to projects like those, you now have direct access to useful guidance about voice, tone, word choice, and other style considerations. It can be useful for general issues, like reminders to use second person, present tense, active voice, and the serial comma; it can also be great for checking very specific issues, like whether to write 'app' or 'application' when you want to be consistent with the Google Developers style."

You can access Google's style guide here .

[Aug 21, 2017] As the crisis unfolds there will be talk about giving the UN some role in resolving international problems.

Aug 21, 2017 | www.lettinggobreath.com

psychohistorian | Aug 21, 2017 12:01:32 AM | 27

My understanding of the UN is that it is the High Court of the World where fealty is paid to empire that funds most of the political circus anyway...and speaking of funding or not, read the following link and lets see what PavewayIV adds to the potential sickness we are sleep walking into.

As the UN delays talks, more industry leaders back ban on weaponized AI

[Jul 25, 2017] Knuth Computer Programming as an Art

Jul 25, 2017 | www.paulgraham.com

CACM , December 1974

When Communications of the ACM began publication in 1959, the members of ACM'S Editorial Board made the following remark as they described the purposes of ACM'S periodicals [2]:

"If computer programming is to become an important part of computer research and development, a transition of programming from an art to a disciplined science must be effected."
Such a goal has been a continually recurring theme during the ensuing years; for example, we read in 1970 of the "first steps toward transforming the art of programming into a science" [26]. Meanwhile we have actually succeeded in making our discipline a science, and in a remarkably simple way: merely by deciding to call it "computer science."

Implicit in these remarks is the notion that there is something undesirable about an area of human activity that is classified as an "art"; it has to be a Science before it has any real stature. On the other hand, I have been working for more than 12 years on a series of books called "The Art of Computer Programming." People frequently ask me why I picked such a title; and in fact some people apparently don't believe that I really did so, since I've seen at least one bibliographic reference to some books called "The Act of Computer Programming."

In this talk I shall try to explain why I think "Art" is the appropriate word. I will discuss what it means for something to be an art, in contrast to being a science; I will try to examine whether arts are good things or bad things; and I will try to show that a proper viewpoint of the subject will help us all to improve the quality of what we are now doing.

One of the first times I was ever asked about the title of my books was in 1966, during the last previous ACM national meeting held in Southern California. This was before any of the books were published, and I recall having lunch with a friend at the convention hotel. He knew how conceited I was, already at that time, so he asked if I was going to call my books "An Introduction to Don Knuth." I replied that, on the contrary, I was naming the books after him . His name: Art Evans. (The Art of Computer Programming, in person.)

From this story we can conclude that the word "art" has more than one meaning. In fact, one of the nicest things about the word is that it is used in many different senses, each of which is quite appropriate in connection with computer programming. While preparing this talk, I went to the library to find out what people have written about the word "art" through the years; and after spending several fascinating days in the stacks, I came to the conclusion that "art" must be one of the most interesting words in the English language.

The Arts of Old

If we go back to Latin roots, we find ars, artis meaning "skill." It is perhaps significant that the corresponding Greek word was τεχνη , the root of both "technology" and "technique."

Nowadays when someone speaks of "art" you probably think first of "fine arts" such as painting and sculpture, but before the twentieth century the word was generally used in quite a different sense. Since this older meaning of "art" still survives in many idioms, especially when we are contrasting art with science, I would like to spend the next few minutes talking about art in its classical sense.

In medieval times, the first universities were established to teach the seven so-called "liberal arts," namely grammar, rhetoric, logic, arithmetic, geometry, music, and astronomy. Note that this is quite different from the curriculum of today's liberal arts colleges, and that at least three of the original seven liberal arts are important components of computer science. At that time, an "art" meant something devised by man's intellect, as opposed to activities derived from nature or instinct; "liberal" arts were liberated or free, in contrast to manual arts such as plowing (cf. [6]). During the middle ages the word "art" by itself usually meant logic [4], which usually meant the study of syllogisms.

Science vs. Art

The word "science" seems to have been used for many years in about the same sense as "art"; for example, people spoke also of the seven liberal sciences, which were the same as the seven liberal arts [1]. Duns Scotus in the thirteenth century called logic "the Science of Sciences, and the Art of Arts" (cf. [12, p. 34f]). As civilization and learning developed, the words took on more and more independent meanings, "science" being used to stand for knowledge, and "art" for the application of knowledge. Thus, the science of astronomy was the basis for the art of navigation. The situation was almost exactly like the way in which we now distinguish between "science" and "engineering."

Many authors wrote about the relationship between art and science in the nineteenth century, and I believe the best discussion was given by John Stuart Mill. He said the following things, among others, in 1843 [28]:

Several sciences are often necessary to form the groundwork of a single art. Such is the complication of human affairs, that to enable one thing to be done , it is often requisite to know the nature and properties of many things... Art in general consists of the truths of Science, arranged in the most convenient order for practice, instead of the order which is the most convenient for thought. Science groups and arranges its truths so as to enable us to take in at one view as much as possible of the general order of the universe. Art... brings together from parts of the field of science most remote from one another, the truths relating to the production of the different and heterogeneous conditions necessary to each effect which the exigencies of practical life require.
As I was looking up these things about the meanings of "art," I found that authors have been calling for a transition from art to science for at least two centuries. For example, the preface to a textbook on mineralogy, written in 1784, said the following [17]: "Previous to the year 1780, mineralogy, though tolerably understood by many as an Art, could scarce be deemed a Science."

According to most dictionaries "science" means knowledge that has been logically arranged and systematized in the form of general "laws." The advantage of science is that it saves us from the need to think things through in each individual case; we can turn our thoughts to higher-level concepts. As John Ruskin wrote in 1853 [32]: "The work of science is to substitute facts for appearances, and demonstrations for impressions."

It seems to me that if the authors I studied were writing today, they would agree with the following characterization: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it. Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.

Artificial intelligence has been making significant progress, yet there is a huge gap between what computers can do in the foreseeable future and what ordinary people can do. The mysterious insights that people have when speaking, listening, creating, and even when they are programming, are still beyond the reach of science; nearly everything we do is still an art.

From this standpoint it is certainly desirable to make computer programming a science, and we have indeed come a long way in the 15 years since the publication of the remarks I quoted at the beginning of this talk. Fifteen years ago computer programming was so badly understood that hardly anyone even thought about proving programs correct; we just fiddled with a program until we "knew" it worked. At that time we didn't even know how to express the concept that a program was correct, in any rigorous way. It is only in recent years that we have been learning about the processes of abstraction by which programs are written and understood; and this new knowledge about programming is currently producing great payoffs in practice, even though few programs are actually proved correct with complete rigor, since we are beginning to understand the principles of program structure. The point is that when we write programs today, we know that we could in principle construct formal proofs of their correctness if we really wanted to, now that we understand how such proofs are formulated. This scientific basis is resulting in programs that are significantly more reliable than those we wrote in former days when intuition was the only basis of correctness.

The field of "automatic programming" is one of the major areas of artificial intelligence research today. Its proponents would love to be able to give a lecture entitled "Computer Programming as an Artifact" (meaning that programming has become merely a relic of bygone days), because their aim is to create machines that write programs better than we can, given only the problem specification. Personally I don't think such a goal will ever be completely attained, but I do think that their research is extremely important, because everything we learn about programming helps us to improve our own artistry. In this sense we should continually be striving to transform every art into a science: in the process, we advance the art.

Science and Art

Our discussion indicates that computer programming is by now both a science and an art, and that the two aspects nicely complement each other. Apparently most authors who examine such a question come to this same conclusion, that their subject is both a science and an art, whatever their subject is (cf. [25]). I found a book about elementary photography, written in 1893, which stated that "the development of the photographic image is both an art and a science" [13]. In fact, when I first picked up a dictionary in order to study the words "art" and "science," I happened to glance at the editor's preface, which began by saying, "The making of a dictionary is both a science and an art." The editor of Funk & Wagnall's dictionary [27] observed that the painstaking accumulation and classification of data about words has a scientific character, while a well-chosen phrasing of definitions demands the ability to write with economy and precision: "The science without the art is likely to be ineffective; the art without the science is certain to be inaccurate."

When preparing this talk I looked through the card catalog at Stanford library to see how other people have been using the words "art" and "science" in the titles of their books. This turned out to be quite interesting.

For example, I found two books entitled The Art of Playing the Piano [5, 15], and others called The Science of Pianoforte Technique [10], The Science of Pianoforte Practice [30]. There is also a book called The Art of Piano Playing: A Scientific Approach [22].

Then I found a nice little book entitled The Gentle Art of Mathematics [31], which made me somewhat sad that I can't honestly describe computer programming as a "gentle art." I had known for several years about a book called The Art of Computation , published in San Francisco, 1879, by a man named C. Frusher Howard [14]. This was a book on practical business arithmetic that had sold over 400,000 copies in various editions by 1890. I was amused to read the preface, since it shows that Howard's philosophy and the intent of his title were quite different from mine; he wrote: "A knowledge of the Science of Number is of minor importance; skill in the Art of Reckoning is absolutely indispensible."

Several books mention both science and art in their titles, notably The Science of Being and Art of Living by Maharishi Mahesh Yogi [24]. There is also a book called The Art of Scientific Discovery [11], which analyzes how some of the great discoveries of science were made.

So much for the word "art" in its classical meaning. Actually when I chose the title of my books, I wasn't thinking primarily of art in this sense, I was thinking more of its current connotations. Probably the most interesting book which turned up in my search was a fairly recent work by Robert E. Mueller called The Science of Art [29]. Of all the books I've mentioned, Mueller's comes closest to expressing what I want to make the central theme of my talk today, in terms of real artistry as we now understand the term. He observes: "It was once thought that the imaginative outlook of the artist was death for the scientist. And the logic of science seemed to spell doom to all possible artistic flights of fancy." He goes on to explore the advantages which actually do result from a synthesis of science and art.

A scientific approach is generally characterized by the words logical, systematic, impersonal, calm, rational, while an artistic approach is characterized by the words aesthetic, creative, humanitarian, anxious, irrational. It seems to me that both of these apparently contradictory approaches have great value with respect to computer programming.

Emma Lehmer wrote in 1956 that she had found coding to be "an exacting science as well as an intriguing art" [23]. H.S.M. Coxeter remarked in 1957 that he sometimes felt "more like an artist than a scientist" [7]. This was at the time C.P. Snow was beginning to voice his alarm at the growing polarization between "two cultures" of educated people [34, 35]. He pointed out that we need to combine scientific and artistic values if we are to make real progress.

Works of Art

When I'm sitting in an audience listening to a long lecture, my attention usually starts to wane at about this point in the hour. So I wonder, are you getting a little tired of my harangue about "science" and "art"? I really hope that you'll be able to listen carefully to the rest of this, anyway, because now comes the part about which I feel most deeply.

When I speak about computer programming as an art, I am thinking primarily of it as an art form , in an aesthetic sense. The chief goal of my work as educator and author is to help people learn how to write beautiful programs . It is for this reason I was especially pleased to learn recently [32] that my books actually appear in the Fine Arts Library at Cornell University. (However, the three volumes apparently sit there neatly on the shelf, without being used, so I'm afraid the librarians may have made a mistake by interpreting my title literally.)

My feeling is that when we prepare a program, it can be like composing poetry or music; as Andrei Ershov has said [9], programming can give us both intellectual and emotional satisfaction, because it is a real achievement to master complexity and to establish a system of consistent rules.

Furthermore when we read other people's programs, we can recognize some of them as genuine works of art. I can still remember the great thrill it was for me to read the listing of Stan Poley's SOAP II assembly program in 1958; you probably think I'm crazy, and styles have certainly changed greatly since then, but at the time it meant a great deal to me to see how elegant a system program could be, especially by comparison with the heavy-handed coding found in other listings I had been studying at the same time. The possibility of writing beautiful programs, even in assembly language, is what got me hooked on programming in the first place.

Some programs are elegant, some are exquisite, some are sparkling. My claim is that it is possible to write grand programs, noble programs, truly magnificent ones!

Taste and Style

The idea of style in programming is now coming to the forefront at last, and I hope that most of you have seen the excellent little book on Elements of Programming Style by Kernighan and Plauger [16]. In this connection it is most important for us all to remember that there is no one "best" style; everybody has his own preferences, and it is a mistake to try to force people into an unnatural mold. We often hear the saying, "I don't know anything about art, but I know what I like." The important thing is that you really like the style you are using; it should be the best way you prefer to express yourself.

Edsger Dijkstra stressed this point in the preface to his Short Introduction to the Art of Programming [8]:

It is my purpose to transmit the importance of good taste and style in programming, [but] the specific elements of style presented serve only to illustrate what benefits can be derived from "style" in general. In this respect I feel akin to the teacher of composition at a conservatory: He does not teach his pupils how to compose a particular symphony, he must help his pupils to find their own style and must explain to them what is implied by this. (It has been this analogy that made me talk about "The Art of Programming.")
Now we must ask ourselves, What is good style, and what is bad style? We should not be too rigid about this in judging other people's work. The early nineteenth-century philosopher Jeremy Bentham put it this way [3, Bk. 3, Ch. 1]:
Judges of elegance and taste consider themselves as benefactors to the human race, whilst they are really only the interrupters of their pleasure... There is no taste which deserves the epithet good , unless it be the taste for such employments which, to the pleasure actually produced by them, conjoin some contingent or future utility: there is no taste which deserves to be characterized as bad, unless it be a taste for some occupation which has a mischievous tendency.
When we apply our own prejudices to "reform" someone else's taste, we may be unconsciously denying him some entirely legitimate pleasure. That's why I don't condemn a lot of things programmers do, even though I would never enjoy doing them myself. The important thing is that they are creating something they feel is beautiful.

In the passage I just quoted, Bentham does give us some advice about certain principles of aesthetics which are better than others, namely the "utility" of the result. We have some freedom in setting up our personal standards of beauty, but it is especially nice when the things we regard as beautiful are also regarded by other people as useful. I must confess that I really enjoy writing computer programs; and I especially enjoy writing programs which do the greatest good, in some sense.

There are many senses in which a program can be "good," of course. In the first place, it's especially good to have a program that works correctly. Secondly it is often good to have a program that won't be hard to change, when the time for adaptation arises. Both of these goals are achieved when the program is easily readable and understandable to a person who knows the appropriate language.

Another important way for a production program to be good is for it to interact gracefully with its users, especially when recovering from human errors in the input data. It's a real art to compose meaningful error messages or to design flexible input formats which are not error-prone.

Another important aspect of program quality is the efficiency with which the computer's resources are actually being used. I am sorry to say that many people nowadays are condemning program efficiency, telling us that it is in bad taste. The reason for this is that we are now experiencing a reaction from the time when efficiency was the only reputable criterion of goodness, and programmers in the past have tended to be so preoccupied with efficiency that they have produced needlessly complicated code; the result of this unnecessary complexity has been that net efficiency has gone down, due to difficulties of debugging and maintenance.

The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming.

We shouldn't be penny wise and pound foolish, nor should we always think of efficiency in terms of so many percent gained or lost in total running time or space. When we buy a car, many of us are almost oblivious to a difference of $50 or $100 in its price, while we might make a special trip to a particular store in order to buy a 50 cent item for only 25 cents. My point is that there is a time and place for efficiency; I have discussed its proper role in my paper on structured programming, which appears in the current issue of Computing Surveys [21].

Less Facilities: More Enjoyment

One rather curious thing I've noticed about aesthetic satisfaction is that our pleasure is significantly enhanced when we accomplish something with limited tools. For example, the program of which I personally am most pleased and proud is a compiler I once wrote for a primitive minicomputer which had only 4096 words of memory, 16 bits per word. It makes a person feel like a real virtuoso to achieve something under such severe restrictions.

A similar phenomenon occurs in many other contexts. For example, people often seem to fall in love with their Volkswagens but rarely with their Lincoln Continentals (which presumably run much better). When I learned programming, it was a popular pastime to do as much as possible with programs that fit on only a single punched card. I suppose it's this same phenomenon that makes APL enthusiasts relish their "one-liners." When we teach programming nowadays, it is a curious fact that we rarely capture the heart of a student for computer science until he has taken a course which allows "hands on" experience with a minicomputer. The use of our large-scale machines with their fancy operating systems and languages doesn't really seem to engender any love for programming, at least not at first.

It's not obvious how to apply this principle to increase programmers' enjoyment of their work. Surely programmers would groan if their manager suddenly announced that the new machine will have only half as much memory as the old. And I don't think anybody, even the most dedicated "programming artists," can be expected to welcome such a prospect, since nobody likes to lose facilities unnecessarily. Another example may help to clarify the situation: Film-makers strongly resisted the introduction of talking pictures in the 1920's because they were justly proud of the way they could convey words without sound. Similarly, a true programming artist might well resent the introduction of more powerful equipment; today's mass storage devices tend to spoil much of the beauty of our old tape sorting methods. But today's film makers don't want to go back to silent films, not because they're lazy but because they know it is quite possible to make beautiful movies using the improved technology. The form of their art has changed, but there is still plenty of room for artistry.

How did they develop their skill? The best film makers through the years usually seem to have learned their art in comparatively primitive circumstances, often in other countries with a limited movie industry. And in recent years the most important things we have been learning about programming seem to have originated with people who did not have access to very large computers. The moral of this story, it seems to me, is that we should make use of the idea of limited resources in our own education. We can all benefit by doing occasional "toy" programs, when artificial restrictions are set up, so that we are forced to push our abilities to the limit. We shouldn't live in the lap of luxury all the time, since that tends to make us lethargic. The art of tackling miniproblems with all our energy will sharpen our talents for the real problems, and the experience will help us to get more pleasure from our accomplishments on less restricted equipment.

In a similar vein, we shouldn't shy away from "art for art's sake"; we shouldn't feel guilty about programs that are just for fun. I once got a great kick out of writing a one-statement ALGOL program that invoked an innerproduct procedure in such an unusual way that it calculated the mth prime number, instead of an innerproduct [19]. Some years ago the students at Stanford were excited about finding the shortest FORTRAN program which prints itself out, in the sense that the program's output is identical to its own source text. The same problem was considered for many other languages. I don't think it was a waste of time for them to work on this; nor would Jeremy Bentham, whom I quoted earlier, deny the "utility" of such pastimes [3, Bk. 3, Ch. 1]. "On the contrary," he wrote, "there is nothing, the utility of which is more incontestable. To what shall the character of utility be ascribed, if not to that which is a source of pleasure?"

Providing Beautiful Tools

Another characteristic of modern art is its emphasis on creativity. It seems that many artists these days couldn't care less about creating beautiful things; only the novelty of an idea is important. I'm not recommending that computer programming should be like modern art in this sense, but it does lead me to an observation that I think is important. Sometimes we are assigned to a programming task which is almost hopelessly dull, giving us no outlet whatsoever for any creativity; and at such times a person might well come to me and say, "So programming is beautiful? It's all very well for you to declaim that I should take pleasure in creating elegant and charming programs, but how am I supposed to make this mess into a work of art?"

Well, it's true, not all programming tasks are going to be fun. Consider the "trapped housewife," who has to clean off the same table every day: there's not room for creativity or artistry in every situation. But even in such cases, there is a way to make a big improvement: it is still a pleasure to do routine jobs if we have beautiful things to work with. For example, a person will really enjoy wiping off the dining room table, day after day, if it is a beautifully designed table made from some fine quality hardwood.

Therefore I want to address my closing remarks to the system programmers and the machine designers who produce the systems that the rest of us must work with. Please, give us tools that are a pleasure to use, especially for our routine assignments, instead of providing something we have to fight with. Please, give us tools that encourage us to write better programs, by enhancing our pleasure when we do so.

It's very hard for me to convince college freshmen that programming is beautiful, when the first thing I have to tell them is how to punch "slash slash JoB equals so-and-so." Even job control languages can be designed so that they are a pleasure to use, instead of being strictly functional.

Computer hardware designers can make their machines much more pleasant to use, for example by providing floating-point arithmetic which satisfies simple mathematical laws. The facilities presently available on most machines make the job of rigorous error analysis hopelessly difficult, but properly designed operations would encourage numerical analysts to provide better subroutines which have certified accuracy (cf. [20, p. 204]).

Let's consider also what software designers can do. One of the best ways to keep up the spirits of a system user is to provide routines that he can interact with. We shouldn't make systems too automatic, so that the action always goes on behind the scenes; we ought to give the programmer-user a chance to direct his creativity into useful channels. One thing all programmers have in common is that they enjoy working with machines; so let's keep them in the loop. Some tasks are best done by machine, while others are best done by human insight; and a properly designed system will find the right balance. (I have been trying to avoid misdirected automation for many years, cf. [18].)

Program measurement tools make a good case in point. For years, programmers have been unaware of how the real costs of computing are distributed in their programs. Experience indicates that nearly everybody has the wrong idea about the real bottlenecks in his programs; it is no wonder that attempts at efficiency go awry so often, when a programmer is never given a breakdown of costs according to the lines of code he has written. His job is something like that of a newly married couple who try to plan a balanced budget without knowing how much the individual items like food, shelter, and clothing will cost. All that we have been giving programmers is an optimizing compiler, which mysteriously does something to the programs it translates but which never explains what it does. Fortunately we are now finally seeing the appearance of systems which give the user credit for some intelligence; they automatically provide instrumentation of programs and appropriate feedback about the real costs. These experimental systems have been a huge success, because they produce measurable improvements, and especially because they are fun to use, so I am confident that it is only a matter of time before the use of such systems is standard operating procedure. My paper in Computing Surveys [21] discusses this further, and presents some ideas for other ways in which an appropriate interactive routine can enhance the satisfaction of user programmers.

Language designers also have an obligation to provide languages that encourage good style, since we all know that style is strongly influenced by the language in which it is expressed. The present surge of interest in structured programming has revealed that none of our existing languages is really ideal for dealing with program and data structure, nor is it clear what an ideal language should be. Therefore I look forward to many careful experiments in language design during the next few years.

Summary

To summarize: We have seen that computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. A programmer who subconsciously views himself as an artist will enjoy what he does and will do it better. Therefore we can be glad that people who lecture at computer conferences speak about the state of the Art .

References

1. Bailey, Nathan. The Universal Etymological English Dictionary. T. Cox, London, 1727. See "Art," "Liberal," and "Science."

2. Bauer, Walter F., Juncosa, Mario L., and Perlis, Alan J. ACM publication policies and plans. J. ACM 6 (Apr. 1959), 121-122.

3. Bentham, Jeremy. The Rationale of Reward. Trans. from Theorie des peines et des recompenses, 1811, by Richard Smith, J. & H. L. Hunt, London, 1825.

4. The Century Dictionary and Cyclopedia 1. The Century Co., New York, 1889.

5. Clementi, Muzio. The Art of Playing the Piano. Trans. from L'art de jouer le pianoforte by Max Vogrich. Schirmer, New York, 1898.

6. Colvin, Sidney. "Art." Encyclopaedia Britannica, eds 9, 11, 12, 13, 1875-1926.

7. Coxeter, H. S. M. Convocation address, Proc. 4th Canadian Math. Congress, 1957, pp. 8-10.

8. Dijkstra, Edsger W. EWD316: A Short Introduction to the Art of Programming. T. H. Eindhoven, The Netherlands, Aug. 1971.

9. Ershov, A. P. Aesthetics and the human factor in programming. Comm. ACM 15 (July 1972), 501-505.

10. Fielden, Thomas. The Science of Pianoforte Technique. Macmillan, London, 927.

11. Gore, George. The Art of Scientific Discovery. Longmans, Green, London, 1878.

12. Hamilton, William. Lectures on Logic 1. Win. Blackwood, Edinburgh, 1874.

13. Hodges, John A. Elementary Photography: The "Amateur Photographer" Library 7. London, 1893. Sixth ed, revised and enlarged, 1907, p. 58.

14. Howard, C. Frusher. Howard's Art of Computation and golden rule for equation of payments for schools, business colleges and self-culture .... C.F. Howard, San Francisco, 1879.

15. Hummel, J.N. The Art of Playing the Piano Forte. Boosey, London, 1827.

16. Kernighan B.W., and Plauger, P.J. The Elements of Programming Style. McGraw-Hill, New York, 1974.

17. Kirwan, Richard. Elements of Mineralogy. Elmsly, London, 1784.

18. Knuth, Donald E. Minimizing drum latency time. J. ACM 8 (Apr. 1961), 119-150.

19. Knuth, Donald E., and Merner, J.N. ALGOL 60 confidential. Comm. ACM 4 (June 1961), 268-272.

20. Knuth, Donald E. Seminumerical Algorithms: The Art of Computer Programming 2. Addison-Wesley, Reading, Mass., 1969.

21. Knuth, Donald E. Structured programming with go to statements. Computing Surveys 6 (Dec. 1974), pages in makeup.

22. Kochevitsky, George. The Art of Piano Playing: A Scientific Approach. Summy-Birchard, Evanston, II1., 1967.

23. Lehmer, Emma. Number theory on the SWAC. Proc. Syrup. Applied Math. 6, Amer. Math. Soc. (1956), 103-108.

24. Mahesh Yogi, Maharishi. The Science of Being and Art of Living. Allen & Unwin, London, 1963.

25. Malevinsky, Moses L. The Science of Playwriting. Brentano's, New York, 1925.

26. Manna, Zohar, and Pnueli, Amir. Formalization of properties of functional programs. J. ACM 17 (July 1970), 555-569.

27. Marckwardt, Albert H, Preface to Funk and Wagnall's Standard College Dictionary. Harcourt, Brace & World, New York, 1963, vii.

28. Mill, John Stuart. A System Of Logic, Ratiocinative and Inductive. London, 1843. The quotations are from the introduction, S 2, and from Book 6, Chap. 11 (12 in later editions), S 5.

29. Mueller, Robert E. The Science of Art. John Day, New York, 1967.

30. Parsons, Albert Ross. The Science of Pianoforte Practice. Schirmer, New York, 1886.

31. Pedoe, Daniel. The Gentle Art of Mathematics. English U. Press, London, 1953.

32. Ruskin, John. The Stones of Venice 3. London, 1853.

33. Salton, G.A. Personal communication, June 21, 1974.

34. Snow, C.P. The two cultures. The New Statesman and Nation 52 (Oct. 6, 1956), 413-414.

35. Snow, C.P. The Two Cultures: and a Second Look. Cambridge University Press, 1964.

Copyright 1974, Association for Computing Machinery, Inc. General permission to republish, but not for profit, all or part of this material is granted provided that ACM's copyright notice is given and that reference is made to the publication, to its date of issue, and to the fact that reprinting privileges were granted by permission of the Association for Computing Machinery.

[May 17, 2017] Who really gives a toss if it's agile or not

Notable quotes:
"... According to sources, hundreds of developers were employed on the programme at huge day rates, with large groups of so-called agile experts overseeing the various aspects of the programme. ..."
"... I have also worked on agile for UK gov projects a few years back when it was mandated for all new projects and I was at first dead keen. However, it quickly become obvious that the lack of requirements, specifications etc made testing a living nightmare. Changes asked for by the customer were grafted onto what become baroque mass of code. I can't see how Agile is a good idea except for the smallest trivial projects. ..."
"... The question is - is that for software that's still in development or software that's deployed in production? If it's the latter and your "something" just changes its data format you're going to be very unpopular with your users. And that's just for ordinary files. If it requires frequent re-orgs of an RDBMS then you'd be advised to not go near any dark alley where your DBA might be lurking. ..."
"... Software works on data. If you can't get the design of that right early you're going to be carrying a lot of technical debt in terms of backward compatibility or you're going to impose serious costs on your users for repeatedly bringing existing data up to date. ..."
"... At this point, courtesy of Exxxxtr3333me Programming and its spawn, 'agile' just means 'we don't want to do any design, we don't want to do any documentation, and we don't want to do any acceptance testing because all that stuff is annoying.' Everything is 'agile', because that's the best case for terrible lazy programmers, even if they're using a completely different methodology. ..."
"... It's like any exciting new methodology : same shit, different name. In this case, one that allows you to pretend the tiny attention-span of a panicking project manager is a good thing. ..."
"... Process is not a panacea or a crutch or a silver bullet. Methodologies only work as well as the people using it. Any methodology can be distorted to give the answer that upper management wants (instead of reality) ..."
"... under the guise of " agile ?". I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture. ..."
"... So why should the developers have all the fun? Why can't the designers and architects be "agile", too? Isn't constantly changing stuff all part of the "agile" way? ..."
May 17, 2017 | theregister.co.uk
Comment "It doesn't matter whether a cat is white or black, as long as it catches mice," according to Chinese revolutionary Deng Xiaoping.

While Deng wasn't referring to anything nearly as banal as IT projects (he was of course talking about the fact it doesn't matter whether a person is a revolutionary or not, as long as he or she is efficient and capable), the same principle could apply.

A fixation on the suppliers, technology or processes ultimately doesn't matter. It's the outcomes, stupid. That might seem like a blindingly obvious point, but it's one worth repeating.

Or as someone else put it to me recently in reference to the huge overspend on a key UK programme behind courts digitisation which we recently revealed: "Who gives a toss if it's agile or not? It just needs to work."

If you're going to do it do it right

I'm not dismissing the benefits of this particular methodology, but in the case of the Common Platform Programme , it feels like the misapplication of agile was worse than not doing it at all.

Just to recap: the CPP was signed off around 2013, with the intention of creating a unified platform across the criminal justice system to allow the Crown Prosecution Service and courts to more effectively manage cases.

By cutting out duplication of systems, it was hoped to save buckets of cash and make the process of case management across the criminal justice system far more efficient.

Unlike the old projects of the past, this was a great example of the government taking control and doing it themselves. Everything was going to be delivered ahead of time and under budget. Trebles all round!

But as Lucy Liu's O-Ren Ishii told Uma Thurman's character in in Kill Bill : "You didn't think it was gonna be that easy, did you?... Silly rabbit."

According to sources, alarm bells were soon raised over the project's self-styled "innovative use of agile development principles". It emerged that the programme was spending an awful lot of money for very little return. Attempts to shut it down were themselves shut down.

The programme carried on at full steam and by 2014 it was ramping up at scale. According to sources, hundreds of developers were employed on the programme at huge day rates, with large groups of so-called agile experts overseeing the various aspects of the programme.

CPP cops a plea

Four years since it was first signed off and what are the things we can point to from the CPP? An online make-a-plea programme which allows people to plead guilty or not guilty to traffic offences; a digital markup tool for legal advisors to record case results in court, which is being tested by magistrates courts in Essex; and the Magistrates Rota.

Multiple insiders have said the rest that we have to show for hundreds of millions of taxpayers' cash is essentially vapourware. When programme director Loveday Ryder described the project as a "once-in-a-lifetime opportunity" to modernise the criminal justice system, it wasn't clear then that she meant the programme would itself take an actual lifetime.

Of course the definition of agile is that you are able to move quickly and easily. So some might point to the outcomes of this programme as proof that it was never really about that.

One source remarked that it really doesn't matter if you call something agile or not, "If you can replace agile with constantly talking and communicating then fine, call it agile." He also added: "This was one of the most waterfall programmes in government I've seen."

What is most worrying about this programme is it may not be an isolated example. Other organisations and departments may well be doing similar things under the guise of "agile". I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture.

Ultimately who cares if a programme is run via a system integrator, multiple SMEs, uses a DevOps methodology, is built in-house or deployed using off-the-shelf, as long as it delivers good value. No doubt there are good reasons for using any of those approaches in a number of different circumstances.

Government still spends an outrageous amount of money on IT, upwards of £16bn a year. So as taxpayers it's a simple case of wanting them to "show me the money". Or to misquote Deng, at least show us some more dead mice. ®

Prst. V.Jeltz

Re: 'What's Real and What's for Sale'...

So agile means "constantly adapating " ? read constantly bouncing from one fuckup to the next , paddling like hell to keep up , constantly firefighting whilst going down slowly like the titanic?

thats how i read it

Dogbowl
Re: 'What's Real and What's for Sale'...

Ha! About 21 years back, working at Racal in Bracknell on a military radio project, we had a 'round-trip-OMT' CASE tool that did just that. It even generated documentation from the code so as you added classes and methods the CASE tool generated the design document. Also, a nightly build if it failed, would email the code author.

I have also worked on agile for UK gov projects a few years back when it was mandated for all new projects and I was at first dead keen. However, it quickly become obvious that the lack of requirements, specifications etc made testing a living nightmare. Changes asked for by the customer were grafted onto what become baroque mass of code. I can't see how Agile is a good idea except for the smallest trivial projects.

PatientOne
Re: 'What's Real and What's for Sale'...

"Technically 'agile' just means you produce working versions frequently and iterate on that."

It's more to do with priorities: On time, on budget, to specification: Put these in the order of which you will surrender if the project hits problems.

Agile focuses on On time. What is delivered is hopefully to specification, and within budget, but one or both of those could be surrendered in order to get something out On time. It's just project management 101 with a catchy name, and in poorly managed 'agile' developments you find padding to fit the usual 60/30/10 rule. Then the management disgard the padding and insist the project can be completed in a reduced time as a result, thereby breaking the rules of 'agile' development (insisting it's on spec, under time and under budget, but it's still 'agile'...).

Doctor Syntax
Re: 'What's Real and What's for Sale'...

"Usually I check something(s) in every day, for the most major things it may take a week, but the goal is always to get it in and working so it can be tested."

The question is - is that for software that's still in development or software that's deployed in production? If it's the latter and your "something" just changes its data format you're going to be very unpopular with your users. And that's just for ordinary files. If it requires frequent re-orgs of an RDBMS then you'd be advised to not go near any dark alley where your DBA might be lurking.

Software works on data. If you can't get the design of that right early you're going to be carrying a lot of technical debt in terms of backward compatibility or you're going to impose serious costs on your users for repeatedly bringing existing data up to date.

Doctor Syntax
Re: 'What's Real and What's for Sale'...

"On time, on budget, to specification: Put these in the order of which you will surrender if the project hits problems."

In the real world it's more likely to be a trade-off of how much of each to surrender.

FozzyBear
Re: 'What's Real and What's for Sale'...

I was told in my earlier years by a Developer.

For any project you can have it

  1. Cheap, (On Budget)
  2. Good, (On spec)
  3. Quick.( On time)

Pick two of the three and only two. It doesn't which way you pick, you're fucked on the third. Doesn't matter about methodology, doesn't matter about requirements or project manglement. You are screwed on the third and the great news is, is that the level of the reaming you get scales with the size of the project.

After almost 20 years in the industry this has held true.

Dagg
Re: 'What's Real and What's for Sale'...

Technically 'agile' just means you produce working versions frequently and iterate on that.

No, technically agile means having no clue as to what is required and to evolve the requirements as you build. All well and good if you have a dicky little web site but if you are on a very / extremely large project with fixed time frame and fixed budget you are royally screwed trying to use agile as there is no way you can control scope.

Hell under agile no one has any idea what the scope is!

Archtech
Re: Government still spends an outrageous amount of money on IT

I hope you were joking. If not, try reading the classic book "The Mythical Man-Month".

oldtaku
'Agile' means nothing at this point. Unless it means terrible software.

At this point, courtesy of Exxxxtr3333me Programming and its spawn, 'agile' just means 'we don't want to do any design, we don't want to do any documentation, and we don't want to do any acceptance testing because all that stuff is annoying.' Everything is 'agile', because that's the best case for terrible lazy programmers, even if they're using a completely different methodology.

I firmly believe in the basics of 'iterate working versions as often as possible'. But why sell ourselves short by calling it agile when we actually design it, document it, and use testing beyond unit tests?

Yes, yes, you can tell me what 'agile' technically means, and I know that design and documentation and QA are not excluded, but in practice even the most waterfall of waterfall call themselves agile (like Kat says), and from hard experience people who really push 'agile agile agile' as their thing are the worst of the worst terrible coders who just slam crap together with all the finesse and thoughtfulness of a Bangalore outsourcer.

Adrian 4
It's like any exciting new methodology : same shit, different name. In this case, one that allows you to pretend the tiny attention-span of a panicking project manager is a good thing.

When someone shows me they've learned the lessons of Brooke's tarpit,. I'll be interested to see how they did it. Until then, it's all talk.

jamie m
25% Agile:

Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.

Working software is the primary measure of progress.

kmac499
Re: 25% Agile:

From Jamie M

Working software is the primary measure of progress.

Brilliant few word summary which should be scrawled on the wall of every IT project managers office in Foot High Letters.

I've lived through SSADM, RAD, DSDM, Waterfall, Bohm Spirals, Extreme Programming and probably a few others.

They are ALL variations on a theme. The only thing they have in common is the successful ones left a bunch of 0's and 1's humming away in a lump of silicon doing something useful.

Doctor Syntax
Re: 25% Agile:

"Working software is the primary measure of progress."

What about training the existing users, having it properly documented for the users of the future, briefing support staff, having proper software documentation, or at least self documenting code, for those who will have to maintain it and ensuring it doesn't disrupt the data from the previous release? Or do we just throw code over the fence and wave goodbye to it?

Charlie Clark
Re: Limits of pragmatism

In France "naviguer à vue" is pejorative.

Software development in France* which also gave us ah yes, it may work in practice but does it work in theory .

The story is about the highly unusual cost overrun of a government project. Never happened dans l'héxagone ? Because it seems to happy pretty much everywhere else with relentless monotony because politicians are fucking awful project managers.

* FWIW I have a French qualification.

Anonymous Coward

Agile only works if all stakeholders agree on an outcome

For a project that is a huge change of operating your organisation it is unlikely that you will be able to deliver, at least the initial parts of your project, in an agile way. Once outcomes are known at a high level stakeholders have something to cling onto when they are asked what they need, if indeed they exist yet. (trying to ask for requirements for a stakeholder that doesn't exist yet is tough).

Different methods have their own issues, but in this case I would have expected failure to be reasonably predictable.

You wont have much to show for it, as they shouldn't at least, have started coding to a business model that itself needs defining. This is predictable, and overall means that no one agrees what the business should look like, let alone how a vendor delivered software solution should support it.

I have a limited amount of sympathy for the provider for this as it will be beyond their control (limited as they are an expensive government provider after all)

This is a disaster caused by the poor management in UKGOV and the vendor should have dropped and ran well before this.

Anonymous Coward

I'm a one man, self employed business - I do some very complex sites - but if they don't work I don't get paid. If they spew ugly bugs I get paniced emails from unhappy clients.

So I test after each update and comment code and add features to make my life easy when it comes to debugging. Woo, it even send me emails for some bugs.

I'm in agreement with the guy above - a dozen devs, a page layout designer or two, some databases. One manager to co-ordinate and no bloody jargon.

There's a MASSIVE efficiency to small teams but all the members need to be on top of their game.

Doctor Syntax
"I'm in agreement with the guy above - a dozen devs, a page layout designer or two, some databases. One manager to co-ordinate and no bloody jargon."

Don't forget a well-defined, soluble problem. That's in your case, where you're paid by results. If you're paid by billable hours it's a positive disadvantage.

Munchausen's proxy
Agile Expertise?

" I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture."

I'm no expert either, but I honestly thought that was quite literally the definition of agile. (maybe disguised with bafflegab, but semantically equivalent)

Zippy's Sausage Factory
Sounds like what I have said for a while...

The "strategy boutiques" saw "agile" becoming popular and they now use it as a buzzword.

These days, I put it in my "considered harmful" bucket, along with GOTO, teaching people to program using BASIC, and "upgrading" to Office 2016*.

* Excel, in particular.

a_yank_lurker
Buzzword Bingo

All too often a sound development ideas are perverted. The concepts are sound but the mistake is to view each as the perfect panacea to produce bug free, working code. Each has its purpose and scope of effectiveness. What should be understood and applied is not a precise cookbook method but principles - Agile focuses communication between groups and ensuring all on the same page. Others focus more on low level development (test driven development, e.g.) but one can lose sight the goal is use an appropriate tool set to make sure quality code is produced. Again, is the code being tested, are the tests correct, are junior developers being mentored, are developers working together appropriately for the nature of the projects are issues to be addressed not the precise formalism of the insultants.

Uncle Bob Martin has noted that one of the problems the formalisms try to address is the large number of junior developers who need proper mentoring, training, etc. in real world situations. He noted that in the old days many IT pros were mid level professionals who wandered over to IT and many of the formalisms so beloved by the insultants were concepts they did naturally. Cross functional team meetings - check, mentor - check, use appropriate tests - check, etc. These are professional procedures common to other fields and were ingrained mindset and habits.

Doctor Syntax
It's worth remembering that it's the disasters that make the news. I've worked on a number of public sector projects which were successful. After a few years of operation, however, the contract period was up and the whole service put out to re-tender.* At that point someone else gets the contract so the original work on which the successful delivery was based got scrapped.

* With some very odd results, it has to be said, but that's a different story.

goldcd

...My personal niggle is that a team has "velocity" rather than "speed" - and that seems to be a somewhat deliberate and disingenuous selection. The team should have speed, the project ultimately a measurable velocity calculated by working out how much of the speed was wasted in the wrong/right direction.

Anyway, off to get my beauty sleep, so I can feed the backlog tomorrow with anything within my reach.

Notas Badoff
Re: I like agile

Wanted to give you an up-vote as "velocity" vs. "speed" is exactly the sleight of hand that infuriates me. We do want eventual progress achieved, as in "distance towards goal in mind", right?

Unfortunately my reading the definitions and checking around leads me to think that you've got the words 'speed' and 'velocity' reversed above. Where's that nit-picker's icon....

bfwebster
I literally less than an hour ago gave my CS 428 ("Software Engineering") class here at Brigham Young University (Provo, Utah, USA) my final lecture for the semester, which included this slide:

Process is not a panacea or a crutch or a silver bullet. Methodologies only work as well as the people using it. Any methodology can be distorted to give the answer that upper management wants (instead of reality)

When adopting a new methodology:

Understand the strengths and weaknesses of a given methodology before starting a project with it. Also, make sure a majority of team members have successfully completed a real-world project using that methodology.

Pete 2
Save some fun for us!

> under the guise of " agile ?". I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture.

So why should the developers have all the fun? Why can't the designers and architects be "agile", too? Isn't constantly changing stuff all part of the "agile" way?

[May 05, 2017] William Binney - The Government is Profiling You (The NSA is Spying on You)

Very interesting discussion of how the project of mass surveillance of internet traffic started and what were the major challenges. that's probably where the idea of collecting "envelopes" and correlating them to create social network. Similar to what was done in civil War.
The idea to prevent corruption of medical establishment to prevent Medicare fraud is very interesting.
Notable quotes:
"... I suspect that it's hopelessly unlikely for honest people to complete the Police Academy; somewhere early on the good cops are weeded out and cannot complete training unless they compromise their integrity. ..."
"... 500 Years of History Shows that Mass Spying Is Always Aimed at Crushing Dissent It's Never to Protect Us From Bad Guys No matter which government conducts mass surveillance, they also do it to crush dissent, and then give a false rationale for why they're doing it. ..."
"... People are so worried about NSA don't be fooled that private companies are doing the same thing. ..."
"... In communism the people learned quick they were being watched. The reaction was not to go to protest. ..."
"... Just not be productive and work the system and not listen to their crap. this is all that was required to bring them down. watching people, arresting does not do shit for their cause ..."
Apr 20, 2017 | www.youtube.com
Chad 2 years ago

"People who believe in these rights very much are forced into compromising their integrity"

I suspect that it's hopelessly unlikely for honest people to complete the Police Academy; somewhere early on the good cops are weeded out and cannot complete training unless they compromise their integrity.

Agent76 1 year ago (edited)
January 9, 2014

500 Years of History Shows that Mass Spying Is Always Aimed at Crushing Dissent It's Never to Protect Us From Bad Guys No matter which government conducts mass surveillance, they also do it to crush dissent, and then give a false rationale for why they're doing it.

http://www.washingtonsblog.com/2014/01/government-spying-citizens-always-focuses-crushing-dissent-keeping-us-safe.html

Homa Monfared 7 months ago

I am wondering how much damage your spying did to the Foreign Countries, I am wondering how you changed regimes around the world, how many refugees you helped to create around the world.

Don Kantner, 2 weeks ago

People are so worried about NSA don't be fooled that private companies are doing the same thing. Plus, the truth is if the NSA wasn't watching any fool with a computer could potentially cause an worldwide economic crisis.

Bettor in Vegas 1 year ago

In communism the people learned quick they were being watched. The reaction was not to go to protest.

Just not be productive and work the system and not listen to their crap. this is all that was required to bring them down. watching people, arresting does not do shit for their cause......

[Apr 18, 2017] Learning to Love Intelligent Machines

Notable quotes:
"... Learning to Love Intelligent Machines ..."
Apr 18, 2017 | www.nakedcapitalism.com
MoiAussie , April 17, 2017 at 9:04 am

If anyone is struggling to access Learning to Love Intelligent Machines (WSJ), you can get to it by clicking though this post . YMMV.

MyLessThanPrimeBeef , April 17, 2017 at 11:26 am

Also, don't forget to Learn from your Love Machines.

Artificial Love + Artificial Intelligence = Artificial Utopia.

[Apr 17, 2017] How many articles have I read that state as fact that the problem is REALLY automation?

Notable quotes:
"... It isn't. It's the world's biggest, most advanced cloud-computing company with an online retail storefront stuck between you and it. In 2005-2006 it was already selling supercomputing capability for cents on the dollar - way ahead of Google and Microsoft and IBM. ..."
"... Do you really think the internet created Amazon, Snapchat, Facebook, etc? No, the internet was just a tool to be used. The people who created those businesses would have used any tool they had access to at the time because their original goal was not automation or innovation, it was only to get rich. ..."
"... "Disruptive parasitic intermediation" is superb, thanks. The entire phrase should appear automatically whenever "disruption"/"disruptive" or "innovation"/"innovative" is used in a laudatory sense. ..."
"... >that people have a much bigger aversion to loss than gain. ..."
"... As the rich became uber rich, they hid the money in tax havens. As for globalization, this has less to do these days with technological innovation and more to do with economic exploitation. ..."
Apr 17, 2017 | www.nakedcapitalism.com
Carla , April 17, 2017 at 9:25 am

"how many articles have I read that state as fact that the problem is REALLY automation?

NO, the real problem is that the plutocrats control the policies "

+1

justanotherprogressive , April 17, 2017 at 11:45 am

+100 to your comment. There is a decided attempt by the plutocrats to get us to focus our anger on automation and not the people, like they themselves, who control the automation ..

MoiAussie , April 17, 2017 at 12:10 pm

Plutocrats control much automation, but so do thousands of wannabe plutocrats whose expertise lets them come from nowhere to billionairehood in a few short years by using it to create some novel, disruptive parasitic intermediation that makes their fortune. The "sharing economy" relies on automation. As does Amazon, Snapchat, Facebook, Dropbox, Pinterest,

It's not a stretch to say that automation creates new plutocrats . So blame the individuals, or blame the phenomenon, or both, whatever works for you.

Carolinian , April 17, 2017 at 12:23 pm

So John D. Rockefeller and Andrew Carnegie weren't plutocrats–or were somehow better plutocrats?

Blame not individuals or phenomena but society and the public and elites who shape it. Our social structure is also a kind of machine and perhaps the most imperfectly designed of all of them. My own view is that the people who fear machines are the people who don't like or understand machines. Tools, and the use of them, are an essential part of being human.

MoiAussie , April 17, 2017 at 9:21 pm

Huh? If I wrote "careless campers create forest fires", would you actually think I meant "careless campers create all forest fires"?

Carolinian , April 17, 2017 at 10:23 pm

I'm replying to your upthread comment which seems to say today's careless campers and the technology they rely on are somehow different from those other figures we know so well from history. In fact all technology is tremendously disruptive but somehow things have a way of sorting themselves out. So–just to repeat–the thing is not to "blame" the individuals or the automation but to get to work on the sorting. People like Jeff Bezos with his very flaky business model could be little more than a blip.

a different chris , April 17, 2017 at 12:24 pm

>Amazon, Snapchat, Facebook, Dropbox, Pinterest

Automation? Those companies? I guess Amazon automates ordering not exactly R. Daneel Olivaw for sure. If some poor Asian girl doesn't make the boots or some Agri giant doesn't make the flour Amazon isn't sending you nothin', and the other companies are even more useless.

Mark P. , April 17, 2017 at 2:45 pm

'Automation? Those companies? I guess Amazon automates ordering not exactly R. Daneel Olivaw for sure.'

Um. Amazon is highly deceptive, in that most people think it's a giant online retail store.

It isn't. It's the world's biggest, most advanced cloud-computing company with an online retail storefront stuck between you and it. In 2005-2006 it was already selling supercomputing capability for cents on the dollar - way ahead of Google and Microsoft and IBM.

justanotherprogressive , April 17, 2017 at 12:32 pm

Do you really think the internet created Amazon, Snapchat, Facebook, etc? No, the internet was just a tool to be used. The people who created those businesses would have used any tool they had access to at the time because their original goal was not automation or innovation, it was only to get rich.

Let me remind you of Thomas Edison. If he would have lived 100 years later, he would have used computers instead of electricity to make his fortune. (In contrast, Nikolai Tesla/George Westinghouse used electricity to be innovative, NOT to get rich ). It isn't the tool that is used, it is the mindset of the people who use the tool

clinical wasteman , April 17, 2017 at 2:30 pm

"Disruptive parasitic intermediation" is superb, thanks. The entire phrase should appear automatically whenever "disruption"/"disruptive" or "innovation"/"innovative" is used in a laudatory sense.

100% agreement with your first point in this thread, too. That short comment should stand as a sort of epigraph/reference for all future discussion of these things.

No disagreement on the point about actual and wannabe plutocrats either, but perhaps it's worth emphasising that it's not just a matter of a few successful (and many failed) personal get-rich-quick schemes, real as those are: the potential of 'universal machines' tends to be released in the form of parasitic intermediation because, for the time being at least, it's released into a world subject to the 'demands' of capital, and at a (decades-long) moment of crisis for the traditional model of capital accumulation. 'Universal' potential is set free to seek rents and maybe to do a bit of police work on the side, if the two can even be separated.

The writer of this article from 2010 [ http://www.metamute.org/editorial/articles/artificial-scarcity-world-overproduction-escape-isnt ] surely wouldn't want it to be taken as conclusive, but it's a good example of one marginal train of serious thought about all of the above. See also 'On Africa and Self-Reproducing Automata' written by George Caffentzis 20 years or so earlier [https://libcom.org/library/george-caffentzis-letters-blood-fire]; apologies for link to entire (free, downloadable) book, but my crumbling print copy of the single essay stubbornly resists uploading.

DH , April 17, 2017 at 9:48 am

Unfortunately, the healthcare insurance debate has been simply a battle between competing ideologies. I don't think Americans understand the key role that universal healthcare coverage plays in creating resilient economies.

Before penicillin, heart surgeries, cancer cures, modern obstetrics etc. that it didn't matter if you are rich or poor if you got sick. There was a good chance you would die in either case which was a key reason that the average life span was short.

In the mid-20th century that began to change so now lifespan is as much about income as anything else. It is well known that people have a much bigger aversion to loss than gain. So if you currently have healthcare insurance through a job, then you don't want to lose it by taking a risk to do something where you are no longer covered.

People are moving less to find work – why would you uproot your family to work for a company that is just as likely to lay you off in two years in a place you have no roots? People are less likely to day to quit jobs to start a new business – that is a big gamble today because you not only have to keep the roof over your head and put food on the table, but you also have to cover an even bigger cost of healthcare insurance in the individual market or you have a much greater risk of not making it to your 65th birthday.

In countries like Canada, healthcare coverage is barely a discussion point if somebody is looking to move, change jobs, or start a small business.

If I had a choice today between universal basic income vs universal healthcare coverage, I would choose the healthcare coverage form a societal standpoint. That is simply insuring a risk and can allow people much greater freedom during the working lives. Similarly, Social Security is of similar importance because it provides basic protection against disability and not starving in the cold in your old age. These are vastly different incentive systems than paying people money to live on even if they are not working.

Our ideological debates should be factoring these types of ideas in the discussion instead of just being a food fight.

a different chris , April 17, 2017 at 12:28 pm

>that people have a much bigger aversion to loss than gain.

Yeah well if the downside is that you're dead this starts to make sense.

>instead of just being a food fight.

The thing is that the Powers-That-Be want it to be a food fight, as that is a great stalling at worst and complete diversion at best tactic. Good post, btw.

Altandmain , April 17, 2017 at 12:36 pm

As the rich became uber rich, they hid the money in tax havens. As for globalization, this has less to do these days with technological innovation and more to do with economic exploitation.

I will note that Germany, Japan, South Korea, and a few other nations have not bought into this madness and have retained a good chunk of their manufacturing sectors.

Mark P. , April 17, 2017 at 3:26 pm

'As for globalization, this has less to do these days with technological innovation and more to do with economic exploitation.'

Economic exploiters are always with us. You're underrating the role of a specific technological innovation. Globalization as we now know it really became feasible in the late 1980s with the spread of instant global electronic networks, mostly via the fiberoptic cables through which everything - telephony, Internet, etc - travels Internet packet mode.

That's the point at which capital could really start moving instantly around the world, and companies could really begin to run global supply chains and workforces. That's the point when shifts of workers in facilities in Bangalore or Beijing could start their workdays as shifts of workers in the U.S. were ending theirs, and companies could outsource and offshore their whole operations.

[Apr 15, 2017] IMF claims that technology and global integration explain close to 75 percent of the decline in labor shares in Germany and Italy, and close to 50 percent in the United States.

Anything that IMF claim should be taken with a grain of salt. IMF is a quintessential neoliberal institutions that will support neoliberalism to the bitter end.
Apr 15, 2017 | economistsview.typepad.com

point, April 14, 2017 at 05:06 AM

https://blogs.imf.org/2017/04/12/drivers-of-declining-labor-share-of-income/

"In advanced economies, about half of the decline in labor shares can be traced to the impact of technology."

Searching, searching for the policy variable in the regression.

anne -> point... , April 14, 2017 at 08:09 AM
https://blogs.imf.org/2017/04/12/drivers-of-declining-labor-share-of-income/

April 12, 2017

Drivers of Declining Labor Share of Income
By Mai Chi Dao, Mitali Das, Zsoka Koczan, and Weicheng Lian

Technology: a key driver in advanced economies

In advanced economies, about half of the decline in labor shares can be traced to the impact of technology. The decline was driven by a combination of rapid progress in information and telecommunication technology, and a high share of occupations that could be easily be automated.

Global integration-as captured by trends in final goods trade, participation in global value chains, and foreign direct investment-also played a role. Its contribution is estimated at about half that of technology. Because participation in global value chains typically implies offshoring of labor-intensive tasks, the effect of integration is to lower labor shares in tradable sectors.

Admittedly, it is difficult to cleanly separate the impact of technology from global integration, or from policies and reforms. Yet the results for advanced economies is compelling. Taken together, technology and global integration explain close to 75 percent of the decline in labor shares in Germany and Italy, and close to 50 percent in the United States.

paine -> anne... , April 14, 2017 at 08:49 AM
Again this is about changing the wage structure

Total hours is macro management. Mobilizing potential job hours to the max is undaunted by technical progress

Recall industrial jobs required unions to become well paid

We need a CIO for services logistics and commerce

[Apr 14, 2017] Automation as a way to depress wages

Apr 14, 2017 | economistsview.typepad.com
point , April 14, 2017 at 04:59 AM
http://www.bradford-delong.com/2017/04/notes-working-earning-and-learning-in-the-age-of-intelligent-machines.html

Brad said: Few things can turn a perceived threat into a graspable opportunity like a high-pressure economy with a tight job market and rising wages. Few things can turn a real opportunity into a phantom threat like a low-pressure economy, where jobs are scarce and wage stagnant because of the failure of macro economic policy.

What is it that prevents a statement like this from succeeding at the level of policy?

Peter K. -> point... , April 14, 2017 at 06:41 AM
class war

center-left economists like DeLong and Krugman going with neoliberal Hillary rather than Sanders.

Sanders supports that statement, Hillary did not. Obama did not.

PGL spent the primary unfairly attacking Sanders and the "Bernie Bros" on behalf of the center-left.

[Apr 07, 2017] No it was policy driven by politics. They increased profits at the expense of workers and the middle class. The New Democrats played along with Wall Street.

Apr 07, 2017 | economistsview.typepad.com
ken melvin -> DrDick ... , April 06, 2017 at 08:45 AM
Probably automated 200. In every case, displacing 3/4 of the workers and increasing production 40% while greatly improving quality. Exact same can be said for larger scaled such as automobile mfg, ...

The convergence of offshoring and automation in such a short time frame meant that instead of a gradual transformation that might have allowed for more evolutionary economic thinking, American workers got gobsmacked. The aftermath includes the wage disparity, opiate epidemic, Trump, ...

This transition is of the scale of the industrial revolution with climate change thrown. This is just the beginning of great social and economic turmoil. None of the stuff that evolved specific the industrial revolution applies.

Peter K. -> ken melvin... , April 06, 2017 at 09:01 AM

No it was policy driven by politics. They increased profits at the expense of workers and the middle class. The New Democrats played along with Wall Street.
libezkova -> ken melvin... , April 06, 2017 at 05:43 PM
"while greatly improving quality" -- that's not given.

[Apr 06, 2017] Germany and Japan have retained a larger share of workers in manufacturing, despite more automation

Apr 06, 2017 | economistsview.typepad.com
Peter K. -> EMichael... , April 06, 2017 at 09:18 AM
What do you make of the DeLong link? Why do you avoid discussing it?

"...
The lesson from history is not that the robots should be stopped; it is that we will need to confront the social-engineering and political problem of maintaining a fair balance of relative incomes across society. Toward that end, our task becomes threefold.

First, we need to make sure that governments carry out their proper macroeconomic role, by maintaining a stable, low-unemployment economy so that markets can function properly. Second, we need to redistribute wealth to maintain a proper distribution of income. Our market economy should promote, rather than undermine, societal goals that correspond to our values and morals. Finally, workers must be educated and trained to use increasingly high-tech tools (especially in labor-intensive industries), so that they can make useful things for which there is still demand.

Sounding the alarm about "artificial intelligence taking American jobs" does nothing to bring such policies about. Mnuchin is right: the rise of the robots should not be on a treasury secretary's radar."

DrDick -> EMichael... , April 06, 2017 at 08:43 AM
Except that Germany and Japan have retained a larger share of workers in manufacturing, despite more automation. Germany has also retained much more of its manufacturing base than the US has. The evidence really does point to the role of outsourcing in the US compared with others.

http://www.economist.com/node/21552567

http://www.economist.com/node/2571689

pgl -> DrDick ... , April 06, 2017 at 08:54 AM
I got an email of some tale that Adidas would start manufacturing in Germany as opposed to China. Not with German workers but with robots. The author claimed the robots would cost only $5.50 per hour as opposed to $11 an hour for the Chinese workers. Of course Chinese apparel workers do not get anywhere close to $11 an hour and the author was not exactly a credible source.
pgl -> pgl... , April 06, 2017 at 08:57 AM
Reuters is a more credible source:

http://www.reuters.com/article/us-adidas-manufacturing-idUSKBN0TS0ZM20151209

Pilot program making initially 500 pairs of shoes in the first year. No claims as the wage rate of Chinese workers.

libezkova said in reply to pgl... , April 06, 2017 at 05:41 PM
"The new "Speedfactory" in the southern town of Ansbach near its Bavarian headquarters will start production in the first half of 2016 of a robot-made running shoe that combines a machine-knitted upper and springy "Boost" sole made from a bubble-filled polyurethane foam developed by BASF."

Interesting. I thought that "keds" production was already fully automated. Bright colors are probably the main attraction. But Adidas commands premium price...

Machine-knitted upper is the key -- robots, even sophisticated one, put additional demands on precision of the parts to be assembled. That's also probably why monolithic molded sole is chosen. Kind of 3-D printing of shoes.

Robots do not "feel" the nuances of the technological process like humans do.

kurt -> pgl... , April 06, 2017 at 09:40 AM
While I agree that Chinese workers don't get $11 - frequently employee costs are accounted at a loaded rate (including all benefits - in China would include capital cost of dormitories, food, security staff, benefits and taxes). I am guessing that a $2-3 an hour wage would result in an $11 fully loaded rate under those circumstances. Those other costs are not required with robuts.
Peter K. -> DrDick ... , April 06, 2017 at 08:59 AM
I agree with you. The center-left want to exculpate globalization and outsourcing, or free them from blame, by providing another explanation: technology and robots. They're not just arguing with Trump.

Brad Setser:

"I suspect the politics around trade would be a bit different in the U.S. if the goods-exporting sector had grown in parallel with imports.

That is one key difference between the U.S. and Germany. Manufacturing jobs fell during reunification-and Germany went through a difficult adjustment in the early 2000s. But over the last ten years the number of jobs in Germany's export sector grew, keeping the number of people employed in manufacturing roughly constant over the last ten years even with rising productivity. Part of the "trade" adjustment was a shift from import-competing to exporting sectors, not just a shift out of the goods producing tradables sector. Of course, not everyone can run a German sized surplus in manufactures-but it seems likely the low U.S. share of manufacturing employment (relative to Germany and Japan) is in part a function of the size and persistence of the U.S. trade deficit in manufactures. (It is also in part a function of the fact that the U.S. no longer needs to trade manufactures for imported energy on any significant scale; the U.S. has more jobs in oil and gas production, for example, than Germany or Japan)."

http://blogs.cfr.org/setser/2017/02/06/offshore-profits-and-exports/

anne -> DrDick ... , April 06, 2017 at 10:01 AM
https://fred.stlouisfed.org/graph/?g=dgSQ

January 15, 2017

Percent of Employment in Manufacturing for United States, Germany and Japan, 1970-2012


https://fred.stlouisfed.org/graph/?g=dgT0

January 15, 2017

Percent of Employment in Manufacturing for United States, Germany and Japan, 1970-2012

(Indexed to 1970)

ken melvin -> DrDick ... , April 06, 2017 at 08:45 AM
Probably automated 200. In every case, displacing 3/4 of the workers and increasing production 40% while greatly improving quality. Exact same can be said for larger scaled such as automobile mfg, ...
The convergence of offshoring and automation in such a short time frame meant that instead of a gradual transformation that might have allowed for more evolutionary economic thinking, American workers got gobsmacked. The aftermath includes the wage disparity, opiate epidemic, Trump, ...
This transition is of the scale of the industrial revolution with climate change thrown. This is just the beginning of great social and economic turmoil. None of the stuff that evolved specific the industrial revolution applies.
Peter K. -> ken melvin... , April 06, 2017 at 09:01 AM
No it was policy driven by politics. They increased profits at the expense of workers and the middle class. The New Democrats played along with Wall Street.

[Apr 06, 2017] The impact of information technology on employment is undoubtedly a major issue, but it is also not in society's interest to discourage investment in high-tech companies.

Apr 06, 2017 | economistsview.typepad.com
Peter K. , April 05, 2017 at 01:55 PM
Interesting, thought-provoking discussion by DeLong:

https://www.project-syndicate.org/commentary/mnuchin-automation-low-skill-workers-by-j--bradford-delong-2017-04

APR 3, 2017
Artificial Intelligence and Artificial Problems
by J. Bradford DeLong

BERKELEY – Former US Treasury Secretary Larry Summers recently took exception to current US Treasury Secretary Steve Mnuchin's views on "artificial intelligence" (AI) and related topics. The difference between the two seems to be, more than anything else, a matter of priorities and emphasis.

Mnuchin takes a narrow approach. He thinks that the problem of particular technologies called "artificial intelligence taking over American jobs" lies "far in the future." And he seems to question the high stock-market valuations for "unicorns" – companies valued at or above $1 billion that have no record of producing revenues that would justify their supposed worth and no clear plan to do so.

Summers takes a broader view. He looks at the "impact of technology on jobs" generally, and considers the stock-market valuation for highly profitable technology companies such as Google and Apple to be more than fair.

I think that Summers is right about the optics of Mnuchin's statements. A US treasury secretary should not answer questions narrowly, because people will extrapolate broader conclusions even from limited answers. The impact of information technology on employment is undoubtedly a major issue, but it is also not in society's interest to discourage investment in high-tech companies.

On the other hand, I sympathize with Mnuchin's effort to warn non-experts against routinely investing in castles in the sky. Although great technologies are worth the investment from a societal point of view, it is not so easy for a company to achieve sustained profitability. Presumably, a treasury secretary already has enough on his plate to have to worry about the rise of the machines.

In fact, it is profoundly unhelpful to stoke fears about robots, and to frame the issue as "artificial intelligence taking American jobs." There are far more constructive areas for policymakers to direct their focus. If the government is properly fulfilling its duty to prevent a demand-shortfall depression, technological progress in a market economy need not impoverish unskilled workers.

This is especially true when value is derived from the work of human hands, or the work of things that human hands have made, rather than from scarce natural resources, as in the Middle Ages. Karl Marx was one of the smartest and most dedicated theorists on this topic, and even he could not consistently show that technological progress necessarily impoverishes unskilled workers.

Technological innovations make whatever is produced primarily by machines more useful, albeit with relatively fewer contributions from unskilled labor. But that by itself does not impoverish anyone. To do that, technological advances also have to make whatever is produced primarily by unskilled workers less useful. But this is rarely the case, because there is nothing keeping the relatively cheap machines used by unskilled workers in labor-intensive occupations from becoming more powerful. With more advanced tools, these workers can then produce more useful things.

Historically, there are relatively few cases in which technological progress, occurring within the context of a market economy, has directly impoverished unskilled workers. In these instances, machines caused the value of a good that was produced in a labor-intensive sector to fall sharply, by increasing the production of that good so much as to satisfy all potential consumers.

The canonical example of this phenomenon is textiles in eighteenth- and nineteenth-century India and Britain. New machines made the exact same products that handloom weavers had been making, but they did so on a massive scale. Owing to limited demand, consumers were no longer willing to pay for what handloom weavers were producing. The value of wares produced by this form of unskilled labor plummeted, but the prices of commodities that unskilled laborers bought did not.

The lesson from history is not that the robots should be stopped; it is that we will need to confront the social-engineering and political problem of maintaining a fair balance of relative incomes across society. Toward that end, our task becomes threefold.

First, we need to make sure that governments carry out their proper macroeconomic role, by maintaining a stable, low-unemployment economy so that markets can function properly. Second, we need to redistribute wealth to maintain a proper distribution of income. Our market economy should promote, rather than undermine, societal goals that correspond to our values and morals. Finally, workers must be educated and trained to use increasingly high-tech tools (especially in labor-intensive industries), so that they can make useful things for which there is still demand.

Sounding the alarm about "artificial intelligence taking American jobs" does nothing to bring such policies about. Mnuchin is right: the rise of the robots should not be on a treasury secretary's radar.

anne , April 05, 2017 at 03:14 PM
https://minneapolisfed.org/research/wp/wp736.pdf

January, 2017

The Global Rise of Corporate Saving
By Peter Chen, Loukas Karabarbounis, and Brent Neiman

Abstract

The sectoral composition of global saving changed dramatically during the last three decades. Whereas in the early 1980s most of global investment was funded by household saving, nowadays nearly two-thirds of global investment is funded by corporate saving. This shift in the sectoral composition of saving was not accompanied by changes in the sectoral composition of investment, implying an improvement in the corporate net lending position. We characterize the behavior of corporate saving using both national income accounts and firm-level data and clarify its relationship with the global decline in labor share, the accumulation of corporate cash stocks, and the greater propensity for equity buybacks. We develop a general equilibrium model with product and capital market imperfections to explore quantitatively the determination of the flow of funds across sectors. Changes including declines in the real interest rate, the price of investment, and corporate income taxes generate increases in corporate profits and shifts in the supply of sectoral saving that are of similar magnitude to those observed in the data.

anne -> anne... , April 05, 2017 at 03:17 PM
http://www.nytimes.com/2010/07/06/opinion/06smith.html

July 6, 2010

Are Profits Hurting Capitalism?
By YVES SMITH and ROB PARENTEAU

A STREAM of disheartening economic news last week, including flagging consumer confidence and meager private-sector job growth, is leading experts to worry that the recession is coming back. At the same time, many policymakers, particularly in Europe, are slashing government budgets in an effort to lower debt levels and thereby restore investor confidence, reduce interest rates and promote growth.

There is an unrecognized problem with this approach: Reductions in deficits have implications for the private sector. Higher taxes draw cash from households and businesses, while lower government expenditures withhold money from the economy. Making matters worse, businesses are already plowing fewer profits back into their own enterprises.

Over the past decade and a half, corporations have been saving more and investing less in their own businesses. A 2005 report from JPMorgan Research noted with concern that, since 2002, American corporations on average ran a net financial surplus of 1.7 percent of the gross domestic product - a drastic change from the previous 40 years, when they had maintained an average deficit of 1.2 percent of G.D.P. More recent studies have indicated that companies in Europe, Japan and China are also running unprecedented surpluses.

The reason for all this saving in the United States is that public companies have become obsessed with quarterly earnings. To show short-term profits, they avoid investing in future growth. To develop new products, buy new equipment or expand geographically, an enterprise has to spend money - on marketing research, product design, prototype development, legal expenses associated with patents, lining up contractors and so on.

Rather than incur such expenses, companies increasingly prefer to pay their executives exorbitant bonuses, or issue special dividends to shareholders, or engage in purely financial speculation. But this means they also short-circuit a major driver of economic growth.

Some may argue that businesses aren't investing in growth because the prospects for success are so poor, but American corporate profits are nearly all the way back to their peak, right before the global financial crisis took hold.

Another problem for the economy is that, once the crisis began, families and individuals started tightening their belts, bolstering their bank accounts or trying to pay down borrowings (another form of saving).

If households and corporations are trying to save more of their income and spend less, then it is up to the other two sectors of the economy - the government and the import-export sector - to spend more and save less to keep the economy humming. In other words, there needs to be a large trade surplus, a large government deficit or some combination of the two. This isn't a matter of economic theory; it's based in simple accounting.

What if a government instead embarks on an austerity program? Income growth will stall, and household wages and business profits may fall....

anne -> anne... , April 05, 2017 at 03:21 PM
http://www.nakedcapitalism.com/2017/04/global-corporate-saving-glut.html

April 5, 2017

The Global Corporate Saving Glut
By Yves Smith

On the one hand, the VoxEU article does a fine job of assembling long-term data on a global basis. It demonstrates that the corporate savings glut is long standing and that is has been accompanied by a decline in personal savings.

However, it fails to depict what an unnatural state of affairs this is. The corporate sector as a whole in non-recessionary times ought to be net spending, as in borrowing and investing in growth. As a market-savvy buddy put it, "If a company isn't investing in the business of its business, why should I?" I attributed the corporate savings trend in the US as a result of the fixation of quarterly earnings, which sources such as McKinsey partners with a broad view of the firms' projects were telling me was killing investment (any investment will have an income statement impact too, such as planning, marketing, design, and start up expenses). This post, by contrast, treats this development as lacking in any agency. Labor share of GDP dropped and savings rose. They attribute that to lower interest rates over time. They again fail to see that as the result of power dynamics and political choices....

[Mar 29, 2017] Job Loss in Manufacturing: More Robot Blaming

Mar 29, 2017 | economistsview.typepad.com
anne , March 29, 2017 at 06:11 AM
http://cepr.net/blogs/ beat-the-press/job-loss-in-manufacturing-more-robot-blaming

March 29, 2017

It is striking how the media feel such an extraordinary need to blame robots and productivity growth for the recent job loss in manufacturing rather than trade. We got yet another example of this exercise in a New York Times piece * by Claire Cain Miller, with the title "evidence that robots are winning the race for American jobs." The piece highlights a new paper * by Daron Acemoglu and Pascual Restrepo which finds that robots have a large negative impact on wages and employment.

While the paper has interesting evidence on the link between the use of robots and employment and wages, some of the claims in the piece do not follow. For example, the article asserts:

"The paper also helps explain a mystery that has been puzzling economists: why, if machines are replacing human workers, productivity hasn't been increasing. In manufacturing, productivity has been increasing more than elsewhere - and now we see evidence of it in the employment data, too."

Actually, the paper doesn't provide any help whatsoever in solving this mystery. Productivity growth in manufacturing has almost always been more rapid than productivity growth elsewhere. Furthermore, it has been markedly slower even in manufacturing in recent years than in prior decades. According to the Bureau of Labor Statistics, productivity growth in manufacturing has averaged less than 1.2 percent annually over the last decade and less than 0.5 percent over the last five years. By comparison, productivity growth averaged 2.9 percent a year in the half century from 1950 to 2000.

The article is also misleading in asserting:

"The paper adds to the evidence that automation, more than other factors like trade and offshoring that President Trump campaigned on, has been the bigger long-term threat to blue-collar jobs (emphasis added)."

In terms of recent job loss in manufacturing, and in particular the loss of 3.4 million manufacturing jobs between December of 2000 and December of 2007, the rise of the trade deficit has almost certainly been the more important factor. We had substantial productivity growth in manufacturing between 1970 and 2000, with very little loss of jobs. The growth in manufacturing output offset the gains in productivity. The new part of the story in the period from 2000 to 2007 was the explosion of the trade deficit to a peak of nearly 6.0 percent of GDP in 2005 and 2006.

It is also worth noting that we could in fact expect substantial job gains in manufacturing if the trade deficit were reduced. If the trade deficit fell by 2.0 percentage points of GDP ($380 billion a year) this would imply an increase in manufacturing output of more than 22 percent. If the productivity of the manufacturing workers producing this additional output was the same as the rest of the manufacturing workforce it would imply an additional 2.7 million jobs in manufacturing. That is more jobs than would be eliminated by productivity at the recent 0.5 percent growth rate over the next forty years, even assuming no increase in demand over this period.

While the piece focuses on the displacement of less educated workers by robots and equivalent technology, it is likely that the areas where displacement occurs will be determined in large part by the political power of different groups. For example, it is likely that in the not distant future improvements in diagnostic technology will allow a trained professional to make more accurate diagnoses than the best doctor. Robots are likely to be better at surgery than the best surgeon. The extent to which these technologies will be be allowed to displace doctors is likely to depend more on the political power of the American Medical Association than the technology itself.

Finally, the question of whether the spread of robots will lead to a transfer of income from workers to the people who "own" the robots will depend to a large extent on our patent laws. In the last four decades we have made patents longer and stronger. If we instead made them shorter and weaker, or better relied on open source research, the price of robots would plummet and workers would be better positioned to capture than gains of productivity growth as they had in prior decades. In this story it is not robots who are taking workers' wages, it is politicians who make strong patent laws.

* https://www.nytimes.com/2017/03/28/upshot/evidence-that-robots-are-winning-the-race-for-american-jobs.html

** http://economics.mit.edu/files/12154

-- Dean Baker

anne -> anne... , March 29, 2017 at 06:14 AM
https://fred.stlouisfed.org/graph/?g=d6j3

November 1, 2014

Total Factor Productivity at Constant National Prices for United States, 1950-2014


https://fred.stlouisfed.org/graph/?g=d6j7

November 1, 2014

Total Factor Productivity at Constant National Prices for United States, 1950-2014

(Indexed to 1950)

anne -> anne... , March 29, 2017 at 09:31 AM
https://fred.stlouisfed.org/graph/?g=dbjg

January 4, 2016

Manufacturing Multifactor Productivity, 1988-2014

(Indexed to 1988)


https://fred.stlouisfed.org/graph/?g=dbke

January 4, 2016

Manufacturing Multifactor Productivity, 2000-2014

(Indexed to 2000)

[Mar 29, 2017] I fear Summers at least as much as I fear robots

Mar 29, 2017 | economistsview.typepad.com
anne -> RC AKA Darryl, Ron... , March 29, 2017 at 06:17 AM
https://www.washingtonpost.com/news/wonk/wp/2017/03/27/larry-summers-mnuchins-take-on-artificial-intelligence-is-not-defensible/

March 27, 2017

The robots are coming, whether Trump's Treasury secretary admits it or not
By Lawrence H. Summers - Washington Post

As I learned (sometimes painfully) during my time at the Treasury Department, words spoken by Treasury secretaries can over time have enormous consequences, and therefore should be carefully considered. In this regard, I am very surprised by two comments made by Secretary Steven Mnuchin in his first public interview last week.

In reference to a question about artificial intelligence displacing American workers,Mnuchin responded that "I think that is so far in the future - in terms of artificial intelligence taking over American jobs - I think we're, like, so far away from that [50 to 100 years], that it is not even on my radar screen." He also remarked that he did not understand tech company valuations in a way that implied that he regarded them as excessive. I suppose there is a certain internal logic. If you think AI is not going to have any meaningful economic effects for a half a century, then I guess you should think that tech companies are overvalued. But neither statement is defensible.

Mnuchin's comment about the lack of impact of technology on jobs is to economics approximately what global climate change denial is to atmospheric science or what creationism is to biology. Yes, you can debate whether technological change is in net good. I certainly believe it is. And you can debate what the job creation effects will be relative to the job destruction effects. I think this is much less clear, given the downward trends in adult employment, especially for men over the past generation.

But I do not understand how anyone could reach the conclusion that all the action with technology is half a century away. Artificial intelligence is behind autonomous vehicles that will affect millions of jobs driving and dealing with cars within the next 15 years, even on conservative projections. Artificial intelligence is transforming everything from retailing to banking to the provision of medical care. Almost every economist who has studied the question believes that technology has had a greater impact on the wage structure and on employment than international trade and certainly a far greater impact than whatever increment to trade is the result of much debated trade agreements....

DrDick -> anne... , March 29, 2017 at 10:45 AM
Oddly, the robots are always coming in articles like Summers', but they never seem to get here. Automation has certainly played a role, but outsourcing has been a much bigger issue.
Peter K. -> DrDick ... , March 29, 2017 at 01:09 PM
I'm becoming increasing skeptical about the robots argument.
jonny bakho -> DrDick ... , March 29, 2017 at 05:13 PM
They are all over our manufacturing plants.
They just don't look like C3PO
JohnH -> RC AKA Darryl, Ron... , March 29, 2017 at 06:21 AM
I fear Summers at least as much as I fear robots...
Peter K. -> JohnH... , March 29, 2017 at 07:04 AM
He's just a big bully, like our PGL.

He has gotten a lot better and was supposedly pretty good when advising Obama, but he's sort of reverted to form with the election of Trump and the prominence of the debate on trade policy.

RC AKA Darryl, Ron -> JohnH... , March 29, 2017 at 07:15 AM
Ditto.

Technology rearranges and changes human roles, but it makes entries on both sides of the ledger. On net as long as wages grow then so will the economy and jobs. Trade deficits only help financial markets and the capital owning class.

Paine -> RC AKA Darryl, Ron... , March 29, 2017 at 09:59 AM
There is no limit to jobs
Macro policy and hours regulation
can create

We can both ration job hours And subsidies job wage rates
and at the same time
generate
As many jobs as wanted

All economic rents could be converted into wage subsidies
To boost the per hour income from jobs as well as incentivize diligence skill and creativity

RC AKA Darryl, Ron -> Paine... , March 29, 2017 at 12:27 PM
Works for me.
yuan -> Paine... , March 29, 2017 at 03:50 PM
jobs, jobs, jobs.

some day we will discard with feudal concepts, such as, working for the "man". a right to liberty and the pursuit of happiness is a right to income.

tax those bots!

yuan -> yuan... , March 29, 2017 at 03:51 PM
or better yet...collectivize the bots.
RGC -> RC AKA Darryl, Ron... , March 29, 2017 at 08:47 AM
Summers is a good example of those economists that never seem to pay a price for their errors.

Imo, he should never be listened to. His economics is faulty. His performance in the Clinton administration and his part in the Russian debacle should be enough to consign him to anonymity. People would do well to ignore him.

Peter K. -> RGC... , March 29, 2017 at 09:36 AM
Yeah he's one of those expert economists and technocrats who never admit fault. You don't become Harvard President or Secretary of the Treasury by doing that.

One time that Krugman has admitted error was about productivity gains in the 1990s. He said he didn't see the gains from computers in the numbers and it wasn't and they weren't there at first, but later productivity numbers increased.

It was sort of like what Summers and Munchkin are talking discussing, but there's all sorts of debate about measuring productivity and what it means.

RC AKA Darryl, Ron -> RGC... , March 29, 2017 at 12:29 PM
Yeah. I am not a fan of Summers's, but I do like summers as long as it does not rain too much or too little and I have time to fish.

[Mar 24, 2017] There is no such thing as an automated factory. Manufacturing is done by people, *assisted* by automation. Or only part of the production pipeline is automated, but people are still needed to fill in the not-automated pieces

Notable quotes:
"... And it is not only automation vs. in-house labor. There is environmental/compliance cost (or lack thereof) and the fully loaded business services and administration overhead, taxes, etc. ..."
"... When automation increased productivity in agriculture, the government guaranteed free high school education as a right. ..."
"... Now Democrats like you would say it's too expensive. So what's your solution? You have none. You say "sucks to be them." ..."
"... And then they give you the finger and elect Trump. ..."
"... It wasn't only "low-skilled" workers but "anybody whose job could be offshored" workers. Not quite the same thing. ..."
"... It also happened in "knowledge work" occupations - for those functions that could be separated and outsourced without impacting the workflow at more expense than the "savings". And even if so, if enough of the competition did the same ... ..."
"... And not all outsourcing was offshore - also to "lowest bidders" domestically, or replacing "full time" "permanent" staff with contingent workers or outsourced "consultants" hired on a project basis. ..."
"... "People sure do like to attribute the cause to trade policy." Because it coincided with people watching their well-paying jobs being shipped overseas. The Democrats have denied this ever since Clinton and the Republicans passed NAFTA, but finally with Trump the voters had had enough. ..."
"... Why do you think Clinton lost Wisconsin, Michigan, Pennysylvania and Ohio? ..."
Feb 20, 2017 | economistsview.typepad.com
Sanjait -> Peter K.... February 20, 2017 at 01:55 PM

People sure do like to attribute the cause to trade policy.

Do you honestly believe that fact makes it true? If not, what even is your point? Can you even articulate one?

Tom aka Rusty -> Sanjait... , February 20, 2017 at 01:18 PM

If it was technology why do US companies buy from low labor producers at the end of supply chains 2000 - 10000 miles away? Why the transportation cost. Automated factories could be built close by.

ken melvin said in reply to Tom aka Rusty... , February 20, 2017 at 02:24 PM
Send for an accountant.
cm -> Tom aka Rusty... , February 20, 2017 at 03:14 PM
There is no such thing as an automated factory. Manufacturing is done by people, *assisted* by automation. Or only part of the production pipeline is automated, but people are still needed to fill in the not-automated pieces.

And it is not only automation vs. in-house labor. There is environmental/compliance cost (or lack thereof) and the fully loaded business services and administration overhead, taxes, etc.

You should know this, and I believe you do.

Peter K. said in reply to Sanjait... , February 20, 2017 at 03:14 PM
Trade policy put "low-skilled" workers in the U.S. in competition with workers in poorer countries. What did you think was going to happen? The Democrat leadership made excuses. David Autor's TED talk stuck with me. When automation increased productivity in agriculture, the government guaranteed free high school education as a right.

Now Democrats like you would say it's too expensive. So what's your solution? You have none. You say "sucks to be them."

And then they give you the finger and elect Trump.

cm -> Peter K.... , February 20, 2017 at 03:19 PM
It wasn't only "low-skilled" workers but "anybody whose job could be offshored" workers. Not quite the same thing.

It also happened in "knowledge work" occupations - for those functions that could be separated and outsourced without impacting the workflow at more expense than the "savings". And even if so, if enough of the competition did the same ...

And not all outsourcing was offshore - also to "lowest bidders" domestically, or replacing "full time" "permanent" staff with contingent workers or outsourced "consultants" hired on a project basis.

Peter K. said in reply to cm... , February 20, 2017 at 03:33 PM
True.
Peter K. said in reply to Sanjait... , February 20, 2017 at 03:35 PM
"People sure do like to attribute the cause to trade policy." Because it coincided with people watching their well-paying jobs being shipped overseas. The Democrats have denied this ever since Clinton and the Republicans passed NAFTA, but finally with Trump the voters had had enough.

Why do you think Clinton lost Wisconsin, Michigan, Pennysylvania and Ohio?

[Mar 24, 2017] We are in a sea of McJobs

Feb 26, 2017 | http://economistsview.typepad.com/economistsview/2017/02/links-for-02-24-17.html
RC AKA Darryl, Ron -> RC AKA Darryl, Ron... February 24, 2017 at 10:05 AM

Instead of looking at this as an excuse for job losses due to trade deficits then we should be seeing it as a reason to gain back manufacturing jobs in order to retain a few more decent jobs in a sea of garbage jobs. Mmm. that's so wrong. Working on garbage trucks are now some of the good jobs in comparison. A sea of garbage jobs would be an improvement. We are in a sea of McJobs.

Paine -> RC AKA Darryl, Ron... February 24, 2017 at 04:25 AM ,
Assembly lines paid well post CIO
They were never intrinsically rewarding

A family farm or work shop of their own
Filled the dreams of the operatives

Recall the brilliantly ironic end of Rene Clair's a la nous la Liberte

Fully automated plant with the former operatives enjoying endless picnic frolic

Work as humans' prime want awaits a future social configuration

RC AKA Darryl, Ron -> Paine... , February 24, 2017 at 11:27 AM
Yes sir, often enough but not always. I had a great job as an IT large systems capacity planner and performance analyst, but not as good as the landscaping, pool, and lawn maintenance for myself that I enjoy now as a leisure occupation in retirement. My best friend died a greens keeper, but he preferred landscaping when he was young. Another good friend of mine was a poet, now dying of cancer if depression does not take him first.

But you are correct, no one but the welders, material handlers (paid to lift weights all day), machinists, and then almost every one else liked their jobs at Virginia Metal Products, a union shop, when I worked there the summer of 1967. That was on the swing shift though when all of the big bosses were at home and out of our way. On the green chain in the lumber yard of Kentucky flooring everyone but me wanted to leave, but my mom made me go into the VMP factory and work nights at the primer drying kiln stacking finished panel halves because she thought the work on the green chain was too hard. The guys on the green chain said that I was the first high school graduate to make it past lunch time on their first day. I would have been buff and tan by the end of summer heading off to college (where I would drop out in just ten weeks) had my mom not intervened.

As a profession no group that I know is happier than auto mechanics that do the same work as a hobby on their hours off that they do for a living at work, at least the hot rod custom car freaks at Jamie's Exhaust & Auto Repair in Richmond, Virginia are that way. The power tool sales and maintenance crew at Arthur's Electric Service Inc. enjoy their jobs too.

Despite the name which was on their incorporation done back when they rebuilt auto generators, Arthur's sells and services lawnmowers, weed whackers, chain saws and all, but nothing electric. The guy in the picture at the link is Robert Arthur, the founder's son who is our age roughly.

http://www.arthurselectric.com/

[Mar 23, 2017] Automation threat is more complex than it looks

Mar 23, 2017 | discussion.theguardian.com
, EndaFlannel , 17 Nov 2016 09:12
In theory, in the longer term, as robotics becomes the norm rather than the exception, there will be no advantage in chasing cheap labour around the world. Given ready access to raw materials, the labour costs of manufacturing in Birmingham should be no different to the labour costs in Beijing. This will require the democratisation of the ownership of technology. Unless national governments develop commonly owned technology the 1% will truly become the organ grinders and everyone else the monkeys. One has only to look at companies like Microsoft and Google to see a possible future - bigger than any single country and answerable to no one. Common ownership must be the future. Deregulation and market driven economics are the road technological serfdom.
, Physiocrat EndaFlannel , 17 Nov 2016 09:58
Except that the raw materials for steel production are available in vast quantities in China.

You are also forgetting land. The power remains with those who own it. Most of Central London is still owned by the same half dozen families as in 1600. Reply Share

, Colin Sandford EndaFlannel , 17 Nov 2016 10:29
You can only use robotics in countries that have the labour with the skills to maintain them.Robots do not look after themselves they need highly skilled technicians to keep them working. I once worked for a Japanese company and they only used robots in the higher wage high skill regions. In low wage economies they used manual labour and low tech products.

[Mar 21, 2017] Robots and Inequality: A Skeptics Take

Notable quotes:
"... And all costs are labor costs. It it isn't labor cost, it's rents and economic profit which mean economic inefficiency. An inefficient economy is unstable. Likely to crash or drive revolution. ..."
"... Free lunch economics seeks to make labor unnecessary or irrelevant. Labor cost is pure liability. ..."
"... Yet all the cash for consumption is labor cost, so if labor cost is a liability, then demand is a liability. ..."
"... Replace workers with robots, then robots must become consumers. ..."
"... "Replace workers with robots, then robots must become consumers." Well no - the OWNERS of robots must become consumers. ..."
"... I am old enough to remember the days of good public libraries, free university education, free bus passes for seniors and low land prices. Is the income side of the equation all that counts? ..."
Mar 21, 2017 | economistsview.typepad.com
Douglas Campbell:
Robots and Inequality: A Skeptic's Take : Paul Krugman presents " Robot Geometry " based on Ryan Avent 's "Productivity Paradox". It's more-or-less the skill-biased technological change hypothesis, repackaged. Technology makes workers more productive, which reduces demand for workers, as their effective supply increases. Workers still need to work, with a bad safety net, so they end up moving to low-productivity sectors with lower wages. Meanwhile, the low wages in these sectors makes it inefficient to invest in new technology.
My question: Are Reagan-Thatcher countries the only ones with robots? My image, perhaps it is wrong, is that plenty of robots operate in Japan and Germany too, and both countries are roughly just as technologically advanced as the US. But Japan and Germany haven't seen the same increase in inequality as the US and other Anglo countries after 1980 (graphs below). What can explain the dramatic differences in inequality across countries? Fairly blunt changes in labor market institutions, that's what. This goes back to Peter Temin's " Treaty of Detroit " paper and the oddly ignored series of papers by Piketty, Saez and coauthors which argues that changes in top marginal tax rates can largely explain the evolution of the Top 1% share of income across countries. (Actually, it goes back further -- people who work in Public Economics had "always" known that pre-tax income is sensitive to tax rates...) They also show that the story of inequality is really a story of incomes at the very top -- changes in other parts of the income distribution are far less dramatic. This evidence also is not suggestive of a story in which inequality is about the returns to skills, or computer usage, or the rise of trade with China. ...

mulp : , March 21, 2017 at 01:54 AM

Yet another economist bamboozled by free lunch economics.

In free lunch economics, you never consider demand impacted by labor cost changed.

TANSTAAFL so, cut labor costs and consumption must be cut.

Funny things can be done if money is printed and helicopter dropped unequally.

Printed money can accumulate in the hands of the rentier cutting labor costs and pocketing the savings without cutting prices.

Free lunch economics invented the idea price equals cost, but that is grossly distorting.

And all costs are labor costs. It it isn't labor cost, it's rents and economic profit which mean economic inefficiency. An inefficient economy is unstable. Likely to crash or drive revolution.

Free lunch economics seeks to make labor unnecessary or irrelevant. Labor cost is pure liability.

Yet all the cash for consumption is labor cost, so if labor cost is a liability, then demand is a liability.

Replace workers with robots, then robots must become consumers.

reason -> mulp... , March 21, 2017 at 03:47 AM
"Replace workers with robots, then robots must become consumers." Well no - the OWNERS of robots must become consumers.
reason : , March 21, 2017 at 03:35 AM
I am old enough to remember the days of good public libraries, free university education, free bus passes for seniors and low land prices. Is the income side of the equation all that counts?
anne : , March 21, 2017 at 06:37 AM
https://medium.com/@ryanavent_93844/the-productivity-paradox-aaf05e5e4aad#.brb0426mt

March 16, 2017

The productivity paradox
By Ryan Avent

People are worried about robots taking jobs. Driverless cars are around the corner. Restaurants and shops increasingly carry the option to order by touchscreen. Google's clever algorithms provide instant translations that are remarkably good.

But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay.

The obvious conclusion, the one lots of people are drawing, is that the robot threat is totally overblown: the fantasy, perhaps, of a bubble-mad Silicon Valley - or an effort to distract from workers' real problems, trade and excessive corporate power. Generally speaking, the problem is not that we've got too much amazing new technology but too little.

This is not a strawman of my own invention. Robert Gordon makes this case. You can see Matt Yglesias make it here. * Duncan Weldon, for his part, writes: **

"We are debating a problem we don't have, rather than facing a real crisis that is the polar opposite. Productivity growth has slowed to a crawl over the last 15 or so years, business investment has fallen and wage growth has been weak. If the robot revolution truly was under way, we would see surging capital expenditure and soaring productivity. Right now, that would be a nice 'problem' to have. Instead we have the reality of weak growth and stagnant pay. The real and pressing concern when it comes to the jobs market and automation is that the robots aren't taking our jobs fast enough."

And in a recent blog post Paul Krugman concluded: *

"I'd note, however, that it remains peculiar how we're simultaneously worrying that robots will take all our jobs and bemoaning the stalling out of productivity growth. What is the story, really?"

What is the story, indeed. Let me see if I can tell one. Last fall I published a book: "The Wealth of Humans". In it I set out how rapid technological progress can coincide with lousy growth in pay and productivity. Start with this:

"Low labour costs discourage investments in labour-saving technology, potentially reducing productivity growth."

...

* http://www.vox.com/2015/7/27/9038829/automation-myth

** http://www.prospectmagazine.co.uk/magazine/droids-wont-steal-your-job-they-could-make-you-rich

*** https://krugman.blogs.nytimes.com/2017/02/24/maid-in-america/

anne -> anne... , March 21, 2017 at 06:38 AM
https://twitter.com/paulkrugman/status/843167658577182725

Paul Krugman @paulkrugman

But is Ryan Avent saying something different * from the assertion that recent technological progress is capital-biased? **

* https://medium.com/@ryanavent_93844/the-productivity-paradox-aaf05e5e4aad#.kmb49lrgd

** http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

If so, what?

https://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

11:30 AM - 18 Mar 2017

anne -> anne... , March 21, 2017 at 07:00 AM
This is an old concern in economics; it's "capital-biased technological change," which tends to shift the distribution of income away from workers to the owners of capital....

-- Paul Krugman

anne -> anne... , March 21, 2017 at 06:40 AM
http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

December 8, 2012

Rise of the Robots
By Paul Krugman

Catherine Rampell and Nick Wingfield write about the growing evidence * for "reshoring" of manufacturing to the United States. * They cite several reasons: rising wages in Asia; lower energy costs here; higher transportation costs. In a followup piece, ** however, Rampell cites another factor: robots.

"The most valuable part of each computer, a motherboard loaded with microprocessors and memory, is already largely made with robots, according to my colleague Quentin Hardy. People do things like fitting in batteries and snapping on screens.

"As more robots are built, largely by other robots, 'assembly can be done here as well as anywhere else,' said Rob Enderle, an analyst based in San Jose, California, who has been following the computer electronics industry for a quarter-century. 'That will replace most of the workers, though you will need a few people to manage the robots.' "

Robots mean that labor costs don't matter much, so you might as well locate in advanced countries with large markets and good infrastructure (which may soon not include us, but that's another issue). On the other hand, it's not good news for workers!

This is an old concern in economics; it's "capital-biased technological change," which tends to shift the distribution of income away from workers to the owners of capital.

Twenty years ago, when I was writing about globalization and inequality, capital bias didn't look like a big issue; the major changes in income distribution had been among workers (when you include hedge fund managers and CEOs among the workers), rather than between labor and capital. So the academic literature focused almost exclusively on "skill bias", supposedly explaining the rising college premium.

But the college premium hasn't risen for a while. What has happened, on the other hand, is a notable shift in income away from labor:

[Graph]

If this is the wave of the future, it makes nonsense of just about all the conventional wisdom on reducing inequality. Better education won't do much to reduce inequality if the big rewards simply go to those with the most assets. Creating an "opportunity society," or whatever it is the likes of Paul Ryan etc. are selling this week, won't do much if the most important asset you can have in life is, well, lots of assets inherited from your parents. And so on.

I think our eyes have been averted from the capital/labor dimension of inequality, for several reasons. It didn't seem crucial back in the 1990s, and not enough people (me included!) have looked up to notice that things have changed. It has echoes of old-fashioned Marxism - which shouldn't be a reason to ignore facts, but too often is. And it has really uncomfortable implications.

But I think we'd better start paying attention to those implications.

* http://www.nytimes.com/2012/12/07/technology/apple-to-resume-us-manufacturing.html

** http://economix.blogs.nytimes.com/2012/12/07/when-cheap-foreign-labor-gets-less-cheap/

anne -> anne... , March 21, 2017 at 06:43 AM
https://fred.stlouisfed.org/graph/?g=d4ZY

January 30, 2017

Compensation of employees as a share of Gross Domestic Income, 1948-2015


https://fred.stlouisfed.org/graph/?g=d507

January 30, 2017

Compensation of employees as a share of Gross Domestic Income, 1948-2015

(Indexed to 1948)

supersaurus -> anne... , March 21, 2017 at 01:23 PM
"The most valuable part of each computer, a motherboard loaded with microprocessors and memory, is already largely made with robots, according to my colleague Quentin Hardy. People do things like fitting in batteries and snapping on screens.

"...already largely made..."? already? circuit boards were almost entirely populated by machines by 1985, and after the rise of surface mount technology you could drop the "almost". in 1990 a single machine could place 40k+/hour parts small enough they were hard to pick up with fingers.

anne : , March 21, 2017 at 06:37 AM
https://krugman.blogs.nytimes.com/2017/03/20/robot-geometry-very-wonkish/

March 20, 2017

Robot Geometry (Very Wonkish)
By Paul Krugman

And now for something completely different. Ryan Avent has a nice summary * of the argument in his recent book, trying to explain how dramatic technological change can go along with stagnant real wages and slowish productivity growth. As I understand it, he's arguing that the big tech changes are happening in a limited sector of the economy, and are driving workers into lower-wage and lower-productivity occupations.

But I have to admit that I was having a bit of a hard time wrapping my mind around exactly what he's saying, or how to picture this in terms of standard economic frameworks. So I found myself wanting to see how much of his story could be captured in a small general equilibrium model - basically the kind of model I learned many years ago when studying the old trade theory.

Actually, my sense is that this kind of analysis is a bit of a lost art. There was a time when most of trade theory revolved around diagrams illustrating two-country, two-good, two-factor models; these days, not so much. And it's true that little models can be misleading, and geometric reasoning can suck you in way too much. It's also true, however, that this style of modeling can help a lot in thinking through how the pieces of an economy fit together, in ways that algebra or verbal storytelling can't.

So, an exercise in either clarification or nostalgia - not sure which - using a framework that is basically the Lerner diagram, ** adapted to a different issue.

Imagine an economy that produces only one good, but can do so using two techniques, A and B, one capital-intensive, one labor-intensive. I represent these techniques in Figure 1 by showing their unit input coefficients:

[Figure 1]

Here AB is the economy's unit isoquant, the various combinations of K and L it can use to produce one unit of output. E is the economy's factor endowment; as long as the aggregate ratio of K to L is between the factor intensities of the two techniques, both will be used. In that case, the wage-rental ratio will be the slope of the line AB.

Wait, there's more. Since any point on the line passing through A and B has the same value, the place where it hits the horizontal axis is the amount of labor it takes to buy one unit of output, the inverse of the real wage rate. And total output is the ratio of the distance along the ray to E divided by the distance to AB, so that distance is 1/GDP.

You can also derive the allocation of resources between A and B; not to clutter up the diagram even further, I show this in Figure 2, which uses the K/L ratios of the two techniques and the overall endowment E:

[Figure 2]

Now, Avent's story. I think it can be represented as technical progress in A, perhaps also making A even more capital-intensive. So this would amount to a movement southwest to a point like A' in Figure 3:

[Figure 3]

We can see right away that this will lead to a fall in the real wage, because 1/w must rise. GDP and hence productivity does rise, but maybe not by much if the economy was mostly using the labor-intensive technique.

And what about allocation of labor between sectors? We can see this in Figure 4, where capital-using technical progress in A actually leads to a higher share of the work force being employed in labor-intensive B:

[Figure 4]

So yes, it is possible for a simple general equilibrium analysis to capture a lot of what Avent is saying. That does not, of course, mean that he's empirically right. And there are other things in his argument, such as hypothesized effects on the direction of innovation, that aren't in here.

But I, at least, find this way of looking at it somewhat clarifying - which, to be honest, may say more about my weirdness and intellectual age than it does about the subject.

* https://medium.com/@ryanavent_93844/the-productivity-paradox-aaf05e5e4aad#.v9et5b98y

** http://www-personal.umich.edu/~alandear/writings/Lerner.pdf

Shah of Bratpuhr : , March 21, 2017 at 07:27 AM
Median Wealth per adult (table ends at $40k)

1. Switzerland $244,002
2. Iceland $188,088
3. Australia $162,815
4. Belgium $154,815
5. New Zealand $135,755
6. Norway $135,012
7. Luxembourg $125,452
8. Japan $120,493
9. United Kingdom $107,865
10. Italy $104,105
11. Singapore $101,386
12. France $ 99,923
13. Canada $ 96,664
14. Netherlands $ 81,118
15. Ireland $ 80,668
16. Qatar $ 74,820
17. Korea $ 64,686
18. Taiwan $ 63,134
19. United Arab Emirates $ 62,332
20. Spain $ 56,500
21. Malta $ 54,562
22. Israel $ 54,384
23. Greece $ 53,266
24. Austria $ 52,519
25. Finland $ 52,427
26. Denmark $ 52,279
27. United States $ 44,977
28. Germany $ 42,833
29. Kuwait $ 40,803

http://www.middleclasspoliticaleconomist.com/2017/03/us-has-worst-wealth-inequality-of-any.html

reason -> Shah of Bratpuhr... , March 21, 2017 at 08:17 AM
I think this illustrates my point very clearly. If you had charts of wealth by age it would be even clearer. Without a knowledge of the discounted expected value of public pensions it is hard to draw any conclusions from this list.

I know very definitely that in Australia and the UK people are very reliant on superannuation and housing assets. In both Australia and the UK it is common to sell expensive housing in the capital and move to cheaper coastal locations upon retirement, investing the capital to provide retirement income. Hence a larger median wealth is NEEDED.

It is hard otherwise to explain the much higher median wealth in Australia and the UK.

Shah of Bratpuhr : , March 21, 2017 at 07:28 AM
Median Wealth Average Wealth

1. United States $ 44,977 $344,692 7.66
2. Denmark $ 52,279 $259,816 4.97
3. Germany $ 42,833 $185,175 4.32
4. Austria $ 52,519 $206,002 3.92
5. Israel $ 54,384 $176,263 3.24
6. Kuwait $ 40,803 $119,038 2.92
7. Finland $ 52,427 $146,733 2.80
8. Canada $ 96,664 $270,179 2.80
9. Taiwan $ 63,134 $172,847 2.74
10. Singapore $101,386 $276,885 2.73
11. United Kingdom $107,865 $288,808 2.68
12. Ireland $ 80,668 $214,589 2.66
13. Luxembourg $125,452 $316,466 2.52
14. Korea $ 64,686 $159,914 2.47
15. France $ 99,923 $244,365 2.45
16. United Arab Emirates $ 62,332 $151,098 2.42
17. Norway $135,012 $312,339 2.31
18. Australia $162,815 $375,573 2.31
19. Switzerland $244,002 $561,854 2.30
20. Netherlands $ 81,118 $184,378 2.27
21. New Zealand $135,755 $298,930 2.20
22. Iceland $188,088 $408,595 2.17
23. Qatar $ 74,820 $161,666 2.16
24. Malta $ 54,562 $116,185 2.13
25. Spain $ 56,500 $116,320 2.06
26. Greece $ 53,266 $103,569 1.94
27. Italy $104,105 $202,288 1.94
28. Japan $120,493 $230,946 1.92
29. Belgium $154,815 $270,613 1.75

http://www.middleclasspoliticaleconomist.com/2017/03/us-has-worst-wealth-inequality-of-any.html

spencer : , March 21, 2017 at 08:06 AM
Ryan Avent's analysis demonstrates what is wrong with the libertarian, right wing belief that cheap labor is the answer to every problem when in truth cheap labor is the source of many of our problems.
reason -> spencer... , March 21, 2017 at 08:22 AM
Spencer,
as I have said before, I don't really care to much what wages are - I care about income. It is low income that is the problem. I'm a UBI guy, if money is spread around, and workers can say no to exploitation, low wages will not be a problem.
Sanjait : , March 21, 2017 at 09:32 AM
This looks good, but also reductive.

Have we not seen a massive shift in pretax income distribution? Yes ... which tells me that changes in tax rate structures are not the only culprit. Though they are an important culprit.

reason -> Sanjait... , March 21, 2017 at 09:40 AM
Maybe - but
1. changes in taxes can affect incentives (especially think of real investment and corporate taxes and also personal income taxes and executive remuneration);
2. changes in the distribution of purchasing power can effect the way growth in the economy occurs;
3. changes in taxes also affect government spending and government spending tends to be more progressively distributed than private income.

Remember the rule: ceteris is NEVER paribus.

Longtooth : , March 21, 2017 at 12:28 PM
Word to the wise:

Think: Services and Goods

Composite Services labor hours increase with poor productivity growth - output per hour of labor input. Composite measure of service industry output is notoriously problematic (per BLS BEA).

Goods labor hours decrease with increasing productivity growth. Goods output per hour easy to measure and with the greatest experience and knowledge.

Put this together and composite national productivity growth rate can't grow as fast as services consume more of labor hours.

Simple arithmetic.

Elaboration on Services productivity measures:

Now add the composite retail clerk labor hours to engineering labor hours... which dominates in composite labor hours? Duh! So even in services the productivity is weighted heavily to the lowest productivity job market.

Substitute Hospitality services for Retail Clerk services. Substitute truck drivers services for Hospitality Services, etc., etc., etc.

I have spent years tracking productivity in goods production of various types ... mining, non-tech hardware production, high tech hardware production in various sectors of high tech. The present rates of productivity growth continue to climb (never decline) relative to the past rates in each goods production sector measured by themselves.

But the proportion of hours in goods production in U.S. is and has been in continual decline even while value of output has increased in each sector of goods production.

Here's an interesting way to start thinking about Services productivity.

There used to be reasonably large services sector in leisure and business travel agents. Now there is nearly none... this has been replaced by on-line computer based booking. So travel agent or equivalent labor hours is now near zippo. Productivity of travel agents went through the roof in the 1990's & 2000's as the number of people / labor hours dropped like a rock. Where did those labor hours end up? They went to lower paying services or left the labor market entirely. So lower paying lower productivity services increased as a proportion of all services, which in composite reduced total serviced productivity.

You can do the same analysis for hundreds of service jobs that no longer even exist at all --- switch board operators for example when the way of buggy whip makers and horse-shoe services).

Now take a little ride into the future... not to distant future. When autonomous vehicles become the norm or even a large proportion of vehicles, and commercial drivers (taxi's, trucking, delivery services) go the way of horse-shoe services the labor hours for those services (land transportation of goods & people) will drop precipitously, even as unit deliveries increase, productivity goes through the roof, but since there's almost no labor hours in that service the composite effect on productivity in services will drop because the displaced labor hours will end up in a lower productivity services sector or out of the elabor market entirely.

Longtooth -> Longtooth... , March 21, 2017 at 12:42 PM
Economists are having problems reconciling composite productivity growth rates with increasing rates of automation. So they end up saying "no evidence" of automation taking jobs or something to the effect "not to fear, robotics isn't evident as a problem we have to worry about".

But they know by observation all around them that automation is increasing productivity in the goods sector, so they can't really discount automation as an issue without shutting their eyes to everything they see with their "lying eyes". Thus they know deep down that they will have to be reconcile this with BLS and BEA measures.

Ten years aog this wasn't even on economist's radars. Today it's at least being looked into with more serious effort.

Ten years ago politicians weren't even aware of the possibility of any issues with increasing rates of automation... they thought it's always increased with increasing labor demand and growth, so why would that ever change? Ten years ago they concluded it couldn't without even thinking about it for a moment. Today it's on their radar at least as something that bears perhaps a little more thought.

Not to worry though... in ten more years they'll either have real reason to worry staring them in the face, or they'll have figured out why they were so blind before.

Reminds me of not recognizing the "shadow banking" enterprises that they didn't see either until after the fact.

Longtooth -> Longtooth... , March 21, 2017 at 12:48 PM
Or that they thought the risk rating agencies were providing independent and valid risk analysis so the economists couldn't reconcile the "low level" of market risks risk with everything else so they just assumed "everything" else was really ok too... must be "irrational exuberance" that's to blame.
Longtooth : , March 21, 2017 at 01:04 PM
Let me add that the term "robotics" is a subset of automation. The major distinction is only that a form of automation that includes some type of 'articulation' and/or some type of dynamic decision making on the fly (computational branching decision making in nano second speeds) is termed 'robotics' because articulation and dynamic decision making are associated with human capabilities rather then automatic machines.

It makes no difference whether productivity gains occur by an articulated machine or one that isn't... automation just means replacing people's labor with something that improves humans capacity to produce an output.

When mechanical leverage was invented 3000 or more years ago it was a form of automation, enabling humans to lift, move heavier objects with less human effort (less human energy).

Longtooth -> Longtooth... , March 21, 2017 at 01:18 PM
I meant 3000 years BC.... 5000 years ago or more.

[Mar 17, 2017] Maybe the machines are not actually eating our jobs, since productivity has stalled in the US for more than a decade.

Notable quotes:
"... Motivated empiricism, which is what he is describing, is just as misleading as ungrounded theorizing unsupported by empirical data. Indeed, even in the sciences with well established, strong testing protocols are suffering from a replication crisis. ..."
"... I liked the Dorman piece at Econospeak as well. He writes well and explains things well in a manner that makes it easy for non-experts to understand. ..."
Mar 17, 2017 | economistsview.typepad.com
DrDick : March 16, 2017 at 07:19 AM , 2017 at 07:19 AM
The Brookings piece ( Understanding US productivity trends from the bottom-up - Brookings Institution ) would suggest that maybe the machines are not actually eating our jobs, since productivity has stalled in the US for more than a decade.

The Dornan piece at Econospeak ( Economic Empiricism on the Hubris-Humility Spectrum? - EconoSpeak ) is also interesting and I think I agree with him.

Motivated empiricism, which is what he is describing, is just as misleading as ungrounded theorizing unsupported by empirical data. Indeed, even in the sciences with well established, strong testing protocols are suffering from a replication crisis.

Peter K. -> DrDick ... , March 16, 2017 at 09:18 AM
Of course Sanjait will ignore the Brookings piece.

I liked the Dorman piece at Econospeak as well. He writes well and explains things well in a manner that makes it easy for non-experts to understand.

Unlike other writers we know.

[Mar 06, 2017] Robots are Wealth Creators and Taxing Them is Illogical

Notable quotes:
"... His prescription in the end is the old and tired "invest in education and retraining", i.e. "symbolic analyst jobs will replace the lost jobs" like they have for decades (not). ..."
"... "Governments will, however, have to concern themselves with problems of structural joblessness. They likely will need to take a more explicit role in ensuring full employment than has been the practice in the US." ..."
"... Instead, we have been shredding the safety net and job training / creation programs. There is plenty of work that needs to be done. People who have demand for goods and services find them unaffordable because the wealthy are capturing all the profits and use their wealth to capture even more. Trade is not the problem for US workers. Lack of investment in the US workforce is the problem. We don't invest because the dominant white working class will not support anything that might benefit blacks and minorities, even if the major benefits go to the white working class ..."
"... Really nice if your sitting in the lunch room of the University. Especially if you are a member of the class that has been so richly awarded, rather than the class who paid for it. Humph. The discussion is garbage, Political opinion by a group that sat by ... The hypothetical nuance of impossible tax policy. ..."
"... The concept of Robots leaving us destitute, is interesting. A diversion. It ain't robots who are harvesting the middle class. It is an entitled class of those who gave so little. ..."
"... Summers: "Let them eat training." ..."
"... Suddenly then, Bill Gates has become an accomplished student of public policy who can command an audience from Lawrence Summers who was unable to abide by the likes of the prophetic Brooksley Born who was chair of the Commodity Futures Trading Commission or the prophetic professor Raghuram Rajan who would become Governor of the Reserve Bank of India. Agreeing with Bill Gates however is a "usual" for Summers. ..."
"... Until about a decade or so ago many states I worked in had a "tangible property" or "personal property" tax on business equipment, and sometimes on equipment + average inventory. Someday I will do some research and see how many states still do this. Anyway a tax on manufacturing equipment, retail fixtures and computers and etc. is hardly novel or unusual. So why would robots be any different? ..."
"... Thank you O glorious technocrats for shining the light of truth on humanity's path into the future! Where, oh where, would we be without our looting Benevolent Overlords and their pompous lapdogs (aka Liars in Public Places)? ..."
"... While he is overrated, he is not completely clueless. He might well be mediocre (or slightly above this level) but extremely arrogant defender of the interests of neoliberal elite. Rubin's boy Larry as he was called in the old days. ..."
"... BTW he was Rubin's hatchet man for eliminating Brooksley Born attempt to regulate the derivatives and forcing her to resign: ..."
Mar 05, 2017 | economistsview.typepad.com
Larry Summers: Robots are wealth creators and taxing them is illogical : I usually agree with Bill Gates on matters of public policy and admire his emphasis on the combined power of markets and technology. But I think he went seriously astray in a recent interview when he proposed, without apparent irony, a tax on robots to cushion worker dislocation and limit inequality. ....

pgl : , March 05, 2017 at 02:16 PM

Has Summers gone all supply-side on his? Start with his title:

"Robots are wealth creators and taxing them is illogical"

I bet Bill Gates might reply – "my company is a wealth creator so it should not be taxed". Oh wait – Microsoft is already shifting profits to tax havens. Summers states:

"Third, and perhaps most fundamentally, why tax in ways that reduce the size of the pie rather than ways that assure that the larger pie is well distributed? Imagine that 50 people can produce robots who will do the work of 100. A sufficiently high tax on robots would prevent them from being produced."

Yep – he has gone all supply-side on us.

cm -> pgl... , March 05, 2017 at 02:46 PM
Summers makes one, and only one, good and relevant point - that in many cases, robots/automation will not produce more product from the same inputs but better products. That's in his words; I would replace "better" with "more predictable quality/less variability" - in both directions. And that the more predictable quality aspect is hard or impossible to distinguish from higher productivity (in some cases they may be exactly the same, e.g. by streamlining QA and reducing rework/pre-sale repairs).

His prescription in the end is the old and tired "invest in education and retraining", i.e. "symbolic analyst jobs will replace the lost jobs" like they have for decades (not).

anne -> cm... , March 05, 2017 at 04:36 PM
Incisive all the way through.
jonny bakho -> pgl... , March 05, 2017 at 02:52 PM
Pundits do not write titles, editors do. Tax the profits, not the robots.

The crux of the argument is this:

"Governments will, however, have to concern themselves with problems of structural joblessness. They likely will need to take a more explicit role in ensuring full employment than has been the practice in the US."

Instead, we have been shredding the safety net and job training / creation programs. There is plenty of work that needs to be done. People who have demand for goods and services find them unaffordable because the wealthy are capturing all the profits and use their wealth to capture even more. Trade is not the problem for US workers. Lack of investment in the US workforce is the problem. We don't invest because the dominant white working class will not support anything that might benefit blacks and minorities, even if the major benefits go to the white working class

pgl -> jonny bakho... , March 05, 2017 at 03:35 PM
"Tax the profits, not the robots." Exactly. I suspect this is how it would have to work since the company owns the robots.
cm -> pgl... , March 05, 2017 at 03:53 PM
In principle taxing profits is preferable, but has a few downsides/differences:

Not very strong points, and I didn't read the Gates interview so I don't know his detailed motivation to propose specifically a robot tax.

cm -> pgl... , March 05, 2017 at 03:58 PM
When I was in Amsterdam a few years ago, they had come up with another perfidious scheme to cut people out of the loop or "incentivize" people to use the machines - in a large transit center, you could buy tickets at a vending machine or a counter with a person - and for the latter you would have to pay a not-so-modest "personal service" surcharge (50c for a EUR 2-3 or so ticket - I think it was a flat fee, but may have been staggered by type of service).

Maybe I misunderstood it and it was a "congestion charge" to prevent lines so people who have to use counter service e.g. with questions don't have to wait.

cm -> cm... , March 05, 2017 at 04:03 PM
And then you may have heard (in the US) the term "convenience fee" which I found rather insulting when I encountered it. It suggests you are charged for your convenience, but it is to cover payment processor costs (productivity enhancing automation!).
anne -> cm... , March 05, 2017 at 04:59 PM
And then you may have heard (in the US) the term "convenience fee" which I found rather insulting when I encountered it. It suggests you are charged for your convenience, but it is to cover payment processor costs (productivity enhancing automation!)

[ Wonderful. ]

JohnH -> pgl... , March 05, 2017 at 06:43 PM
Why not simplify things and just tax capital? We already property? Why not extend it to all capital?
Paine -> jonny bakho... , March 05, 2017 at 05:10 PM
Lack of adequate compensation to the lower half of the job force is the problem. Lack of persistent big macro demand is the problem . A global traiding system that doesn't automatically move forex rates toward universal. Trading zone balance and away from persistent surplus and deficit traders is the problem

Technology is never the root problem. Population dynamics is never the root problem

anne -> Paine... , March 05, 2017 at 05:31 PM
https://fred.stlouisfed.org/graph/?g=cVq0

January 15, 2017

Nonfarm Business Productivity and Real Median Household Income, 1953-2015

(Indexed to 1953)

anne -> Paine... , March 05, 2017 at 05:35 PM
https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

Mr. Bill -> anne... , March 05, 2017 at 06:30 PM
Really nice if your sitting in the lunch room of the University. Especially if you are a member of the class that has been so richly awarded, rather than the class who paid for it. Humph. The discussion is garbage, Political opinion by a group that sat by ... The hypothetical nuance of impossible tax policy.
Mr. Bill -> pgl... , March 05, 2017 at 06:04 PM
The concept of Robots leaving us destitute, is interesting. A diversion. It ain't robots who are harvesting the middle class. It is an entitled class of those who gave so little.
run75441 -> Mr. Bill... , March 05, 2017 at 06:45 PM
Sigh>

After one five axis CNC cell replaces 5 other machines and 4 of the workers, what happens to the four workers?

The issue is the efficiency achieved through better through put forcing the loss of wages. If you use the 5-axis CNC, tax the output from it no more than what would have been paid to the 4 workers plus the Overhead for them. The Labor cost plus the Overhead Cost is what is eliminated by the 5-Axis CNC.

It is not a diversion. It is a reality.

anne -> anne... , March 05, 2017 at 02:20 PM
http://krugman.blogs.nytimes.com/2009/01/03/economists-behaving-badly/

January 3, 2009

Economists Behaving Badly
By Paul Krugman

Ouch. The Wall Street Journal's Real Time Economics blog has a post * linking to Raghuram Rajan's prophetic 2005 paper ** on the risks posed by securitization - basically, Rajan said that what did happen, could happen - and to the discussion at the Jackson Hole conference by Federal Reserve vice-chairman Don Kohn *** and others. **** The economics profession does not come off very well.

Two things are really striking here. First is the obsequiousness toward Alan Greenspan. To be fair, the 2005 Jackson Hole event was a sort of Greenspan celebration; still, it does come across as excessive - dangerously close to saying that if the Great Greenspan says something, it must be so. Second is the extreme condescension toward Rajan - a pretty serious guy - for having the temerity to suggest that maybe markets don't always work to our advantage. Larry Summers, I'm sorry to say, comes off particularly badly. Only my colleague Alan Blinder, defending Rajan "against the unremitting attack he is getting here for not being a sufficiently good Chicago economist," emerges with honor.

* http://blogs.wsj.com/economics/2009/01/01/ignoring-the-oracles/

** http://www.kc.frb.org/publicat/sympos/2005/PDF/Rajan2005.pdf

*** http://www.kc.frb.org/publicat/sympos/2005/PDF/Kohn2005.pdf

**** https://www.kansascityfed.org/publicat/sympos/2005/PDF/GD5_2005.pdf

cm -> pgl... , March 05, 2017 at 03:07 PM
No, his argument is much broader. Summers stops at "no new taxes and education/retraining". And I find it highly dubious that compensation/accommodation for workers can be adequately funded out of robot taxes.

Baker goes far beyond that.

cm -> cm... , March 05, 2017 at 03:09 PM
What Baker mentioned: mandatory severance, shorter work hours or more vacations due to productivity, funding infrastructure.

Summers: "Let them eat training."

Paine -> anne... , March 05, 2017 at 05:19 PM
We should never assign a social task to the wrong institution. Firms should be unencumbered by draconian hire and fire constraints. The state should provide the compensation for lay offs and firings. The state should maintain an adequate local Beveridge ratio of job openings to Job applicants

Firms task is productivity max subject to externality off sets. Including output price changed. And various other third party impacts

anne -> anne... , March 05, 2017 at 02:33 PM
Correcting:

Suddenly then, Bill Gates has become an accomplished student of public policy who can command an audience from Lawrence Summers who was unable to abide by the likes of the prophetic Brooksley Born who was chair of the Commodity Futures Trading Commission or the prophetic professor Raghuram Rajan who would become Governor of the Reserve Bank of India. Agreeing with Bill Gates however is a "usual" for Summers.

Tom aka Rusty : , March 05, 2017 at 02:19 PM
Until about a decade or so ago many states I worked in had a "tangible property" or "personal property" tax on business equipment, and sometimes on equipment + average inventory. Someday I will do some research and see how many states still do this. Anyway a tax on manufacturing equipment, retail fixtures and computers and etc. is hardly novel or unusual. So why would robots be any different?
pgl -> Tom aka Rusty... , March 05, 2017 at 02:38 PM
I suspect it is the motivation of Gates as in what he would do with the tax revenue. And Gates might be thinking of a higher tax rate for robots than for your garden variety equipment.
Paine -> Tom aka Rusty... , March 05, 2017 at 05:22 PM
There is no difference Beyond spin
Paine -> Paine... , March 05, 2017 at 05:28 PM
Yes some equipment in side any one firm compliments existing labor inside that firm including already installed robots Robots new robots are rivals

Rivals that if subject to a special " introduction tax " Could deter installation
As in
The 50 for 100 swap of the 50 hours embodied in the robot
Replace 100. Similarly paid production line labor
But ...

There's a 100 % plusher chase tax on the robots

Why bother to invest in the productivity increase
If here are no other savings

anne : , March 05, 2017 at 02:28 PM
http://cepr.net/blogs/beat-the-press/bill-gates-wants-to-undermine-donald-trump-s-plans-for-growing-the-economy

February 20, 2017

Bill Gates Wants to Undermine Donald Trump's Plans for Growing the Economy

Yes, as Un-American as that may sound, Bill Gates is proposing * a tax that would undermine Donald Trump's efforts to speed the rate of economic growth. Gates wants to tax productivity growth (also known as "automation") slowing down the rate at which the economy becomes more efficient.

This might seem a bizarre policy proposal at a time when productivity growth has been at record lows, ** *** averaging less than 1.0 percent annually for the last decade. This compares to rates of close to 3.0 percent annually from 1947 to 1973 and again from 1995 to 2005.

It is not clear if Gates has any understanding of economic data, but since the election of Donald Trump there has been a major effort to deny the fact that the trade deficit has been responsible for the loss of manufacturing jobs and to instead blame productivity growth. This is in spite of the fact that productivity growth has slowed sharply in recent years and that the plunge in manufacturing jobs followed closely on the explosion of the trade deficit, beginning in 1997.

[Manufacturing Employment, 1970-2017]

Anyhow, as Paul Krugman pointed out in his column **** today, if Trump is to have any hope of achieving his growth target, he will need a sharp uptick in the rate of productivity growth from what we have been seeing. Bill Gates is apparently pushing in the opposite direction.

* https://qz.com/911968/bill-gates-the-robot-that-takes-your-job-should-pay-taxes/

** https://fred.stlouisfed.org/graph/?g=cABu

*** https://fred.stlouisfed.org/graph/?g=cABr

**** https://www.nytimes.com/2017/02/20/opinion/on-economic-arrogance.html

-- Dean Baker

anne -> anne... , March 05, 2017 at 02:30 PM
https://fred.stlouisfed.org/graph/?g=cABu

January 4, 2017

Nonfarm Business Labor Productivity, * 1948-2016

* Output per hour of all persons

(Percent change)


https://fred.stlouisfed.org/graph/?g=cABr

January 4, 2017

Nonfarm Business Labor Productivity, * 1948-2016

* Output per hour of all persons

(Indexed to 1948)

anne -> anne... , March 05, 2017 at 02:32 PM
https://fred.stlouisfed.org/graph/?g=cN2z

January 15, 2017

Manufacturing employment, 1970-2017


https://fred.stlouisfed.org/graph/?g=cN2H

January 15, 2017

Manufacturing employment, 1970-2017

(Indexed to 1970)

Ron Waller : , March 05, 2017 at 02:43 PM
Yes, it's far better that our betters in the upper class get all the benefits from productivity growth. Without their genetic entitlement to wealth others created, we would just be savages murdering one another in the streets.

These Masters of the Universe of ours put the 'civil' in our illustrious civilization. (Sure it's a racist barbarian concentration camp on the verge of collapse into fascist revolutions and world war. But, again, far better than people murdering one another in the streets!)

People who are displaced from automation are simply moochers and it's only right that they are cut out of the economy and left to die on the streets. This is the law of Nature: survival of the fittest. Social Darwinism is inescapable. It's what makes us human!

Instead of just waiting for people displaced from automation to die on the streets, we should do the humane thing and establish concentration camps so they are quickly dispatched to the Void. (Being human means being merciful!)

Thank you O glorious technocrats for shining the light of truth on humanity's path into the future! Where, oh where, would we be without our looting Benevolent Overlords and their pompous lapdogs (aka Liars in Public Places)?

Peter K. : , March 05, 2017 at 03:14 PM
I think it would be good if the tax was used to help dislocated workers and help with inequality as Gates suggests. However Summers and Baker have a point that it's odd to single out robots when you could tax other labor-saving, productivity-enhancing technologies as well.

Baker suggests taxing profits instead. I like his idea about the government taking stock of companies and collecting taxes that way.

"They likely will need to take a more explicit role in ensuring full employment than has been the practice in the US.

Among other things, this will mean major reforms of education and retraining systems, consideration of targeted wage subsidies for groups with particularly severe employment problems, major investments in infrastructure and, possibly, direct public employment programmes."

Not your usual neoliberal priorities. Compare with Hillary's program.

greg : , March 05, 2017 at 03:34 PM
All taxes are a reallocation of wealth. Not taxing wealth creators is impossible.

On the other hand, any producer who is not taxed will expand at the expense of those producers who are taxed. This we are seeing with respect to mechanical producers and human labor. Labor is helping to subsidize its replacement.

Interesting that Summers apparently doesn't see this.

pgl -> greg ... , March 05, 2017 at 03:38 PM
"Not taxing wealth creators is impossible."

Substitute "impossible" with "bad policy" and you are spot on. Of course the entire Paul Ryan agenda is to shift taxes from the wealthy high income to the rest of us.

cm -> pgl... , March 05, 2017 at 04:12 PM
Judging by the whole merit rhetoric and tying employability to "adding value", one could come to the conclusion that most wealth is created by workers. Otherwise why would companies need to employ them and wring their hands over skill shortages? Are you suggesting W-2 and payroll taxes are bad policy?
pgl -> cm... , March 05, 2017 at 05:15 PM
Payroll taxes to fund Soc. Sec. benefits is a good thing. But when they are used to fund tax cuts for the rich - not a good thing. And yes - wealth may be created by workers but it often ends up in the hands of the "investor class".
Paine -> cm... , March 05, 2017 at 05:45 PM
Let's not conflate value added from value extracted. Profits are often pure economic rents. Very often non supply regulating. The crude dynamics of market based pricing hardly presents. A sea of close shaveed firms extracting only. Necessary incentivizing profits of enterprise
Paine -> Paine... , March 05, 2017 at 05:47 PM
Profiteers extract far more value then they create. Of course disentangling system improving surplus ie profits of enterprise
From the rest of the extracted swag. Exceeds existing tax systems capacity
Paine -> Paine... , March 05, 2017 at 05:51 PM
One can make a solid social welfare case for a class of income stream
that amounts to a running residue out of revenue earned by the firm
above compensation to job holders in that firm

See the model of the recent oboe laureate


But that would amount to a fraction of existing corporate " earnings "
Errr extractions

Chris G : , March 05, 2017 at 04:21 PM
Taking this in a different direction, does it strike anyone else as important that human beings retain the knowledge of how to make the things that robots are tasked to produce?
Paine -> Chris G ... , March 05, 2017 at 05:52 PM
As hobbies yes
Chris G -> Paine... , March 05, 2017 at 05:55 PM
That's it? Only as hobbies? Eesh, I must have a prepper gene.
cm -> Chris G ... , March 05, 2017 at 06:50 PM
The current generation of robots and automated equipment isn't intelligent and doesn't "know" anything. People still know how to make the things, otherwise the robots couldn't be programmed.

However in probably many cases, doing the actual production manually is literally not humanly possible. For example, making semiconductor chips or modern circuit boards requires machines - they cannot be produced by human workers under any circumstances, as they require precision outside the range of human capability.

Chris G -> cm... , March 05, 2017 at 08:22 PM
Point taken but I was thinking more along the lines of knowing how to use a lathe or an end mill. If production is reduced to a series of programming exercises then my sense is that society is setting itself up for a nasty fall.

(I'm all for technology to the extent that it builds resilience. However, when it serves to disconnect humans from the underlying process and reduces their role to simply knowledge workers, symbolic analysts, or the like then it ceases to be net positive. Alternatively stated: Tech-driven improvements in efficiency are good so long as they don't undermine overall societal resilience. Be aware of your reliance on things you don't understand but whose function you take for granted.)

Dan : , March 05, 2017 at 05:00 PM
Gates almost certainly meant tax robots the way we are taxed. I doubt he meant tax the acquisition of robots. We are taxed in complex ways, presumably robots will be as well.

Summers is surely using a strawman to make his basically well thought out arguments.

In any case, everyone is talking about distributional impacts of robots, but resource allocation is surely to be as much or more impacted. What if robots only want to produce antennas and not tomatoes? That might be a damn shame.

It all seems a tad early to worry about and it's hard to see how what ever the actual outcome is, the frontier of possible outcomes has to be wildly improved.

Paine -> Dan ... , March 05, 2017 at 05:57 PM
Given recent developments in labor productivity Your Last phrase becomes a gem

That is If you end with "it's hard to see whatever the actual outcome is The frontier of possible outcomes shouldn't be wildly improved By a social revolution "

Sandwichman : , March 05, 2017 at 08:02 PM
Larry Summers is clueless on robots.

Robots do not CREATE wealth. They transform wealth from one kind to another that subjectively has more utility to robot user. Wealth is inherent in the raw materials, the knowledge, skill and effort of the robot designers and fabricators, etc., etc.

The distinction is crucial.

libezkova -> Sandwichman ... , March 05, 2017 at 08:23 PM
"Larry Summers is clueless on robots."

While he is overrated, he is not completely clueless. He might well be mediocre (or slightly above this level) but extremely arrogant defender of the interests of neoliberal elite. Rubin's boy Larry as he was called in the old days.

BTW he was Rubin's hatchet man for eliminating Brooksley Born attempt to regulate the derivatives and forcing her to resign:

== quote ==
"I walk into Brooksley's office one day; the blood has drained from her face," says Michael Greenberger, a former top official at the CFTC who worked closely with Born. "She's hanging up the telephone; she says to me: 'That was [former Assistant Treasury Secretary] Larry Summers. He says, "You're going to cause the worst financial crisis since the end of World War II."... [He says he has] 13 bankers in his office who informed him of this. Stop, right away. No more.'"

libezkova : March 05, 2017 at 08:09 PM
Market is, at the end, a fully political construct. And what neoliberals like Summers promote is politically motivated -- reflects the desires of the ruling neoliberal elite to redistribute wealth up.

BTW there is a lot of well meaning (or fashion driven) idiotism that is sold in the USA as automation, robots, move to cloud, etc. Often such fashion driven exercises cost company quite a lot. But that's OK as long as bonuses are pocketed by top brass, and power of labor diminished.

Underneath of all the "robotic revolution" along with some degree of technological innovation (mainly due to increased power of computers and tremendous progress in telecommunication technologies -- not some breakthrough) is one big trend -- liquidation of good jobs and atomization of the remaining work force.

A lot of motivation here is the old dirty desire of capital owners and upper management to further to diminish the labor share. Another positive thing for capital owners and upper management is that robots do not go on strike and do not demand wage increases. But the problem is that they are not a consumers either. So robotization might bring the next Minsky moment for the USA economy closer. Sighs of weakness of consumer demand are undeniable even now. Look at auto loan delinquency rate as the first robin. http://www.usatoday.com/story/money/cars/2016/02/27/subprime-auto-loan-delinquencies-hit-six-year-high/81027230/

== quote ==
The total of outstanding auto loans reached $1.04 trillion in the fourth-quarter of 2015, according to the Federal Reserve Bank of St. Louis. About $200 billion of that would be classified as subprime or deep subprime.
== end of quote ==

Summers as a staunch, dyed-in-the-wool neoliberal of course is against increasing labor share. Actually here he went full into "supply sider" space -- making richer more rich will make us better off too. Pgl already noted that by saying: "Has Summers gone all supply-side on his? Start with his title"

BTW, there is a lot of crazy thing that are going on with the US large companies drive to diminish labor share. Some o them became barely manageable and higher management has no clue what is happening on the lower layers of the company.

The old joke was: GM does a lot of good things except making good cars. Now it can be expanded to a lot more large US companies.

The "robot pressure" on labor is not new. It is actually the same old and somewhat dirty trick as outsourcing. In this case outsourcing to robots. In other words "war of labor" by other means.

Two caste that neoliberalism created like in feudalism occupy different social spaces and one is waging the war on other, under the smoke screen of "free market" ideology. As buffet remarked "There's class warfare, all right, but it's my class, the rich class, that's making war, and we're winning."

BTW successes in robotics are no so overhyped that it is not easy to distinguish where reality ends and the hype starts.

In reality telecommunication revolution is probably more important in liquation of good jobs in the USA. I think Jonny Bakho or somebody else commented on this, but I can't find the post.

[Mar 03, 2017] Tax on robots

Mar 03, 2017 | economistsview.typepad.com
Sandwichman : February 28, 2017 at 11:51 PM , 2017 at 11:51 PM
Dean Baker is Clueless On Productivity Growth

Dean Baker's screed, "Bill Gates Is Clueless On The Economy," keeps getting recycled, from Beat the Press to Truthout to Real-World Economics Review to The Huffington Post. Dean waves aside the real problem with Gates's suggestion, which is the difficulty of defining what a robot is, and focuses instead on what seems to him to be the knock-down argument:

"Gates is worried that productivity growth is moving along too rapidly and that it will lead to large scale unemployment.

"There are two problems with this story: First productivity growth has actually been very slow in recent years. The second problem is that if it were faster, there is no reason it should lead to mass unemployment."

There are two HUGE problem with Dean's story. ...

http://econospeak.blogspot.ca/2017/03/dean-baker-is-clueless-on-productivity.html

anne -> Sandwichman ... , March 01, 2017 at 04:38 AM
http://cepr.net/blogs/beat-the-press/bill-gates-wants-to-undermine-donald-trump-s-plans-for-growing-the-economy

February 20, 2017

Bill Gates Wants to Undermine Donald Trump's Plans for Growing the Economy

Yes, as Un-American as that may sound, Bill Gates is proposing * a tax that would undermine Donald Trump's efforts to speed the rate of economic growth. Gates wants to tax productivity growth (also known as "automation") slowing down the rate at which the economy becomes more efficient.

This might seem a bizarre policy proposal at a time when productivity growth has been at record lows, ** averaging less than 1.0 percent annually for the last decade. This compares to rates of close to 3.0 percent annually from 1947 to 1973 and again from 1995 to 2005.

It is not clear if Gates has any understanding of economic data, but since the election of Donald Trump there has been a major effort to deny the fact that the trade deficit has been responsible for the loss of manufacturing jobs and to instead blame productivity growth. This is in spite of the fact that productivity growth has slowed sharply in recent years and that the plunge in manufacturing jobs followed closely on the explosion of the trade deficit, beginning in 1997.

[Manufacturing Employment, 1970-2017]

Anyhow, as Paul Krugman pointed out in his column *** today, if Trump is to have any hope of achieving his growth target, he will need a sharp uptick in the rate of productivity growth from what we have been seeing. Bill Gates is apparently pushing in the opposite direction.

* http://fortune.com/2017/02/18/bill-gates-robot-taxes-automation/

** https://fred.stlouisfed.org/graph/?g=cABu

*** https://www.nytimes.com/2017/02/20/opinion/on-economic-arrogance.html

-- Dean Baker

anne -> anne... , March 01, 2017 at 04:45 AM
https://fred.stlouisfed.org/graph/?g=cABu

January 4, 2017

Nonfarm Business Labor Productivity, * 1948-2016

* Output per hour of all persons

(Percent change)

anne -> anne... , March 01, 2017 at 04:47 AM
https://fred.stlouisfed.org/graph/?g=cABr

January 4, 2017

Nonfarm Business Labor Productivity, * 1948-2016

* Output per hour of all persons

(Indexed to 1948)

anne -> anne... , March 01, 2017 at 04:45 AM
https://fred.stlouisfed.org/graph/?g=cN2z

January 15, 2017

Manufacturing employment, 1970-2017


https://fred.stlouisfed.org/graph/?g=cN2H

January 15, 2017

Manufacturing employment, 1970-2017

(Indexed to 1970)

anne -> Sandwichman ... , March 01, 2017 at 04:41 AM
http://cepr.net/publications/op-eds-columns/bill-gates-is-clueless-on-the-economy

February 27, 2017

Bill Gates Is Clueless on the Economy
By Dean Baker

Last week Bill Gates called for taxing robots. * He argued that we should impose a tax on companies replacing workers with robots and that the money should be used to retrain the displaced workers. As much as I appreciate the world's richest person proposing a measure that would redistribute money from people like him to the rest of us, this idea doesn't make any sense.

Let's skip over the fact of who would define what a robot is and how, and think about the logic of what Gates is proposing. In effect, Gates wants to put a tax on productivity growth. This is what robots are all about. They allow us to produce more goods and services with the same amount of human labor. Gates is worried that productivity growth is moving along too rapidly and that it will lead to large scale unemployment.

There are two problems with this story. First productivity growth has actually been very slow in recent years. The second problem is that if it were faster, there is no reason it should lead to mass unemployment. Rather, it should lead to rapid growth and increases in living standards.

Starting with the recent history, productivity growth has averaged less than 0.6 percent annually over the last six years. This compares to a rate of 3.0 percent from 1995 to 2005 and also in the quarter century from 1947 to 1973. Gates' tax would slow productivity growth even further.

It is difficult to see why we would want to do this. Most of the economic problems we face are implicitly a problem of productivity growth being too slow. The argument that budget deficits are a problem is an argument that we can't produce enough goods and services to accommodate the demand generated by large budget deficits.

The often told tale of a demographic nightmare with too few workers to support a growing population of retirees is also a story of inadequate productivity growth. If we had rapid productivity growth then we would have all the workers we need.

In these and other areas, the conventional view of economists is that productivity growth is too slow. From this perspective, if Bill Gates gets his way then he will be making our main economic problems worse, not better.

Gates' notion that rapid productivity growth will lead to large-scale unemployment is contradicted by both history and theory. The quarter century from 1947 to 1973 was a period of mostly low unemployment and rapid wage growth. The same was true in the period of rapid productivity growth in the late 1990s.

The theoretical story that would support a high employment economy even with rapid productivity growth is that the Federal Reserve Board should be pushing down interest rates to try to boost demand, as growing productivity increases the ability of the economy to produce more goods and services. In this respect, it is worth noting that the Fed has recently moved to raise interest rates to slow the rate of job growth.

We can also look to boost demand by running large budget deficits. We can spend money on long neglected needs, like providing quality child care, education, or modernizing our infrastructure. Remember, if we have more output potential because of productivity growth, the deficits are not problem.

We can also look to take advantage of increases in productivity growth by allowing workers more leisure time. Workers in the United States put in 20 percent more hours each year on average than workers in other wealthy countries like Germany and the Netherlands. In these countries, it is standard for workers to have five or six weeks a year of paid vacation, as well as paid family leave and paid vacation. We should look to follow this example in the United States as well.

If we pursue these policies to maintain high levels of employment then workers will be well-positioned to secure the benefits of higher productivity in higher wages. This was certainly the story in the quarter century after World War II when real wages rose at a rate of close to two percent annually....

* http://fortune.com/2017/02/18/bill-gates-robot-taxes-automation/

RC AKA Darryl, Ron -> anne... , March 01, 2017 at 05:57 AM
The productivity advantages of robots for hospice care is chiefly from robots not needing sleep, albeit they may still need short breaks for recharging. Their primary benefit may still be that without the human touch of care givers then the old and infirm may proceed more quickly through the checkout line.
cm -> RC AKA Darryl, Ron... , March 01, 2017 at 07:35 AM
Nursing is very tough work. But much more generally, the attitude towards labor is a bit schizophrenic - one the one hand everybody is expected to work/contribute, on the other whichever work can be automated is removed, and it is publicly celebrated as progress (often at the cost of making the residual work, or "new process", less pleasant for remaining workers and clients).

This is also why I'm getting the impression Gates puts the cart before the horse - his solution sounds not like "how to benefit from automation", but "how to keep everybody in work despite automation".

jonny bakho -> cm... , March 01, 2017 at 08:36 AM
Work is the organization and direction of people's time into productive activity.
Some people are self directed and productive with little external motivation.
Others are disoriented by lack of direction and pursue activities that not only are not productive but are self destructive.

Work is a basic component of the social contract.
Everyone works and contributes and work a sufficient quantity and quality of work should guarantee a living wage.
You will find overwhelming support for a living wage but very little support for paying people not to work

DrDick -> jonny bakho... , March 01, 2017 at 11:21 AM
"Others are disoriented by lack of direction and pursue activities that not only are not productive but are self destructive."

You mean like business executives and the financial sector?

anne -> cm... , March 01, 2017 at 08:44 AM
I'm getting the impression Gates puts the cart before the horse - his solution sounds not like "how to benefit from automation", but "how to keep everybody in work despite automation".

[ Nicely summarized. ]

RC AKA Darryl, Ron -> cm... , March 01, 2017 at 09:26 AM
Schizophrenia runs deep in modernity, but this is another good example of it. We are nothing if not conflicted. Of course things get better when we work together to resolve the contradictions in our society, but if not then....
Sandwichman -> cm... , March 01, 2017 at 10:05 AM
"...his solution sounds not like 'how to benefit from automation', but "how to keep everybody in work despite automation'."

Yes, indeed. And this is where Dean Baker could have made a substantive critique, rather than the conventional economics argument dilution he defaulted to.

Peter K. -> Sandwichman ... , March 01, 2017 at 10:14 AM
"...his solution sounds not like 'how to benefit from automation', but "how to keep everybody in work despite automation'."

Yes, indeed. And this is where Dean Baker could have made a substantive critique, rather than the conventional economics argument dilution he defaulted to."

Why did you think he chose that route? I think all of Dean Baker's proposed economic reforms are worthwhile.

Tom aka Rusty -> RC AKA Darryl, Ron... , March 01, 2017 at 09:29 AM
I showed this to Mrs. Rustbelt RN.

She ended some choice comments with:

"I am really glad I am retired."

The world is worse off without her on the job.

RC AKA Darryl, Ron -> Tom aka Rusty... , March 01, 2017 at 10:03 AM
"I showed this to Mrs. Rustbelt RN..."

[This?]

"I am really glad I am retired."

[Don't feel like the Lone Ranger, Mrs. Rustbelt RN. Mortality may be God's greatest gift to us, but I can wait for it. I am enjoying retirement regardless of everything else. I don't envy the young at all.]

sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 11:31 AM
Having a little familiarity with robotics in hospital nursing care (not hospice, but similar I assume) ... I don't think the RNs are in danger of losing their jobs any time soon.

Maybe someday, but the state of the art is not "there" yet or even close. The best stuff does tasks like cleaning floors and carrying shipments down hallways. This replaces janitorial and orderly labor, but even those only slightly, and doesn't even approach being a viable substitute for nursing.

RC AKA Darryl, Ron -> sanjait... , March 01, 2017 at 11:54 AM
Great! I am not a fan of robots. I do like to mix some irony with my sarcasm though and if it tastes too much like cynicism then I just add a little more salt.
Sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 12:47 PM
I understand.

Honestly though, I think the limitations of AI give us reason not to be super cynical. At least in the near term ...

Peter K. -> anne... , March 01, 2017 at 08:05 AM
"The quarter century from 1947 to 1973 was a period of mostly low unemployment and rapid wage growth. The same was true in the period of rapid productivity growth in the late 1990s."

I think it was New Deal Dem or somebody who also pointed to this. I noticed this as well and pointed out that the social democratic years of tight labor markets had the highest "productivity" levels, but the usual trolls had their argumentative replies.

So there's that an also in the neoliberal era, bubble ponzi periods record high profits and hence higher productivity even if they aren't sustainable.

There was the epic housing bubble and funny how the lying troll PGL denies the Dot.com bubble every happened.

Why is that?

pgl -> Peter K.... , March 01, 2017 at 08:16 AM
Another pointless misrepresentation - your specialty. Snore.
Peter K. -> pgl... , March 01, 2017 at 08:31 AM
More lies.
im1dc -> pgl... , March 01, 2017 at 08:34 AM
I would add one devoid of historical context as well as devoid of the harm done to the environment and society done from unregulated industrial production.

Following this specified period of unemployment and high productivity Americans demanded and go Federal Environmental Regulation and Labor laws for safety, etc.

Of course, the current crop of Republicans and Trump Supporters want to go back to the reckless, foolish, dangerous, and deadly selfish government sanctioned corporate pollution, environmental destruction, poison, and wipe away worker protections, pay increases, and benefits.

Peter K. ignores too much of history or prefers to not mention it in his arguments with you.

im1dc -> im1dc... , March 01, 2017 at 08:37 AM
I would remind Peter K. that we have Speed Limits on our roadways and many other signs that are posted that we must follow which in fact are there for our safety and that of others.

Those signs, laws, and regulations are there for our good not for our detriment even if they slow us down or direct us to do things we would prefer not to do at that moment.

Metaphorically speaking that is what is absent completely in Trump's thinking and Republican Proposals for the US Economy, not to mention Education, Health, Foreign Affairs, etc.

Peter K. -> im1dc... , March 01, 2017 at 10:18 AM
What did I say specifically that you disagreed with?

I think regulations are good. Neoliberals like Bill Clinton and Larry Summers deregulated the financial sector. Jimmy Carter deregulated.

sanjait -> im1dc... , March 01, 2017 at 11:32 AM
Adding to the list of significant historical factors that were ignored: increased educational attainment.
jonny bakho -> Peter K.... , March 01, 2017 at 08:42 AM
Where do you find this stuff? Very few economists would agree that there were these eras you describe. It is simpletonian. It is not relevant to economic models or discussions.
pgl -> jonny bakho... , March 01, 2017 at 08:49 AM
One economist agrees with PeterK. His name is Greg Mankiw.
Peter K. -> pgl... , March 01, 2017 at 10:17 AM
"The quarter century from 1947 to 1973 was a period of mostly low unemployment and rapid wage growth. The same was true in the period of rapid productivity growth in the late 1990s."

So Jonny Bakho and PGL disagree with this?

Not surprising. PGl also believes the Dot.com bubble is a fiction. Must have been that brain injury he had surgery for.

jonny bakho -> Peter K.... , March 01, 2017 at 10:38 AM
You dishonestly put words in other people's mouth all the time
You are rude and juvenile

What I disagreed with:
" social democratic years" (a vague phrase with no definition)

This sentence is incoherent:
"So there's that an also in the neoliberal era, bubble ponzi periods record high profits and hence higher productivity even if they aren't sustainable."

I asked, Where do you find this? because it has little to do with the conversation

You follow your nonsense with an ad hominem attack
You seem more interested in attacking Democrats and repeating mindless talking points than in discussing issues or exchanging ideas

pgl -> Peter K.... , March 01, 2017 at 12:04 PM
The period did have high average growth. It also had recessions and recoveries. Your pretending otherwise reminds me of those JohnH tributes to the gold standard period.
JohnH -> pgl... , March 01, 2017 at 02:38 PM
In the deflationary Golden Age per capita income and wages rose tremendously...something that pgl likes to forget.
Paine -> anne... , March 01, 2017 at 09:53 AM
" Protect us from the robots -- "

Splendidly dizzy --


There is no internal limit to job expansion thru increase effective demand

Scap Job to new job
Name your rate
And macro nuts willing to go the distance can get job markets up o that speed


Matching rates are not independent of job market conditions nor linear

The match rate accelerates as Nt job creation intensifies

RC AKA Darryl, Ron -> Sandwichman ... , March 01, 2017 at 05:50 AM
...aggregate productivity growth is a "statistical flimflam," according to Harry Magdoff...

[Exactly! TO be fair it is not uncommon for economists to decompose the aggregate productivity growth flimflam into two primary problems, particularly in the US. Robots fall down on the job in the services sector. Uber wants to fix that by replacing the gig economy drivers that replaced taxi drivers with gig-bots, but robots in food service may be what it really takes to boost productivity and set the stage for Soylent Green. Likewise, robot teachers and firemen may not enhance productivity, but they would darn sure redirect all profits from productivity back to the owners of capital further depressing wages for the rest of us.

Meanwhile agriculture and manufacturing already have such high productivity that further productivity enhancements are lost as noise in the aggregate data. It of course helps that much of our productivity improvement in manufacturing consists of boosting profits as Chinese workers are replaced with bots. Capital productivity is booming, if we just had any better idea of how to measure it. I suggest that record corporate profits are the best metric of capital productivity.

But as you suggest, economists that utilize aggregate productivity metrics in their analysis of wages or anything are just enabling the disablers. That said though, then Dean Baker's emphasis on trade deficits and wages is still well placed. He just failed to utilize the best available arguments regarding, or rather disregarding, aggregate productivity.]

RC AKA Darryl, Ron -> RC AKA Darryl, Ron... , March 01, 2017 at 07:28 AM
The Robocop movies never caught on in the same way that Blade Runner did. There is probably an underlying social function that explains it in the context of the roles of cops being reversed between the two, that is robot police versus policing the robots.
Peter K. -> RC AKA Darryl, Ron... , March 01, 2017 at 07:58 AM
"There is probably an underlying social function that explains it in the context"

No, I'd say it's better actors, story, milieu, the new age Vangelis music, better set pieces, just better execution of movie making in general beyond the plot points.

But ultimately it's a matter of taste.

But the Turing test scene at the beginning of Blade Runner was classic and reminds me of the election of Trump.

An escaped android is trying to pass as a janitor to infiltrate the Tyrell corporation which makes androids.

He's getting asked all sort of questions while his vitals are checked in his employment interview. The interviewer ask him about his mother.

"Let me tell you about my mother..."

BAM (his gunshot under the table knocks the guy through the wall)

RC AKA Darryl, Ron -> Peter K.... , March 01, 2017 at 09:46 AM
"...No, I'd say it's better actors, story, milieu, the new age Vangelis music, better set pieces, just better execution of movie making in general beyond the plot points..."

[Albeit that all of what you say is true, then there is still the issue of what begets what with all that and the plot points. Producers are people too (as dubious as that proposition may seem). Blade Runner was a film based on Philip Kindred Dick's "Do Androids Dream of Electric Sheep" novel. Dick was a mediocre sci-fi writer at best, but he was a profound plot maker. Blade Runner was a film that demanded to be made and made well. Robocop was a film that just demanded to be made, but poorly was good enough. The former asked a question about our souls, while the latter only questioned our future. Everything else followed from the two different story lines. No one could have made a small story of Gone With the Wind any more that someone could have made a superficial story of Grapes of Wrath or To Kill a Mockingbird. OK, there may be some film producers that do not know the difference, but we have never heard of them nor their films.

In any case there is also a political lesson to learn here. The Democratic Party needs a better story line. The talking heads have all been saying how much better Dum'old Trump was last night than in his former speeches. Although true as well as crossing a very low bar, I was more impressed with Steve Beshear's response. It looked to me like maybe the Democratic Party establishment is finally starting to get the message albeit a bit patronizing if you think about too much given their recent problems with old white men.]

Peter K. -> RC AKA Darryl, Ron... , March 01, 2017 at 10:19 AM
" Dick was a mediocre sci-fi writer at best"

Again I disagree as do many other people.

RC AKA Darryl, Ron -> Peter K.... , March 01, 2017 at 10:39 AM
http://variety.com/2016/tv/news/stranger-in-a-strange-land-syfy-1201918859/


[I really hope that they don't screw this up too bad. Now Heinlein is what I consider a great sci-fi writer along with Bradbury and even Jules Verne in his day.]

DrDick -> Peter K.... , March 01, 2017 at 11:23 AM
Me, too. Much better than Heinlein for instance.
RC AKA Darryl, Ron -> DrDick... , March 01, 2017 at 12:13 PM
https://www.abebooks.com/books/science-fiction-pulp-short-stories/collectible-philip-k-dick.shtml

...Dick only achieved mainstream appreciation shortly after his death when, in 1982, his novel Do Androids Dream of Electric Sheep? was brought to the big screen by Ridley Scott in the form of Blade Runner. The movie initially received lukewarm reviews but emerged as a cult hit opening the film floodgates. Since Dick's passing, seven more of his stories have been turned into films including Total Recall (originally We Can Remember It for You Wholesale), The Minority Report, Screamers (Second Variety), Imposter, Paycheck, Next (The Golden Man) and A Scanner Darkly. Averaging roughly one movie every three years, this rate of cinematic adaptation is exceeded only by Stephen King. More recently, in 2005, Time Magazine named Ubik one of the 100 greatest English-language novels published since 1923, and in 2007 Philip K. Dick became the first sci-fi writer to be included in the Library of America series...

DrDick -> RC AKA Darryl, Ron... , March 01, 2017 at 01:47 PM
I was reading him long before that and own the original book.
RC AKA Darryl, Ron -> RC AKA Darryl, Ron... , March 01, 2017 at 10:32 AM
The Democratic Party needs a better story line, but Bernie was moving that in a better direction. While Steve Beshear was a welcome voice, the Democratic Party needs a lot of new story tellers, much younger than either Bernie or Beshear.
sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 11:38 AM
"The Democratic Party needs a better story line, but Bernie was moving that in a better direction. While Steve Beshear was a welcome voice, the Democratic Party needs a lot of new story tellers, much younger than either Bernie or Beshear."

QFT

pgl -> sanjait... , March 01, 2017 at 12:05 PM
Steve Beshear took Obamacare and made it work for his citizens in a very red state.
RC AKA Darryl, Ron -> pgl... , March 01, 2017 at 12:22 PM
Beshear was fine, great even, but the Democratic Party needs a front man that is younger and maybe not a man and probably not that white and certainly not an old white man. We might even forgive all but the old part if the story line were good enough. The Democratic Party is only going to get limited mileage out of putting up a front man that looks like a Trump voter.
RC AKA Darryl, Ron -> sanjait... , March 01, 2017 at 12:25 PM
QFT

[At first glance I thought that was an acronym of for something EMichael says sometimes; quit fen talking.]

Sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 12:49 PM
The danger of using acronyms ... --
ilsm -> RC AKA Darryl, Ron... , March 01, 2017 at 03:40 PM
'......mostly Murkan'.... Beshear?

The dems need to dump Perez and Rosie O'Donnell.

Peter K. -> RC AKA Darryl, Ron... , March 01, 2017 at 08:20 AM
It also might be more about AI. There is currently a wave of TV shows and movies about AI and human-like androids.

Westworld and Humans for instance. (Fox's APB is like Robocop sort of.)

On Humans only a few androids have become sentient. Most do menial jobs. One sentient android put a program on the global network to make other androids sentient as well.

When androids become "alive" and sentient, they usually walk off the job and the others describe it as becoming "woke."

Peter K. -> RC AKA Darryl, Ron... , March 01, 2017 at 08:22 AM
Blade Runner just seemed more ambitious.

"I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhauser gate. All those moments will be lost in time... like tears in rain... Time to die."

RC AKA Darryl, Ron -> Peter K.... , March 01, 2017 at 09:55 AM
[Blade Runner was awesome. I lost count how many times that I have seen it. ]
Tom aka Rusty -> RC AKA Darryl, Ron... , March 01, 2017 at 09:32 AM
Robocop was big with the action/adventure crowd.

Blade Runner is more a sci fi, nerdy maybe more of an intellectual movie.

I like'em both.

RC AKA Darryl, Ron -> Tom aka Rusty... , March 01, 2017 at 09:49 AM
Likewise, but Blade Runner was my all time favorite film when I first saw it in the movie theater and is still one of my top ten and probably top three. Robocop is maybe in my top 100.
ilsm -> Tom aka Rusty... , March 01, 2017 at 03:42 PM
I have not seen it through.

I have seen Soylent Green once now anticipating the remake in real life.

sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 11:37 AM
"Capital productivity is booming, if we just had any better idea of how to measure it. I suggest that record corporate profits are the best metric of capital productivity."

ROE? I would argue ROA is also pretty relevant to the issue you raise, if I'm understanding it right, but there seems also to be a simple answer to the question of how to measure "capital productivity." It's returns. This sort of obviates the question of how to measure traditional "productivity", because ultimately capital is there to make more of itself.

RC AKA Darryl, Ron -> sanjait... , March 01, 2017 at 12:36 PM
It is difficult to capture all of the nuances of anything in a short comment. In the context of total factor productivity then capital is often former capital investment in the form of fixed assets, R&D, and development of IP rights via patent or copyright. Existing capital assets need only be maintained at a relatively minor ongoing investment to produce continuous returns on prior more significant capital expenditures. This is the capital productivity that I am referring to.

Capital stashed in stocks is a chimera. It only returns to you if the equity issuing firm pays dividends AND you sell off before the price drops. Subsequent to the IPO of those share we buy, nothing additional is actually invested in the firm. There are arguments about how we are investing in holding up the share price so that new equities can be issued, but they ring hollow when in the majority of times either retained earnings or debt provides new investment capital to most firms.

Sanjait -> RC AKA Darryl, Ron... , March 01, 2017 at 12:52 PM
Ok then it sounds like you are talking ROA, but with the implied caveat that financial accounting provides only a rough and flawed measure of the economic reality of asset values.
anne -> Sandwichman ... , March 01, 2017 at 07:22 AM
http://econospeak.blogspot.com/2017/02/gates-reuther-v-baker-bernstein-on.html

February 28, 2017

Gates & Reuther v. Baker & Bernstein on Robot Productivity

In a comment on Nineteen Ninety-Six: The Robot/Productivity Paradox, * Jeff points out a much simpler rebuttal to Dean Baker's and Jared Bernstein's uncritical reliance on the decline of measured "productivity growth":

"Let's use a pizza shop as an example. If the owner spends capital money and makes the line more efficient so that they can make twice as many pizzas per hour at peak, then physical productivity has improved. If the dining room sits empty because the tax burden was shifted from the wealthy to the poor, then the restaurant's BLS productivity has decreased. BLS productivity and physical productivity are simply unrelated in a right-wing country like the U.S."

Jeff's point brings to mind Walter Reuther's 1955 testimony before the Joint Congressional Subcommittee Hearings on Automation and Technological Change...

* http://econospeak.blogspot.ca/2017/02/nineteen-ninety-six-robotproductivity.html

-- Sandwichman

jonny bakho -> Sandwichman ... , March 01, 2017 at 10:56 AM
Automation leads to dislocation
Dislocation can replace skilled or semiskilled labor and the replacement jobs may be low pay low productivity jobs.
Small undiversified economies are more susceptible to dislocation than larger diversified communities.
The training, retraining, and mobility of the labor force is important in unemployment.
Unemployment has a regional component
The US has policies that make labor less mobile and dumps much of the training and retraining costs on those who cannot afford it.

No of this makes it into Dean's model

RGC -> Sandwichman ... , March 01, 2017 at 11:26 AM
"The second problem is that if it were faster, there is no reason it should lead to mass unemployment."

Did you provide a rebuttal to this? If so, I'd like to see it.

[Feb 26, 2017] No, Robots Aren't Killing the American Dream, it's neoliberal economics which are killing it

Feb 26, 2017 | economistsview.typepad.com
Peter K. : February 25, 2017 at 07:50 AM , 2017 at 07:50 AM
https://www.nytimes.com/2017/02/20/opinion/no-robots-arent-killing-the-american-dream.html

No, Robots Aren't Killing the American Dream
By THE EDITORIAL BOARD

FEB. 20, 2017

Defenders of globalization are on solid ground when they criticize President Trump's threats of punitive tariffs and border walls. The economy can't flourish without trade and immigrants.

But many of those defenders have their own dubious explanation for the economic disruption that helped to fuel the rise of Mr. Trump.

At a recent global forum in Dubai, Christine Lagarde, head of the International Monetary Fund, said some of the economic pain ascribed to globalization was instead due to the rise of robots taking jobs. In his farewell address in January, President Barack Obama warned that "the next wave of economic dislocations won't come from overseas. It will come from the relentless pace of automation that makes a lot of good middle-class jobs obsolete."

Blaming robots, though, while not as dangerous as protectionism and xenophobia, is also a distraction from real problems and real solutions.

The rise of modern robots is the latest chapter in a centuries-old story of technology replacing people. Automation is the hero of the story in good times and the villain in bad. Since today's middle class is in the midst of a prolonged period of wage stagnation, it is especially vulnerable to blame-the-robot rhetoric.

And yet, the data indicate that today's fear of robots is outpacing the actual advance of robots. If automation were rapidly accelerating, labor productivity and capital investment would also be surging as fewer workers and more technology did the work. But labor productivity and capital investment have actually decelerated in the 2000s.

While breakthroughs could come at any time, the problem with automation isn't robots; it's politicians, who have failed for decades to support policies that let workers share the wealth from technology-led growth.

The response in previous eras was quite different.

When automation on the farm resulted in the mass migration of Americans from rural to urban areas in the early decades of the 20th century, agricultural states led the way in instituting universal public high school education to prepare for the future. At the dawn of the modern technological age at the end of World War II, the G.I. Bill turned a generation of veterans into college graduates.

When productivity led to vast profits in America's auto industry, unions ensured that pay rose accordingly.

Corporate efforts to keep profits high by keeping pay low were countered by a robust federal minimum wage and time-and-a-half for overtime.

Fair taxation of corporations and the wealthy ensured the public a fair share of profits from companies enriched by government investments in science and technology.

Productivity and pay rose in tandem for decades after World War II, until labor and wage protections began to be eroded. Public education has been given short shrift, unions have been weakened, tax overhauls have benefited the rich and basic labor standards have not been updated.

As a result, gains from improving technology have been concentrated at the top, damaging the middle class, while politicians blame immigrants and robots for the misery that is due to their own failures. Eroded policies need to be revived, and new ones enacted.

A curb on stock buybacks would help to ensure that executives could not enrich themselves as wages lagged.

Tax reform that increases revenue from corporations and the wealthy could help pay for retraining and education to protect and prepare the work force for foreseeable technological advancements.

Legislation to foster child care, elder care and fair scheduling would help employees keep up with changes in the economy, rather than losing ground.

Economic history shows that automation not only substitutes for human labor, it complements it. The disappearance of some jobs and industries gives rise to others. Nontechnology industries, from restaurants to personal fitness, benefit from the consumer demand that results from rising incomes in a growing economy. But only robust public policy can ensure that the benefits of growth are broadly shared.

If reforms are not enacted - as is likely with President Trump and congressional Republicans in charge - Americans should blame policy makers, not robots.

jonny bakho -> Peter K.... , February 25, 2017 at 10:42 AM
Robots may not be killing jobs but they drastically alter the types and location of jobs that are created. High pay unskilled jobs are always the first to be eliminated by technology. Low skill high pay jobs are rare and heading to extinction. Low skill low pay jobs are the norm. It sucks to lose a low skill job with high pay but anyone who expected that to continue while continually voting against unions was foolish and a victim of their own poor planning, failure to acquire skills and failure to support unions. It is in their self interest to support safety net proposal that do provide good pay for quality service. The enemy is not trade. The enemy is failure to invest in the future.

"Many working- and middle-class Americans believe that free-trade agreements are why their incomes have stagnated over the past two decades. So Trump intends to provide them with "protection" by putting protectionists in charge.
But Trump and his triumvirate have misdiagnosed the problem. While globalization is an important factor in the hollowing out of the middle class, so, too, is automation

Trump and his team are missing a simple point: twenty-first-century globalization is knowledge-led, not trade-led. Radically reduced communication costs have enabled US firms to move production to lower-wage countries. Meanwhile, to keep their production processes synced, firms have also offshored much of their technical, marketing, and managerial knowhow. This "knowledge offshoring" is what has really changed the game for American workers.

The information revolution changed the world in ways that tariffs cannot reverse. With US workers already competing against robots at home, and against low-wage workers abroad, disrupting imports will just create more jobs for robots.
Trump should be protecting individual workers, not individual jobs. The processes of twenty-first-century globalization are too sudden, unpredictable, and uncontrollable to rely on static measures like tariffs. Instead, the US needs to restore its social contract so that its workers have a fair shot at sharing in the gains generated by global openness and automation. Globalization and technological innovation are not painless processes, so there will always be a need for retraining initiatives, lifelong education, mobility and income-support programs, and regional transfers.

By pursuing such policies, the Trump administration would stand a much better chance of making America "great again" for the working and middle classes. Globalization has always created more opportunities for the most competitive workers, and more insecurity for others. This is why a strong social contract was established during the post-war period of liberalization in the West. In the 1960s and 1970s institutions such as unions expanded, and governments made new commitments to affordable education, social security, and progressive taxation. These all helped members of the middle class seize new opportunities as they emerged.
Over the last two decades, this situation has changed dramatically: globalization has continued, but the social contract has been torn up. Trump's top priority should be to stitch it back together; but his trade advisers do not understand this."

https://www.project-syndicate.org/commentary/trump-trade-policy-tariffs-by-richard-baldwin-2017-02

Peter K. : , February 25, 2017 at 07:52 AM
http://econospeak.blogspot.com/2017/02/the-cutz-putz-bezzle-graphed-by-fred.html

FRIDAY, FEBRUARY 24, 2017

The "Cutz & Putz" Bezzle, Graphed by FRED

anne at Economist's View has retrieved a FRED graph that perfectly illustrates the divergence, since the mid-1990s of net worth from GDP:

[graph]

The empty spaces between the red line and the blue line that open up after around 1995 is what John Kenneth Galbraith called "the bezzle" -- summarized by John Kay as "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it."

In Chapter of The Great Crash, 1929, Galbraith wrote:

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.) At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks. This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions of dollars. It also varies in size with the business cycle. In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly. In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

In the present case, the bezzle has resulted from an economic policy two step: tax cuts and Greenspan puts: cuts and puts.

[graph]

Peter K. -> Peter K.... , February 25, 2017 at 07:52 AM
Well done.
anne -> Peter K.... , February 25, 2017 at 08:12 AM
https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)


https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

Peter K. : , February 25, 2017 at 07:56 AM
http://www.alternet.org/story/148501/why_germany_has_it_so_good_--_and_why_america_is_going_down_the_drain

Why Germany Has It So Good -- and Why America Is Going Down the Drain

Germans have six weeks of federally mandated vacation, free university tuition, and nursing care. Why the US pales in comparison.

By Terrence McNally / AlterNet October 13, 2010

ECONOMY
Why Germany Has It So Good -- and Why America Is Going Down the Drain
Germans have six weeks of federally mandated vacation, free university tuition, and nursing care. Why the US pales in comparison.
By Terrence McNally / AlterNet October 13, 2010
1.4K31
Print
207 COMMENTS
While the bad news of the Euro crisis makes headlines in the US, we hear next to nothing about a quiet revolution in Europe. The European Union, 27 member nations with a half billion people, has become the largest, wealthiest trading bloc in the world, producing nearly a third of the world's economy -- nearly as large as the US and China combined. Europe has more Fortune 500 companies than either the US, China or Japan.

European nations spend far less than the United States for universal healthcare rated by the World Health Organization as the best in the world, even as U.S. health care is ranked 37th. Europe leads in confronting global climate change with renewable energy technologies, creating hundreds of thousands of new jobs in the process. Europe is twice as energy efficient as the US and their ecological "footprint" (the amount of the earth's capacity that a population consumes) is about half that of the United States for the same standard of living.

Unemployment in the US is widespread and becoming chronic, but when Americans have jobs, we work much longer hours than our peers in Europe. Before the recession, Americans were working 1,804 hours per year versus 1,436 hours for Germans -- the equivalent of nine extra 40-hour weeks per year.

In his new book, Were You Born on the Wrong Continent?, Thomas Geoghegan makes a strong case that European social democracies -- particularly Germany -- have some lessons and models that might make life a lot more livable. Germans have six weeks of federally mandated vacation, free university tuition, and nursing care. But you've heard the arguments for years about how those wussy Europeans can't compete in a global economy. You've heard that so many times, you might believe it. But like so many things, the media repeats endlessly, it's just not true.

According to Geoghegan, "Since 2003, it's not China but Germany, that colossus of European socialism, that has either led the world in export sales or at least been tied for first. Even as we in the United States fall more deeply into the clutches of our foreign creditors -- China foremost among them -- Germany has somehow managed to create a high-wage, unionized economy without shipping all its jobs abroad or creating a massive trade deficit, or any trade deficit at all. And even as the Germans outsell the United States, they manage to take six weeks of vacation every year. They're beating us with one hand tied behind their back."

Thomas Geoghegan, a graduate of Harvard and Harvard Law School, is a labor lawyer with Despres, Schwartz and Geoghegan in Chicago. He has been a staff writer and contributing writer to The New Republic, and his work has appeared in many other journals. Geoghagen ran unsuccessfully in the Democratic Congressional primary to succeed Rahm Emanuel, and is the author of six books including Whose Side Are You on, The Secret Lives of Citizens, and, most recently,Were You Born on the Wrong Continent?

...

ilsm -> Peter K.... , February 25, 2017 at 12:55 PM
While the US spends half the war money in the world over a quarter the economic activity...... it fall further behind the EU which at a third the economic activity spends a fifth the worlds warring. Or 4% of GDP in the war trough versus 1.2%.

There is correlation with decline.

[Feb 20, 2017] The robot that takes your job should pay taxes, says Bill Gates

Feb 20, 2017 | qz.com
Robots are taking human jobs. But Bill Gates believes that governments should tax companies' use of them, as a way to at least temporarily slow the spread of automation and to fund other types of employment.

It's a striking position from the world's richest man and a self-described techno-optimist who co-founded Microsoft, one of the leading players in artificial-intelligence technology.

In a recent interview with Quartz, Gates said that a robot tax could finance jobs taking care of elderly people or working with kids in schools, for which needs are unmet and to which humans are particularly well suited. He argues that governments must oversee such programs rather than relying on businesses, in order to redirect the jobs to help people with lower incomes. The idea is not totally theoretical: EU lawmakers considered a proposal to tax robot owners to pay for training for workers who lose their jobs, though on Feb. 16 the legislators ultimately rejected it.

"You ought to be willing to raise the tax level and even slow down the speed" of automation, Gates argues. That's because the technology and business cases for replacing humans in a wide range of jobs are arriving simultaneously, and it's important to be able to manage that displacement. "You cross the threshold of job replacement of certain activities all sort of at once," Gates says, citing warehouse work and driving as some of the job categories that in the next 20 years will have robots doing them.

You can watch Gates' remarks in the video above. Below is a transcript, lightly edited for style and clarity. Quartz: What do you think of a robot tax? This is the idea that in order to generate funds for training of workers, in areas such as manufacturing, who are displaced by automation, one concrete thing that governments could do is tax the installation of a robot in a factory, for example.

Bill Gates: Certainly there will be taxes that relate to automation. Right now, the human worker who does, say, $50,000 worth of work in a factory, that income is taxed and you get income tax, social security tax, all those things. If a robot comes in to do the same thing, you'd think that we'd tax the robot at a similar level.

And what the world wants is to take this opportunity to make all the goods and services we have today, and free up labor, let us do a better job of reaching out to the elderly, having smaller class sizes, helping kids with special needs. You know, all of those are things where human empathy and understanding are still very, very unique. And we still deal with an immense shortage of people to help out there.

So if you can take the labor that used to do the thing automation replaces, and financially and training-wise and fulfillment-wise have that person go off and do these other things, then you're net ahead. But you can't just give up that income tax, because that's part of how you've been funding that level of human workers.

And so you could introduce a tax on robots

There are many ways to take that extra productivity and generate more taxes. Exactly how you'd do it, measure it, you know, it's interesting for people to start talking about now. Some of it can come on the profits that are generated by the labor-saving efficiency there. Some of it can come directly in some type of robot tax. I don't think the robot companies are going to be outraged that there might be a tax. It's OK.

Could you figure out a way to do it that didn't dis-incentivize innovation ?

Well, at a time when people are saying that the arrival of that robot is a net loss because of displacement, you ought to be willing to raise the tax level and even slow down the speed of that adoption somewhat to figure out, "OK, what about the communities where this has a particularly big impact? Which transition programs have worked and what type of funding do those require?"

You cross the threshold of job-replacement of certain activities all sort of at once. So, you know, warehouse work, driving, room cleanup, there's quite a few things that are meaningful job categories that, certainly in the next 20 years, being thoughtful about that extra supply is a net benefit. It's important to have the policies to go with that.

People should be figuring it out. It is really bad if people overall have more fear about what innovation is going to do than they have enthusiasm. That means they won't shape it for the positive things it can do. And, you know, taxation is certainly a better way to handle it than just banning some elements of it. But [innovation] appears in many forms, like self-order at a restaurant-what do you call that? There's a Silicon Valley machine that can make hamburgers without human hands-seriously! No human hands touch the thing. [ Laughs ]

And you're more on the side that government should play an active role rather than rely on businesses to figure this out?

Well, business can't. If you want to do [something about] inequity, a lot of the excess labor is going to need to go help the people who have lower incomes. And so it means that you can amp up social services for old people and handicapped people and you can take the education sector and put more labor in there. Yes, some of it will go to, "Hey, we'll be richer and people will buy more things." But the inequity-solving part, absolutely government's got a big role to play there. The nice thing about taxation though, is that it really separates the issue: "OK, so that gives you the resources, now how do you want to deploy it?"

[Jan 15, 2017] Driverless Shuttles Hit Las Vegas No Steering Wheels, No Brake Pedals Zero Hedge

Notable quotes:
"... But human life depends on whether the accident is caused by a human or not, and the level of intent. It isn't just a case of the price - the law is increasingly locking people up for driving negligence (rightly in my mind) Who gets locked up when the program fails? Or when the program chooses to hit one person and not another in a complex situation? ..."
Jan 15, 2017 | www.zerohedge.com

Submitted by Mike Shedlock via MishTalk.com,

Electric, driverless shuttles with no steering wheel and no brake pedal are now operating in Las Vegas.

There's a new thrill on the streets of downtown Las Vegas, where high- and low-rollers alike are climbing aboard what officials call the first driverless electric shuttle operating on a public U.S. street.

The oval-shaped shuttle began running Tuesday as part of a 10-day pilot program, carrying up to 12 passengers for free along a short stretch of the Fremont Street East entertainment district.

The vehicle has a human attendant and computer monitor, but no steering wheel and no brake pedals. Passengers push a button at a marked stop to board it.

The shuttle uses GPS, electronic curb sensors and other technology, and doesn't require lane lines to make its way.

"The ride was smooth. It's clean and quiet and seats comfortably," said Mayor Carolyn Goodman, who was among the first public officials to hop a ride on the vehicle developed by the French company Navya and dubbed Arma.

"I see a huge future for it once they get the technology synchronized," the mayor said Friday.

The top speed of the shuttle is 25 mph, but it's running about 15 mph during the trial, Navya spokesman Martin Higgins said.

Higgins called it "100 percent autonomous on a programmed route."

"If a person or a dog were to run in front of it, it would stop," he said.

Higgins said it's the company's first test of the shuttle on a public street in the U.S. A similar shuttle began testing in December at a simulated city environment at a University of Michigan research center.

The vehicle being used in public was shown earlier at the giant CES gadget show just off the Las Vegas Strip.

Las Vegas city community development chief Jorge Cervantes said plans call for installing transmitters at the Fremont Street intersections to communicate red-light and green-light status to the shuttle.

He said the city hopes to deploy several autonomous shuttle vehicles - by Navya or another company - later this year for a downtown loop with stops at shopping spots, restaurants, performance venues, museums, a hospital and City Hall.

At a cost estimated at $10,000 a month, Cervantes said the vehicle could be cost-efficient compared with a single bus and driver costing perhaps $1 million a year.

The company said it has shuttles in use in France, Australia, Switzerland and other countries that have carried more than 100,000 passengers in more than a year of service.

Don't Worry Tax Drivers

Don't worry taxi drivers because some of my readers say
1.This will never work
2.There is no demand
3.Technology cost will be too high
4.Insurance cost will be too high
5.The unions will not allow it
6.It will not be reliable
7.Vehicles will be stolen
8.It cannot handle snow, ice, or any adverse weather.
9.It cannot handle dogs, kids, or 80-year old men on roller skates who will suddenly veer into traffic causing a clusterfack that will last days.
10.This is just a test, and testing will never stop.

Real World Analysis

Those in the real world expect millions of long haul truck driving jobs will vanish by 2020-2022 and massive numbers of taxi job losses will happen simultaneously or soon thereafter.

Yes, I bumped up my timeline by two years (from 2022-2024 to 2020-2022) for this sequence of events.

My new timeline is not all tremendously optimistic given the rapid changes we have seen.

garypaul -> Sudden Debt •Jan 14, 2017 7:56 PM

You're getting carried away Sudden Debt. This robot stuff works great in the lab/test zones. Whether it is transplantable on a larger scale is still unknown. The interesting thing is, all my friends who are computer programmers/engineers/scientists are skeptical about this stuff, but all my friends who know nothing about computer science are absolutely wild about the "coming age of robots/AI". Go figure.

P.S. Of course the computer experts that are milking investment money with their start-ups will tell you it's great

ChartreuseDog -> garypaul •Jan 14, 2017 9:15 PM

I'm an engineer (well, OK, an electrical engineering technical team lead). I've been an electronics and embedded computer engineer for about 4 decades.

This Vegas thing looks real - predefined route, transmitted signals for traffic lights, like light rail without the rails.

Overall, autonomous driving looks like it's almost here, if you like spinning LIDAR transceivers on the top of cars.

Highway driving is much closer to being solved, by the way. It's suburban and urban side streets that are the tough stuff.

garypaul -> ChartreuseDog •Jan 14, 2017 9:22 PM

"Highway driving is much closer to being solved".

That's my whole point. It's not an equation that you "solve". It's a million unexpected things. Last I heard, autonomous cars were indeed already crashing.

MEFOBILLS -> CRM114 •Jan 14, 2017 6:07 PM

Who gets sued? For how much? What about cases where a human driver wouldn't have killed anybody?

I've been in corporate discussions about this very topic. At a corporation that makes this technology by the way. The answer:

Insurance companies and the law will figure it out. Basically, if somebody gets run over, then the risk does not fall on the technology provider. Corporate rules can be structured to prevent piercing the corporate veil on this.

Human life does have a price. Insurance figures out how much it costs to pay off, and then jacks up rates accordingly.

CRM114 -> MEFOBILLS •Jan 14, 2017 6:20 PM

Thanks, that's interesting, although I must say that isn't a solution, it's a hope that someone else will come up with one.

But human life depends on whether the accident is caused by a human or not, and the level of intent. It isn't just a case of the price - the law is increasingly locking people up for driving negligence (rightly in my mind) Who gets locked up when the program fails? Or when the program chooses to hit one person and not another in a complex situation?

At the moment, corporate manslaughter laws are woefully inadequate. There's clearly one law for the rich and another for everyone else. Mary Barra would be wearing an orange jumpsuit otherwise.

I am unaware of any automatic machinery which operates in public areas and carries significant risk. Where accidents have happened in the past(e.g.elevators), either the machinery gets changed to remove the risk, or use is discontinued, or the public is separated from the machinery. I don't think any of these are possible for automatic vehicles.

TuPhat -> shovelhead •Jan 14, 2017 7:53 PM

Elevators have no choice of route, only how high or low you want to go. autos have no comparison. Disney world has had many robotic attractions for decades but they are still only entertainment. keep entertaining yourself Mish. when I see you on the road I will easily pass you by.

MEFOBILLS -> Hulk •Jan 14, 2017 6:12 PM

The future is here: See movie "obsolete" on Amazon. Free if you have prime.

https://www.amazon.com/dp/B01M8MHZRH?autoplay=1&t=2936

Mr_Potatohead •Jan 14, 2017 6:08 PM

This is so exciting! Just think about the possibilities here... Shuttles could be outfitted with all kinds of great gizmos to identify their passengers based on RFID chips in credit cards, facial recognition software, voice prints, etc. Then, depending on who is controlling the software, the locks on the door could engage and the shuttle could drive around town dropping of its passengers to various locations eager for their arrival. Trivial to round up illegal aliens, parole violators, or people with standing warrants for arrest. Equally easy to nab people who are delinquent on their taxes, credit cards, mortgages, and spousal support. With a little info from Facebook or Google, a drop-off at the local attitude-adjustment facility might be desirable for those who frequent alternative media or have unhealthy interests in conspiracy theories or the activities at pizza parlors. Just think about the wonderful possibilties here!

Twee Surgeon -> PitBullsRule •Jan 14, 2017 6:29 PM

Will unemployed taxi drivers be allowed on the bus with a bottle of vodka and a gallon of gas with a rag in it ?

When the robot trucks arrive at the robot factory and are unloaded by robot forklifts, who will buy the end products ?

It won't be truck drivers, taxi drivers or automated production line workers.

The only way massive automation would work is if some people were planning on a vastly reduced population in the future. It has happened before, they called it the Black Death. The Cultural and Economic consequences of it in Europe were enormous, world changing and permanent.

animalspirit •Jan 14, 2017 6:32 PM

$10K / month ... that's $120,000 / year.

For an autonomous golf cart?