Softpanorama

Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Skepticism and critical thinking is not panacea, but can help to understand the world better

Software Engineering

News Programming Languages Design Recommended Books Recommended Links Selected Papers LAMP Stack Unix Component Model
Architecture Brooks law Conway Law A Note on the Relationship of Brooks Law and Conway Law Configuration Management Simplification and KISS Git
Software Life Cycle Models Software Prototyping Program Understanding Exteme programming as yet another SE fad Distributed software development anti-OO Literate Programming
Reverse Engineering Links Programming style Project Management Code Reviews and Inspections The Mythical Man-Month Design patterns CMM
Bad Software Information Overload Inhouse vs outsourced applications development OSS Development as a Special Type of Academic Research A Second Look at the Cathedral and Bazaar  Labyrinth of Software Freedom Programming as a profession
Testing Over 50 and unemployed Sysadmin Horror Stories Health Issues SE quotes Humor Etc

Software Engineering: A study akin to numerology and astrology, but lacking the precision of the former and the success of the latter.

KISS Principle     /kis' prin'si-pl/ n.     "Keep It Simple, Stupid". A maxim often invoked when discussing design to fend off creeping featurism and control development complexity. Possibly related to the marketroid maxim on sales presentations, "Keep It Short and Simple".

creeping featurism     /kree'ping fee'chr-izm/ n.     [common] 1. Describes a systematic tendency to load more chrome and features onto systems at the expense of whatever elegance they may have possessed when originally designed. See also feeping creaturism. "You know, the main problem with BSD Unix has always been creeping featurism." 2. More generally, the tendency for anything complicated to become even more complicated because people keep saying "Gee, it would be even better if it had this feature too". (See feature.) The result is usually a patchwork because it grew one ad-hoc step at a time, rather than being planned. Planning is a lot of work, but it's easy to add just one extra little feature to help someone ... and then another ... and another... When creeping featurism gets out of hand, it's like a cancer. Usually this term is used to describe computer programs, but it could also be said of the federal government, the IRS 1040 form, and new cars. A similar phenomenon sometimes afflicts conscious redesigns; see second-system effect. See also creeping elegance.
Jargon file

Software engineering (SE)  has probably largest concentration of snake oil salesman after OO programming and software architecture is far from being an exclusion.  Many published software methodologies/architectures claim to provide the benefits, that most of them can not deliver (UML is one good example). I see a lot of oversimplification of the real situation and unnecessary (and useless) formalisms.  The main idea advocated here is simplification of software architecture (including usage of well-understood "Pipe and Filter model")  and scripting languages.

There are few quality general architectural resources available from the Net, therefore the list below represent only some links that I am interested personally. The stress here is on skepticism and this collection is neither complete, nor up to date. But still it might help students that are trying to study this complex and interesting subject. Or perhaps, if you already a software architect you might be able to expand your knowledge of the subject. 

Excessive zeal in adopting some fashionable but questionable methodology is a "real and present danger" in software engineering. This is not a new threat, it started with structured programming revolution and then verification "holy land" searching with Edsger W. Dijkstra as a new prophet of an obsure cult.  The main problem here that all those methodologies contain 20% of useful elements; but the other 80% kill all the useful elements and introduce probably some real disadvantages. After a dozen or so partially useful but mostly useless methodologies came, were enthusiastically  adopted and went into oblivion we should definitely be skeptical.

All this "extreme programming" idiotism or CMM Lysenkoism should be treated as we treat dangerous religious sects.  It's undemocratic and stupid to prohibit them but it's equally dangerous and stupid to follow their recommendations ;-). As Talleyrand advised to junior diplomats: "Above all, gentlemen, not too much zeal. "  By this phrase, Talleyrand was reportedly recommended to his subordinates that important decisions must be based upon the exercise of cool-headed reason and not upon emotions or any waxing or waning popular delusion.

One interesting fact about software architecture is that it can't be practiced from the "ivory tower". Only when you do coding yourself and faces limitations of the tools and hardware you can create a great architecture.  See Real Insights into Architecture Come Only From Actual Programming

One interesting fact about software architecture is that it can't be practiced from the "ivory tower". Only when you do coding yourself and faces limitations of the tools and hardware you can create a great architecture.  See Real Insights into Architecture Come Only From Actual Programming

The primary purpose of Software Architecture courses is to teach students some higher level skills useful in designing and implementing complex software systems. In usually includes some information about classification (general and domain specific architectures), analysis and tools.  As guys in Breadmear consulting aptly noted in their paper  Who Software Architect Role:

A simplistic view of the role is that architects create architectures, and their responsibilities encompass all that is involved in doing so. This would include articulating the architectural vision, conceptualizing and experimenting with alternative architectural approaches, creating models and component and interface specification documents, and validating the architecture against requirements and assumptions.

However, any experienced architect knows that the role involves not just these technical activities, but others that are more political and strategic in nature on the one hand, and more like those of a consultant, on the other. A sound sense of business and technical strategy is required to envision the "right" architectural approach to the customer's problem set, given the business objectives of the architect's organization. Activities in this area include the creation of technology roadmaps, making assertions about technology directions and determining their consequences for the technical strategy and hence architectural approach.

Further, architectures are seldom embraced without considerable challenges from many fronts. The architect thus has to shed any distaste for what may be considered "organizational politics", and actively work to sell the architecture to its various stakeholders, communicating extensively and working networks of influence to ensure the ongoing success of the architecture.

But "buy-in" to the architecture vision is not enough either. Anyone involved in implementing the architecture needs to understand it. Since weighty architectural documents are notorious dust-gatherers, this involves creating and teaching tutorials and actively consulting on the application of the architecture, and being available to explain the rationale behind architectural choices and to make amendments to the architecture when justified.

Lastly, the architect must lead--the architecture team, the developer community, and, in its technical direction, the organization.

Again, I would like to stress that the main principle of software architecture is simple and well known -- it's famous KISS principle. While principle is simple its implementation is not and a lot of developers (especially developers with limited resources) paid dearly for violating this principle.  I have found one one reference on simplicity in SE: R. S. Pressman. Simplicity. In Software Engineering, A Practitioner's Approach, page 452. McGraw Hill, 1997. Here open source tools can help because for those tools a complexity is not such a competitive advantage as for closed source tools. But that not necessary true about actual tools as one problem with open source projects is change of the leader. This is the moment when many projects lose architectural integrity and became a Byzantium compendium of conflicting approaches.

I appreciate am architecture of software system that lead to small size implementations with simple, Spartan interface. In these days the usage of scripting languages can cut the volume of code more than in half in comparison with Java.  That's why this site is advocating usage of scripting languages for complex software projects.

"Real Beauty can be found in Simplicity," and as you may know already, ' "Less" sometimes equal "More".' I continue to adhere to that philosophy. If you, too, have an eye for simplicity in software engineering, then you might benefit from this collection of links.

I think writing a good software system is somewhat similar to writing a multivolume series of books. Most writers will rewrite each chapter of book several times and changes general structure a lot. Rewriting large systems is more difficult, but also very beneficial. It make sense always consider the current version of the system a draft that can be substantially improved and simplified by discovering some new unifying and simplifying paradigm.  Sometimes you can take a wrong direction, but still "nothing venture nothing have."

On a subsystem level a decent configuration management system can help going back. Too often people try to write and debug their fundamentally flawed architecturally "first draft", when it would have been much simpler and faster to rewrite it based on better understanding of architecture and better understanding of the problem.  Actually rewriting can save time spend in debugging of the old version.  That way, when you're done, you may get easy-to-understand, simple software systems, instead of just systems that "seems to work okay" (only as correct as your testing).

On component level refactoring (see Refactoring: Improving the Design of Existing Code) might be a useful simplification technique. Actually rewriting is a simpler term, but let's assume that refactoring is rewriting with some ideological frosting ;-). See Slashdot Book Reviews Refactoring Improving the Design of Existing Code.

I have found one reference on simplicity in SE: R. S. Pressman. Simplicity. In Software Engineering, A Practitioner's Approach, page 452. McGraw Hill, 1997.

Another relevant work (he try to promote his own solution -- you can skip this part) is the critique of "the technology mud slide" in a  book The Innovator's Dilemma by Harvard Business School Professor Clayton M. Christensen . He defined  the term"technology mudslide", the concept very similar to Brooks "software development tar pit" -- a perpetual cycle of abandonment or retooling of existing systems in pursuit of the latest fashionable technology trend -- a cycle in which

 "Coping with the relentless onslaught of technology change was akin to trying to climb a mudslide raging down a hill. You have to scramble with everything you've got to stay on top of it. and if you ever once stop to catch your breath, you get buried."

The complexity caused by adopting new technology for the sake of new technology is further exacerbated by the narrow focus and inexperience of many project leaders -- inexperience with mission-critical systems, systems of larger scale then previously built, software development disciplines, and project management. A Standish Group International survey recently showed that 46% of IT projects were over budget and overdue -- and 28% failed altogether. That's normal and probably the real failures figures are higher: great software managers and architects are rare and it is those people who determine the success of a software project.

Dr. Nikolai Bezroukov


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Jul 23, 2019] Object-Oriented Programming -- The Trillion Dollar Disaster

While OO critique is good (althoth most point are far from new) and up to the point the proposed solution is not. There is no universal opener for creating elegant reliable programs.
Notable quotes:
"... Object-Oriented Programming has been created with one goal in mind -- to manage the complexity of procedural codebases. In other words, it was supposed to improve code organization. There's no objective and open evidence that OOP is better than plain procedural programming. ..."
"... The bitter truth is that OOP fails at the only task it was intended to address. It looks good on paper -- we have clean hierarchies of animals, dogs, humans, etc. However, it falls flat once the complexity of the application starts increasing. Instead of reducing complexity, it encourages promiscuous sharing of mutable state and introduces additional complexity with its numerous design patterns . OOP makes common development practices, like refactoring and testing, needlessly hard. ..."
"... C++ is a horrible [object-oriented] language And limiting your project to C means that people don't screw things up with any idiotic "object model" c&@p. -- Linus Torvalds, the creator of Linux ..."
"... Many dislike speed limits on the roads, but they're essential to help prevent people from crashing to death. Similarly, a good programming framework should provide mechanisms that prevent us from doing stupid things. ..."
"... Unfortunately, OOP provides developers too many tools and choices, without imposing the right kinds of limitations. Even though OOP promises to address modularity and improve reusability, it fails to deliver on its promises (more on this later). OOP code encourages the use of shared mutable state, which has been proven to be unsafe time and time again. OOP typically requires a lot of boilerplate code (low signal-to-noise ratio). ..."
Jul 23, 2019 | medium.com

The ultimate goal of every software developer should be to write reliable code. Nothing else matters if the code is buggy and unreliable. And what is the best way to write code that is reliable? Simplicity . Simplicity is the opposite of complexity . Therefore our first and foremost responsibility as software developers should be to reduce code complexity.

Disclaimer

I'll be honest, I'm not a raving fan of object-orientation. Of course, this article is going to be biased. However, I have good reasons to dislike OOP.

I also understand that criticism of OOP is a very sensitive topic -- I will probably offend many readers. However, I'm doing what I think is right. My goal is not to offend, but to raise awareness of the issues that OOP introduces.

I'm not criticizing Alan Kay's OOP -- he is a genius. I wish OOP was implemented the way he designed it. I'm criticizing the modern Java/C# approach to OOP.

I will also admit that I'm angry. Very angry. I think that it is plain wrong that OOP is considered the de-facto standard for code organization by many people, including those in very senior technical positions. It is also wrong that many mainstream languages don't offer any other alternatives to code organization other than OOP.

Hell, I used to struggle a lot myself while working on OOP projects. And I had no single clue why I was struggling this much. Maybe I wasn't good enough? I had to learn a couple more design patterns (I thought)! Eventually, I got completely burned out.

This post sums up my first-hand decade-long journey from Object-Oriented to Functional programming. I've seen it all. Unfortunately, no matter how hard I try, I can no longer find use cases for OOP. I have personally seen OOP projects fail because they become too complex to maintain.


TLDR

Object oriented programs are offered as alternatives to correct ones -- Edsger W. Dijkstra , pioneer of computer science

<img src="https://miro.medium.com/max/1400/1*MTb-Xx5D0H6LUJu_cQ9fMQ.jpeg" width="700" height="467"/>
Photo by Sebastian Herrmann on Unsplash

Object-Oriented Programming has been created with one goal in mind -- to manage the complexity of procedural codebases. In other words, it was supposed to improve code organization. There's no objective and open evidence that OOP is better than plain procedural programming.

The bitter truth is that OOP fails at the only task it was intended to address. It looks good on paper -- we have clean hierarchies of animals, dogs, humans, etc. However, it falls flat once the complexity of the application starts increasing. Instead of reducing complexity, it encourages promiscuous sharing of mutable state and introduces additional complexity with its numerous design patterns . OOP makes common development practices, like refactoring and testing, needlessly hard.

Some might disagree with me, but the truth is that modern OOP has never been properly designed. It never came out of a proper research institution (in contrast with Haskell/FP). I do not consider Xerox or another enterprise to be a "proper research institution". OOP doesn't have decades of rigorous scientific research to back it up. Lambda calculus offers a complete theoretical foundation for Functional Programming. OOP has nothing to match that. OOP mainly "just happened".

Using OOP is seemingly innocent in the short-term, especially on greenfield projects. But what are the long-term consequences of using OOP? OOP is a time bomb, set to explode sometime in the future when the codebase gets big enough.

Projects get delayed, deadlines get missed, developers get burned-out, adding in new features becomes next to impossible. The organization labels the codebase as the "legacy codebase" , and the development team plans a rewrite .

OOP is not natural for the human brain, our thought process is centered around "doing" things -- go for a walk, talk to a friend, eat pizza. Our brains have evolved to do things, not to organize the world into complex hierarchies of abstract objects.

OOP code is non-deterministic -- unlike with functional programming, we're not guaranteed to get the same output given the same inputs. This makes reasoning about the program very hard. As an oversimplified example, the output of 2+2 or calculator.Add(2, 2) mostly is equal to four, but sometimes it might become equal to three, five, and maybe even 1004. The dependencies of the Calculator object might change the result of the computation in subtle, but profound ways.


The Need for a Resilient Framework

I know, this may sound weird, but as programmers, we shouldn't trust ourselves to write reliable code. Personally, I am unable to write good code without a strong framework to base my work on. Yes, there are frameworks that concern themselves with some very particular problems (e.g. Angular or ASP.Net).

I'm not talking about the software frameworks. I'm talking about the more abstract dictionary definition of a framework: "an essential supporting structure " -- frameworks that concern themselves with the more abstract things like code organization and tackling code complexity. Even though Object-Oriented and Functional Programming are both programming paradigms, they're also both very high-level frameworks.

Limiting our choices

C++ is a horrible [object-oriented] language And limiting your project to C means that people don't screw things up with any idiotic "object model" c&@p. -- Linus Torvalds, the creator of Linux

Linus Torvalds is widely known for his open criticism of C++ and OOP. One thing he was 100% right about is limiting programmers in the choices they can make. In fact, the fewer choices programmers have, the more resilient their code becomes. In the quote above, Linus Torvalds highly recommends having a good framework to base our code upon.

<img src="https://miro.medium.com/max/1400/1*ujt2PMrbhCZuGhufoxfr5w.jpeg" width="700" height="465"/>
Photo by specphotops on Unsplash

Many dislike speed limits on the roads, but they're essential to help prevent people from crashing to death. Similarly, a good programming framework should provide mechanisms that prevent us from doing stupid things.

A good programming framework helps us to write reliable code. First and foremost, it should help reduce complexity by providing the following things:

  1. Modularity and reusability
  2. Proper state isolation
  3. High signal-to-noise ratio

Unfortunately, OOP provides developers too many tools and choices, without imposing the right kinds of limitations. Even though OOP promises to address modularity and improve reusability, it fails to deliver on its promises (more on this later). OOP code encourages the use of shared mutable state, which has been proven to be unsafe time and time again. OOP typically requires a lot of boilerplate code (low signal-to-noise ratio).

... ... ...

Messaging

Alan Kay coined the term "Object Oriented Programming" in the 1960s. He had a background in biology and was attempting to make computer programs communicate the same way living cells do.

<img src="https://miro.medium.com/max/1400/1*bzRsnzakR7O4RMbDfEZ1sA.jpeg" width="700" height="467"/>
Photo by Muukii on Unsplash

Alan Kay's big idea was to have independent programs (cells) communicate by sending messages to each other. The state of the independent programs would never be shared with the outside world (encapsulation).

That's it. OOP was never intended to have things like inheritance, polymorphism, the "new" keyword, and the myriad of design patterns.

OOP in its purest form

Erlang is OOP in its purest form. Unlike more mainstream languages, it focuses on the core idea of OOP -- messaging. In Erlang, objects communicate by passing immutable messages between objects.

Is there proof that immutable messages are a superior approach compared to method calls?

Hell yes! Erlang is probably the most reliable language in the world. It powers most of the world's telecom (and hence the internet) infrastructure. Some of the systems written in Erlang have reliability of 99.9999999% (you read that right -- nine nines). Code Complexity

With OOP-inflected programming languages, computer software becomes more verbose, less readable, less descriptive, and harder to modify and maintain.

-- Richard Mansfield

The most important aspect of software development is keeping the code complexity down. Period. None of the fancy features matter if the codebase becomes impossible to maintain. Even 100% test coverage is worth nothing if the codebase becomes too complex and unmaintainable .

What makes the codebase complex? There are many things to consider, but in my opinion, the top offenders are: shared mutable state, erroneous abstractions, and low signal-to-noise ratio (often caused by boilerplate code). All of them are prevalent in OOP.


The Problems of State

<img src="https://miro.medium.com/max/1400/1*1WeuR9OoKyD5EvtT9KjXOA.jpeg" width="700" height="467"/>
Photo by Mika Baumeister on Unsplash

What is state? Simply put, state is any temporary data stored in memory. Think variables or fields/properties in OOP. Imperative programming (including OOP) describes computation in terms of the program state and changes to that state . Declarative (functional) programming describes the desired results instead, and don't specify changes to the state explicitly.

... ... ...

To make the code more efficient, objects are passed not by their value, but by their reference . This is where "dependency injection" falls flat.

Let me explain. Whenever we create an object in OOP, we pass references to its dependencies to the constructor . Those dependencies also have their own internal state. The newly created object happily stores references to those dependencies in its internal state and is then happy to modify them in any way it pleases. And it also passes those references down to anything else it might end up using.

This creates a complex graph of promiscuously shared objects that all end up changing each other's state. This, in turn, causes huge problems since it becomes almost impossible to see what caused the program state to change. Days might be wasted trying to debug such state changes. And you're lucky if you don't have to deal with concurrency (more on this later).

Methods/Properties

The methods or properties that provide access to particular fields are no better than changing the value of a field directly. It doesn't matter whether you mutate an object's state by using a fancy property or method -- the result is the same: mutated state.

Some people say that OOP tries to model the real world. This is simply not true -- OOP has nothing to relate to in the real world. Trying to model programs as objects probably is one of the biggest OOP mistakes.

The real world is not hierarchical

OOP attempts to model everything as a hierarchy of objects. Unfortunately, that is not how things work in the real world. Objects in the real world interact with each other using messages, but they mostly are independent of each other.

Inheritance in the real world

OOP inheritance is not modeled after the real world. The parent object in the real world is unable to change the behavior of child objects at run-time. Even though you inherit your DNA from your parents, they're unable to make changes to your DNA as they please. You do not inherit "behaviors" from your parents, you develop your own behaviors. And you're unable to "override" your parents' behaviors.

The real world has no methods

Does the piece of paper you're writing on have a "write" method ? No! You take an empty piece of paper, pick up a pen, and write some text. You, as a person, don't have a "write" method either -- you make the decision to write some text based on outside events or your internal thoughts.


The Kingdom of Nouns

Objects bind functions and data structures together in indivisible units. I think this is a fundamental error since functions and data structures belong in totally different worlds.

-- Joe Armstrong , creator of Erlang

Objects (or nouns) are at the very core of OOP. A fundamental limitation of OOP is that it forces everything into nouns. And not everything should be modeled as nouns. Operations (functions) should not be modeled as objects. Why are we forced to create a Multiplier class when all we need is a function that multiplies two numbers? Simply have a Multiply function, let data be data and let functions be functions!

In non-OOP languages, doing trivial things like saving data to a file is straightforward -- very similar to how you would describe an action in plain English.

Real-world example, please!

Sure, going back to the painter example, the painter owns a PaintingFactory . He has hired a dedicated BrushManager , ColorManager , a CanvasManager and a MonaLisaProvider . His good friend zombie makes use of a BrainConsumingStrategy . Those objects, in turn, define the following methods: CreatePainting , FindBrush , PickColor , CallMonaLisa , and ConsumeBrainz .

Of course, this is plain stupidity, and could never have happened in the real world. How much unnecessary complexity has been created for the simple act of drawing a painting?

There's no need to invent strange concepts to hold your functions when they're allowed to exist separately from the objects.


Unit Testing
<img src="https://miro.medium.com/max/1400/1*xGn4uGgVyrRAXnqSwTF69w.jpeg" width="700" height="477"/>
Photo by Ani Kolleshi on Unsplash

Automated testing is an important part of the development process and helps tremendously in preventing regressions (i.e. bugs being introduced into existing code). Unit Testing plays a huge role in the process of automated testing.

Some might disagree, but OOP code is notoriously difficult to unit test. Unit Testing assumes testing things in isolation, and to make a method unit-testable:

  1. Its dependencies have to be extracted into a separate class.
  2. Create an interface for the newly created class.
  3. Declare fields to hold the instance of the newly created class.
  4. Make use of a mocking framework to mock the dependencies.
  5. Make use of a dependency-injection framework to inject the dependencies.

How much more complexity has to be created just to make a piece of code testable? How much time was wasted just to make some code testable?

> PS we'd also have to instantiate the entire class in order to test a single method. This will also bring in the code from all of its parent classes.

With OOP, writing tests for legacy code is even harder -- almost impossible. Entire companies have been created ( TypeMock ) around the issue of testing legacy OOP code.

Boilerplate code

Boilerplate code is probably the biggest offender when it comes to the signal-to-noise ratio. Boilerplate code is "noise" that is required to get the program to compile. Boilerplate code takes time to write and makes the codebase less readable because of the added noise.

While "program to an interface, not to an implementation" is the recommended approach in OOP, not everything should become an interface. We'd have to resort to using interfaces in the entire codebase, for the sole purpose of testability. We'd also probably have to make use of dependency injection, which further introduced unnecessary complexity.

Testing private methods

Some people say that private methods shouldn't be tested I tend to disagree, unit testing is called "unit" for a reason -- test small units of code in isolation. Yet testing of private methods in OOP is nearly impossible. We shouldn't be making private methods internal just for the sake of testability.

In order to achieve testability of private methods, they usually have to be extracted into a separate object. This, in turn, introduces unnecessary complexity and boilerplate code.


Refactoring

Refactoring is an important part of a developer's day-to-day job. Ironically, OOP code is notoriously hard to refactor. Refactoring is supposed to make the code less complex, and more maintainable. On the contrary, refactored OOP code becomes significantly more complex -- to make the code testable, we'd have to make use of dependency injection, and create an interface for the refactored class. Even then, refactoring OOP code is really hard without dedicated tools like Resharper.

https://medium.com/media/b557e5152569ad4569e250d2c2ba21b6

In the simple example above, the line count has more than doubled just to extract a single method. Why does refactoring create even more complexity, when the code is being refactored in order to decrease complexity in the first place?

Contrast this to a similar refactor of non-OOP code in JavaScript:

https://medium.com/media/36d6f2f2e78929c6bcd783f12c929f90

The code has literally stayed the same -- we simply moved the isValidInput function to a different file and added a single line to import that function. We've also added _isValidInput to the function signature for the sake of testability.

This is a simple example, but in practice the complexity grows exponentially as the codebase gets bigger.

And that's not all. Refactoring OOP code is extremely risky . Complex dependency graphs and state scattered all over OOP codebase, make it impossible for the human brain to consider all of the potential issues.


The Band-aids
<img src="https://miro.medium.com/max/1400/1*JOtbVvacgu-nH3ZR4mY2Og.jpeg" width="700" height="567"/>
Image source: Photo by Pixabay from Pexels

What do we do when something is not working? It is simple, we only have two options -- throw it away or try fixing it. OOP is something that can't be thrown away easily, millions of developers are trained in OOP. And millions of organizations worldwide are using OOP.

You probably see now that OOP doesn't really work , it makes our code complex and unreliable. And you're not alone! People have been thinking hard for decades trying to address the issues prevalent in OOP code. They've come up with a myriad of design patterns.

Design patterns

OOP provides a set of guidelines that should theoretically allow developers to incrementally build larger and larger systems: SOLID principle, dependency injection, design patterns, and others.

Unfortunately, the design patterns are nothing other than band-aids. They exist solely to address the shortcomings of OOP. A myriad of books has even been written on the topic. They wouldn't have been so bad, had they not been responsible for the introduction of enormous complexity to our codebases.

The problem factory

In fact, it is impossible to write good and maintainable Object-Oriented code.

On one side of the spectrum we have an OOP codebase that is inconsistent and doesn't seem to adhere to any standards. On the other side of the spectrum, we have a tower of over-engineered code, a bunch of erroneous abstractions built one on top of one another. Design patterns are very helpful in building such towers of abstractions.

Soon, adding in new functionality, and even making sense of all the complexity, gets harder and harder. The codebase will be full of things like SimpleBeanFactoryAwareAspectInstanceFactory , AbstractInterceptorDrivenBeanDefinitionDecorator , TransactionAwarePersistenceManagerFactoryProxy or RequestProcessorFactoryFactory .

Precious brainpower has to be wasted trying to understand the tower of abstractions that the developers themselves have created. The absence of structure is in many cases better than having bad structure (if you ask me).

<img src="https://miro.medium.com/max/1400/1*_xDSrTC0F2lke6OYtkRm8g.png" width="700" height="308"/>
Image source: https://www.reddit.com/r/ProgrammerHumor/comments/418x95/theory_vs_reality/

Further reading: FizzBuzzEnterpriseEdition

[Jul 22, 2019] Is Object-Oriented Programming a Trillion Dollar Disaster - Slashdot

Jul 22, 2019 | developers.slashdot.org

Is Object-Oriented Programming a Trillion Dollar Disaster? (medium.com) Posted by EditorDavid on Monday July 22, 2019 @01:04AM from the OOPs dept. Senior full-stack engineer Ilya Suzdalnitski recently published a lively 6,000-word essay calling object-oriented programming "a trillion dollar disaster." Precious time and brainpower are being spent thinking about "abstractions" and "design patterns" instead of solving real-world problems... Object-Oriented Programming (OOP) has been created with one goal in mind -- to manage the complexity of procedural codebases. In other words, it was supposed to improve code organization . There's no objective and open evidence that OOP is better than plain procedural programming ... Instead of reducing complexity, it encourages promiscuous sharing of mutable state and introduces additional complexity with its numerous design patterns . OOP makes common development practices, like refactoring and testing, needlessly hard...

Using OOP is seemingly innocent in the short-term, especially on greenfield projects. But what are the long-term consequences of using OOP? OOP is a time bomb, set to explode sometime in the future when the codebase gets big enough. Projects get delayed, deadlines get missed, developers get burned-out, adding in new features becomes next to impossible . The organization labels the codebase as the " legacy codebase ", and the development team plans a rewrite .... OOP provides developers too many tools and choices, without imposing the right kinds of limitations. Even though OOP promises to address modularity and improve reusability, it fails to deliver on its promises...

I'm not criticizing Alan Kay's OOP -- he is a genius. I wish OOP was implemented the way he designed it. I'm criticizing the modern Java/C# approach to OOP... I think that it is plain wrong that OOP is considered the de-facto standard for code organization by many people, including those in very senior technical positions. It is also wrong that many mainstream languages don't offer any other alternatives to code organization other than OOP.

The essay ultimately blames Java for the popularity of OOP, citing Alan Kay's comment that Java "is the most distressing thing to happen to computing since MS-DOS." It also quotes Linus Torvalds's observation that "limiting your project to C means that people don't screw things up with any idiotic 'object model'."

And it ultimately suggests Functional Programming as a superior alternative, making the following assertions about OOP:

"OOP code encourages the use of shared mutable state, which has been proven to be unsafe time and time again... [E]ncapsulation, in fact, is glorified global state." "OOP typically requires a lot of boilerplate code (low signal-to-noise ratio)." "Some might disagree, but OOP code is notoriously difficult to unit test... [R]efactoring OOP code is really hard without dedicated tools like Resharper." "It is impossible to write good and maintainable Object-Oriented code."

segedunum ( 883035 ) , Monday July 22, 2019 @05:36AM ( #58964224 )

Re:Not Tiresome, Hilariously Hypocritical ( Score: 4 , Informative)
There's no objective and open evidence that OOP is better than plain procedural programming...

...which is followed by the author's subjective opinions about why procedural programming is better than OOP. There's no objective comparison of the pros and cons of OOP vs procedural just a rant about some of OOP's problems.

We start from the point-of-view that OOP has to prove itself. Has it? Has any project or programming exercise ever taken less time because it is object-oriented?

Precious time and brainpower are being spent thinking about "abstractions" and "design patterns" instead of solving real-world problems...

...says the person who took the time to write a 6,000 word rant on "why I hate OOP".

Sadly, that was something you hallucinated. He doesn't say that anywhere.

mfnickster ( 182520 ) , Monday July 22, 2019 @10:54AM ( #58965660 )
Re:Tiresome ( Score: 5 , Interesting)

Inheritance, while not "inherently" bad, is often the wrong solution. See: Why extends is evil [javaworld.com]

Composition is frequently a more appropriate choice. Aaron Hillegass wrote this funny little anecdote in Cocoa Programming for Mac OS X [google.com]:

"Once upon a time, there was a company called Taligent. Taligent was created by IBM and Apple to develop a set of tools and libraries like Cocoa. About the time Taligent reached the peak of its mindshare, I met one of its engineers at a trade show. I asked him to create a simple application for me: A window would appear with a button, and when the button was clicked, the words 'Hello, World!' would appear in a text field. The engineer created a project and started subclassing madly: subclassing the window and the button and the event handler. Then he started generating code: dozens of lines to get the button and the text field onto the window. After 45 minutes, I had to leave. The app still did not work. That day, I knew that the company was doomed. A couple of years later, Taligent quietly closed its doors forever."

Darinbob ( 1142669 ) , Monday July 22, 2019 @03:00AM ( #58963760 )
Re:The issue ( Score: 5 , Insightful)

Almost every programming methodology can be abused by people who really don't know how to program well, or who don't want to. They'll happily create frameworks, implement new development processes, and chart tons of metrics, all while avoiding the work of getting the job done. In some cases the person who writes the most code is the same one who gets the least amount of useful work done.

So, OOP can be misused the same way. Never mind that OOP essentially began very early and has been reimplemented over and over, even before Alan Kay. Ie, files in Unix are essentially an object oriented system. It's just data encapsulation and separating work into manageable modules. That's how it was before anyone ever came up with the dumb name "full-stack developer".

cardpuncher ( 713057 ) , Monday July 22, 2019 @04:06AM ( #58963948 )
Re:The issue ( Score: 5 , Insightful)

As a developer who started in the days of FORTRAN (when it was all-caps), I've watched the rise of OOP with some curiosity. I think there's a general consensus that abstraction and re-usability are good things - they're the reason subroutines exist - the issue is whether they are ends in themselves.

I struggle with the whole concept of "design patterns". There are clearly common themes in software, but there seems to be a great deal of pressure these days to make your implementation fit some pre-defined template rather than thinking about the application's specific needs for state and concurrency. I have seen some rather eccentric consequences of "patternism".

Correctly written, OOP code allows you to encapsulate just the logic you need for a specific task and to make that specific task available in a wide variety of contexts by judicious use of templating and virtual functions that obviate the need for "refactoring". Badly written, OOP code can have as many dangerous side effects and as much opacity as any other kind of code. However, I think the key factor is not the choice of programming paradigm, but the design process. You need to think first about what your code is intended to do and in what circumstances it might be reused. In the context of a larger project, it means identifying commonalities and deciding how best to implement them once. You need to document that design and review it with other interested parties. You need to document the code with clear information about its valid and invalid use. If you've done that, testing should not be a problem.

Some people seem to believe that OOP removes the need for some of that design and documentation. It doesn't and indeed code that you intend to be reused needs *more* design and documentation than the glue that binds it together in any one specific use case. I'm still a firm believer that coding begins with a pencil, not with a keyboard. That's particularly true if you intend to design abstract interfaces that will serve many purposes. In other words, it's more work to do OOP properly, so only do it if the benefits outweigh the costs - and that usually means you not only know your code will be genuinely reusable but will also genuinely be reused.

ImdatS ( 958642 ) , Monday July 22, 2019 @04:43AM ( #58964070 ) Homepage
Re:The issue ( Score: 5 , Insightful)
[...] I'm still a firm believer that coding begins with a pencil, not with a keyboard. [...]

This!
In fact, even more: I'm a firm believer that coding begins with a pencil designing the data model that you want to implement.

Everything else is just code that operates on that data model. Though I agree with most of what you say, I believe the classical "MVC" design-pattern is still valid. And, you know what, there is a reason why it is called "M-V-C": Start with the Model, continue with the View and finalize with the Controller. MVC not only stood for Model-View-Controller but also for the order of the implementation of each.

And preferably, as you stated correctly, "... start with pencil & paper ..."

Rockoon ( 1252108 ) , Monday July 22, 2019 @05:23AM ( #58964192 )
Re:The issue ( Score: 5 , Insightful)
I struggle with the whole concept of "design patterns".

Because design patterns are stupid.

A reasonable programmer can understand reasonable code so long as the data is documented even when the code isnt documented, but will struggle immensely if it were the other way around. Bad programmers create objects for objects sake, and because of that they have to follow so called "design patterns" because no amount of code commenting makes the code easily understandable when its a spaghetti web of interacting "objects" The "design patterns" dont make the code easier the read, just easier to write.

Those OOP fanatics, if they do "document" their code, add comments like "// increment the index" which is useless shit.

The big win of OOP is only in the encapsulation of the data with the code, and great code treats objects like data structures with attached subroutines, not as "objects" , and document the fuck out of the contained data, while more or less letting the code document itself. and keep OO elements to a minimum. As it turns out, OOP is just much more effort than procedural and it rarely pays off to invest that effort, at least for me.

Z00L00K ( 682162 ) , Monday July 22, 2019 @05:14AM ( #58964162 ) Homepage
Re:The issue ( Score: 4 , Insightful)

The problem isn't the object orientation paradigm itself, it's how it's applied.

The big problem in any project is that you have to understand how to break down the final solution into modules that can be developed independently of each other to a large extent and identify the items that are shared. But even when you have items that are apparently identical don't mean that they will be that way in the long run, so shared code may even be dangerous because future developers don't know that by fixing problem A they create problems B, C, D and E.

Futurepower(R) ( 558542 ) writes: < MJennings.USA@NOT_any_of_THISgmail.com > on Monday July 22, 2019 @06:03AM ( #58964326 ) Homepage
Eternal September? ( Score: 4 , Informative)

Eternal September [wikipedia.org]

gweihir ( 88907 ) , Monday July 22, 2019 @07:48AM ( #58964672 )
Re:The issue ( Score: 3 )
Any time you make something easier, you lower the bar as well and now have a pack of idiots that never could have been hired if it weren't for a programming language that stripped out a lot of complexity for them.

Exactly. There are quite a few aspects of writing code that are difficult regardless of language and there the difference in skill and insight really matters.

Joce640k ( 829181 ) , Monday July 22, 2019 @04:14AM ( #58963972 ) Homepage
Re:The issue ( Score: 2 )

OO programming doesn't have any real advantages for small projects.

ImdatS ( 958642 ) , Monday July 22, 2019 @04:36AM ( #58964040 ) Homepage
Re:The issue ( Score: 5 , Insightful)

I have about 35+ years of software development experience, including with procedural, OOP and functional programming languages.

My experience is: The question "is procedural better than OOP or functional?" (or vice-versa) has a single answer: "it depends".

Like in your cases above, I would exactly do the same: use some procedural language that solves my problem quickly and easily.

In large-scale applications, I mostly used OOP (having learned OOP with Smalltalk & Objective-C). I don't like C++ or Java - but that's a matter of personal preference.

I use Python for large-scale scripts or machine learning/AI tasks.

I use Perl for short scripts that need to do a quick task.

Procedural is in fact easier to grasp for beginners as OOP and functional require a different way of thinking. If you start developing software, after a while (when the project gets complex enough) you will probably switch to OOP or functional.

Again, in my opinion neither is better than the other (procedural, OOP or functional). It just depends on the task at hand (and of course on the experience of the software developer).

spazmonkey ( 920425 ) , Monday July 22, 2019 @01:22AM ( #58963430 )
its the way OOP is taught ( Score: 5 , Interesting)

There is nothing inherently wrong with some of the functionality it offers, its the way OOP is abused as a substitute for basic good programming practices. I was helping interns - students from a local CC - deal with idiotic assignments like making a random number generator USING CLASSES, or displaying text to a screen USING CLASSES. Seriously, WTF? A room full of career programmers could not even figure out how you were supposed to do that, much less why. What was worse was a lack of understanding of basic programming skill or even the use of variables, as the kids were being taught EVERY program was to to be assembled solely by sticking together bits of libraries. There was no coding, just hunting for snippets of preexisting code to glue together. Zero idea they could add their own, much less how to do it. OOP isn't the problem, its the idea that it replaces basic programming skills and best practice.

sjames ( 1099 ) , Monday July 22, 2019 @02:30AM ( #58963680 ) Homepage Journal
Re:its the way OOP is taught ( Score: 5 , Interesting)

That and the obsession with absofrackinglutely EVERYTHING just having to be a formally declared object including the while program being an object with a run() method.

Some things actually cry out to be objects, some not so much.Generally, I find that my most readable and maintainable code turns out to be a procedural program that manipulates objects.

Even there, some things just naturally want to be a struct or just an array of values.

The same is true of most ingenious ideas in programming. It's one thing if code is demonstrating a particular idea, but production code is supposed to be there to do work, not grind an academic ax.

For example, slavish adherence to "patterns". They're quite useful for thinking about code and talking about code, but they shouldn't be the end of the discussion. They work better as a starting point. Some programs seem to want patterns to be mixed and matched.

In reality those problems are just cargo cult programming one level higher.

I suspect a lot of that is because too many developers barely grasp programming and never learned to go beyond the patterns they were explicitly taught.

When all you have is a hammer, the whole world looks like a nail.

bradley13 ( 1118935 ) , Monday July 22, 2019 @02:15AM ( #58963622 ) Homepage
It depends... ( Score: 5 , Insightful)

There are a lot of mediocre programmers who follow the principle "if you have a hammer, everything looks like a nail". They know OOP, so they think that every problem must be solved in an OOP way. In fact, OOP works well when your program needs to deal with relatively simple, real-world objects: the modelling follows naturally. If you are dealing with abstract concepts, or with highly complex real-world objects, then OOP may not be the best paradigm.

In Java, for example, you can program imperatively, by using static methods. The problem is knowing when to break the rules. For example, I am working on a natural language system that is supposed to generate textual answers to user inquiries. What "object" am I supposed to create to do this task? An "Answer" object that generates itself? Yes, that would work, but an imperative, static "generate answer" method makes at least as much sense.

There are different ways of thinking, different ways of modelling a problem. I get tired of the purists who think that OO is the only possible answer. The world is not a nail.

Beechmere ( 538241 ) , Monday July 22, 2019 @02:31AM ( #58963684 )
Class? Object? ( Score: 5 , Interesting)

I'm approaching 60, and I've been coding in COBOL, VB, FORTRAN, REXX, SQL for almost 40 years. I remember seeing Object Oriented Programming being introduced in the 80s, and I went on a course once (paid by work). I remember not understanding the concept of "Classes", and my impression was that the software we were buying was just trying to invent stupid new words for old familiar constructs (eg: Files, Records, Rows, Tables, etc). So I never transitioned away from my reliable mainframe programming platform. I thought the phrase OOP had dies out long ago, along with "Client Server" (whatever that meant). I'm retiring in a few years, and the mainframe will outlive me. Everything else is buggy.

cb88 ( 1410145 ) , Monday July 22, 2019 @03:11AM ( #58963794 )
Going back to Torvald's quote.... ( Score: 5 , Funny)

"limiting your project to C means that people don't screw things up with any idiotic 'object model'."

GTK .... hold by beer... it is not a good argument against OOP languages. But first, lets see how OOP came into place. OOP was designed to provide encapsulation, like components, support reuse and code sharing. It was the next step coming from modules and units, which where better than libraries, as functions and procedures had namespaces, which helped structuring code. OOP is a great idea when writing UI toolkits or similar stuff, as you can as

DrXym ( 126579 ) , Monday July 22, 2019 @04:57AM ( #58964116 )
No ( Score: 3 )

Like all things OO is fine in moderation but it's easy to go completely overboard, decomposing, normalizing, producing enormous inheritance trees. Yes your enormous UML diagram looks impressive, and yes it will be incomprehensible, fragile and horrible to maintain.

That said, it's completely fine in moderation. The same goes for functional programming. Most programmers can wrap their heads around things like functions, closures / lambdas, streams and so on. But if you mean true functional programming then forget it.

As for the kernel's choice to use C, that really boils down to the fact that a kernel needs to be lower level than a typical user land application. It has to do its own memory allocation and other things that were beyond C++ at the time. STL would have been usable, so would new / delete, and exceptions & unwinding. And at that point why even bother? That doesn't mean C is wonderful or doesn't inflict its own pain and bugs on development. But at the time, it was the only sane choice.

[Jul 22, 2019] Almost right

Jul 22, 2019 | developers.slashdot.org
Tough Love ( 215404 ), Monday July 22, 2019 @01:27AM ( #58963442 )

The entire software world is a multi-trillion dollar disaster.

Agile, Waterfall, Oop, fucking Javascript or worse its wannabe spawn of the devil Node. C, C++, Java wankers, did I say wankers? Yes wankers.

IT architects, pundit of the week, carpetbaggers, Aspies, total incompetents moving from job to job, you name it.

Disaster, complete and utter. Anybody who doesn't know this hasn't been paying attention.

About the only bright spot is a few open source projects like Linux Kernel, Postgres, Samba, Squid etc, totally outnumbered by wankers and posers.

[Jul 01, 2019] I worked twenty years in commercial software development including in aviation for UA and while Indian software developers are capable, their corporate culture is completely different as is based on feudal workplace relations of subordinates and management that results with extreme cronyism

Notable quotes:
"... Being powerless within calcifies totalitarian corporate culture ..."
"... ultimately promoted wide spread culture of obscurantism and opportunism what amounts to extreme office politics of covering their own butts often knowing that entire development strategy is flawed, as long as they are not personally blamed or if they in fact benefit by collapse of the project. ..."
"... As I worked side by side and later as project manager with Indian developers I can attest to that culture which while widely spread also among American developers reaches extremes among Indian corporations which infect often are engaged in fraud to be blamed on developers. ..."
Jul 01, 2019 | www.moonofalabama.org
dh-mtl , Jun 30, 2019 3:51:11 PM | 29

@Kalen , Jun 30, 2019 12:58:14 PM | 13

The programmers in India are well capable of writing good software. The difficulty lies in communicating the design requirements for the software. If they do not know in detail how air planes are engineered, they will implement the design to the letter but not to its intent.

I worked twenty years in commercial software development including in aviation for UA and while Indian software developers are capable, their corporate culture is completely different as is based on feudal workplace relations of subordinates and management that results with extreme cronyism, far exceeding that in the US as such relations are not only based on extreme exploitation (few jobs hundreds of qualified candidates) but on personal almost paternal like relations that preclude required independence of judgment and practically eliminates any major critical discussions about efficacy of technological solutions and their risks.

Being powerless within calcifies totalitarian corporate culture facing alternative of hurting family-like relations with bosses' and their feelings, who emotionally and in financial terms committed themselves to certain often wrong solutions dictated more by margins than technological imperatives, ultimately promoted wide spread culture of obscurantism and opportunism what amounts to extreme office politics of covering their own butts often knowing that entire development strategy is flawed, as long as they are not personally blamed or if they in fact benefit by collapse of the project.

As I worked side by side and later as project manager with Indian developers I can attest to that culture which while widely spread also among American developers reaches extremes among Indian corporations which infect often are engaged in fraud to be blamed on developers.

In fact it is shocking contrast with German culture that practically prevents anyone engaging in any project as it is almost always, in its entirety, discussed, analyzed, understood and fully supported by every member of the team, otherwise they often simply refused to work on project citing professional ethics. High quality social welfare state and handsome unemployment benefits definitely supported such ethical stand back them

While what I describe happened over twenty years ago it is still applicable I believe.

[Jun 30, 2019] Design Genius Jony Ive Leaves Apple, Leaving Behind Crapified Products That Cannot Be Repaired naked capitalism

Notable quotes:
"... Honestly, since 2015 feels like Apple wants to abandon it's PC business but just doesn't know how so ..."
"... The new line seems like a valid refresh, but the prices are higher than ever, and remember young people are earning less than ever, so I still think they are looking for a way out of the PC trade, maybe this refresh is to just buy time for an other five years before they close up. ..."
"... I wonder how much those tooling engineers in the US make compared to their Chinese competitors? It seems like a neoliberal virtuous circle: loot/guts education, then find skilled labor from places that still support education, by moving abroad or importing workers, reducing wages and further undermining the local skill base. ..."
"... I sympathize with y'all. It's not uncommon for good products to become less useful and more trouble as the original designers, etc., get arrogant from their success and start to believe that every idea they have is a useful improvement. Not even close. Too much of fixing things that aren't broken and gilding lilies. ..."
Jun 30, 2019 | www.nakedcapitalism.com

As iFixit notes :

The iPod, the iPhone, the MacBook Air, the physical Apple Store, even the iconic packaging of Apple products -- these products changed how we view and use their categories, or created new categories, and will be with us a long time.

But the title of that iFixit post, Jony Ive's Fragmented Legacy: Unreliable, Unrepairable, Beautiful Gadgets , makes clear that those beautiful products carried with them considerable costs- above and beyond their high prices. They're unreliable, and difficult to repair.

Ironically. both Jobs and Ive were inspired by Dieter Rams – whom iFixit calls "the legendary industrial designer renowned for functional and simple consumer products." And unlike Apple. Rams believed that good design didn't have to come at the expense of either durability or the environment:

Rams loves durable products that are environmentally friendly. That's one of his 10 principles for good design : "Design makes an important contribution to the preservation of the environment." But Ive has never publicly discussed the dissonance between his inspiration and Apple's disposable, glued-together products. For years, Apple has openly combated green standards that would make products easier to repair and recycle, stating that they need "complete design flexibility" no matter the impact on the environment.

Complete Design Flexibility Spells Environmental Disaster

In fact, that complete design flexibility – at least as practiced by Ive – has resulted in crapified products that are an environmental disaster. Their lack of durability means they must be repaired to be functional, and the lack of repairability means many of these products end up being tossed prematurely – no doubt not a bug, but a feature. As Vice recounts :

But history will not be kind to Ive, to Apple, or to their design choices. While the company popularized the smartphone and minimalistic, sleek, gadget design, it also did things like create brand new screws designed to keep consumers from repairing their iPhones.

Under Ive, Apple began gluing down batteries inside laptops and smartphones (rather than screwing them down) to shave off a fraction of a millimeter at the expense of repairability and sustainability.

It redesigned MacBook Pro keyboards with mechanisms that are, again, a fraction of a millimeter thinner, but that are easily defeated by dust and crumbs (the computer I am typing on right now -- which is six months old -- has a busted spacebar and 'r' key). These keyboards are not easily repairable, even by Apple, and many MacBook Pros have to be completely replaced due to a single key breaking. The iPhone 6 Plus had a design flaw that led to its touch screen spontaneously breaking -- it then told consumers there was no problem for months before ultimately creating a repair program . Meanwhile, Apple's own internal tests showed those flaws . He designed AirPods, which feature an unreplaceable battery that must be physically destroyed in order to open .

Vice also notes that in addition to Apple's products becoming "less modular, less consumer friendly, less upgradable, less repairable, and, at times, less functional than earlier models", Apple's design decisions have not been confined to Apple. Instead, "Ive's influence is obvious in products released by Samsung, HTC, Huawei, and others, which have similarly traded modularity for sleekness."

Right to Repair

As I've written before, Apple is leading opponent of giving consumers a right to repair. Nonetheless, there's been some global progress on this issue (see Global Gains on Right to Repair ). And we've also seen a widening of support in the US for such a right. The issue has arisen in the current presidential campaign, with Elizabeth Warren throwing down the gauntlet by endorsing a right to repair for farm tractors. The New York Times has also taken up the cause more generally (see Right to Repair Initiatives Gain Support in US ). More than twenty states are considering enacting right to repair statutes.


samhill , June 30, 2019 at 5:41 pm

I've been using Apple since 1990, I concur with the article about h/w and add that from Snow Leopard to Sierra the OSX was buggy as anything from the Windows world if not more so. Got better with High Sierra but still not up to the hype. I haven't lived with Mojave. I use Apple out of habit, haven't felt the love from them since Snow Leopard, exactly when they became a cell phone company. People think Apple is Mercedes and PCs are Fords, but for a long time now in practical use, leaving aside the snazzy aesthetics, under the hood it's GM vs Ford. I'm not rich enough to buy a $1500 non-upgradable, non-repairable product so the new T2 protected computers can't be for me.

The new Dell XPS's are tempting, they got the right idea, if you go to their service page you can dl complete service instructions, diagrams, and blow ups. They don't seem at all worried about my hurting myself.

In the last few years PCs offer what before I could only get from Apple; good screen, back lit keyboard, long battery life, trim size.

Honestly, since 2015 feels like Apple wants to abandon it's PC business but just doesn't know how so it's trying to drive off all the old legacy power users, the creative people that actually work hard for their money, exchanging them for rich dilettantes, hedge fund managers, and status seekers – an easier crowd to finally close up shop on.

The new line seems like a valid refresh, but the prices are higher than ever, and remember young people are earning less than ever, so I still think they are looking for a way out of the PC trade, maybe this refresh is to just buy time for an other five years before they close up.

When you start thinking like this about a company you've been loyal to for 30 years something is definitely wrong.

TG , June 30, 2019 at 6:09 pm

The reason that Apple moved the last of its production to China is, quite simply, that China now has basically the entire industrial infrastructure that we used to have. We have been hollowed out, and are now essentially third-world when it comes to industry. The entire integrated supply chain that defines an industrial power, is now gone.

The part about China no longer being a low-wage country is correct. China's wages have been higher than Mexico's for some time. But the part about the skilled workers is a slap in the face.

How can US workers be skilled at manufacturing, when there are no longer any jobs here where they can learn or use those skills?

fdr-fan , June 30, 2019 at 6:10 pm

A thin rectangle isn't more beautiful than a thick rectangle. They're both just rectangles.

Skip Intro , June 30, 2019 at 2:14 pm

I wonder how much those tooling engineers in the US make compared to their Chinese competitors? It seems like a neoliberal virtuous circle: loot/guts education, then find skilled labor from places that still support education, by moving abroad or importing workers, reducing wages and further undermining the local skill base.

EMtz , June 30, 2019 at 4:08 pm

They lost me when they made the iMac so thin it couldn't play a CD – and had the nerve to charge $85 for an Apple player. Bought another brand for $25. I don't care that it's not as pretty. I do care that I had to buy it at all.

I need a new cellphone. You can bet it won't be an iPhone.

John Zelnicker , June 30, 2019 at 4:24 pm

Jerri-Lynn – Indeed, a great article.

Although I have never used an Apple product, I sympathize with y'all. It's not uncommon for good products to become less useful and more trouble as the original designers, etc., get arrogant from their success and start to believe that every idea they have is a useful improvement. Not even close. Too much of fixing things that aren't broken and gilding lilies.

Charles Leseau , June 30, 2019 at 5:13 pm

Worst computer I've ever owned: Apple Macbook Pro, c. 2011 or so.

Died within 2 years, and also more expensive than the desktops I've built since that absolutely crush it in every possible performance metric (and last longer).

Meanwhile, I also still use a $300 Best Buy Toshiba craptop that has now lasted for 8 straight years.

Never again.

Alfred , June 30, 2019 at 5:23 pm

"Beautiful objects" – aye, there's the rub. In point of fact, the goal of industrial design is not to create beautiful objects. It is the goal of the fine arts to create beautiful objects. The goal of design is to create useful things that are easy to use and are effective at their tasks. Some -- including me -- would add to those most basic goals, the additional goals of being safe to use, durable, and easy to repair; perhaps even easy to adapt or suitable for recycling, or conservative of precious materials. The principles of good product design are laid out admirably in the classic book by Donald A. Norman, The Design of Everyday Things (1988). So this book was available to Jony Ive (born 1967) during his entire career (which overlapped almost exactly the wonder years of Postmodernism – and therein lies a clue). It would indeed be astonishing to learn that Ive took no notice of it. Yet Norman's book can be used to show that Ive's Apple violated so many of the principles of good design, so habitually, as to raise the suspicion that the company was not engaged in "product design" at all. The output Apple in the Ive era, I'd say, belongs instead to the realm of so-called "commodity aesthetics," which aims to give manufactured items a sufficiently seductive appearance to induce their purchase – nothing more. Aethetics appears as Dieter Rams's principle 3, as just one (and the only purely commercial) function in his 10; so in a theoretical context that remains ensconced within a genuine, Modernist functionalism. But in the Apple dispensation that single (aesthetic) principle seems to have subsumed the entire design enterprise – precisely as one would expect from "the cultural logic of late capitalism" (hat tip to Mr Jameson). Ive and his staff of formalists were not designing industrial products, or what Norman calls "everyday things," let alone devices; they were aestheticizing products in ways that first, foremost, and almost only enhanced their performance as expressions of a brand. Their eyes turned away from the prosaic prize of functionality to focus instead on the more profitable prize of sales -- to repeat customers, aka the devotees of 'iconic' fetishism. Thus did they serve not the masses but Mammon, and they did so as minions of minimalism. Nor was theirs the minimalism of the Frankfurt kitchen, with its deep roots in ethics and ergonomics. It was only superficially Miesian. Bauhaus-inspired? Oh, please. Only the more careless readers of Tom Wolfe and Wikipedia could believe anything so preposterous. Surely Steve Jobs, he of the featureless black turtleneck by Issey Miyake, knew better. Anyone who has so much as walked by an Apple Store, ever, should know better. And I guess I should know how to write shorter

[Jun 29, 2019] Boeing Outsourced Its 737 MAX Software To $9-Per-Hour Engineers

Jun 29, 2019 | www.zerohedge.com

The software at the heart of the Boeing 737 MAX crisis was developed at a time when the company was laying off experienced engineers and replacing them with temporary workers making as little as $9 per hour, according to Bloomberg .

In an effort to cut costs, Boeing was relying on subcontractors making paltry wages to develop and test its software. Often times, these subcontractors would be from countries lacking a deep background in aerospace, like India.

Boeing had recent college graduates working for Indian software developer HCL Technologies Ltd. in a building across from Seattle's Boeing Field, in flight test groups supporting the MAX. The coders from HCL designed to specifications set by Boeing but, according to Mark Rabin, a former Boeing software engineer, "it was controversial because it was far less efficient than Boeing engineers just writing the code."

Rabin said: "...it took many rounds going back and forth because the code was not done correctly."

In addition to cutting costs, the hiring of Indian companies may have landed Boeing orders for the Indian military and commercial aircraft, like a $22 billion order received in January 2017 . That order included 100 737 MAX 8 jets and was Boeing's largest order ever from an Indian airline. India traditionally orders from Airbus.

HCL engineers helped develop and test the 737 MAX's flight display software while employees from another Indian company, Cyient Ltd, handled the software for flight test equipment. In 2011, Boeing named Cyient, then known as Infotech, to a list of its "suppliers of the year".

One HCL employee posted online: "Provided quick workaround to resolve production issue which resulted in not delaying flight test of 737-Max (delay in each flight test will cost very big amount for Boeing) ."

But Boeing says the company didn't rely on engineers from HCL for the Maneuvering Characteristics Augmentation System, which was linked to both last October's crash and March's crash. The company also says it didn't rely on Indian companies for the cockpit warning light issue that was disclosed after the crashes.

A Boeing spokesperson said: "Boeing has many decades of experience working with supplier/partners around the world. Our primary focus is on always ensuring that our products and services are safe, of the highest quality and comply with all applicable regulations."

HCL, on the other hand, said: "HCL has a strong and long-standing business relationship with The Boeing Company, and we take pride in the work we do for all our customers. However, HCL does not comment on specific work we do for our customers. HCL is not associated with any ongoing issues with 737 Max."

Recent simulator tests run by the FAA indicate that software issues on the 737 MAX run deeper than first thought. Engineers who worked on the plane, which Boeing started developing eight years ago, complained of pressure from managers to limit changes that might introduce extra time or cost.

Rick Ludtke, a former Boeing flight controls engineer laid off in 2017, said: "Boeing was doing all kinds of things, everything you can imagine, to reduce cost , including moving work from Puget Sound, because we'd become very expensive here. All that's very understandable if you think of it from a business perspective. Slowly over time it appears that's eroded the ability for Puget Sound designers to design."

Rabin even recalled an incident where senior software engineers were told they weren't needed because Boeing's productions were mature. Rabin said: "I was shocked that in a room full of a couple hundred mostly senior engineers we were being told that we weren't needed."

Any given jetliner is made up of millions of parts and millions of lines of code. Boeing has often turned over large portions of the work to suppliers and subcontractors that follow its blueprints. But beginning in 2004 with the 787 Dreamliner, Boeing sought to increase profits by providing high-level specs and then asking suppliers to design more parts themselves.

Boeing also promised to invest $1.7 billion in Indian companies as a result of an $11 billion order in 2005 from Air India. This investment helped HCL and other software developers.

For the 787, HCL offered a price to Boeing that they couldn't refuse, either: free. HCL "took no up-front payments on the 787 and only started collecting payments based on sales years later".

Rockwell Collins won the MAX contract for cockpit displays and relied in part on HCL engineers and contract engineers from Cyient to test flight test equipment.

Charles LoveJoy, a former flight-test instrumentation design engineer at the company, said: "We did have our challenges with the India team. They met the requirements, per se, but you could do it better."


Anonymous IX , 2 minutes ago link

I love it. A company which fell in love so much with their extraordinary profits that they sabatoged their design and will now suffer enormous financial consequences. They're lucky to have all their defense/military contracts.

scraping_by , 4 minutes ago link

Oftentimes, it's the cut-and-paste code that's the problem. If you don't have a good appreciation for what every line does, you're never going to know what the sub or entire program does.

vienna_proxy , 7 minutes ago link

hahahaha non-technical managers making design decisions are complete **** ups wherever they go and here it blew up in their faces rofl

Ignorance is bliss , 2 minutes ago link

I see this all the time, and a lot of the time these non-technical decision makers are women.

hispanicLoser , 13 minutes ago link

By 2002 i could not sit down with any developers without hearing at least one story about how they had been in a code review meeting and seen absolute garbage turned out by H-1B workers.

Lots of people have known about this problem for many years now.

brazilian , 11 minutes ago link

May the gods damn all financial managers! One of the two professions, along with bankers, which have absolutely no social value whatsoever. There should be open hunting season on both!

scraping_by , 15 minutes ago link

Shifting to high-level specs puts more power in the hands of management/accounting types, since it doesn't require engineering knowledge to track a deadline. Indeed, this whole story is the wet dream of business school, the idea of being able to accomplish technical tasks purely by demand. A lot of public schools teach kids science is magic so when they grow up, the think they can just give directions and technology appears.

pops , 20 minutes ago link

In this country, one must have a license from the FAA to work on commercial aircraft. That means training and certification that usually results in higher pay for those qualified to perform the repairs to the aircraft your family will fly on.

In case you're not aware, much of the heavy stuff like D checks (overhaul) have been outsourced by the airlines to foreign countries where the FAA has nothing to say about it. Those contractors can hire whoever they wish for whatever they'll accept. I have worked with some of those "mechanics" who cannot even read.

Keep that in mind next time the TSA perv is fondling your junk. That might be your last sexual encounter.

Klassenfeind , 22 minutes ago link

Boeing Outsourced Its 737 MAX Software To $9-Per-Hour Engineers

Long live the free market, right Tylers?

You ZH guys always rally against minimum wage here, well there you go: $9/hr aircraft 'engineers!' Happy now?

asteroids , 25 minutes ago link

You gotta be kidding. You let kids straight out of school write mission critical code? How ******* stupid are you BA?

reader2010 , 20 minutes ago link

Go to India. There are many outsourcing companies that only hire new college graduates for work and they are paid less than $2 an hour for the job.

For the DoD contractors, they have to bring them to the US to work. There are tons of H1B guys from India working for defense contractors.

[Jun 29, 2019] Hiring aircraft computer engineers at $9/hr by Boeing is a great idea. Who could argue with smart cost saving?

Jun 29, 2019 | www.zerohedge.com

Anonymous IX , 3 minutes ago link

I love it. A company which fell in love so much with their extraordinary profits that they sabatoged their design and will now suffer enormous financial consequences. They're lucky to have all their defense/military contracts.

[Jun 29, 2019] If you have to be told that H-1B code in critical aircraft software might be not reliable you are too stupid to live

Jun 29, 2019 | www.zerohedge.com

hispanicLoser , 25 minutes ago link

If you have to be told that H-1B code in aircraft software is not reliable you are too stupid to live.

zob2020 , 16 minutes ago link

Or this online shop designed back in 1997. It was supposed to take over all internet shopping that didn't really exist back then yet. And they used Indian doctors to code. Well sure they ended up with a site... but one so heavy with pictures it took 30min to open one page, another 20min to even click on a product to read its text etc-. This with good university internet.

Unsurprisingly i don't think they ever managed to sell anything. But they gave out free movie tickets to every registered customer... so me & friend each registered some 80 accounts and went to free movies for a good bit over a year.

mailman must have had fun delivering 160 letters to random names in the same student apartment :D

[Jun 26, 2019] The Individual Costs of Occupational Decline

Jun 26, 2019 | www.nakedcapitalism.com

Yves here. You have to read a bit into this article on occupational decline, aka, "What happens to me after the robots take my job?" to realize that the authors studied Swedish workers. One has to think that the findings would be more pronounced in the US, due both to pronounced regional and urban/rural variations, as well as the weakness of social institutions in the US. While there may be small cities in Sweden that have been hit hard by the decline of a key employer, I don't have the impression that Sweden has areas that have suffered the way our Rust Belt has. Similarly, in the US, a significant amount of hiring starts with resume reviews with the job requirements overspecified because the employer intends to hire someone who has done the same job somewhere else and hence needs no training (which in practice is an illusion; how companies do things is always idiosyncratic and new hires face a learning curve). On top of that, many positions are filled via personal networks, not formal recruiting. Some studies have concluded that having a large network of weak ties is more helpful in landing a new post than fewer close connections. It's easier to know a lot of people casually in a society with strong community institutions.

The article does not provide much in the way of remedies; it hints at "let them eat training" when programs have proven to be ineffective. One approach would be aggressive enforcement of laws against age discrimination. And even though some readers dislike a Job Guarantee, not only would it enable people who wanted to work to keep working, but private sector employers are particularly loath to employ someone who has been out of work for more than six months, so a Job Guarantee post would also help keep someone who'd lost a job from looking like damaged goods.

By Per-Anders Edin, Professor of Industrial Relations, Uppsala University; Tiernan Evans, Economics MRes/PhD Candidate, LSE; Georg Graetz, Assistant Professor in the Department of Economics, Uppsala University; Sofia Hernnäs, PhD student, Department of Economics, Uppsala University; Guy Michaels,Associate Professor in the Department of Economics, LSE. Originally published at VoxEU

As new technologies replace human labour in a growing number of tasks, employment in some occupations invariably falls. This column compares outcomes for similar workers in similar occupations over 28 years to explore the consequences of large declines in occupational employment for workers' careers. While mean losses in earnings and employment for those initially working in occupations that later declined are relatively moderate, low-earners lose significantly more.

How costly is it for workers when demand for their occupation declines? As new technologies replace human labour in a growing number of tasks, employment in some occupations invariably falls. Until recently, technological change mostly automated routine production and clerical work (Autor et al. 2003). But machines' capabilities are expanding, as recent developments include self-driving vehicles and software that outperforms professionals in some tasks. Debates on the labour market implications of these new technologies are ongoing (e.g. Brynjolfsson and McAfee 2014, Acemoglu and Restrepo 2018). But in these debates, it is important to ask not only "Will robots take my job?", but also "What would happen to my career if robots took my job?"

Much is at stake. Occupational decline may hurt workers and their families, and may also have broader consequences for economic inequality, education, taxation, and redistribution. If it exacerbates differences in outcomes between economic winners and losers, populist forces may gain further momentum (Dal Bo et al. 2019).

In a new paper (Edin et al. 2019) we explore the consequences of large declines in occupational employment for workers' careers. We assemble a dataset with forecasts of occupational employment changes that allow us to identify unanticipated declines, population-level administrative data spanning several decades, and a highly detailed occupational classification. These data allow us to compare outcomes for similar workers who perform similar tasks and have similar expectations of future occupational employment trajectories, but experience different actual occupational changes.

Our approach is distinct from previous work that contrasts career outcomes of routine and non-routine workers (e.g. Cortes 2016), since we compare workers who perform similar tasks and whose careers would likely have followed similar paths were it not for occupational decline. Our work is also distinct from studies of mass layoffs (e.g. Jacobson et al. 1993), since workers who experience occupational decline may take action before losing their jobs.

In our analysis, we follow individual workers' careers for almost 30 years, and we find that workers in declining occupations lose on average 2-5% of cumulative earnings, compared to other similar workers. Workers with low initial earnings (relative to others in their occupations) lose more – about 8-11% of mean cumulative earnings. These earnings losses reflect both lost years of employment and lower earnings conditional on employment; some of the employment losses are due to increased time spent in unemployment and retraining, and low earners spend more time in both unemployment and retraining.

Estimating the Consequences of Occupational Decline

We begin by assembling data from the Occupational Outlook Handbooks (OOH), published by the US Bureau of Labor Statistics, which cover more than 400 occupations. In our main analysis we define occupations as declining if their employment fell by at least 25% from 1984-2016, although we show that our results are robust to using other cutoffs. The OOH also provides information on technological change affecting each occupation, and forecasts of employment over time. Using these data, we can separate technologically driven declines, and also unanticipated declines. Occupations that declined include typesetters, drafters, proof readers, and various machine operators.

We then match the OOH data to detailed Swedish occupations. This allows us to study the consequences of occupational decline for workers who, in 1985, worked in occupations that declined over the subsequent decades. We verify that occupations that declined in the US also declined in Sweden, and that the employment forecasts that the BLS made for the US have predictive power for employment changes in Sweden.

Detailed administrative micro-data, which cover all Swedish workers, allow us to address two potential concerns for identifying the consequences of occupational decline: that workers in declining occupations may have differed from other workers, and that declining occupations may have differed even in absence of occupational decline. To address the first concern, about individual sorting, we control for gender, age, education, and location, as well as 1985 earnings. Once we control for these characteristics, we find that workers in declining occupations were no different from others in terms of their cognitive and non-cognitive test scores and their parents' schooling and earnings. To address the second concern, about occupational differences, we control for occupational earnings profiles (calculated using the 1985 data), the BLS forecasts, and other occupational and industry characteristics.

Assessing the losses and how their incidence varied

We find that prime age workers (those aged 25-36 in 1985) who were exposed to occupational decline lost about 2-6 months of employment over 28 years, compared to similar workers whose occupations did not decline. The higher end of the range refers to our comparison between similar workers, while the lower end of the range compares similar workers in similar occupations. The employment loss corresponds to around 1-2% of mean cumulative employment. The corresponding earnings losses were larger, and amounted to around 2-5% of mean cumulative earnings. These mean losses may seem moderate given the large occupational declines, but the average outcomes do not tell the full story. The bottom third of earners in each occupation fared worse, losing around 8-11% of mean earnings when their occupations declined.

The earnings and employment losses that we document reflect increased time spent in unemployment and government-sponsored retraining – more so for workers with low initial earnings. We also find that older workers who faced occupational decline retired a little earlier.

We also find that workers in occupations that declined after 1985 were less likely to remain in their starting occupation. It is quite likely that this reduced supply to declining occupations contributed to mitigating the losses of the workers that remained there.

We show that our main findings are essentially unchanged when we restrict our analysis to technology-related occupational declines.

Further, our finding that mean earnings and employment losses from occupational decline are small is not unique to Sweden. We find similar results using a smaller panel dataset on US workers, using the National Longitudinal Survey of Youth 1979.

Theoretical implications

Our paper also considers the implications of our findings for Roy's (1951) model, which is a workhorse model for labour economists. We show that the frictionless Roy model predicts that losses are increasing in initial occupational earnings rank, under a wide variety of assumptions about the skill distribution. This prediction is inconsistent with our finding that the largest earnings losses from occupational decline are incurred by those who earned the least. To reconcile our findings, we add frictions to the model: we assume that workers who earn little in one occupation incur larger time costs searching for jobs or retraining if they try to move occupations. This extension of the model, especially when coupled with the addition of involuntary job displacement, allows us to reconcile several of our empirical findings.

Conclusions

There is a vivid academic and public debate on whether we should fear the takeover of human jobs by machines. New technologies may replace not only factory and office workers but also drivers and some professional occupations. Our paper compares similar workers in similar occupations over 28 years. We show that although mean losses in earnings and employment for those initially working in occupations that later declined are relatively moderate (2-5% of earnings and 1-2% of employment), low-earners lose significantly more.

The losses that we find from occupational decline are smaller than those suffered by workers who experience mass layoffs, as reported in the existing literature. Because the occupational decline we study took years or even decades, its costs for individual workers were likely mitigated through retirements, reduced entry into declining occupations, and increased job-to-job exits to other occupations. Compared to large, sudden shocks, such as plant closures, the decline we study may also have a less pronounced impact on local economies.

While the losses we find are on average moderate, there are several reasons why future occupational decline may have adverse impacts. First, while we study unanticipated declines, the declines were nevertheless fairly gradual. Costs may be larger for sudden shocks following, for example, a quick evolution of machine learning. Second, the occupational decline that we study mainly affected low- and middle-skilled occupations, which require less human capital investment than those that may be impacted in the future. Finally, and perhaps most importantly, our findings show that low-earning individuals are already suffering considerable (pre-tax) earnings losses, even in Sweden, where institutions are geared towards mitigating those losses and facilitating occupational transitions. Helping these workers stay productive when they face occupational decline remains an important challenge for governments.

Please see original post for references

[May 17, 2019] Shareholder Capitalism, the Military, and the Beginning of the End for Boeing

Highly recommended!
Notable quotes:
"... Like many of its Wall Street counterparts, Boeing also used complexity as a mechanism to obfuscate and conceal activity that is incompetent, nefarious and/or harmful to not only the corporation itself but to society as a whole (instead of complexity being a benign byproduct of a move up the technology curve). ..."
"... The economists who built on Friedman's work, along with increasingly aggressive institutional investors, devised solutions to ensure the primacy of enhancing shareholder value, via the advocacy of hostile takeovers, the promotion of massive stock buybacks or repurchases (which increased the stock value), higher dividend payouts and, most importantly, the introduction of stock-based pay for top executives in order to align their interests to those of the shareholders. These ideas were influenced by the idea that corporate efficiency and profitability were impinged upon by archaic regulation and unionization, which, according to the theory, precluded the ability to compete globally. ..."
"... "Return on Net Assets" (RONA) forms a key part of the shareholder capitalism doctrine. ..."
"... If the choice is between putting a million bucks into new factory machinery or returning it to shareholders, say, via dividend payments, the latter is the optimal way to go because in theory it means higher net returns accruing to the shareholders (as the "owners" of the company), implicitly assuming that they can make better use of that money than the company itself can. ..."
"... It is an absurd conceit to believe that a dilettante portfolio manager is in a better position than an aviation engineer to gauge whether corporate investment in fixed assets will generate productivity gains well north of the expected return for the cash distributed to the shareholders. But such is the perverse fantasy embedded in the myth of shareholder capitalism ..."
"... When real engineering clashes with financial engineering, the damage takes the form of a geographically disparate and demoralized workforce: The factory-floor denominator goes down. Workers' wages are depressed, testing and quality assurance are curtailed. ..."
May 17, 2019 | www.nakedcapitalism.com

The fall of the Berlin Wall and the corresponding end of the Soviet Empire gave the fullest impetus imaginable to the forces of globalized capitalism, and correspondingly unfettered access to the world's cheapest labor. What was not to like about that? It afforded multinational corporations vastly expanded opportunities to fatten their profit margins and increase the bottom line with seemingly no risk posed to their business model.

Or so it appeared. In 2000, aerospace engineer L.J. Hart-Smith's remarkable paper, sardonically titled "Out-Sourced Profits – The Cornerstone of Successful Subcontracting," laid out the case against several business practices of Hart-Smith's previous employer, McDonnell Douglas, which had incautiously ridden the wave of outsourcing when it merged with the author's new employer, Boeing. Hart-Smith's intention in telling his story was a cautionary one for the newly combined Boeing, lest it follow its then recent acquisition down the same disastrous path.

Of the manifold points and issues identified by Hart-Smith, there is one that stands out as the most compelling in terms of understanding the current crisis enveloping Boeing: The embrace of the metric "Return on Net Assets" (RONA). When combined with the relentless pursuit of cost reduction (via offshoring), RONA taken to the extreme can undermine overall safety standards.

Related to this problem is the intentional and unnecessary use of complexity as an instrument of propaganda. Like many of its Wall Street counterparts, Boeing also used complexity as a mechanism to obfuscate and conceal activity that is incompetent, nefarious and/or harmful to not only the corporation itself but to society as a whole (instead of complexity being a benign byproduct of a move up the technology curve).

All of these pernicious concepts are branches of the same poisoned tree: " shareholder capitalism ":

[A] notion best epitomized by Milton Friedman that the only social responsibility of a corporation is to increase its profits, laying the groundwork for the idea that shareholders, being the owners and the main risk-bearing participants, ought therefore to receive the biggest rewards. Profits therefore should be generated first and foremost with a view toward maximizing the interests of shareholders, not the executives or managers who (according to the theory) were spending too much of their time, and the shareholders' money, worrying about employees, customers, and the community at large. The economists who built on Friedman's work, along with increasingly aggressive institutional investors, devised solutions to ensure the primacy of enhancing shareholder value, via the advocacy of hostile takeovers, the promotion of massive stock buybacks or repurchases (which increased the stock value), higher dividend payouts and, most importantly, the introduction of stock-based pay for top executives in order to align their interests to those of the shareholders. These ideas were influenced by the idea that corporate efficiency and profitability were impinged upon by archaic regulation and unionization, which, according to the theory, precluded the ability to compete globally.

"Return on Net Assets" (RONA) forms a key part of the shareholder capitalism doctrine. In essence, it means maximizing the returns of those dollars deployed in the operation of the business. Applied to a corporation, it comes down to this: If the choice is between putting a million bucks into new factory machinery or returning it to shareholders, say, via dividend payments, the latter is the optimal way to go because in theory it means higher net returns accruing to the shareholders (as the "owners" of the company), implicitly assuming that they can make better use of that money than the company itself can.

It is an absurd conceit to believe that a dilettante portfolio manager is in a better position than an aviation engineer to gauge whether corporate investment in fixed assets will generate productivity gains well north of the expected return for the cash distributed to the shareholders. But such is the perverse fantasy embedded in the myth of shareholder capitalism.

Engineering reality, however, is far more complicated than what is outlined in university MBA textbooks. For corporations like McDonnell Douglas, for example, RONA was used not as a way to prioritize new investment in the corporation but rather to justify disinvestment in the corporation. This disinvestment ultimately degraded the company's underlying profitability and the quality of its planes (which is one of the reasons the Pentagon helped to broker the merger with Boeing; in another perverse echo of the 2008 financial disaster, it was a politically engineered bailout).

RONA in Practice

When real engineering clashes with financial engineering, the damage takes the form of a geographically disparate and demoralized workforce: The factory-floor denominator goes down. Workers' wages are depressed, testing and quality assurance are curtailed. Productivity is diminished, even as labor-saving technologies are introduced. Precision machinery is sold off and replaced by inferior, but cheaper, machines. Engineering quality deteriorates. And the upshot is that a reliable plane like Boeing's 737, which had been a tried and true money-spinner with an impressive safety record since 1967, becomes a high-tech death trap.

The drive toward efficiency is translated into a drive to do more with less. Get more out of workers while paying them less. Make more parts with fewer machines. Outsourcing is viewed as a way to release capital by transferring investment from skilled domestic human capital to offshore entities not imbued with the same talents, corporate culture and dedication to quality. The benefits to the bottom line are temporary; the long-term pathologies become embedded as the company's market share begins to shrink, as the airlines search for less shoddy alternatives.

You must do one more thing if you are a Boeing director: you must erect barriers to bad news, because there is nothing that bursts a magic bubble faster than reality, particularly if it's bad reality.

The illusion that Boeing sought to perpetuate was that it continued to produce the same thing it had produced for decades: namely, a safe, reliable, quality airplane. But it was doing so with a production apparatus that was stripped, for cost reasons, of many of the means necessary to make good aircraft. So while the wine still came in a bottle signifying Premier Cru quality, and still carried the same price, someone had poured out the contents and replaced them with cheap plonk.

And that has become remarkably easy to do in aviation. Because Boeing is no longer subject to proper independent regulatory scrutiny. This is what happens when you're allowed to " self-certify" your own airplane , as the Washington Post described: "One Boeing engineer would conduct a test of a particular system on the Max 8, while another Boeing engineer would act as the FAA's representative, signing on behalf of the U.S. government that the technology complied with federal safety regulations."

This is a recipe for disaster. Boeing relentlessly cut costs, it outsourced across the globe to workforces that knew nothing about aviation or aviation's safety culture. It sent things everywhere on one criteria and one criteria only: lower the denominator. Make it the same, but cheaper. And then self-certify the plane, so that nobody, including the FAA, was ever the wiser.

Boeing also greased the wheels in Washington to ensure the continuation of this convenient state of regulatory affairs for the company. According to OpenSecrets.org , Boeing and its affiliates spent $15,120,000 in lobbying expenses in 2018, after spending, $16,740,000 in 2017 (along with a further $4,551,078 in 2018 political contributions, which placed the company 82nd out of a total of 19,087 contributors). Looking back at these figures over the past four elections (congressional and presidential) since 2012, these numbers represent fairly typical spending sums for the company.

But clever financial engineering, extensive political lobbying and self-certification can't perpetually hold back the effects of shoddy engineering. One of the sad byproducts of the FAA's acquiescence to "self-certification" is how many things fall through the cracks so easily.

[May 05, 2019] Does America Have an Economy or Any Sense of Reality by Paul Craig Roberts

Notable quotes:
"... We are having a propaganda barrage about the great Trump economy. We have been hearing about the great economy for a decade while the labor force participation rate declined, real family incomes stagnated, and debt burdens rose. The economy has been great only for large equity owners whose stock ownership benefited from the trillions of dollars the Fed poured into financial markets and from buy-backs by corporations of their own stocks. ..."
"... Federal Reserve data reports that a large percentage of the younger work force live at home with parents, because the jobs available to them are insufficient to pay for an independent existence. How then can the real estate, home furnishings, and appliance markets be strong? ..."
"... In contrast, Robotics, instead of displacing labor, eliminates it. Unlike jobs offshoring which shifted jobs from the US to China, robotics will cause jobs losses in both countries. If consumer incomes fall, then demand for output also falls, and output will fall. Robotics, then, is a way to shrink gross domestic product. ..."
"... The tech nerds and corporations who cannot wait for robotics to reduce labor cost in their profits calculation are incapable of understanding that when masses of people are without jobs, there is no consumer income with which to purchase the products of robots. The robots themselves do not need housing, food, clothing, entertainment, transportation, and medical care. The mega-rich owners of the robots cannot possibly consume the robotic output. An economy without consumers is a profitless economy. ..."
"... A country incapable of dealing with real problems has no future. ..."
May 02, 2019 | www.unz.com

We are having a propaganda barrage about the great Trump economy. We have been hearing about the great economy for a decade while the labor force participation rate declined, real family incomes stagnated, and debt burdens rose. The economy has been great only for large equity owners whose stock ownership benefited from the trillions of dollars the Fed poured into financial markets and from buy-backs by corporations of their own stocks.

I have pointed out for years that the jobs reports are fabrications and that the jobs that do exist are lowly paid domestic service jobs such as waitresses and bartenders and health care and social assistance. What has kept the American economy going is the expansion of consumer debt, not higher pay from higher productivity. The reported low unemployment rate is obtained by not counting discouraged workers who have given up on finding a job.

Do you remember all the corporate money that the Trump tax cut was supposed to bring back to America for investment? It was all BS. Yesterday I read reports that Apple is losing its trillion dollar market valuation because Apple is using its profits to buy back its own stock. In other words, the demand for Apple's products does not justify more investment. Therefore, the best use of the profit is to repurchase the equity shares, thus shrinking Apple's capitalization. The great economy does not include expanding demand for Apple's products.

I read also of endless store and mall closings, losses falsely attributed to online purchasing, which only accounts for a small percentage of sales.

Federal Reserve data reports that a large percentage of the younger work force live at home with parents, because the jobs available to them are insufficient to pay for an independent existence. How then can the real estate, home furnishings, and appliance markets be strong?

When a couple of decades ago I first wrote of the danger of jobs offshoring to the American middle class, state and local government budgets, and pension funds, idiot critics raised the charge of Luddite.

The Luddites were wrong. Mechanization raised the productivity of labor and real wages, but jobs offshoring shifts jobs from the domestic economy to abroad. Domestic labor is displaced, but overseas labor gets the jobs, thus boosting jobs there. In other words, labor income declines in the country that loses jobs and rises in the country to which the jobs are offshored. This is the way American corporations spurred the economic development of China. It was due to jobs offshoring that China developed far more rapidly than the CIA expected.

In contrast, Robotics, instead of displacing labor, eliminates it. Unlike jobs offshoring which shifted jobs from the US to China, robotics will cause jobs losses in both countries. If consumer incomes fall, then demand for output also falls, and output will fall. Robotics, then, is a way to shrink gross domestic product.

The tech nerds and corporations who cannot wait for robotics to reduce labor cost in their profits calculation are incapable of understanding that when masses of people are without jobs, there is no consumer income with which to purchase the products of robots. The robots themselves do not need housing, food, clothing, entertainment, transportation, and medical care. The mega-rich owners of the robots cannot possibly consume the robotic output. An economy without consumers is a profitless economy.

One would think that there would be a great deal of discussion about the economic effects of robotics before the problems are upon us, just as one would think there would be enormous concern about the high tensions Washington has caused between the US and Russia and China, just as one would think there would be preparations for the adverse economic consequences of global warming, whatever the cause. Instead, the US, a country facing many crises, is focused on whether President Trump obstructed investigation of a crime that the special prosecutor said did not take place.

A country incapable of dealing with real problems has no future.

[Apr 28, 2019] AI is software. Software bugs. Software doesn't autocorrect bugs. Men correct bugs. A bugging self-driving car leads its passengers to death. A man driving a car can steer away from death

Apr 28, 2019 | www.unz.com

Vojkan , April 27, 2019 at 7:42 am GMT

The infatuation with AI makes people overlook three AI's built-in glitches. AI is software. Software bugs. Software doesn't autocorrect bugs. Men correct bugs. A bugging self-driving car leads its passengers to death. A man driving a car can steer away from death. Humans love to behave in erratic ways, it is just impossible to program AI to respond to all possible erratic human behaviour. Therefore, instead of adapting AI to humans, humans will be forced to adapt to AI, and relinquish a lot of their liberty as humans. Humans have moral qualms (not everybody is Hillary Clinton), AI being strictly utilitarian, will necessarily be "psychopathic".

In short AI is the promise of communism raised by several orders of magnitude. Welcome to the "Brave New World".

Digital Samizdat , says: April 27, 2019 at 11:42 am GMT

@Vojkan You've raised some interesting objections, Vojkan. But here are a few quibbles:

1) AI is software. Software bugs. Software doesn't autocorrect bugs. Men correct bugs. A bugging self-driving car leads its passengers to death. A man driving a car can steer away from death.

Learn to code! Seriously, until and unless the AI devices acquire actual power over their human masters (as in The Matrix ), this is not as big a problem as you think. You simply test the device over and over and over until the bugs are discovered and worked out -- in other words, we just keep on doing what we've always done with software: alpha, beta, etc.

2) Humans love to behave in erratic ways, it is just impossible to program AI to respond to all possible erratic human behaviour. Therefore, instead of adapting AI to humans, humans will be forced to adapt to AI, and relinquish a lot of their liberty as humans.

There's probably some truth to that. This reminds me of the old Marshall McCluhan saying that "the medium is the message," and that we were all going to adapt our mode of cognition (somewhat) to the TV or the internet, or whatever. Yeah, to some extent that has happened. But to some extent, that probably happened way back when people first began domesticating horses and riding them. Human beings are 'programmed', as it were, to adapt to their environments to some extent, and to condition their reactions on the actions of other things/creatures in their environment.

However, I think you may be underestimating the potential to create interfaces that allow AI to interact with a human in much more complex ways, such as how another human would interact with human: sublte visual cues, pheromones, etc. That, in fact, was the essence of the old Turing Test, which is still the Holy Grail of AI:

https://en.wikipedia.org/wiki/Turing_test

3) Humans have moral qualms (not everybody is Hillary Clinton), AI being strictly utilitarian, will necessarily be "psychopathic".

I don't see why AI devices can't have some moral principles -- or at least moral biases -- programmed into them. Isaac Asimov didn't think this was impossible either:

https://en.wikipedia.org/wiki/Three_Laws_of_Robotics

reiner Tor , says: April 27, 2019 at 11:47 am GMT
@Digital Samizdat

You simply test the device over and over and over until the bugs are discovered and worked out -- in other words, we just keep on doing what we've always done with software: alpha, beta, etc.

Some bugs stay dormant for decades. I've seen one up close.

Digital Samizdat , says: April 27, 2019 at 11:57 am GMT
@reiner Tor

Well, you fix it whenever you find it!

That's a problem as old as programming; in fact, it's a problem as old as engineering itself. It's nothing new.

reiner Tor , says: April 27, 2019 at 12:11 pm GMT
@Digital Samizdat

What's new with AI is the amount of damage a faulty software multiplied many times over can do. My experience was pretty horrible (I was one of the two humans overseeing the system, but it was a pretty horrifying experience), but if the system was fully autonomous, it'd have driven my employer bankrupt.

Now I'm not against using AI in any form whatsoever; I also think that it's inevitable anyway. I'd support AI driving cars or flying planes, because they are likely safer than humans, though it's of course changing a manageable risk for a very small probability tail risk. But I'm pretty worried about AI in general.

[Mar 13, 2019] Pilots Complained About Boeing 737 Max 8 For Months Before Second Deadly Crash

Mar 13, 2019 | www.zerohedge.com

Several Pilots repeatedly warned federal authorities of safety concerns over the now-grounded Boeing 737 Max 8 for months leading up to the second deadly disaster involving the plane, according to an investigation by the Dallas Morning News . One captain even called the Max 8's flight manual " inadequate and almost criminally insufficient ," according to the report.

" The fact that this airplane requires such jury-rigging to fly is a red flag. Now we know the systems employed are error-prone -- even if the pilots aren't sure what those systems are, what redundancies are in place and failure modes. I am left to wonder: what else don't I know?" wrote the captain.

At least five complaints about the Boeing jet were found in a federal database which pilots routinely use to report aviation incidents without fear of repercussions.

The complaints are about the safety mechanism cited in preliminary reports for an October plane crash in Indonesia that killed 189.

The disclosures found by The News reference problems during flights of Boeing 737 Max 8s with an autopilot system during takeoff and nose-down situations while trying to gain altitude. While records show these flights occurred during October and November, information regarding which airlines the pilots were flying for at the time is redacted from the database. - Dallas Morning News

One captain who flies the Max 8 said in November that it was "unconscionable" that Boeing and federal authorities have allowed pilots to fly the plane without adequate training - including a failure to fully disclose how its systems were distinctly different from other planes.

An FAA spokesman said the reporting system is directly filed to NASA, which serves as an neutral third party in the reporting of grievances.

"The FAA analyzes these reports along with other safety data gathered through programs the FAA administers directly, including the Aviation Safety Action Program, which includes all of the major airlines including Southwest and American," said FAA southwest regional spokesman Lynn Lunsford.

Meanwhile, despite several airlines and foreign countries grounding the Max 8, US regulators have so far declined to follow suit. They have, however, mandated that Boeing upgrade the plane's software by April.

Sen. Ted Cruz (R-TX), who chairs a Senate subcommittee overseeing aviation, called for the grounding of the Max 8 in a Thursday statement.

"Further investigation may reveal that mechanical issues were not the cause, but until that time, our first priority must be the safety of the flying public," said Cruz.

At least 18 carriers -- including American Airlines and Southwest Airlines, the two largest U.S. carriers flying the 737 Max 8 -- have also declined to ground planes , saying they are confident in the safety and "airworthiness" of their fleets. American and Southwest have 24 and 34 of the aircraft in their fleets, respectively. - Dallas Morning News

The United States should be leading the world in aviation safety," said Transport Workers Union president John Samuelsen. " And yet, because of the lust for profit in the American aviation, we're still flying planes that dozens of other countries and airlines have now said need to grounded ." Tags Disaster Accident

[Mar 13, 2019] Boeing's automatic trim for the 737 MAX was not disclosed to the Pilots by Bjorn Fehrm

The background to Boeing's 737 MAX automatic trim
Mar 13, 2019 | leehamnews.com

The automatic trim we described last week has a name, MCAS, or Maneuvering Characteristics Automation System.

It's unique to the MAX because the 737 MAX no longer has the docile pitch characteristics of the 737NG at high Angles Of Attack (AOA). This is caused by the larger engine nacelles covering the higher bypass LEAP-1B engines.

The nacelles for the MAX are larger and placed higher and further forward of the wing, Figure 1.

Figure 1. Boeing 737NG (left) and MAX (right) nacelles compared. Source: Boeing 737 MAX brochure.

By placing the nacelle further forward of the wing, it could be placed higher. Combined with a higher nose landing gear, which raises the nacelle further, the same ground clearance could be achieved for the nacelle as for the 737NG.

The drawback of a larger nacelle, placed further forward, is it destabilizes the aircraft in pitch. All objects on an aircraft placed ahead of the Center of Gravity (the line in Figure 2, around which the aircraft moves in pitch) will contribute to destabilize the aircraft in pitch.

... ... ...

The 737 is a classical flight control aircraft. It relies on a naturally stable base aircraft for its flight control design, augmented in selected areas. Once such area is the artificial yaw damping, present on virtually all larger aircraft (to stop passengers getting sick from the aircraft's natural tendency to Dutch Roll = Wagging its tail).

Until the MAX, there was no need for artificial aids in pitch. Once the aircraft entered a stall, there were several actions described last week which assisted the pilot to exit the stall. But not in normal flight.

The larger nacelles, called for by the higher bypass LEAP-1B engines, changed this. When flying at normal angles of attack (3° at cruise and say 5° in a turn) the destabilizing effect of the larger engines are not felt.

The nacelles are designed to not generate lift in normal flight. It would generate unnecessary drag as the aspect ratio of an engine nacelle is lousy. The aircraft designer focuses the lift to the high aspect ratio wings.

But if the pilot for whatever reason manoeuvres the aircraft hard, generating an angle of attack close to the stall angle of around 14°, the previously neutral engine nacelle generates lift. A lift which is felt by the aircraft as a pitch up moment (as its ahead of the CG line), now stronger than on the 737NG. This destabilizes the MAX in pitch at higher Angles Of Attack (AOA). The most difficult situation is when the maneuver has a high pitch ratio. The aircraft's inertia can then provoke an over-swing into stall AOA.

To counter the MAX's lower stability margins at high AOA, Boeing introduced MCAS. Dependent on AOA value and rate, altitude (air density) and Mach (changed flow conditions) the MCAS, which is a software loop in the Flight Control computer, initiates a nose down trim above a threshold AOA.

It can be stopped by the Pilot counter-trimming on the Yoke or by him hitting the CUTOUT switches on the center pedestal. It's not stopped by the Pilot pulling the Yoke, which for normal trim from the autopilot or runaway manual trim triggers trim hold sensors. This would negate why MCAS was implemented, the Pilot pulling so hard on the Yoke that the aircraft is flying close to stall.

It's probably this counterintuitive characteristic, which goes against what has been trained many times in the simulator for unwanted autopilot trim or manual trim runaway, which has confused the pilots of JT610. They learned that holding against the trim stopped the nose down, and then they could take action, like counter-trimming or outright CUTOUT the trim servo. But it didn't. After a 10 second trim to a 2.5° nose down stabilizer position, the trimming started again despite the Pilots pulling against it. The faulty high AOA signal was still present.

How should they know that pulling on the Yoke didn't stop the trim? It was described nowhere; neither in the aircraft's manual, the AFM, nor in the Pilot's manual, the FCOM. This has created strong reactions from airlines with the 737 MAX on the flight line and their Pilots. They have learned the NG and the MAX flies the same. They fly them interchangeably during the week.

They do fly the same as long as no fault appears. Then there are differences, and the Pilots should have been informed about the differences.

  1. Bruce Levitt
    November 14, 2018
    In figure 2 it shows the same center of gravity for the NG as the Max. I find this a bit surprising as I would have expected that mounting heavy engines further forward would have cause a shift forward in the center of gravity that would not have been offset by the longer tailcone, which I'm assuming is relatively light even with APU installed.

    Based on what is coming out about the automatic trim, Boeing must be counting its lucky stars that this incident happened to Lion Air and not to an American aircraft. If this had happened in the US, I'm pretty sure the fleet would have been grounded by the FAA and the class action lawyers would be lined up outside the door to get their many pounds of flesh.

    This is quite the wake-up call for Boeing.

    • OV-099
      November 14, 2018
      If the FAA is not going to comprehensively review the certification for the 737 MAX, I would not be surprised if EASA would start taking a closer look at the aircraft and why the FAA seemingly missed the seemingly inadequate testing of the automatic trim when they decided to certified the 737 MAX 8. Reply
      • Doubting Thomas
        November 16, 2018
        One wonders if there are any OTHER goodies in the new/improved/yet identical handling latest iteration of this old bird that Boeing did not disclose so that pilots need not be retrained.
        EASA & FAA likely already are asking some pointed questions and will want to verify any statements made by the manufacturer.
        Depending on the answers pilot training requirements are likely to change materially.
    • jbeeko
      November 14, 2018
      CG will vary based on loading. I'd guess the line is the rear-most allowed CG.
    • ahmed
      November 18, 2018
      hi dears
      I think that even the pilot didnt knew about the MCAS ; this case can be corrected by only applying the boeing check list (QRH) stabilizer runaway.
      the pilot when they noticed that stabilizer are trimming without a knewn input ( from pilot or from Auto pilot ) ; shout put the cut out sw in the off position according to QRH. Reply
      • TransWorld
        November 19, 2018
        Please note that the first actions pulling back on the yoke to stop it.

        Also keep in mind the aircraft is screaming stall and the stick shaker is activated.

        Pulling back on the yoke in that case is the WRONG thing to do if you are stalled.

        The Pilot has to then determine which system is lading.

        At the same time its chaning its behavior from previous training, every 5 seconds, it does it again.

        There also was another issue taking place at the same time.

        So now you have two systems lying to you, one that is actively trying to kill you.

        If the Pitot static system is broken, you also have several key instruments feeding you bad data (VSI, altitude and speed)

    • TransWorld
      November 14, 2018
      Grubbie: I can partly answer that.

      Pilots are trained to immediately deal with emergency issues (engine loss etc)

      Then there is a follow up detailed instructions for follow on actions (if any).

      Simulators are wonderful things because you can train lethal scenes without lethal results.

      In this case, with NO pilot training let alone in the manuals, pilots have to either be really quick in the situation or you get the result you do. Some are better at it than others (Sullenbergers along with other aspects elected to turn on his APU even though it was not part of the engine out checklist)

      The other one was to ditch, too many pilots try to turn back even though we are trained not to.

      What I can tell you from personal expereince is having got myself into a spin without any training, I was locked up logic wise (panic) as suddenly nothing was working the way it should.

      I was lucky I was high enough and my brain kicked back into cold logic mode and I knew the counter to a spin from reading)

      Another 500 feet and I would not be here to post.

      While I did parts of the spin recovery wrong, fortunately in that aircraft it did not care, right rudder was enough to stop it.

      Reply
  1. OV-099
    November 14, 2018
    It's starting to look as if Boeing will not be able to just pay victims' relatives in the form of "condolence money", without admitting liability. Reply
    • Dukeofurl
      November 14, 2018
      Im pretty sure, even though its an Indonesian Airline, any whiff of fault with the plane itself will have lawyers taking Boeing on in US courts.
  1. Tech-guru
    November 14, 2018
    Astonishing to say the least. It is quite unlike Boeing. They are normally very good in the documentation and training. It makes everyone wonder how such vital change on the MAX aircraft was omitted from books as weel as in crew training.
    Your explanation is very good as to why you need this damn MCAS. But can you also tell us how just one faulty sensor can trigger this MCAS. In all other Boeing models like B777, the two AOA sensor signals are compared with a calculated AOA and choose the mid value within the ADIRU. It eliminates a drastic mistake of following a wrong sensor input.
    • Bjorn Fehrm
      November 14, 2018
      Hi Tech-Gury,

      it's not sure it's a one sensor fault. One sensor was changed amid information there was a 20 degree diff between the two sides. But then it happened again. I think we might be informed something else is at the root of this, which could also trip such a plausibility check you mention. We just don't know. What we know is the MCAS function was triggered without the aircraft being close to stall.

      Reply
      • Matthew
        November 14, 2018
        If it's certain that the MCAS was doing unhelpful things, that coupled with the fact that no one was telling pilots anything about it suggests to me that this is already effectively an open-and-shut case so far as liability, regulatory remedies are concerned.

        The tecnical root cause is also important, but probably irrelevant so far as estbalishing the ultimate reason behind the crash.

        Reply

[Mar 13, 2019] Boeing Crapification Second 737 Max Plane Within Five Months Crashes Just After Takeoff

Notable quotes:
"... The key point I want to pick up on from that earlier post is this: the Boeing 737 Max includes a new "safety" feature about which the company failed to inform the Federal Aviation Administration (FAA). ..."
"... Boeing Co. withheld information about potential hazards associated with a new flight-control feature suspected of playing a role in last month's fatal Lion Air jet crash, according to safety experts involved in the investigation, as well as midlevel FAA officials and airline pilots. ..."
"... Notice that phrase: "under unusual conditions". Seems now that the pilots of two of these jets may have encountered such unusual conditions since October. ..."
"... Why did Boeing neglect to tell the FAA – or, for that matter, other airlines or regulatory authorities – about the changes to the 737 Max? Well, the airline marketed the new jet as not needing pilots to undergo any additional training in order to fly it. ..."
"... In addition to considerable potential huge legal liability, from both the Lion Air and Ethiopian Airlines crashes, Boeing also faces the commercial consequences of grounding some if not all 737 Max 8 'planes currently in service – temporarily? indefinitely? -and loss or at minimum delay of all future sales of this aircraft model. ..."
"... If this tragedy had happened on an aircraft of another manufacturer other than big Boeing, the fleet would already have been grounded by the FAA. The arrogance of engineers both at Airbus and Boeing, who refuse to give the pilots easy means to regain immediate and full authority over the plane (pitch and power) is just appalling. ..."
"... Boeing has made significant inroads in China with its 737 MAX family. A dozen Chinese airlines have ordered 180 of the planes, and 76 of them have been delivered, according Boeing. About 85% of Boeing's unfilled Chinese airline orders are for 737 MAX planes. ..."
"... "It's pretty asinine for them to put a system on an airplane and not tell the pilots who are operating the airplane, especially when it deals with flight controls," Captain Mike Michaelis, chairman of the safety committee for the Allied Pilots Association, told the Wall Street Journal. ..."
"... The aircraft company concealed the new system and minimized the differences between the MAX and other versions of the 737 to boost sales. On the Boeing website, the company claims that airlines can save "millions of dollars" by purchasing the new plane "because of its commonality" with previous versions of the plane. ..."
"... "Years of experience representing hundreds of victims has revealed a common thread through most air disaster cases," said Charles Herrmann the principle of Herrmann Law. "Generating profit in a fiercely competitive market too often involves cutting safety measures. In this case, Boeing cut training and completely eliminated instructions and warnings on a new system. Pilots didn't even know it existed. I can't blame so many pilots for being mad as hell." ..."
"... The Air France Airbus disaster was jumped on – Boeing's traditional hydraulic links between the sticks for the two pilots ensuring they move in tandem; the supposed comments by Captain Sully that the Airbus software didn't allow him to hit the water at the optimal angle he wanted, causing the rear rupture in the fuselage both showed the inferiority of fly-by-wire until Boeing started using it too. (Sully has taken issue with the book making the above point and concludes fly-by-wire is a "mixed blessing".) ..."
"... Money over people. ..."
Mar 13, 2019 | www.nakedcapitalism.com

Posted on March 11, 2019 by Jerri-Lynn Scofield By Jerri-Lynn Scofield, who has worked as a securities lawyer and a derivatives trader. She is currently writing a book about textile artisans.

Yesterday, an Ethiopian Airlines flight crashed minutes after takeoff, killing all 157 passengers on board.

The crash occurred less than five months after a Lion Air jet crashed near Jakarta, Indonesia, also shortly after takeoff, and killed all 189 passengers.

Both jets were Boeing's latest 737 Max 8 model.

The Wall Street Journal reports in Ethiopian Crash Carries High Stakes for Boeing, Growing African Airline :

The state-owned airline is among the early operators of Boeing's new 737 MAX single-aisle workhorse aircraft, which has been delivered to carriers around the world since 2017. The 737 MAX represents about two-thirds of Boeing's future deliveries and an estimated 40% of its profits, according to analysts.

Having delivered 350 of the 737 MAX planes as of January, Boeing has booked orders for about 5,000 more, many to airlines in fast-growing emerging markets around the world.

The voice and data recorders for the doomed flight have already been recovered, the New York Times reported in Ethiopian Airline Crash Updates: Data and Voice Recorders Recovered . Investigators will soon be able to determine whether the same factors that caused the Lion Air crash also caused the latest Ethiopian Airlines tragedy.

Boeing, Crapification, Two 737 Max Crashes Within Five Months

Yves wrote a post in November, Boeing, Crapification, and the Lion Air Crash , analyzing a devastating Wall Street Journal report on that earlier crash. I will not repeat the details of her post here, but instead encourage interested readers to read it iin full.

The key point I want to pick up on from that earlier post is this: the Boeing 737 Max includes a new "safety" feature about which the company failed to inform the Federal Aviation Administration (FAA). As Yves wrote:

The short version of the story is that Boeing had implemented a new "safety" feature that operated even when its plane was being flown manually, that if it went into a stall, it would lower the nose suddenly to pick airspeed and fly normally again. However, Boeing didn't tell its buyers or even the FAA about this new goodie. It wasn't in pilot training or even the manuals. But even worse, this new control could force the nose down so far that it would be impossible not to crash the plane. And no, I am not making this up. From the Wall Street Journal:

Boeing Co. withheld information about potential hazards associated with a new flight-control feature suspected of playing a role in last month's fatal Lion Air jet crash, according to safety experts involved in the investigation, as well as midlevel FAA officials and airline pilots.

The automated stall-prevention system on Boeing 737 MAX 8 and MAX 9 models -- intended to help cockpit crews avoid mistakenly raising a plane's nose dangerously high -- under unusual conditions can push it down unexpectedly and so strongly that flight crews can't pull it back up. Such a scenario, Boeing told airlines in a world-wide safety bulletin roughly a week after the accident, can result in a steep dive or crash -- even if pilots are manually flying the jetliner and don't expect flight-control computers to kick in.

Notice that phrase: "under unusual conditions". Seems now that the pilots of two of these jets may have encountered such unusual conditions since October.

Why did Boeing neglect to tell the FAA – or, for that matter, other airlines or regulatory authorities – about the changes to the 737 Max? Well, the airline marketed the new jet as not needing pilots to undergo any additional training in order to fly it.

I see. Why Were 737 Max Jets Still in Service? Today, Boeing executives no doubt rue not pulling all 737 Max 8 jets out of service after the October Lion Air crash, to allow their engineers and engineering safety regulators to make necessary changes in the 'plane's design or to develop new training protocols.

In addition to considerable potential huge legal liability, from both the Lion Air and Ethiopian Airlines crashes, Boeing also faces the commercial consequences of grounding some if not all 737 Max 8 'planes currently in service – temporarily? indefinitely? -and loss or at minimum delay of all future sales of this aircraft model.

Over to Yves again, who in her November post cut to the crux:

And why haven't the planes been taken out of service? As one Wall Street Journal reader put it:

If this tragedy had happened on an aircraft of another manufacturer other than big Boeing, the fleet would already have been grounded by the FAA. The arrogance of engineers both at Airbus and Boeing, who refuse to give the pilots easy means to regain immediate and full authority over the plane (pitch and power) is just appalling.

Accident and incident records abound where the automation has been a major contributing factor or precursor. Knowing our friends at Boeing, it is highly probable that they will steer the investigation towards maintenance deficiencies as primary cause of the accident

In the wake of the Ethiopian Airlines crash, other countries have not waited for the FAA to act. China and Indonesia, as well as Ethiopian Airlines and Cayman Airways, have grounded flights of all Boeing 737 Max 8 aircraft, the Guardian reported in Ethiopian Airlines crash: Boeing faces safety questions over 737 Max 8 jets . The FT has called the Chinese and Indonesian actions an "unparalleled flight ban" (see China and Indonesia ground Boeing 737 Max 8 jets after latest crash ). India's air regulator has also issued new rules covering flights of the 737 Max aircraft, requiring pilots to have a minimum of 1,000 hours experience to fly these 'planes, according to a report in the Economic Times, DGCA issues additional safety instructions for flying B737 MAX planes.

Future of Boeing?

The commercial consequences of grounding the 737 Max in China alone are significant, according to this CNN account, Why grounding 737 MAX jets is a big deal for Boeing . The 737 Max is Boeing's most important plane; China is also the company's major market:

"A suspension in China is very significant, as this is a major market for Boeing," said Greg Waldron, Asia managing editor at aviation research firm FlightGlobal.

Boeing has predicted that China will soon become the world's first trillion-dollar market for jets. By 2037, Boeing estimates China will need 7,690 commercial jets to meet its travel demands.

Airbus (EADSF) and Commercial Aircraft Corporation of China, or Comac, are vying with Boeing for the vast and rapidly growing Chinese market.

Comac's first plane, designed to compete with the single-aisle Boeing 737 MAX and Airbus A320, made its first test flight in 2017. It is not yet ready for commercial service, but Boeing can't afford any missteps.

Boeing has made significant inroads in China with its 737 MAX family. A dozen Chinese airlines have ordered 180 of the planes, and 76 of them have been delivered, according Boeing. About 85% of Boeing's unfilled Chinese airline orders are for 737 MAX planes.

The 737 has been Boeing's bestselling product for decades. The company's future depends on the success the 737 MAX, the newest version of the jet. Boeing has 4,700 unfilled orders for 737s, representing 80% of Boeing's orders backlog. Virtually all 737 orders are for MAX versions.

As of the time of posting, US airlines have yet to ground their 737 Max 8 fleets. American Airlines, Alaska Air, Southwest Airlines, and United Airlines have ordered a combined 548 of the new 737 jets, of which 65 have been delivered, according to CNN.

Legal Liability?

Prior to Sunday's Ethiopian Airlines crash, Boeing already faced considerable potential legal liability for the October Lion Air crash. Just last Thursday, the Hermann Law Group of personal injury lawyers filed suit against Boeing on behalf of the families of 17 Indonesian passengers who died in that crash.

The Families of Lion Air Crash File Lawsuit Against Boeing – News Release did not mince words;

"It's pretty asinine for them to put a system on an airplane and not tell the pilots who are operating the airplane, especially when it deals with flight controls," Captain Mike Michaelis, chairman of the safety committee for the Allied Pilots Association, told the Wall Street Journal.

The president of the pilots union at Southwest Airlines, Jon Weaks, said, "We're pissed that Boeing didn't tell the companies, and the pilots didn't get notice."

The aircraft company concealed the new system and minimized the differences between the MAX and other versions of the 737 to boost sales. On the Boeing website, the company claims that airlines can save "millions of dollars" by purchasing the new plane "because of its commonality" with previous versions of the plane.

"Years of experience representing hundreds of victims has revealed a common thread through most air disaster cases," said Charles Herrmann the principle of Herrmann Law. "Generating profit in a fiercely competitive market too often involves cutting safety measures. In this case, Boeing cut training and completely eliminated instructions and warnings on a new system. Pilots didn't even know it existed. I can't blame so many pilots for being mad as hell."

Additionally, the complaint alleges the United States Federal Aviation Administration is partially culpable for negligently certifying Boeing's Air Flight Manual without requiring adequate instruction and training on the new system. Canadian and Brazilian authorities did require additional training.

What's Next?

The consequences for Boeing could be serious and will depend on what the flight and voice data recorders reveal. I also am curious as to what additional flight training or instructions, if any, the Ethiopian Airlines pilots received, either before or after the Lion Air crash, whether from Boeing, an air safety regulator, or any other source.


el_tel , March 11, 2019 at 5:04 pm

Of course we shouldn't engage in speculation, but we will anyway 'cause we're human. If fly-by-wire and the ability of software to over-ride pilots are indeed implicated in the 737 Max 8 then you can bet the Airbus cheer-leaders on YouTube videos will engage in huge Schaudenfreude.

I really shouldn't even look at comments to YouTube videos – it's bad for my blood pressure. But I occasionally dip into the swamp on ones in areas like airlines. Of course – as you'd expect – you get a large amount of "flag waving" between Europeans and Americans. But the level of hatred and suspiciously similar comments by the "if it ain't Boeing I ain't going" brigade struck me as in a whole new league long before the "SJW" troll wars regarding things like Captain Marvel etc of today.

The Air France Airbus disaster was jumped on – Boeing's traditional hydraulic links between the sticks for the two pilots ensuring they move in tandem; the supposed comments by Captain Sully that the Airbus software didn't allow him to hit the water at the optimal angle he wanted, causing the rear rupture in the fuselage both showed the inferiority of fly-by-wire until Boeing started using it too. (Sully has taken issue with the book making the above point and concludes fly-by-wire is a "mixed blessing".)

I'm going to try to steer clear of my YouTube channels on airlines. Hopefully NC will continue to provide the real evidence as it emerges as to what's been going on here.

Monty , March 11, 2019 at 7:14 pm

Re SJW troll wars.

It is really disheartening how an idea as reasonable as "a just society" has been so thoroughly discredited among a large swath of the population.

No wonder there is such a wide interest in primitive construction and technology on YouTube. This society is very sick and it is nice to pretend there is a way to opt out.

none , March 11, 2019 at 8:17 pm

The version I heard (today, on Reddit) was "if it's Boeing, I'm not going". Hadn't seen the opposite version to just now.

Octopii , March 12, 2019 at 5:19 pm

Nobody is going to provide real evidence but the NTSB.

albert , March 12, 2019 at 6:44 pm

Indeed. The NTSB usually works with local investigation teams (as well as a manufacturers rep) if the manufacturer is located in the US, or if specifically requested by the local authorities. I'd like to see their report. I don't care what the FAA or Boeing says about it.
. .. . .. -- .

d , March 12, 2019 at 5:58 pm

fly by wire has been around the 90s, its not new

notabanker , March 11, 2019 at 6:37 pm

Contains a link to a Seattle Times report as a "comprehensive wrap":
Speaking before China's announcement, Cox, who previously served as the top safety official for the Air Line Pilots Association, said it's premature to think of grounding the 737 MAX fleet.

"We don't know anything yet. We don't have close to sufficient information to consider grounding the planes," he said. "That would create economic pressure on a number of the airlines that's unjustified at this point.

China has grounded them . US? Must not create undue economic pressure on the airlines. Right there in black and white. Money over people.

Joey , March 11, 2019 at 11:13 pm

I just emailed southwest about an upcoming flight asking about my choices for refusal to board MAX 8/9 planes based on this "feature". I expect pro forma policy recitation, but customer pressure could trump too big to fail sweeping the dirt under the carpet. I hope.

Thuto , March 12, 2019 at 3:35 am

We got the "safety of our customers is our top priority and we are remaining vigilant and are in touch with Boeing and the Civial Aviation Authority on this matter but will not be grounding the aircraft model until further information on the crash becomes available" speech from a local airline here in South Africa. It didn't take half a day for customer pressure to effect a swift reversal of that blatant disregard for their "top priority", the model is grounded so yeah, customer muscle flexing will do it

Jessica , March 12, 2019 at 5:26 am

On PPRUNE.ORG (where a lot of pilots hang out), they reported that after the Lion Air crash, Southwest added an extra display (to indicate when the two angle of attack sensors were disagreeing with each other) that the folks on PPRUNE thought was an extremely good idea and effective.
Of course, if the Ethiopian crash was due to something different from the Lion Air crash, that extra display on the Southwest planes may not make any difference.

JerryDenim , March 12, 2019 at 2:09 pm

"On PPRUNE.ORG (where a lot of pilots hang out)"

Take those comments with a large dose of salt. Not to say everyone commenting on PPRUNE and sites like PPRUNE are posers, but PPRUNE.org is where a lot of wanna-be pilots and guys that spend a lot of time in basements playing flight simulator games hang out. The "real pilots" on PPRUNE are more frequently of the aspiring airline pilot type that fly smaller, piston-powered planes.

Altandmain , March 11, 2019 at 5:31 pm

We will have to wait and see what the final investigation reveals. However this does not look good for Boeing at all.

The Maneuvering Characteristics Augmentation System (MCAS) system was implicated in the Lion Air crash. There have been a lot of complaints about the system on many of the pilot forums, suggesting at least anecdotally that there are issues. It is highly suspected that the MCAS system is responsible for this crash too.

Keep in mind that Ethiopian Airlines is a pretty well-known and regarded airline. This is not a cut rate airline we are talking about.

At this point, all we can do is to wait for the investigation results.

d , March 12, 2019 at 6:01 pm

one other minor thing. you remember that shut down? seems that would have delayed any updates from Boeing. seems thats one of the things the pilots pointed out when it shutdown was in progress

WestcoastDeplorable , March 11, 2019 at 5:33 pm

What really is the icing on this cake is the fact the new, larger engines on the "Max" changed the center of gravity of the plane and made it unstable. From what I've read on aviation blogs, this is highly unusual for a commercial passenger jet. Boeing then created the new "safety" feature which makes the plane fly nose down to avoid a stall. But of course garbage in, garbage out on sensors (remember AF447 which stalled right into the S. Atlantic?).
It's all politics anyway .if Boeing had been forthcoming about the "Max" it would have required additional pilot training to certify pilots to fly the airliner. They didn't and now another 189 passengers are D.O.A.
I wouldn't fly on one and wouldn't let family do so either.

Carey , March 11, 2019 at 5:40 pm

If I have read correctly, the MCAS system (not known of by pilots until after the Lion Air crash) is reliant on a single Angle of Attack sensor, without redundancy (!). It's too early
to say if MCAS was an issue in the crashes, I guess, but this does not look good.

Jessica , March 12, 2019 at 5:42 am

If it was some other issue with the plane, that will be almost worse for Boeing. Two crash-causing flaws would require grounding all of the planes, suspending production, then doing some kind of severe testing or other to make sure that there isn't a third flaw waiting to show up.

vomkammer , March 12, 2019 at 3:19 pm

If MCAS relies only on one Angle of Attack (AoA) sensor, then it might have been an error in the system design an the safety assessment, from which Boeing may be liable.

It appears that a failure of the AoA can produce an unannuntiated erroneous pitch trim:
a) If the pilots had proper traning and awareness, this event would "only" increase their workload,
b) But for an unaware or untrained pilot, the event would impair its ability to fly and introduce excessive workload.

The difference is important, because according to standard civil aviation safety assessment (see for instance EASA AMC 25.1309 Ch. 7), the case a) should be classified as "Major" failure, whereas b) should be classified as "Hazardous". "Hazardous" failures are required to have much lower probability, which means MCAS needs two AoA sensors.

In summary: a safe MCAS would need either a second AoA or pilot training. It seems that it had neither.

drumlin woodchuckles , March 12, 2019 at 1:01 am

What are the ways an ignorant lay air traveler can find out about whether a particular airline has these new-type Boeing 737 MAXes in its fleet? What are the ways an ignorant air traveler can find out which airlines do not have ANY of these airplanes in their fleet?

What are the ways an ignorant air traveler can find out ahead of time, when still planning herm's trip, which flights use a 737 MAX as against some other kind of plane?

The only way the flying public could possibly torture the airlines into grounding these planes until it is safe to de-ground them is a total all-encompassing "fearcott" against this airplane all around the world. Only if the airlines in the "go ahead and fly it" countries sell zero seats, without exception, on every single 737 MAX plane that flies, will the airlines themselves take them out of service till the issues are resolved.

Hence my asking how people who wish to save their own lives from future accidents can tell when and where they might be exposed to the risk of boarding a Boeing 737 MAX plane.

Carey , March 12, 2019 at 2:13 am

Should be in your flight info, if not, contact the airline. I'm not getting on a 737 MAX.

pau llauter , March 12, 2019 at 10:57 am

Look up the flight on Seatguru. Generally tells type of aircraft. Of course, airlines do change them, too.

Old Jake , March 12, 2019 at 2:57 pm

Stop flying. Your employer requires it? Tell'em where to get off. There are alternatives. The alternatives are less polluting and have lower climate impact also. Yes, this is a hard pill to swallow. No, I don't travel for employment any more, I telecommute. I used to enjoy flying, but I avoid it like plague any more. Crapification.

Darius , March 12, 2019 at 5:09 pm

Additional training won't do. If they wanted larger engines, they needed a different plane. Changing to an unstable center of gravity and compensating for it with new software sounds like a joke except for the hundreds of victims. I'm not getting on that plane.

Joe Well , March 11, 2019 at 5:35 pm

Has there been any study of crapification as a broad social phenomenon? When I Google the word I only get links to NC and sites that reference NC. And yet, this seems like one of the guiding concepts to understand our present world (the crapification of UK media and civil service go a long way towards understanding Brexit, for instance).

I mean, my first thought is, why would Boeing commit corporate self-harm for the sake of a single bullet in sales materials (requires no pilot retraining!). And the answer, of course, is crapification: the people calling the shots don't know what they're doing.

none , March 11, 2019 at 11:56 pm

"Market for lemons" maybe? Anyway the phenomenon is well known.

Alfred , March 12, 2019 at 1:01 am

Google Books finds the word "crapification" quoted (from a 2004) in a work of literary criticism published in 2008 (Literature, Science and a New Humanities, by J. Gottschall). From 2013 it finds the following in a book by Edward Keenan, Some Great Idea: "Policy-wise, it represented a shift in momentum, a slowing down of the childish, intentional crapification of the city ." So there the word appears clearly in the sense understood by regular readers here (along with an admission that crapfication can be intentional and not just inadvertent). To illustrate that sense, Google Books finds the word used in Misfit Toymakers, by Keith T. Jenkins (2014): "We had been to the restaurant and we had water to drink, because after the takeover's, all of the soda makers were brought to ruination by the total crapification of their product, by government management." But almost twenty years earlier the word "crapification" had occurred in a comic strip published in New York Magazine (29 January 1996, p. 100): "Instant crapification! It's the perfect metaphor for the mirror on the soul of America!" The word has been used on television. On 5 January 2010 a sketch subtitled "Night of Terror – The Crapification of the American Pant-scape" ran on The Colbert Report per: https://en.wikipedia.org/wiki/List_of_The_Colbert_Report_episodes_(2010) . Searching the internet, Google results do indeed show many instances of the word "crapification" on NC, or quoted elsewhere from NC posts. But the same results show it used on many blogs since ca. 2010. Here, at http://nyceducator.com/2018/09/the-crapification-factor.html , is a recent example that comments on the word's popularization: "I stole that word, "crapification," from my friend Michael Fiorillo, but I'm fairly certain he stole it from someone else. In any case, I think it applies to our new online attendance system." A comment here, https://angrybearblog.com/2017/09/open-thread-sept-26-2017.html , recognizes NC to have been a vector of the word's increasing usage. Googling shows that there have been numerous instances of the verb "crapify" used in computer-programming contexts, from at least as early as 2006. Google Books finds the word "crapified" used in a novel, Sonic Butler, by James Greve (2004). The derivation, "de-crapify," is also attested. "Crapify" was suggested to Merriam-Webster in 2007 per: http://nws.merriam-webster.com/opendictionary/newword_display_alpha.php?letter=Cr&last=40 . At that time the suggested definition was, "To make situations/things bad." The verb was posted to Urban Dictionary in 2003: https://www.urbandictionary.com/define.php?term=crapify . The earliest serious discussion I could quickly find on crapificatjon as a phenomenon was from 2009 at https://www.cryptogon.com/?p=10611 . I have found only two attempts to elucidate the causes of crapification: http://malepatternboldness.blogspot.com/2017/03/my-jockey-journey-or-crapification-of.html (an essay on undershirts) and https://twilightstarsong.blogspot.com/2017/04/complaints.html (a comment on refrigerators). This essay deals with the mechanics of job crapification: http://asserttrue.blogspot.com/2015/10/how-job-crapification-works.html (relating it to de-skilling). An apparent Americanism, "crapification" has recently been 'translated' into French: "Mon bled est en pleine urbanisation, comprends : en pleine emmerdisation" [somewhat literally -- My hole in the road is in the midst of development, meaning: in the midst of crapification]: https://twitter.com/entre2passions/status/1085567796703096832 Interestingly, perhaps, a comprehensive search of amazon.com yields "No results for crapification."

Joe Well , March 12, 2019 at 12:27 pm

You deserve a medal! That's amazing research!

drumlin woodchuckles , March 12, 2019 at 1:08 am

This seems more like a specific bussiness conspiracy than like general crapification. This isn't " they just don't make them like they used to". This is like Ford deliberately selling the Crash and Burn Pinto with its special explode-on-impact gas-tank feature

Maybe some Trump-style insults should be crafted for this plane so they can get memed-up and travel faster than Boeing's ability to manage the story. Epithets like " the new Boeing crash-a-matic dive-liner
with nose-to-the-ground pilot-override autocrash built into every plane." It seems unfair, but life and safety should come before fairness, and that will only happen if a world wide wave of fear MAKES it happen.

pretzelattack , March 12, 2019 at 2:17 am

yeah first thing i thought of was the ford pinto.

The Rev Kev , March 12, 2019 at 4:19 am

Now there is a car tailor made to modern suicidal Jihadists. You wouldn't even have to load it up with explosives but just a full fuel tank-

https://www.youtube.com/watch?v=lgOxWPGsJNY

drumlin woodchuckles , March 12, 2019 at 3:27 pm

" Instant car bomb. Just add gas."

EoH , March 12, 2019 at 8:47 am

Good time to reread Yves' recent, Is a Harvard MBA Bad For You? :

The underlying problem is increasingly mercenary values in society.

JerryDenim , March 12, 2019 at 2:49 pm

I think crapification is the end result of a self-serving belief in the unfailing goodness and superiority of Ivy faux-meritocracy and the promotion/exaltation of the do-nothing, know-nothing, corporate, revolving-door MBA's and Psych-major HR types over people with many years of both company and industry experience who also have excellent professional track records. The latter group was the group in charge of major corporations and big decisions in the 'good old days', now it's the former. These morally bankrupt people and their vapid, self-righteous culture of PR first, management science second, and what-the-hell-else-matters anyway, are the prime drivers of crapification. Read the bio of an old-school celebrated CEO like Gordon Bethune (Continental CEO with corporate experience at Boeing) who skipped college altogether and joined the Navy at 17, and ask yourself how many people like that are in corporate board rooms today? I'm not saying going back to a 'Good Ole Boy's Club' is the best model of corporate governnace either but at least people like Bethune didn't think they were too good to mix with their fellow employees, understood leadership, the consequences of bullshit, and what 'The buck stops here' thing was really about. Corporate types today sadly believe their own propaganda, and when their fraudulent schemes, can-kicking, and head-in-the sand strategies inevitably blow up in their faces, they accept no blame and fail upwards to another posh corporate job or a nice golden parachute. The wrong people are in charge almost everywhere these days, hence crapification. Bad incentives, zero white collar crime enforcement, self-replicating board rooms, group-think, begets toxic corporate culture, which equals crapification.

Jeff Zink , March 12, 2019 at 5:46 pm

Also try "built in obsolescence"

VietnamVet , March 11, 2019 at 5:40 pm

As a son of a deceased former Boeing aeronautic engineer, this is tragic. It highlights the problem of financialization, neoliberalism, and lack of corporate responsibility pointed out daily here on NC. The crapification was signaled by the move of the headquarters from Seattle to Chicago and spending billions to build a second 787 line in South Carolina to bust their Unions. Boeing is now an unregulated multinational corporation superior to sovereign nations. However, if the 737 Max crashes have the same cause, this will be hard to whitewash. The design failure of windows on the de Havilland Comet killed the British passenger aircraft business. The EU will keep a discrete silence since manufacturing major airline passenger planes is a duopoly with Airbus. However, China hasn't (due to the trade war with the USA) even though Boeing is building a new assembly line there. Boeing escaped any blame for the loss of two Malaysian Airline's 777s. This may be an existential crisis for American aviation. Like a President who denies calling Tim Cook, Tim Apple, or the soft coup ongoing in DC against him, what is really happening globally is not factually reported by corporate media.

Jerry B , March 11, 2019 at 6:28 pm

===Boeing is now an unregulated multinational corporation superior to sovereign nations===

Susan Strange 101.

Or more recently Quinn Slobodian's Globalists: The End of Empire and the Birth of Neoliberalism.

And the beat goes on.

Synoia , March 11, 2019 at 6:49 pm

The design failure of windows on the de Havilland Comet killed the British passenger aircraft business.

Yes, a misunderstanding the the effect of square windows and 3 dimensional stress cracking.

Gary Gray , March 11, 2019 at 7:54 pm

Sorry, but 'sovereign' nations were always a scam. Nothing than a excuse to build capital markets, which are the underpinning of capitalism. Capital Markets are what control countries and have since the 1700's. Maybe you should blame the monarchies for selling out to the bankers in the late middle ages. Sovereign nations are just economic units for the bankers, their businesses they finance and nothing more. I guess they figured out after the Great Depression, they would throw a bunch of goodies at "Indo Europeans" face in western europe ,make them decadent and jaded via debt expansion. This goes back to my point about the yellow vests ..me me me me me. You reek of it. This stuff with Boeing is all profit based. It could have happened in 2000, 1960 or 1920. It could happen even under state control. Did you love Hitler's Voltswagon?

As for the soft coup .lol you mean Trumps soft coup for his allies in Russia and the Middle East viva la Saudi King!!!!!? Posts like these represent the problem with this board. The materialist over the spiritualist. Its like people who still don't get some of the biggest supporters of a "GND" are racialists and being somebody who has long run the environmentalist rally game, they are hugely in the game. Yet Progressives completely seem blind to it. The media ignores them for con men like David Duke(who's ancestry is not clean, no its not) and "Unite the Right"(or as one friend on the environmental circuit told me, Unite the Yahweh apologists) as whats "white". There is a reason they do this.

You need to wake up and stop the self-gratification crap. The planet is dying due to mishandlement. Over urbanization, over population, constant need for me over ecosystem. It can only last so long. That is why I like Zombie movies, its Gaia Theory in a nutshell. Good for you Earth .or Midgard. Which ever you prefer.

Carey , March 11, 2019 at 8:05 pm

Your job seems to be to muddy the waters, and I'm sure we'll be seeing much more of the same; much more.

Thanks!

pebird , March 11, 2019 at 10:24 pm

Hitler had an electric car?

JerryDenim , March 12, 2019 at 3:05 pm

Hee-hee. I noticed that one too.

TimR , March 12, 2019 at 9:41 am

Interesting but I'm unclear on some of it.. GND supporters are racialist?

JerryDenim , March 12, 2019 at 3:02 pm

Spot on comment VietnamVet, a lot of chickens can be seen coming home to roost in this latest Boeing disaster. Remarkable how not many years ago the government could regulate the aviation industry without fear of killing it, since there was more than one aerospace company, not anymore! The scourge of monsopany/monopoly power rears its head and bites in unexpected places.

Ptb , March 11, 2019 at 5:56 pm

More detail on the "MCAS" system responsible for the previous Lion Air crash here (theaircurrent.com)

It says the bigger and repositioned engine, which give the new model its fuel efficiency, and wing angle tweaks needed to fit the engine vs landing gear and clearance,
change the amount of pitch trim it needs in turns to remain level.

The auto system was added top neutralize the pitch trim during turns, too make it handle like the old model.

There is another pitch trim control besides the main "stick". To deactivate the auto system, this other trim control has to be used, the main controls do not deactivate it (perhaps to prevent it from being unintentionally deactivated, which would be equally bad). If the sensor driving the correction system gives a false reading and the pilot were unaware, there would be seesawing and panic

Actually, if this all happened again I would be very surprised. Nobody flying a 737 would not know after the previous crash. Curious what they find.

Ptb , March 11, 2019 at 6:38 pm

Ok typo fixes didn't register gobbledygook.

EoH , March 12, 2019 at 8:38 am

While logical, If your last comment were correct, it should have prevented this most recent crash. It appears that the "seesawing and panic" continue.

I assume it has now gone beyond the cockpit, and beyond the design, and sales teams and reached the Boeing board room. From there, it is likely to travel to the board rooms of every airline flying this aircraft or thinking of buying one, to their banks and creditors, and to those who buy or recommend their stock. But it may not reach the FAA for some time.

marku52 , March 12, 2019 at 2:47 pm

Full technical discussion of why this was needed at:

https://leehamnews.com/2018/11/14/boeings-automatic-trim-for-the-737-max-was-not-disclosed-to-the-pilots/

Ptb , March 12, 2019 at 5:32 pm

Excellent link, thanks!

Kimac , March 11, 2019 at 6:20 pm

As to what's next?

Think, Too Big To Fail.

Any number of ways will be found to put lipstick on this pig once we recognize the context.

allan , March 11, 2019 at 6:38 pm

"Canadian and Brazilian authorities did require additional training" from the quote at the bottom is not
something I've seen before. What did they know and when did they know it?

rd , March 11, 2019 at 8:31 pm

They probably just assumed that the changes in the plane from previous 737s were big enough to warrant treating it like a major change requiring training.

Both countries fly into remote areas with highly variable weather conditions and some rugged terrain.

dcrane , March 11, 2019 at 7:25 pm

Re: withholding information from the FAA

For what it's worth, the quoted section says that Boeing withheld info about the MCAS from "midlevel FAA officials", while Jerri-Lynn refers to the FAA as a whole.

This makes me wonder if top-level FAA people certified the system.

Carey , March 11, 2019 at 7:37 pm

See under "regulatory capture"

Corps run the show, regulators are window-dressing.

IMO, of course. Of course

allan , March 11, 2019 at 8:04 pm

It wasn't always this way. From 1979:

DC-10 Type Certificate Lifted [Aviation Week]

FAA action follows finding of new cracks in pylon aft bulkhead forward flange; crash investigation continues

Suspension of the McDonnell Douglas DC-10's type certificate last week followed a separate grounding order from a federal court as government investigators were narrowing the scope of their investigation of the American Airlines DC-10 crash May 25 in Chicago.

The American DC-10-10, registration No. N110AA, crashed shortly after takeoff from Chicago's O'Hare International Airport, killing 259 passengers, 13 crewmembers and three persons on the ground. The 275 fatalities make the crash the worst in U.S. history.

The controversies surrounding the grounding of the entire U.S. DC-10 fleet and, by extension, many of the DC-10s operated by foreign carriers, by Federal Aviation Administrator Langhorne Bond on the morning of June 6 to revolve around several issues.

Carey , March 11, 2019 at 8:39 pm

Yes, I remember back when the FAA would revoke a type certificate if a plane was a danger to public safety. It wasn't even that long ago. Now their concern is any threat to Boeing™. There's a name for that

Joey , March 11, 2019 at 11:22 pm

'Worst' disaster in Chicago would still ground planes. Lucky for Boeing its brown and browner.

Max Peck , March 11, 2019 at 7:30 pm

It's not correct to claim the MCAS was concealed. It's right in the January 2017 rev of the NG/MAX differences manual.

Carey , March 11, 2019 at 7:48 pm

Mmm. Why do the dudes and dudettes *who fly the things* say they knew nothing
about MCAS? Their training is quite rigorous.

Max Peck , March 11, 2019 at 10:00 pm

See a post below for link. I'd have provided it in my original post but was on a phone in an inconvenient place for editing.

Carey , March 12, 2019 at 1:51 am

'Boeing's automatic trim for the 737 MAX was not disclosed to the Pilots':

https://leehamnews.com/2018/11/14/boeings-automatic-trim-for-the-737-max-was-not-disclosed-to-the-pilots/

marku52 , March 12, 2019 at 2:39 pm

Leeham news is the best site for info on this. For those of you interested in the tech details got to Bjorns Corner, where he writes about aeronautic design issues.

I was somewhat horrified to find that modern aircraft flying at near mach speeds have a lot of somewhat pasted on pilot assistances. All of them. None of them fly with nothing but good old stick-and-rudder. Not Airbus (which is actually fully Fly By wire-all pilot inputs got through a computer) and not Boeing, which is somewhat less so.

This latest "solution came about becuse the larger engines (and nacelles) fitted on the Max increased lift ahead of the center of gravity in a pitchup situation, which was destabilizing. The MCAS uses inputs from air speed and angle of attack sensors to put a pitch down input to the horizonatal stablisizer.

A faluty AofA sensor lead to Lion Air's Max pushing the nose down against the pilots efforts all the way into the sea.

This is the best backgrounder

https://leehamnews.com/2018/11/14/boeings-automatic-trim-for-the-737-max-was-not-disclosed-to-the-pilots/

The Rev Kev , March 11, 2019 at 7:48 pm

One guy said last night on TV that Boeing had eight years of back orders for this aircraft so you had better believe that this crash will be studied furiously. Saw a picture of the crash site and it looks like it augured in almost straight down. There seems to be a large hole and the wreckage is not strew over that much area. I understand that they were digging out the cockpit as it was underground. Strange that.

Carey , March 11, 2019 at 7:55 pm

It's said that the Flight Data Recorders have been found, FWIW.

EoH , March 12, 2019 at 9:28 am

Suggestive of a high-speed, nose-first impact. Not the angle of attack a pilot would ordinarily choose.

Max Peck , March 11, 2019 at 9:57 pm

It's not true that Boeing hid the existence of the MCAS. They documented it in the January 2017 rev of the NG/MAX differences manual and probably earlier than that. One can argue whether the description was adequate, but the system was in no way hidden.

Carey , March 11, 2019 at 10:50 pm

Looks like, for now, we're stuck between your "in no way hidden", and numerous 737 pilots' claims on various online aviation boards that they knew nothing about MCAS. Lots of money involved, so very cloudy weather expected. For now I'll stick with the pilots.

Alex V , March 12, 2019 at 2:27 am

To the best of my understanding and reading on the subject, the system was well documented in the Boeing technical manuals, but not in the pilots' manuals, where it was only briefly mentioned, at best, and not by all airlines. I'm not an airline pilot, but from what I've read, airlines often write their own additional operators manuals for aircraft models they fly, so it was up to them to decide the depth of documentation. These are in theory sufficient to safely operate the plane, but do not detail every aircraft system exhaustively, as a modern aircraft is too complex to fully understand. Other technical manuals detail how the systems work, and how to maintain them, but a pilot is unlikely to read them as they are used by maintenance personnel or instructors. The problem with these cases (if investigations come to the same conclusions) is that insufficient information was included in the pilots manual explaining the MCAS, even though the information was communicated via other technical manuals.

vlade , March 12, 2019 at 11:50 am

This is correct.

A friend of mine is a commercial pilot who's just doing a 'training' exercise having moved airlines.

He's been flying the planes in question most of his life, but the airline is asking him to re-do it all according to their manuals and their rules. If the airline manual does not bring it up, then the pilots will not read it – few of them have time to go after the actual technical manuals and read those in addition to what the airline wants. [oh, and it does not matter that he has tens of thousands of hours on the airplane in question, if he does not do something in accordance with his new airline manual, he'd get kicked out, even if he was right and the airline manual wrong]

I believe (but would have to check with him) that some countries regulators do their own testing over and above the airlines, but again, it depends on what they put in.

Alex V , March 12, 2019 at 11:58 am

Good to head my understanding was correct. My take on the whole situation was that Boeing was negligent in communicating the significance of the change, given human psychology and current pilot training. The reason was to enable easier aircraft sales. The purpose of the MCAS system is however quite legitimate – it enables a more fuel efficient plane while compensating for a corner case of the flight envelope.

Max Peck , March 12, 2019 at 8:01 am

The link is to the actual manual. If that doesn't make you reconsider, nothing will. Maybe some pilots aren't expected to read the manuals, I don't know.

Furthermore, the post stated that Boeing failed to inform the FAA about the MCAS. Surely the FAA has time to read all of the manuals.

Darius , March 12, 2019 at 6:18 pm

Nobody reads instruction manuals. They're for reference. Boeing needed to yell at the pilots to be careful to read new pages 1,576 through 1,629 closely. They're a lulu.

Also, what's with screwing with the geometry of a stable plane so that it will fall out of the sky without constant adjustments by computer software? It's like having a car designed to explode but don't worry. We've loaded software to prevent that. Except when there's an error. But don't worry. We've included reboot instructions. It takes 15 minutes but it'll be OK. And you can do it with one hand and drive with the other. No thanks. I want the car not designed to explode.

The Rev Kev , March 11, 2019 at 10:06 pm

The FAA is already leaping to the defense of the Boeing 737 Max 8 even before they have a chance to open up the black boxes. Hope that nothing "happens" to those recordings.

https://www.bbc.com/news/world-africa-47533052

Milton , March 11, 2019 at 11:04 pm

I don't know, crapification, at least for me, refers to products, services, or infrastructure that has declined to the point that it has become a nuisance rather than a benefit it once was. This case with Boeing borders on criminal negligence.

pretzelattack , March 12, 2019 at 8:20 am

i came across a word that was new to me "crapitalism", goes well with crapification.

TG , March 12, 2019 at 12:50 am

1. It's really kind of amazing that we can fly to the other side of the world in a few hours – a journey that in my grandfather's time would have taken months and been pretty unpleasant and risky – and we expect perfect safety.

2. Of course the best-selling jet will see these issues. It's the law of large numbers.

3. I am not a fan of Boeing's corporate management, but still, compared to Wall Street and Defense Contractors and big education etc. they still produce an actual technical useful artifact that mostly works, and at levels of performance that in other fields would be considered superhuman.

4. Even for Boeing, one wonders when the rot will set in. Building commercial airliners is hard! So many technical details, nowhere to hide if you make even one mistake so easy to just abandon the business entirely. Do what the (ex) US auto industry did, contract out to foreign manufacturers and just slap a "USA" label on it and double down on marketing. Milk the cost-plus cash cow of the defense market. Or just financialize the entire thing and become too big to fail and walk away with all the profits before the whole edifice crumbles. Greed is good, right?

marku52 , March 12, 2019 at 2:45 pm

"Of course the best-selling jet will see these issues. It's the law of large numbers."

2 crashes of a new model in vary similar circumstances is very unusual. And FAA admits they are requiring a FW upgrade sometime in April. Pilots need to be hyperaware of what this MCAS system is doing. And they currently aren't.

Prairie Bear , March 12, 2019 at 2:42 am

if it went into a stall, it would lower the nose suddenly to pick airspeed and fly normally again.

A while before I read this post, I listened to a news clip that reported that the plane was observed "porpoising" after takeoff. I know only enough about planes and aviation to be a more or less competent passenger, but it does seem like that is something that might happen if the plane had such a feature and the pilot was not familiar with it and was trying to fight it? The below link is not to the story I saw I don't think, but another one I just found.

if it went into a stall, it would lower the nose suddenly to pick airspeed and fly normally again.

https://www.yahoo.com/gma/know-boeing-737-max-8-crashed-ethiopia-221411537.html

none , March 12, 2019 at 5:33 am

https://www.reuters.com/article/us-ethiopia-airplane-witnesses/ethiopian-plane-smoked-and-shuddered-before-deadly-plunge-idUSKBN1QS1LJ

Reuters reports people saw smoke and debris coming out of the plane before the crash.

Jessica , March 12, 2019 at 6:06 am

At PPRUNE.ORG, many of the commentators are skeptical of what witnesses of airplane crashes say they see, but more trusting of what they say they hear.
The folks at PPRUNE.ORG who looked at the record of the flight from FlightRadar24, which only covers part of the flight because FlightRadar24's coverage in that area is not so good and the terrain is hilly, see a plane flying fast in a straight line very unusually low.

EoH , March 12, 2019 at 8:16 am

The dodge about making important changes that affect aircraft handling but not disclosing them – so as to avoid mandatory pilot training, which would discourage airlines from buying the modified aircraft – is an obvious business-over-safety choice by an ethics and safety challenged corporation.

But why does even a company of that description, many of whose top managers, designers, and engineers live and breathe flight, allow its s/w engineers to prevent the pilots from overriding a supposed "safety" feature while actually flying the aircraft? Was it because it would have taken a little longer to write and test the additional s/w or because completing the circle through creating a pilot override would have mandated disclosure and additional pilot training?

Capt. "Sully" Sullenberger and his passengers and crew would have ended up in pieces at the bottom of the Hudson if the s/w on his aircraft had prohibited out of the ordinary flight maneuvers that contradicted its programming.

Alan Carr , March 12, 2019 at 9:13 am

If you carefully review the over all airframe of the 737 it has not hardly changed over the past 20 years or so, for the most part Boeing 737 specifications . What I believe the real issue here is the Avionics upgrades over the years has changed dramatically. More and more precision avionics are installed with less and less pilot input and ultimately no control of the aircraft. Though Boeing will get the brunt of the lawsuits, the avionics company will be the real culprit. I believe the avionics on the Boeing 737 is made by Rockwell Collins, which you guessed it, is owned by Boeing.

Max Peck , March 12, 2019 at 9:38 am

Rockwell Collins has never been owned by Boeing.

Also, to correct some upthread assertions, MCAS has an off switch.

WobblyTelomeres , March 12, 2019 at 10:02 am

United Technologies, UTX, I believe. If I knew how to short, I'd probably short this 'cause if they aren't partly liable, they'll still be hurt if Boeing has to slow (or, horror, halt) production.

Alan Carr , March 12, 2019 at 11:47 am

You are right Max I mis spoke. Rockwell Collins is owned by United Technologies Corporation

Darius , March 12, 2019 at 6:24 pm

Which astronaut are you? Heh.

EoH , March 12, 2019 at 9:40 am

Using routine risk management protocols, the American FAA should need continuing "data" on an aircraft for it to maintain its airworthiness certificate. Its current press materials on the Boeing 737 Max 8 suggest it needs data to yank it or to ground the aircraft pending review. Has it had any other commercial aircraft suffer two apparently similar catastrophic losses this close together within two years of the aircraft's launch?

Synoia , March 12, 2019 at 11:37 am

I am raising an issue with "crapification" as a meme. Crapification is a symptom of a specific behaviour.

GREED.

Please could you reconsider your writing to invlude this very old, tremendously venal, and "worst" sin?

US incentiveness of inventing a new word, "crapification" implies that some error cuould be corrected. If a deliberate sin, it requires atonement and forgiveness, and a sacrifice of wolrdy assets, for any chance of forgiveness and redemption.

Alan Carr , March 12, 2019 at 11:51 am

Something else that will be interesting to this thread is that Boeing doesn't seem to mind letting the Boeing 737 Max aircraft remain for sale on the open market

vlade , March 12, 2019 at 11:55 am

the EU suspends MAX 8s too

Craig H. , March 12, 2019 at 2:29 pm

The moderators in reddit.com/r/aviation are fantastic.

They have corralled everything into one mega-thread which is worth review:

https://www.reddit.com/r/aviation/comments/azzp0r/ethiopian_airlines_et302_and_boeing_737_max_8/

allan , March 12, 2019 at 3:00 pm

Thanks. That's a great link with what seem to be some very knowledgeable comments.

John Beech , March 12, 2019 at 2:30 pm

Experienced private pilot here. Lots of commercial pilot friends. First, the EU suspending the MAX 8 is politics. Second, the FAA mandated changes were already in the pipeline. Three, this won't stop the ignorant from staking out a position on this, and speculating about it on the internet, of course. Fourth, I'd hop a flight in a MAX 8 without concern – especially with a US pilot on board. Why? In part because the Lion Air event a few months back led to pointed discussion about the thrust line of the MAX 8 vs. the rest of the 737 fleet and the way the plane has software to help during strong pitch up events (MAX 8 and 9 have really powerful engines).

Basically, pilots have been made keenly aware of the issue and trained in what to do. Another reason I'd hop a flight in one right now is because there have been more than 31,000 trouble free flights in the USA in this new aircraft to date. My point is, if there were a systemic issue we'd already know about it. Note, the PIC in the recent crash had +8000 hours but the FO had about 200 hours and there is speculation he was flying. Speculation.

Anyway, US commercial fleet pilots are very well trained to deal with runaway trim or uncommanded flight excursions. How? Simple, by switching the breaker off. It's right near your fingers. Note, my airplane has an autopilot also. In the event the autopilot does something unexpected, just like the commercial pilot flying the MAX 8, I'm trained in what to do (the very same thing, switch the thing off).

Moreover, I speak form experience because I've had it happen twice in 15 years – once an issue with a servo causing the plane to slowly drift right wing low, and once a connection came loose leaving the plane trimmed right wing low (coincidence). My reaction is/was about the same as that of a experienced typist automatically hitting backspace on the keyboard upon realizing they mistyped a word, e.g. not reflex but nearly so. In my case, it was to throw the breaker to power off the autopilot as I leveled the plane. No big deal.

Finally, as of yet there been no analysis from the black boxes. I advise holding off on the speculation until they do. They've been found and we'll learn something soon. The yammering and near hysteria by non-pilots – especially with this thread – reminds me of the old saw about now knowing how smart or ignorant someone is until they open their mouth.

notabanker , March 12, 2019 at 5:29 pm

So let me get this straight.

While Boeing is designing a new 787, Airbus redesigns the A320. Boeing cannot compete with it, so instead of redesigning the 737 properly, they put larger engines on it further forward, which is never intended in the original design. So to compensate they use software with two sensors, not three, making it mathematically impossible to know if you have a faulty sensor which one it would be, to automatically adjust the pitch to prevent a stall, and this is the only true way to prevent a stall. But since you can kill the breaker and disable it if you have a bad sensor and can't possibly know which one, everything is ok. And now that the pilots can disable a feature required for certification, we should all feel good about these brand new planes, that for the first time in history, crashed within 5 months.

And the FAA, which hasn't had a Director in 14 months, knows better than the UK, Europe, China, Australia, Singapore, India, Indonesia, Africa and basically every other country in the world except Canada. And the reason every country in the world except Canada has grounded the fleet is political? Singapore put Silk Air out of business because of politics?

How many people need to be rammed into the ground at 500 mph from 8000 feet before yammering and hysteria are justified here? 400 obviously isn't enough.

VietnamVet , March 12, 2019 at 5:26 pm

Overnight since my first post above, the 737 Max 8 crash has become political. The black boxes haven't been officially read yet. Still airlines and aviation authorities have grounded the airplane in Europe, India, China, Mexico, Brazil, Australia and S.E. Asia in opposition to FAA's "Continued Airworthiness Notification to the International Community" issued yesterday.

I was wrong. There will be no whitewash. I thought they would remain silent. My guess this is a result of an abundance of caution plus greed (Europeans couldn't help gutting Airbus's competitor Boeing). This will not be discussed but it is also a manifestation of Trump Derangement Syndrome (TDS). Since the President has started dissing Atlantic Alliance partners, extorting defense money, fighting trade wars, and calling 3rd world countries s***-holes; there is no sympathy for the collapsing hegemon. Boeing stock is paying the price. If the cause is the faulty design of the flight position sensors and fly by wire software control system, it will take a long while to design and get approval of a new safe redundant control system and refit the airplanes to fly again overseas. A real disaster for America's last manufacturing industry.

[Mar 13, 2019] Boing might not survive the third crash

Too much automation and too complex flight control computer engager life of pilots and passengers...
Notable quotes:
"... When systems (like those used to fly giant aircraft) become too automatic while remaining essentially stupid or limited by the feedback systems, they endanger the airplane and passengers. These two "accidents" are painful warnings for air passengers and voters. ..."
"... This sort of problem is not new. Search the web for pitot/static port blockage, erroneous stall / overspeed indications. Pilots used to be trained to handle such emergencies before the desk-jockey suits decided computers always know best. ..."
"... @Sky Pilot, under normal circumstances, yes. but there are numerous reports that Boeing did not sufficiently test the MCAS with unreliable or incomplete signals from the sensors to even comply to its own quality regulations. ..."
"... Boeing did cut corners when designing the B737 MAX by just replacing the engines but not by designing a new wing which would have been required for the new engine. ..."
"... I accept that it should be easier for pilots to assume manual control of the aircraft in such situations but I wouldn't rush to condemn the programmers before we get all the facts. ..."
Mar 13, 2019 | www.nytimes.com

Shirley OK March 11

I want to know if Boeing 767s, as well as the new 737s, now has the Max 8 flight control computer installed with pilots maybe not being trained to use it or it being uncontrollable.

A 3rd Boeing - not a passenger plane but a big 767 cargo plane flying a bunch of stuff for Amazon crashed near Houston (where it was to land) on 2-23-19. The 2 pilots were killed. Apparently there was no call for help (at least not mentioned in the AP article about it I read).

'If' the new Max 8 system had been installed, had either Boeing or the owner of the cargo plane business been informed of problems with Max 8 equipment that had caused a crash and many deaths in a passenger plane (this would have been after the Indonesian crash)? Was that info given to the 2 pilots who died if Max 8 is also being used in some 767s? Did Boeing get the black box from that plane and, if so, what did they find out?

Those 2 pilots' lives matter also - particularly since the Indonesian 737 crash with Max 8 equipment had already happened. Boeing hasn't said anything (yet, that I've seen) about whether or not the Max 8 new configuration computer and the extra steps to get manual control is on other of their planes.

I want to know about the cause of that 3rd Boeing plane crashing and if there have been crashes/deaths in other of Boeing's big cargo planes. What's the total of all Boeing crashes/fatalies in the last few months and how many of those planes had Max 8?

Rufus SF March 11

Gentle readers: In the aftermath of the Lion Air crash, do you think it possible that all 737Max pilots have not received mandatory training review in how to quickly disconnect the MCAS system and fly the plane manually?

Do you think it possible that every 737Max pilot does not have a "disconnect review" as part of his personal checklist? Do you think it possible that at the first hint of pitch instability, the pilot does not first think of the MCAS system and whether to disable it?

Harold Orlando March 11

Compare the altitude fluctuations with those from Lion Air in NYTimes excellent coverage( https://www.nytimes.com/interactive/2018/11/16/world/asia/lion-air-crash-cockpit.html ), and they don't really suggest to me a pilot struggling to maintain proper pitch. Maybe the graph isn't detailed enough, but it looks more like a major, single event rather than a number of smaller corrections. I could be wrong.

Reports of smoke and fire are interesting; there is nothing in the modification that (we assume) caused Lion Air's crash that would explain smoke and fire. So I would hesitate to zero in on the modification at this point. Smoke and fire coming from the luggage bay suggest a runaway Li battery someone put in their suitcase. This is a larger issue because that can happen on any aircraft, Boeing, Airbus, or other.

mrpisces Loui March 11

Is is a shame that Boeing will not ground this aircraft knowing they introduced the MCAS component to automate the stall recovery of the 737 MAX and is behind these accidents in my opinion. Stall recovery has always been a step all pilots handled when the stick shaker and other audible warnings were activated to alert the pilots.

Now, Boeing invented MCAS as a "selling and marketing point" to a problem that didn't exist. MCAS kicks in when the aircraft is about to enter the stall phase and places the aircraft in a nose dive to regain speed. This only works when the air speed sensors are working properly. Now imagine when the air speed sensors have a malfunction and the plane is wrongly put into a nose dive.

The pilots are going to pull back on the stick to level the plane. The MCAS which is still getting incorrect air speed data is going to place the airplane back into a nose dive. The pilots are going to pull back on the stick to level the aircraft. This repeats itself till the airplane impacts the ground which is exactly what happened.

Add the fact that Boeing did not disclose the existence of the MCAS and its role to pilots. At this point only money is keeping the 737 MAX in the air. When Boeing talks about safety, they are not referring to passenger safety but profit safety.

Tony San Diego March 11

1. The procedure to allow a pilot to take complete control of the aircraft from auto-pilot mode should have been a standard eg pull back on the control column. It is not reasonable to expect a pilot to follow some checklist to determine and then turn off a misbehaving module especially in emergency situations. Even if that procedure is written in fine print in a manual. (The number of modules to disable may keep increasing if this is allowed).

2. How are US airlines confident of the safety of the 737 MAX right now when nothing much is known about the cause of the 2nd crash? What is known if that both the crashed aircraft were brand new, and we should be seeing news articles on how the plane's brand-new advanced technology saved the day from the pilot and not the other way round

3. In the first crash, the plane's advanced technology could not even recognize that either the flight path was abnormal and/or the airspeed readings were too erroneous and mandate the pilot to therefore take complete control immediately!

John✔️✔️Brews Tucson, AZ March 11

It's straightforward to design for standard operation under normal circumstances. But when bizarre operation occurs resulting in extreme circumstances a lot more comes into play. Not just more variables interacting more rapidly, testing system response times, but much happening quickly, testing pilot response times and experience. It is doubtful that the FAA can assess exactly what happened in these crashes. It is a result of a complex and rapid succession of man-machine-software-instrumentation interactions, and the number of permutations is huge. Boeing didn't imagine all of them, and didn't test all those it did think of.

The FAA is even less likely to do so. Boeing eventually will fix some of the identified problems, and make pilot intervention more effective. Maybe all that effort to make the new cockpit look as familiar as the old one will be scrapped? Pilot retraining will be done? Redundant sensors will be added? Additional instrumentation? Software re-written?

That'll increase costs, of course. Future deliveries will cost more. Looks likely there will be some downtime. Whether the fixes will cover sufficient eventualities, time will tell. Whether Boeing will be more scrupulous in future designs, less willing to cut corners without evaluating them? Will heads roll? Well, we'll see...

Ron SC March 11

Boeing has been in trouble technologically since its merger with McDonnell Douglas, which some industry analysts called a takeover, though it isn't clear who took over whom since MD got Boeing's name while Boeing took the MD logo and moved their headquarters from Seattle to Chicago.

In addition to problems with the 737 Max, Boeing is charging NASA considerably more than the small startup, SpaceX, for a capsule designed to ferry astronauts to the space station. Boeing's Starliner looks like an Apollo-era craft and is launched via a 1960's-like ATLAS booster.

Despite using what appears to be old technology, the Starliner is well behind schedule and over budget while the SpaceX capsule has already docked with the space station using state-of-art reusable rocket boosters at a much lower cost. It seems Boeing is in trouble, technologically.

BSmith San Francisco March 11

When you read that this model of the Boeing 737 Max was more fuel efficient, and view the horrifying graphs (the passengers spent their last minutes in sheer terror) of the vertical jerking up and down of both air crafts, and learn both crashes occurred minutes after take off, you are 90% sure that the problem is with design, or design not compatible with pilot training. Pilots in both planes had received permission to return to the airports. The likely culprit. to a trained designer, is the control system for injecting the huge amounts of fuel necessary to lift the plane to cruising altitude. Pilots knew it was happening and did not know how to override the fuel injection system.

These two crashes foretell what will happen if airlines, purely in the name of saving money, elmininate human control of aircraft. There will be many more crashes.

These ultra-complicated machines which defy gravity and lift thousands of pounds of dead weight into the stratesphere to reduce friction with air, are immensely complex and common. Thousands of flight paths cover the globe each day. Human pilots must ultimately be in charge - for our own peace of mind, and for their ability to deal with unimaginable, unforeseen hazards.

When systems (like those used to fly giant aircraft) become too automatic while remaining essentially stupid or limited by the feedback systems, they endanger the airplane and passengers. These two "accidents" are painful warnings for air passengers and voters.

Brez Spring Hill, TN March 11

1. Ground the Max 737.

2. Deactivate the ability of the automated system to override pilot inputs, which it apparently can do even with the autopilot disengaged.

3. Make sure that the autopilot disengage button on the yoke (pickle switch) disconnects ALL non-manual control inputs.

4. I do not know if this version of the 737 has direct input ("rope start") gyroscope, airspeed and vertical speed inticators for emergencies such as failure of the electronic wonder-stuff. If not, install them. Train pilots to use them.

5. This will cost money, a lot of money, so we can expect more self-serving excuses until the FAA forces Boeing to do the right thing.

6. This sort of problem is not new. Search the web for pitot/static port blockage, erroneous stall / overspeed indications. Pilots used to be trained to handle such emergencies before the desk-jockey suits decided computers always know best.

Harper Arkansas March 11

I flew big jets for 34 years, mostly Boeing's. Boeing added new logic to the trim system and was allowed to not make it known to pilots. However it was in maintenance manuals. Not great, but these airplanes are now so complex there are many systems that pilots don't know all of the intimate details.

NOT IDEAL, BUT NOT OVERLY SIGNIFICANT. Boeing changed one of the ways to stop a runaway trim system by eliminating the control column trim brake, ie airplane nose goes up, push down (which is instinct) and it stops the trim from running out of control.

BIG DEAL BOIENG AND FAA, NOT TELLING PILOTS. Boeing produces checklists for almost any conceivable malfunction. We pilots are trained to accomplish the obvious then go immediately to the checklist. Some items on the checklist are so important they are called "Memory Items" or "Red Box Items".

These would include things like in an explosive depressurization to put on your o2 mask, check to see that the passenger masks have dropped automatically and start a descent.

Another has always been STAB TRIM SWITCHES ...... CUTOUT which is surrounded by a RED BOX.

For very good reasons these two guarded switches are very conveniently located on the pedestal right between the pilots.

So if the nose is pitching incorrectly, STAB TRIM SWITCHES ..... CUTOUT!!! Ask questions later, go to the checklist. THAT IS THE PILOTS AND TRAINING DEPARTMENTS RESPONSIBILITY. At this point it is not important as to the cause.

David Rubien New York March 11

If these crashes turn out to result from a Boeing flaw, how can that company continue to stay in business? It should be put into receivership and its executives prosecuted. How many deaths are persmissable?

Osama Portland OR March 11

The emphasis on software is misplaced. The software intervention is triggered by readings from something called an Angle of Attack sensor. This sensor is relatively new on airplanes. A delicate blade protrudes from the fuselage and is deflected by airflow. The direction of the airflow determines the reading. A false reading from this instrument is the "garbage in" input to the software that takes over the trim function and directs the nose of the airplane down. The software seems to be working fine. The AOA sensor? Not so much.

experience Michiigan March 11

The basic problem seems to be that the 737 Max 8 was not designed for the larger engines and so there are flight characteristics that could be dangerous. To compensate for the flaw, computer software was used to control the aircraft when the situation was encountered. The software failed to prevent the situation from becoming a fatal crash.

A work around that may be the big mistake of not redesigning the aircraft properly for the larger engines in the first place. The aircraft may need to be modified at a cost that would be not realistic and therefore abandoned and a entirely new aircraft design be implemented. That sounds very drastic but the only other solution would be to go back to the original engines. The Boeing Company is at a crossroad that could be their demise if the wrong decision is made.

Sky Pilot NY March 11

It may be a training issue in that the 737 Max has several systems changes from previous 737 models that may not be covered adequately in differences training, checklists, etc. In the Lyon Air crash, a sticky angle-of-attack vane caused the auto-trim to force the nose down in order to prevent a stall. This is a worthwhile safety feature of the Max, but the crew was slow (or unable) to troubleshoot and isolate the problem. It need not have caused a crash. I suspect the same thing happened with Ethiopian Airlines. The circumstances are temptingly similar.

Thomas Singapore March 11

@Sky Pilot, under normal circumstances, yes. but there are numerous reports that Boeing did not sufficiently test the MCAS with unreliable or incomplete signals from the sensors to even comply to its own quality regulations. And that is just one of the many quality issues with the B737 MAX that have been in the news for a long time and have been of concern to some of the operators while at the same time being covered up by the FAA.

Just look at the difference in training requirements between the FAA and the Brazilian aviation authority.

Brazilian pilots need to fully understand the MCAS and how to handle it in emergency situations while FAA does not even require pilots to know about it.

Thomas Singapore March 11

This is yet another beautiful example of the difference in approach between Europeans and US Americans. While Europeans usually test their before they deliver the product thoroughly in order to avoid any potential failures of the product in their customers hands, the US approach is different: It is "make it work somehow and fix the problems when the client has them".

Which is what happened here as well. Boeing did cut corners when designing the B737 MAX by just replacing the engines but not by designing a new wing which would have been required for the new engine.

So the aircraft became unstable to fly at low speedy and tight turns which required a fix by implementing the MCAS which then was kept from recertification procedures for clients for reasons competitive sales arguments. And of course, the FAA played along and provided a cover for this cutting of corners as this was a product of a US company.

Then the proverbial brown stuff hit the fan, not once but twice. So Boeing sent its "thoughts and prayers" and started to hope for the storm to blow over and for finding a fix that would not be very expensive and not eat the share holder value away.

Sorry, but that is not the way to design and maintain aircraft. If you do it, do it right the first time and not fix it after more than 300 people died in accidents. There is a reason why China has copied the Airbus A-320 and not the Boeing B737 when building its COMAC C919. The Airbus is not a cheap fix, still tested by customers.

Rafael USA March 11

@Thomas And how do you know that Boeing do not test the aircrafts before delivery? It is a requirement by FAA for all complete product, systems, parts and sub-parts to be tested before delivery. However it seems Boeing has not approached the problem (or maybe they do not know the real issue).

As for the design, are you an engineer that can say whatever the design and use of new engines without a complete re-design is wrong? Have you seen the design drawings of the airplane? I do work in an industry in which our products are use for testing different parts of aircratfs and Boeing is one of our customers.

Our products are use during manufacturing and maintenance of airplanes. My guess is that Boeing has no idea what is going on. Your biased opinion against any US product is evident. There are regulations in the USA (and not in other Asia countries) that companies have to follow. This is not a case of untested product, it is a case of unknown problem and Boeing is really in the dark of what is going on...

Sam Europe March 11

Boeing and Regulators continue to exhibit criminal behaviour in this case. Ethical responsibility expects that when the first brand new MAX 8 fell for potentially issues with its design, the fleet should have been grounded. Instead, money was a priority; and unfortunately still is. They are even now flying. Disgraceful and criminal behaviour.

Imperato NYC March 11

@Sam no...too soon to come anywhere near that conclusion.

YW New York, NY March 11

A terrible tragedy for Ethiopia and all of the families affected by this disaster. The fact that two 737 Max jets have crashed in one year is indeed suspicious, especially as it has long been safer to travel in a Boeing plane than a car or school bus. That said, it is way too early to speculate on the causes of the two crashes being identical. Eyewitness accounts of debris coming off the plane in mid-air, as has been widely reported, would not seem to square with the idea that software is again at fault. Let's hope this puzzle can be solved quickly.

Wayne Brooklyn, New York March 11

@Singh the difference is consumer electronic products usually have a smaller number of components and wiring compared to commercial aircraft with miles of wiring and multitude of sensors and thousands of components. From what I know they usually have a preliminary report that comes out in a short time. But the detailed reported that takes into account analysis will take over one year to be written.

John A San Diego March 11

The engineers and management at Boeing need a crash course in ethics. After the crash in Indonesia, Boeing was trying to pass the blame rather than admit responsibility. The planes should all have been grounded then. Now the chickens have come to roost. Boeing is in serious trouble and it will take a long time to recover the reputation. Large multinationals never learn.

Imperato NYC March 11

@John A the previous pilot flying the Lion jet faced the same problem but dealt with it successfully. The pilot on the ill fated flight was less experienced and unfortunately failed.

BSmith San Francisco March 11

@Imperato Solving a repeat problem on an airplane type must not solely depend upon a pilot undertaking an emergency response! That is nonsense even to a non-pilot! This implies that Boeing allows a plane to keep flying which it knows has a fatal flaw! Shouldn't it be grounding all these planes until it identifies and solves the same problem?

Jimi DC March 11

NYT recently did an excellent job explaining how pilots were kept in the dark, by Boeing, during software update for 737 Max: https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html#click=https://t.co/MRgpKKhsly

Steve Charlotte, NC March 11

Something is wrong with those two graphs of altitude and vertical speed. For example, both are flat at the end, even though the vertical speed graph indicates that the plane was climbing rapidly. So what is the source of those numbers? Is it ground-based radar, or telemetry from onboard instruments? If the latter, it might be a clue to the problem.

Imperato NYC March 11

@Steve Addis Ababa is almost at 8000ft.

George North Carolina March 11

I wonder if, somewhere, there is a a report from some engineers saying that the system pushed by administrative-types to get the plane on the market quickly, will results in serious problems down the line.

Rebecca Michigan March 11

If we don't know why the first two 737 Max Jets crashed, then we don't know how long it will be before another one has a catastrophic failure. All the planes need to be grounded until the problem can be duplicated and eliminated.

Shirley OK March 11

@Rebecca And if it is something about the plane itself - and maybe an interaction with the new software - then someone has to be ready to volunteer to die to replicate what's happened.....

Rebecca Michigan March 12

@Shirley Heavens no. When investigating failures, duplicating the problem helps develop the solution. If you can't recreate the problem, then there is nothing to solve. Duplicating the problem generally is done through analysis and simulations, not with actual planes and passengers.

Sisifo Carrboro, NC March 11

Computer geeks can be deadly. This is clearly a software problem. The more software goes into a plane, the more likely it is for a software failure to bring down a plane. And computer geeks are always happy to try "new things" not caring what the effects are in the real world. My PC has a feature that controls what gets typed depending on the speed and repetitiveness of what I type. The darn thing is a constant source of annoyance as I sit at my desk, and there is absolutely no way to neutralize it because a computer geek so decided. Up in an airliner cockpit, this same software idiocy is killing people like flies.

Pooja MA March 11

@Sisifo Software that goes into critical systems like aircraft have a lot more constraints. Comparing it to the user interface on your PC doesn't make any sense. It's insulting to assume programmers are happy to "try new things" at the expense of lives. If you'd read about the Lion Air crash carefully you'd remember that there were faulty sensors involved. The software was doing what it was designed to do but the input it was getting was incorrect. I accept that it should be easier for pilots to assume manual control of the aircraft in such situations but I wouldn't rush to condemn the programmers before we get all the facts.

BSmith San Francisco March 11

@Pooja Mistakes happen. If humans on board can't respond to terrible situations then there is something wrong with the aircraft and its computer systems. By definition.

Patriot NJ March 11

Airbus had its own experiences with pilot "mode confusion" in the 1990's with at least 3 fatal crashes in the A320, but was able to control the media narrative until they resolved the automation issues. Look up Air Inter 148 in Wikipedia to learn the similarities.

Opinioned! NYC -- currently wintering in the Pacific March 11

"Commands issued by the plane's flight control computer that bypasses the pilots." What could possibly go wrong? Now let's see whether Boeing's spin doctors can sell this as a feature, not a bug.

Chris Hartnett Minneapolis March 11

It is telling that the Chinese government grounded their fleet of 737 Max 8 aircraft before the US government. The world truly has turned upside down when it potentially is safer to fly in China than the US. Oh, the times we live in. Chris Hartnett Datchet, UK (formerly Minneapolis)

Hollis Barcelona March 11

As a passenger who likes his captains with a head full of white hair, even if the plane is nosediving to instrument failure, does not every pilot who buckles a seat belt worldwide know how to switch off automatic flight controls and fly the airplane manually?

Even if this were 1000% Boeing's fault pilots should be able to override electronics and fly the plane safely back to the airport. I'm sure it's not that black and white in the air and I know it's speculation at this point but can any pilots add perspective regarding human responsibility?

Karl Rollings Sydney, Australia March 11

@Hollis I'm not a pilot nor an expert, but my understanding is that planes these days are "fly by wire", meaning the control surfaces are operated electronically, with no mechanical connection between the pilot's stick and the wings. So if the computer goes down, the ability to control the plane goes with it.

William Philadelphia March 11

@Hollis The NYT's excellent reporting on the Lion Air crash indicated that in nearly all other commercial aircraft, manual control of the pilot's yoke would be sufficient to override the malfunctioning system (which was controlling the tail wings in response to erroneous sensor data). Your white haired captain's years of training would have ingrained that impulse.

Unfortunately, on the Max 8 that would not sufficiently override the tail wings until the pilots flicked a switch near the bottom of the yoke. It's unclear whether individual airlines made pilots aware of this. That procedure existed in older planes but may not have been standard practice because the yoke WOULD sufficiently override the tail wings. Boeing's position has been that had pilots followed the procedure, a crash would not have occurred.

Nat Netherlands March 11

@Hollis No, that is the entire crux of this problem; switching from auto-pilot to manual does NOT solve it. Hence the danger of this whole system. T

his new Boeing 737-Max series are having the engines placed a bit further away than before and I don't know why they did this, but the result is that there can be some imbalance in air, which they then tried to correct with this strange auto-pilot technical adjustment.

Problem is that it stalls the plane (by pushing its nose down and even flipping out small wings sometimes) even when it shouldn't, and even when they switch to manual this system OVERRULES the pilot and switches back to auto-pilot, continuing to try to 'stabilize' (nose dive) the plane. That's what makes it so dangerous.

It was designed to keep the plane stable but basically turned out to function more or less like a glitch once you are taking off and need the ascend. I don't know why it only happens now and then, as this plane had made many other take-offs prior, but when it hits, it can be deadly. So far Boeings 'solution' is sparsely sending out a HUGE manual for pilots about how to fight with this computer problem.

Which are complicated to follow in a situation of stress with a plane computer constantly pushing the nose of your plane down. Max' mechanism is wrong and instead of correcting it properly, pilots need special training. Or a new technical update may help... which has been delayed and still hasn't been provided.

Mark Lebow Milwaukee, WI March 11

Is it the inability of the two airlines to maintain one of the plane's fly-by-wire systems that is at fault, not the plane itself? Or are both crashes due to pilot error, not knowing how to operate the system and then overreacting when it engages? Is the aircraft merely too advanced for its own good? None of these questions seems to have been answered yet.

Shane Marin County, CA March 11 Times Pick

This is such a devastating thing for Ethiopian Airlines, which has been doing critical work in connecting Africa internally and to the world at large. This is devastating for the nation of Ethiopia and for all the family members of those killed. May the memory of every passenger be a blessing. We should all hope a thorough investigation provides answers to why this make and model of airplane keep crashing so no other people have to go through this horror again.

Mal T KS March 11

A possible small piece of a big puzzle: Bishoftu is a city of 170,000 that is home to the main Ethiopian air force base, which has a long runway. Perhaps the pilot of Flight 302 was seeking to land there rather than returning to Bole Airport in Addis Ababa, a much larger and more densely populated city than Bishoftu. The pilot apparently requested return to Bole, but may have sought the Bishoftu runway when he experienced further control problems. Detailed analysis of radar data, conversations between pilot and control tower, flight path, and other flight-related information will be needed to establish the cause(s) of this tragedy.

Nan Socolow West Palm Beach, FL March 11

The business of building and selling airplanes is brutally competitive. Malfunctions in the systems of any kind on jet airplanes ("workhorses" for moving vast quantities of people around the earth) lead to disaster and loss of life. Boeing's much ballyhooed and vaunted MAX 8 737 jet planes must be grounded until whatever computer glitches brought down Ethiopian Air and LION Air planes -- with hundreds of passenger deaths -- are explained and fixed.

In 1946, Arthur Miller's play, "All My Sons", brought to life guilt by the airplane industry leading to deaths of WWII pilots in planes with defective parts. Arthur Miller was brought before the House UnAmerican Activities Committee because of his criticism of the American Dream. His other seminal American play, "Death of a Salesman", was about an everyman to whom attention must be paid. Attention must be paid to our aircraft industry. The American dream must be repaired.

Rachel Brooklyn, NY March 11

This story makes me very afraid of driverless cars.

Chuck W. Seattle, WA March 11

Meanwhile, human drivers killed 40,000 and injured 4.5 million people in 2018... For comparison, 58,200 American troops died in the entire Vietnam war. Computers do not fall asleep, get drunk, drive angry, or get distracted. As far as I am concerned, we cannot get unreliable humans out from behind the wheel fast enough.

jcgrim Knoxville, TN March 11

@Chuck W. Humans write the algorithms of driverless cars. Algorithms are not 100% fail-safe. Particularly when humans can't seem to write snap judgements or quick inferences into an algorithm. An algorithm can make driverless cars safe in predictable situations but that doesn't mean driveless cars will work in unpredictable events. Also, I don't trust the hype from Uber or the tech industry. https://www.nytimes.com/2017/02/24/technology/anthony-levandowski-waymo-uber-google-lawsuit.html?mtrref=t.co&gwh=D6880521C2C06930788921147F4506C8&gwt=pay

John NYC March 11

The irony here seems to be that in attempting to make the aircraft as safe as possible (with systems updates and such) Boeing may very well have made their product less safe. Since the crashes, to date, have been limited to the one product that product should be grounded until a viable determination has been made. John~ American Net'Zen

cosmos Washington March 11

Knowing quite a few Boeing employees and retirees, people who have shared numerous stories of concerns about Boeing operations -- I personally avoid flying. As for the assertion: "The business of building and selling jets is brutally competitive" -- it is monopolistic competition, as there are only two players. That means consumers (in this case airlines) do not end up with the best and widest array of airplanes. The more monopolistic a market, the more it needs to be regulated in the public interest -- yet I seriously doubt the FAA or any governmental agency has peeked into all the cost cutting measures Boeing has implemented in recent years

drdeanster tinseltown March 11

@cosmos Patently ridiculous. Your odds are greater of dying from a lightning strike, or in a car accident. Or even from food poisoning. Do you avoid driving? Eating? Something about these major disasters makes people itching to abandon all sense of probability and statistics.

Bob Milan March 11

When the past year was the dealiest one in decades, and when there are two disasters involved the same plane within that year, how can anyone not draw an inference that there are something wrong with the plane? In statistical studies of a pattern, this is a very very strong basis for a logical reasoning that something is wrong with the plane. When the number involves human lives, we must take very seriously the possibility of design flaws. The MAX planes should be all grounded for now. Period.

65 Recommend
mak pakistan March 11

@Bob couldn't agree more - however the basic design and engineering of the 737 is proven to be dependable over the past ~ 6 decades......not saying that there haven't been accidents - but these probably lie well within the industry / type averages. the problems seems to have arisen with the introduction of systems which have purportedly been introduced to take a part of the work-load off the pilots & pass it onto a central compuertised system.

Maybe the 'automated anti-stalling ' programme installed into the 737 Max, due to some erroneous inputs from the sensors, provide inaccurate data to the flight management controls leading to stalling of the aircraft. It seems that the manufacturer did not provide sufficent technical data about the upgraded software, & incase of malfunction, the corrective procedures to be followed to mitigate such diasters happening - before delivery of the planes to customers.

The procedure for the pilot to take full control of the aircraft by disengaging the central computer should be simple and fast to execute. Please we don't want Tesla driverless vehicles high up in the sky !

James Conner Northwestern Montana March 11

All we know at the moment is that a 737 Max crashed in Africa a few minutes after taking off from a high elevation airport. Some see similarities with the crash of Lion Air's 737 Max last fall -- but drawing a line between the only two dots that exist does not begin to present a useful picture of the situation.

Human nature seeks an explanation for an event, and may lead some to make assumptions that are without merit in order to provide closure. That tendency is why following a dramatic event, when facts are few, and the few that exist may be misleading, there is so much cocksure speculation masquerading as solid, reasoned, analysis. At this point, it's best to keep an open mind and resist connecting dots.

Peter Sweden March 11

@James Conner 2 deadly crashes after the introduction of a new airplane has no precedence in recent aviation history. And the time it has happened (with Comet), it was due to a faulty aircraft design. There is, of course, some chance that there is no connection between the two accidents, but if there is, the consequences are huge. Especially because the two events happened with very similar fashion (right after takeoff, with wild altitude changes), so there is more similarities than just the type of the plane. So there is literally no reason to keep this model in the air until the investigation is concluded. Oh well, there is: money. Over human lives.

svenbi NY March 11

It might be a wrong analogy, but if Toyota/Lexus recall over 1.5 million vehicles due to at least over 20 fatalities in relations to potentially fawlty airbags, Boeing should -- after over 300 deaths in just about 6 months -- pull their product of the market voluntarily until it is sorted out once and for all.

This tragic situation recalls the early days of the de Havilland Comet, operated by BOAC, which kept plunging from the skies within its first years of operation until the fault was found to be in the rectangular windows, which did not withstand the pressure due its jet speed and the subsequent cracks in body ripped the planes apart in midflight.

Thore Eilertsen Oslo March 11

A third crash may have the potential to take the aircraft manufacturer out of business, it is therefore unbelievable that the reasons for the Lion Air crash haven't been properly established yet. With more than a 100 Boeing 737 Max already grounded, I would expect crash investigations now to be severely fast tracked.

And the entire fleet should be grounded on the principle of "better safe than sorry". But then again, that would cost Boeing money, suggesting that the company's assessment of the risks involved favours continued operations above the absolute safety of passengers.

Londoner London March 11

@Thore Eilertsen This is also not a case for a secretive and extended crash investigation process. As soon as the cockpit voice recording is extracted - which might be later today - it should be made public. We also need to hear the communications between the controllers and the aircraft and to know about the position regarding the special training the pilots received after the Lion Air crash.

Trevor Canada March 11

@Thore Eilertsen I would imagine that Boeing will be the first to propose grounding these planes if they believe with a high degree of probability that it's their issue. They have the most to lose. Let logic and patience prevail.

Marvin McConoughey oregon March 11

It is very clear, even in these early moments, that aircraft makers need far more comprehensive information on everything pertinent that is going on in cockpits when pilots encounter problems. That information should be continually transmitted to ground facilities in real time to permit possible ground technical support.

[Feb 11, 2019] 6 most prevalent problems in the software development world

Dec 01, 2018 | www.catswhocode.com

November 20, 2018

[Dec 27, 2018] The Yoda of Silicon Valley by Siobhan Roberts

Highly recommended!
Although he is certainly a giant, Knuth will never be able to complete this monograph - the technology developed too quickly. Three volumes came out in 1963-1968 and then there was a lull. January 10, he will be 81. At this age it is difficult to work in the field of mathematics and system programming. So we will probably never see the complete fourth volume.
This inability to finish the work he devoted a large part of hi life is definitely a tragedy. The key problem here is that now it is simply impossible to cover the whole area of ​​system programming and related algorithms for one person. But the first three volumes played tremendous positive role for sure.
Also he was distracted for several years to create TeX. He needed to create a non-profit and complete this work by attracting the best minds from the outside. But he is by nature a loner, as many great scientists are, and prefer to work this way.
His other mistake is due to the fact that MIX - his emulator was too far from the IBM S/360, which became the standard de-facto in mid-60th. He then realized that this was a blunder and replaced MIX with more modem emulator MIXX, but it was "too little, too late" and it took time and effort. So the first three volumes and fragments of the fourth is all that we have now and probably forever.
Not all volumes fared equally well with time. The third volume suffered most IMHO and as of 2019 is partially obsolete. Also it was written by him in some haste and some parts of it are are far from clearly written ( it was based on earlier lectures of Floyd, so it was oriented of single CPU computers only. Now when multiprocessor machines, huge amount of RAM and SSD hard drives are the norm, the situation is very different from late 60th. It requires different sorting algorithms (the importance of mergesort increased, importance of quicksort decreased). He also got too carried away with sorting random numbers and establishing upper bound and average run time. The real data is almost never random and typically contain sorted fragments. For example, he overestimated the importance of quicksort and thus pushed the discipline in the wrong direction.
Notable quotes:
"... These days, it is 'coding', which is more like 'code-spraying'. Throw code at a problem until it kind of works, then fix the bugs in the post-release, or the next update. ..."
"... AI is a joke. None of the current 'AI' actually is. It is just another new buzz-word to throw around to people that do not understand it at all. ..."
"... One good teacher makes all the difference in life. More than one is a rare blessing. ..."
Dec 17, 2018 | www.nytimes.com

With more than one million copies in print, "The Art of Computer Programming " is the Bible of its field. "Like an actual bible, it is long and comprehensive; no other book is as comprehensive," said Peter Norvig, a director of research at Google. After 652 pages, volume one closes with a blurb on the back cover from Bill Gates: "You should definitely send me a résumé if you can read the whole thing."

The volume opens with an excerpt from " McCall's Cookbook ":

Here is your book, the one your thousands of letters have asked us to publish. It has taken us years to do, checking and rechecking countless recipes to bring you only the best, only the interesting, only the perfect.

Inside are algorithms, the recipes that feed the digital age -- although, as Dr. Knuth likes to point out, algorithms can also be found on Babylonian tablets from 3,800 years ago. He is an esteemed algorithmist; his name is attached to some of the field's most important specimens, such as the Knuth-Morris-Pratt string-searching algorithm. Devised in 1970, it finds all occurrences of a given word or pattern of letters in a text -- for instance, when you hit Command+F to search for a keyword in a document.

... ... ...

During summer vacations, Dr. Knuth made more money than professors earned in a year by writing compilers. A compiler is like a translator, converting a high-level programming language (resembling algebra) to a lower-level one (sometimes arcane binary) and, ideally, improving it in the process. In computer science, "optimization" is truly an art, and this is articulated in another Knuthian proverb: "Premature optimization is the root of all evil."

Eventually Dr. Knuth became a compiler himself, inadvertently founding a new field that he came to call the "analysis of algorithms." A publisher hired him to write a book about compilers, but it evolved into a book collecting everything he knew about how to write for computers -- a book about algorithms.

... ... ...

When Dr. Knuth started out, he intended to write a single work. Soon after, computer science underwent its Big Bang, so he reimagined and recast the project in seven volumes. Now he metes out sub-volumes, called fascicles. The next installation, "Volume 4, Fascicle 5," covering, among other things, "backtracking" and "dancing links," was meant to be published in time for Christmas. It is delayed until next April because he keeps finding more and more irresistible problems that he wants to present.

In order to optimize his chances of getting to the end, Dr. Knuth has long guarded his time. He retired at 55, restricted his public engagements and quit email (officially, at least). Andrei Broder recalled that time management was his professor's defining characteristic even in the early 1980s.

Dr. Knuth typically held student appointments on Friday mornings, until he started spending his nights in the lab of John McCarthy, a founder of artificial intelligence, to get access to the computers when they were free. Horrified by what his beloved book looked like on the page with the advent of digital publishing, Dr. Knuth had gone on a mission to create the TeX computer typesetting system, which remains the gold standard for all forms of scientific communication and publication. Some consider it Dr. Knuth's greatest contribution to the world, and the greatest contribution to typography since Gutenberg.

This decade-long detour took place back in the age when computers were shared among users and ran faster at night while most humans slept. So Dr. Knuth switched day into night, shifted his schedule by 12 hours and mapped his student appointments to Fridays from 8 p.m. to midnight. Dr. Broder recalled, "When I told my girlfriend that we can't do anything Friday night because Friday night at 10 I have to meet with my adviser, she thought, 'This is something that is so stupid it must be true.'"

... ... ...

Lucky, then, Dr. Knuth keeps at it. He figures it will take another 25 years to finish "The Art of Computer Programming," although that time frame has been a constant since about 1980. Might the algorithm-writing algorithms get their own chapter, or maybe a page in the epilogue? "Definitely not," said Dr. Knuth.

"I am worried that algorithms are getting too prominent in the world," he added. "It started out that computer scientists were worried nobody was listening to us. Now I'm worried that too many people are listening."


Scott Kim Burlingame, CA Dec. 18

Thanks Siobhan for your vivid portrait of my friend and mentor. When I came to Stanford as an undergrad in 1973 I asked who in the math dept was interested in puzzles. They pointed me to the computer science dept, where I met Knuth and we hit it off immediately. Not only a great thinker and writer, but as you so well described, always present and warm in person. He was also one of the best teachers I've ever had -- clear, funny, and interested in every student (his elegant policy was each student can only speak twice in class during a period, to give everyone a chance to participate, and he made a point of remembering everyone's names). Some thoughts from Knuth I carry with me: finding the right name for a project is half the work (not literally true, but he labored hard on finding the right names for TeX, Metafont, etc.), always do your best work, half of why the field of computer science exists is because it is a way for mathematically minded people who like to build things can meet each other, and the observation that when the computer science dept began at Stanford one of the standard interview questions was "what instrument do you play" -- there was a deep connection between music and computer science, and indeed the dept had multiple string quartets. But in recent decades that has changed entirely. If you do a book on Knuth (he deserves it), please be in touch.

IMiss America US Dec. 18

I remember when programming was art. I remember when programming was programming. These days, it is 'coding', which is more like 'code-spraying'. Throw code at a problem until it kind of works, then fix the bugs in the post-release, or the next update.

AI is a joke. None of the current 'AI' actually is. It is just another new buzz-word to throw around to people that do not understand it at all. We should be in a golden age of computing. Instead, we are cutting all corners to get something out as fast as possible. The technology exists to do far more. It is the human element that fails us.

Ronald Aaronson Armonk, NY Dec. 18

My particular field of interest has always been compiler writing and have been long awaiting Knuth's volume on that subject. I would just like to point out that among Kunth's many accomplishments is the invention of LR parsers, which are widely used for writing programming language compilers.

Edward Snowden Russia Dec. 18

Yes, \TeX, and its derivative, \LaTeX{} contributed greatly to being able to create elegant documents. It is also available for the web in the form MathJax, and it's about time the New York Times supported MathJax. Many times I want one of my New York Times comments to include math, but there's no way to do so! It comes up equivalent to: $e^{i\pi}+1$.

48 Recommend
henry pick new york Dec. 18

I read it at the time, because what I really wanted to read was volume 7, Compilers. As I understood it at the time, Professor Knuth wrote it in order to make enough money to build an organ. That apparantly happened by 3:Knuth, Searching and Sorting. The most impressive part is the mathemathics in Semi-numerical (2:Knuth). A lot of those problems are research projects over the literature of the last 400 years of mathematics.

Steve Singer Chicago Dec. 18

I own the three volume "Art of Computer Programming", the hardbound boxed set. Luxurious. I don't look at it very often thanks to time constraints, given my workload. But your article motivated me to at least pick it up and carry it from my reserve library to a spot closer to my main desk so I can at least grab Volume 1 and try to read some of it when the mood strikes. I had forgotten just how heavy it is, intellectual content aside. It must weigh more than 25 pounds.

Terry Hayes Los Altos, CA Dec. 18

I too used my copies of The Art of Computer Programming to guide me in several projects in my career, across a variety of topic areas. Now that I'm living in Silicon Valley, I enjoy seeing Knuth at events at the Computer History Museum (where he was a 1998 Fellow Award winner), and at Stanford. Another facet of his teaching is the annual Christmas Lecture, in which he presents something of recent (or not-so-recent) interest. The 2018 lecture is available online - https://www.youtube.com/watch?v=_cR9zDlvP88

Chris Tong Kelseyville, California Dec. 17

One of the most special treats for first year Ph.D. students in the Stanford University Computer Science Department was to take the Computer Problem-Solving class with Don Knuth. It was small and intimate, and we sat around a table for our meetings. Knuth started the semester by giving us an extremely challenging, previously unsolved problem. We then formed teams of 2 or 3. Each week, each team would report progress (or lack thereof), and Knuth, in the most supportive way, would assess our problem-solving approach and make suggestions for how to improve it. To have a master thinker giving one feedback on how to think better was a rare and extraordinary experience, from which I am still benefiting! Knuth ended the semester (after we had all solved the problem) by having us over to his house for food, drink, and tales from his life. . . And for those like me with a musical interest, he let us play the magnificent pipe organ that was at the center of his music room. Thank you Professor Knuth, for giving me one of the most profound educational experiences I've ever had, with such encouragement and humor!

Been there Boulder, Colorado Dec. 17

I learned about Dr. Knuth as a graduate student in the early 70s from one of my professors and made the financial sacrifice (graduate student assistantships were not lucrative) to buy the first and then the second volume of the Art of Computer Programming. Later, at Bell Labs, when I was a bit richer, I bought the third volume. I have those books still and have used them for reference for years. Thank you Dr, Knuth. Art, indeed!

Gianni New York Dec. 18

@Trerra In the good old days, before Computer Science, anyone could take the Programming Aptitude Test. Pass it and companies would train you. Although there were many mathematicians and scientists, some of the best programmers turned out to be music majors. English, Social Sciences, and History majors were represented as well as scientists and mathematicians. It was a wonderful atmosphere to work in . When I started to look for a job as a programmer, I took Prudential Life Insurance's version of the Aptitude Test. After the test, the interviewer was all bent out of shape because my verbal score was higher than my math score; I was a physics major. Luckily they didn't hire me and I got a job with IBM.

M Martínez Miami Dec. 17

In summary, "May the force be with you" means: Did you read Donald Knuth's "The Art of Computer Programming"? Excellent, we loved this article. We will share it with many young developers we know.

mds USA Dec. 17

Dr. Knuth is a great Computer Scientist. Around 25 years ago, I met Dr. Knuth in a small gathering a day before he was awarded a honorary Doctorate in a university. This is my approximate recollection of a conversation. I said-- " Dr. Knuth, you have dedicated your book to a computer (one with which he had spent a lot of time, perhaps a predecessor to PDP-11). Isn't it unusual?". He said-- "Well, I love my wife as much as anyone." He then turned to his wife and said --"Don't you think so?". It would be nice if scientists with the gift of such great minds tried to address some problems of ordinary people, e.g. a model of economy where everyone can get a job and health insurance, say, like Dr. Paul Krugman.

Nadine NYC Dec. 17

I was in a training program for women in computer systems at CUNY graduate center, and they used his obtuse book. It was one of the reasons I dropped out. He used a fantasy language to describe his algorithms in his book that one could not test on computers. I already had work experience as a programmer with algorithms and I know how valuable real languages are. I might as well have read Animal Farm. It might have been different if he was the instructor.

Doug McKenna Boulder Colorado Dec. 17

Don Knuth's work has been a curious thread weaving in and out of my life. I was first introduced to Knuth and his The Art of Computer Programming back in 1973, when I was tasked with understanding a section of the then-only-two-volume Book well enough to give a lecture explaining it to my college algorithms class. But when I first met him in 1981 at Stanford, he was all-in on thinking about typography and this new-fangled system of his called TeX. Skip a quarter century. One day in 2009, I foolishly decided kind of on a whim to rewrite TeX from scratch (in my copious spare time), as a simple C library, so that its typesetting algorithms could be put to use in other software such as electronic eBook's with high-quality math typesetting and interactive pictures. I asked Knuth for advice. He warned me, prepare yourself, it's going to consume five years of your life. I didn't believe him, so I set off and tried anyway. As usual, he was right.

Baddy Khan San Francisco Dec. 17

I have signed copied of "Fundamental Algorithms" in my library, which I treasure. Knuth was a fine teacher, and is truly a brilliant and inspiring individual. He taught during the same period as Vint Cerf, another wonderful teacher with a great sense of humor who is truly a "father of the internet". One good teacher makes all the difference in life. More than one is a rare blessing.

Indisk Fringe Dec. 17

I am a biologist, specifically a geneticist. I became interested in LaTeX typesetting early in my career and have been either called pompous or vilified by people at all levels for wanting to use. One of my PhD advisors famously told me to forget LaTeX because it was a thing of the past. I have now forgotten him completely. I still use LaTeX almost every day in my work even though I don't generally typeset with equations or algorithms. My students always get trained in using proper typesetting. Unfortunately, the publishing industry has all but largely given up on TeX. Very few journals in my field accept TeX manuscripts, and most of them convert to word before feeding text to their publishing software. Whatever people might argue against TeX, the beauty and elegance of a property typeset document is unparalleled. Long live LaTeX

PaulSFO San Francisco Dec. 17

A few years ago Severo Ornstein (who, incidentally, did the hardware design for the first router, in 1969), and his wife Laura, hosted a concert in their home in the hills above Palo Alto. During a break a friend and I were chatting when a man came over and *asked* if he could chat with us (a high honor, indeed). His name was Don. After a few minutes I grew suspicious and asked "What's your last name?" Friendly, modest, brilliant; a nice addition to our little chat.

Tim Black Wilmington, NC Dec. 17

When I was a physics undergraduate (at Trinity in Hartford), I was hired to re-write professor's papers into TeX. Seeing the beauty of TeX, I wrote a program that re-wrote my lab reports (including graphs!) into TeX. My lab instructors were amazed! How did I do it? I never told them. But I just recognized that Knuth was a genius and rode his coat-tails, as I have continued to do for the last 30 years!

Jack512 Alexandria VA Dec. 17

A famous quote from Knuth: "Beware of bugs in the above code; I have only proved it correct, not tried it." Anyone who has ever programmed a computer will feel the truth of this in their bones.

[Dec 11, 2018] Software "upgrades" require workers to constantly relearn the same task because some young "genius" observed that a carefully thought out interface "looked tired" and glitzed it up.

Dec 11, 2018 | www.ianwelsh.net

S Brennan permalink April 24, 2016

My grandfather, in the early 60's could board a 707 in New York and arrive in LA in far less time than I can today. And no, I am not counting 4 hour layovers with the long waits to be "screened", the jets were 50-70 knots faster, back then your time was worth more, today less.

Not counting longer hours AT WORK, we spend far more time commuting making for much longer work days, back then your time was worth more, today less!

Software "upgrades" require workers to constantly relearn the same task because some young "genius" observed that a carefully thought out interface "looked tired" and glitzed it up. Think about the almost perfect Google Maps driver interface being redesigned by people who take private buses to work. Way back in the '90's your time was worth more than today!

Life is all the "time" YOU will ever have and if we let the elite do so, they will suck every bit of it out of you.

[Nov 05, 2018] Revisiting the Unix philosophy in 2018 Opensource.com by Michael Hausenblas

Nov 05, 2018 | opensource.com

Revisiting the Unix philosophy in 2018 The old strategy of building small, focused applications is new again in the modern microservices environment.

Program Design in the Unix Environment " in the AT&T Bell Laboratories Technical Journal, in which they argued the Unix philosophy, using the example of BSD's cat -v implementation. In a nutshell that philosophy is: Build small, focused programs -- in whatever language -- that do only one thing but do this thing well, communicate via stdin / stdout , and are connected through pipes.

Sound familiar?

Yeah, I thought so. That's pretty much the definition of microservices offered by James Lewis and Martin Fowler:

In short, the microservice architectural style is an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API.

While one *nix program or one microservice may be very limited or not even very interesting on its own, it's the combination of such independently working units that reveals their true benefit and, therefore, their power.

*nix vs. microservices

The following table compares programs (such as cat or lsof ) in a *nix environment against programs in a microservices environment.

*nix Microservices
Unit of execution program using stdin / stdout service with HTTP or gRPC API
Data flow Pipes ?
Configuration & parameterization Command-line arguments,
environment variables, config files
JSON/YAML docs
Discovery Package manager, man, make DNS, environment variables, OpenAPI

Let's explore each line in slightly greater detail.

Unit of execution

More on Microservices

The unit of execution in *nix (such as Linux) is an executable file (binary or interpreted script) that, ideally, reads input from stdin and writes output to stdout . A microservices setup deals with a service that exposes one or more communication interfaces, such as HTTP or gRPC APIs. In both cases, you'll find stateless examples (essentially a purely functional behavior) and stateful examples, where, in addition to the input, some internal (persisted) state decides what happens. Data flow

Traditionally, *nix programs could communicate via pipes. In other words, thanks to Doug McIlroy , you don't need to create temporary files to pass around and each can process virtually endless streams of data between processes. To my knowledge, there is nothing comparable to a pipe standardized in microservices, besides my little Apache Kafka-based experiment from 2017 .

Configuration and parameterization

How do you configure a program or service -- either on a permanent or a by-call basis? Well, with *nix programs you essentially have three options: command-line arguments, environment variables, or full-blown config files. In microservices, you typically deal with YAML (or even worse, JSON) documents, defining the layout and configuration of a single microservice as well as dependencies and communication, storage, and runtime settings. Examples include Kubernetes resource definitions , Nomad job specifications , or Docker Compose files. These may or may not be parameterized; that is, either you have some templating language, such as Helm in Kubernetes, or you find yourself doing an awful lot of sed -i commands.

Discovery

How do you know what programs or services are available and how they are supposed to be used? Well, in *nix, you typically have a package manager as well as good old man; between them, they should be able to answer all the questions you might have. In a microservices setup, there's a bit more automation in finding a service. In addition to bespoke approaches like Airbnb's SmartStack or Netflix's Eureka , there usually are environment variable-based or DNS-based approaches that allow you to discover services dynamically. Equally important, OpenAPI provides a de-facto standard for HTTP API documentation and design, and gRPC does the same for more tightly coupled high-performance cases. Last but not least, take developer experience (DX) into account, starting with writing good Makefiles and ending with writing your docs with (or in?) style .

Pros and cons

Both *nix and microservices offer a number of challenges and opportunities

Composability

It's hard to design something that has a clear, sharp focus and can also play well with others. It's even harder to get it right across different versions and to introduce respective error case handling capabilities. In microservices, this could mean retry logic and timeouts -- maybe it's a better option to outsource these features into a service mesh? It's hard, but if you get it right, its reusability can be enormous.

Observability

In a monolith (in 2018) or a big program that tries to do it all (in 1984), it's rather straightforward to find the culprit when things go south. But, in a

yes | tr \\n x | head -c 450m | grep n

or a request path in a microservices setup that involves, say, 20 services, how do you even start to figure out which one is behaving badly? Luckily we have standards, notably OpenCensus and OpenTracing . Observability still might be the biggest single blocker if you are looking to move to microservices.

Global state

While it may not be such a big issue for *nix programs, in microservices, global state remains something of a discussion. Namely, how to make sure the local (persistent) state is managed effectively and how to make the global state consistent with as little effort as possible.

Wrapping up

In the end, the question remains: Are you using the right tool for a given task? That is, in the same way a specialized *nix program implementing a range of functions might be the better choice for certain use cases or phases, it might be that a monolith is the best option for your organization or workload. Regardless, I hope this article helps you see the many, strong parallels between the Unix philosophy and microservices -- maybe we can learn something from the former to benefit the latter.

Michael Hausenblas is a Developer Advocate for Kubernetes and OpenShift at Red Hat where he helps appops to build and operate apps. His background is in large-scale data processing and container orchestration and he's experienced in advocacy and standardization at W3C and IETF. Before Red Hat, Michael worked at Mesosphere, MapR and in two research institutions in Ireland and Austria. He contributes to open source software incl. Kubernetes, speaks at conferences and user groups, and shares good practices...

[Nov 05, 2018] The Linux Philosophy for SysAdmins And Everyone Who Wants To Be One eBook by David Both

Nov 05, 2018 | www.amazon.com

Elegance is one of those things that can be difficult to define. I know it when I see it, but putting what I see into a terse definition is a challenge. Using the Linux diet
command, Wordnet provides one definition of elegance as, "a quality of neatness and ingenious simplicity in the solution of a problem (especially in science or mathematics); 'the simplicity and elegance of his invention.'"

In the context of this book, I think that elegance is a state of beauty and simplicity in the design and working of both hardware and software. When a design is elegant,
software and hardware work better and are more efficient. The user is aided by simple, efficient, and understandable tools.

Creating elegance in a technological environment is hard. It is also necessary. Elegant solutions produce elegant results and are easy to maintain and fix. Elegance does not happen by accident; you must work for it.

The quality of simplicity is a large part of technical elegance. So large, in fact that it deserves a chapter of its own, Chapter 18, "Find the Simplicity," but we do not ignore it here. This chapter discusses what it means for hardware and software to be elegant.

Hardware Elegance

Yes, hardware can be elegant -- even beautiful, pleasing to the eye. Hardware that is well designed is more reliable as well. Elegant hardware solutions improve reliability'.

[Oct 27, 2018] One issue with Microsoft (not just Microsoft) is that their business model (not the benefit of the users) requires frequent changes in the systems, so bugs are introduced at the steady clip.

Oct 27, 2018 | www.moonofalabama.org

Piotr Berman , Oct 26, 2018 2:55:29 PM | 5 ">link

"Even Microsoft, the biggest software company in the world, recently screwed up..."

Isn't it rather logical than the larger a company is, the more screw ups it can make? After all, Microsofts has armies of programmers to make those bugs.

Once I created a joke that the best way to disable missile defense would be to have a rocket that can stop in mid-air, thus provoking the software to divide be zero and crash. One day I told that joke to a military officer who told me that something like that actually happened, but it was in the Navy and it involved a test with a torpedo. Not only the program for "torpedo defense" went down but the system crashed too and the engine of the ship stopped working as well. I also recall explanations that a new complex software system typically has all major bugs removed after being used for a year. And the occasion was Internal Revenue Service changing hardware and software leading to widely reported problems.

One issue with Microsoft (not just Microsoft) is that their business model (not the benefit of the users) requires frequent changes in the systems, so bugs are introduced at the steady clip. Of course, they do not make money on bugs per se, but on new features that in time make it impossible to use older versions of the software and hardware.

[Sep 21, 2018] 'It Just Seems That Nobody is Interested in Building Quality, Fast, Efficient, Lasting, Foundational Stuff Anymore'

Sep 21, 2018 | tech.slashdot.org

Nikita Prokopov, a software programmer and author of Fira Code, a popular programming font, AnyBar, a universal status indicator, and some open-source Clojure libraries, writes :

Remember times when an OS, apps and all your data fit on a floppy? Your desktop todo app is probably written in Electron and thus has userland driver for Xbox 360 controller in it, can render 3d graphics and play audio and take photos with your web camera. A simple text chat is notorious for its load speed and memory consumption. Yes, you really have to count Slack in as a resource-heavy application. I mean, chatroom and barebones text editor, those are supposed to be two of the less demanding apps in the whole world. Welcome to 2018.

At least it works, you might say. Well, bigger doesn't imply better. Bigger means someone has lost control. Bigger means we don't know what's going on. Bigger means complexity tax, performance tax, reliability tax. This is not the norm and should not become the norm . Overweight apps should mean a red flag. They should mean run away scared. 16Gb Android phone was perfectly fine 3 years ago. Today with Android 8.1 it's barely usable because each app has become at least twice as big for no apparent reason. There are no additional functions. They are not faster or more optimized. They don't look different. They just...grow?

iPhone 4s was released with iOS 5, but can barely run iOS 9. And it's not because iOS 9 is that much superior -- it's basically the same. But their new hardware is faster, so they made software slower. Don't worry -- you got exciting new capabilities like...running the same apps with the same speed! I dunno. [...] Nobody understands anything at this point. Neither they want to. We just throw barely baked shit out there, hope for the best and call it "startup wisdom." Web pages ask you to refresh if anything goes wrong. Who has time to figure out what happened? Any web app produces a constant stream of "random" JS errors in the wild, even on compatible browsers.

[...] It just seems that nobody is interested in building quality, fast, efficient, lasting, foundational stuff anymore. Even when efficient solutions have been known for ages, we still struggle with the same problems: package management, build systems, compilers, language design, IDEs. Build systems are inherently unreliable and periodically require full clean, even though all info for invalidation is there. Nothing stops us from making build process reliable, predictable and 100% reproducible. Just nobody thinks its important. NPM has stayed in "sometimes works" state for years.


K. S. Kyosuke ( 729550 ) , Friday September 21, 2018 @11:32AM ( #57354556 )

Re:Why should they? ( Score: 4 , Insightful)

Less resource use to accomplish the required tasks? Both in manufacturing (more chips from the same amount of manufacturing input) and in operation (less power used)?

K. S. Kyosuke ( 729550 ) writes: on Friday September 21, 2018 @11:58AM ( #57354754 )
Re:Why should they? ( Score: 2 )

Ehm...so for example using smaller cars with better mileage to commute isn't more environmentally friendly either, according to you?https://slashdot.org/comments.pl?sid=12644750&cid=57354556#

DontBeAMoran ( 4843879 ) writes: on Friday September 21, 2018 @12:04PM ( #57354826 )
Re:Why should they? ( Score: 2 )

iPhone 4S used to be the best and could run all the applications.

Today, the same power is not sufficient because of software bloat. So you could say that all the iPhones since the iPhone 4S are devices that were created and then dumped for no reason.

It doesn't matter since we can't change the past and it doesn't matter much since improvements are slowing down so people are changing their phones less often.

Mark of the North ( 19760 ) , Friday September 21, 2018 @01:02PM ( #57355296 )
Re:Why should they? ( Score: 5 , Interesting)

Can you really not see the connection between inefficient software and environmental harm? All those computers running code that uses four times as much data, and four times the number crunching, as is reasonable? That excess RAM and storage has to be built as well as powered along with the CPU. Those material and electrical resources have to come from somewhere.

But the calculus changes completely when the software manufacturer hosts the software (or pays for the hosting) for their customers. Our projected AWS bill motivated our management to let me write the sort of efficient code I've been trained to write. After two years of maintaining some pretty horrible legacy code, it is a welcome change.

The big players care a great deal about efficiency when they can't outsource inefficiency to the user's computing resources.

eth1 ( 94901 ) , Friday September 21, 2018 @11:45AM ( #57354656 )
Re:Why should they? ( Score: 5 , Informative)
We've been trained to be a consuming society of disposable goods. The latest and greatest feature will always be more important than something that is reliable and durable for the long haul.

It's not just consumer stuff.

The network team I'm a part of has been dealing with more and more frequent outages, 90% of which are due to bugs in software running our devices. These aren't fly-by-night vendors either, they're the "no one ever got fired for buying X" ones like Cisco, F5, Palo Alto, EMC, etc.

10 years ago, outages were 10% bugs, and 90% human error, now it seems to be the other way around. Everyone's chasing features, because that's what sells, so there's no time for efficiency/stability/security any more.

LucasBC ( 1138637 ) , Friday September 21, 2018 @12:05PM ( #57354836 )
Re:Why should they? ( Score: 3 , Interesting)

Poor software engineering means that very capable computers are no longer capable of running modern, unnecessarily bloated software. This, in turn, leads to people having to replace computers that are otherwise working well, solely for the reason to keep up with software that requires more and more system resources for no tangible benefit. In a nutshell -- sloppy, lazy programming leads to more technology waste. That impacts the environment. I have a unique perspective in this topic. I do web development for a company that does electronics recycling. I have suffered the continued bloat in software in the tools I use (most egregiously, Adobe), and I see the impact of technological waste in the increasing amount of electronics recycling that is occurring. Ironically, I'm working at home today because my computer at the office kept stalling every time I had Photoshop and Illustrator open at the same time. A few years ago that wasn't a problem.

arglebargle_xiv ( 2212710 ) writes:
Re: ( Score: 3 )

There is one place where people still produce stuff like the OP wants, and that's embedded. Not IoT wank, but real embedded, running on CPUs clocked at tens of MHz with RAM in two-digit kilobyte (not megabyte or gigabyte) quantities. And a lot of that stuff is written to very exacting standards, particularly where something like realtime control and/or safety is involved.

The one problem in this area is the endless battle with standards morons who begin each standard with an implicit "assume an infinitely

commodore64_love ( 1445365 ) , Friday September 21, 2018 @03:58PM ( #57356680 ) Journal
Re:Why should they? ( Score: 3 )

> Poor software engineering means that very capable computers are no longer capable of running modern, unnecessarily bloated software.

Not just computers.

You can add Smart TVs, settop internet boxes, Kindles, tablets, et cetera that must be thrown-away when they become too old (say 5 years) to run the latest bloatware. Software non-engineering is causing a lot of working hardware to be landfilled, and for no good reason.

[Sep 21, 2018] Fast, cheap (efficient) and reliable (robust, long lasting): pick 2

Sep 21, 2018 | tech.slashdot.org

JoeDuncan ( 874519 ) , Friday September 21, 2018 @12:58PM ( #57355276 )

Obligatory ( Score: 2 )

Fast, cheap (efficient) and reliable (robust, long lasting): pick 2.

roc97007 ( 608802 ) , Friday September 21, 2018 @12:16PM ( #57354946 ) Journal
Re:Bloat = growth ( Score: 2 )

There's probably some truth to that. And it's a sad commentary on the industry.

[Sep 21, 2018] Since Moore's law appears to have stalled since at least five years ago, it will be interesting to see if we start to see algorithm research or code optimization techniques coming to the fore again.

Sep 21, 2018 | tech.slashdot.org

Anonymous Coward , Friday September 21, 2018 @11:26AM ( #57354512 )

Moore's law ( Score: 5 , Interesting)

When the speed of your processor doubles every two year along with a concurrent doubling of RAM and disk space, then you can get away with bloatware.

Since Moore's law appears to have stalled since at least five years ago, it will be interesting to see if we start to see algorithm research or code optimization techniques coming to the fore again.

[Sep 16, 2018] After the iron curtain fell, there was a big demand for Russian-trained programmers because they could program in a very efficient and light manner that didn't demand too much of the hardware, if I remember correctly

Notable quotes:
"... It's a bit of chicken-and-egg problem, though. Russia, throughout 20th century, had problem with developing small, effective hardware, so their programmers learned how to code to take maximum advantage of what they had, with their technological deficiency in one field giving rise to superiority in another. ..."
"... Russian tech ppl should always be viewed with certain amount of awe and respect...although they are hardly good on everything. ..."
"... Soviet university training in "cybernetics" as it was called in the late 1980s involved two years of programming on blackboards before the students even touched an actual computer. ..."
"... I recall flowcharting entirely on paper before committing a program to punched cards. ..."
Aug 01, 2018 | turcopolier.typepad.com

Bill Herschel 2 days ago ,

Very, very slightly off-topic.

Much has been made, including in this post, of the excellent organization of Russian forces and Russian military technology.

I have been re-investigating an open-source relational database system known as PosgreSQL (variously), and I remember finding perhaps a decade ago a very useful whole text search feature of this system which I vaguely remember was written by a Russian and, for that reason, mildly distrusted by me.

Come to find out that the principle developers and maintainers of PostgreSQL are Russian. OMG. Double OMG, because the reason I chose it in the first place is that it is the best non-proprietary RDBS out there and today is supported on Google Cloud, AWS, etc.

The US has met an equal or conceivably a superior, case closed. Trump's thoroughly odd behavior with Putin is just one but a very obvious one example of this.

Of course, Trump's nationalistic blather is creating a "base" of people who believe in the godliness of the US. They are in for a very serious disappointment.

kao_hsien_chih Bill Herschel a day ago ,

After the iron curtain fell, there was a big demand for Russian-trained programmers because they could program in a very efficient and "light" manner that didn't demand too much of the hardware, if I remember correctly.

It's a bit of chicken-and-egg problem, though. Russia, throughout 20th century, had problem with developing small, effective hardware, so their programmers learned how to code to take maximum advantage of what they had, with their technological deficiency in one field giving rise to superiority in another.

Russia has plenty of very skilled, very well-trained folks and their science and math education is, in a way, more fundamentally and soundly grounded on the foundational stuff than US (based on my personal interactions anyways).

Russian tech ppl should always be viewed with certain amount of awe and respect...although they are hardly good on everything.

TTG kao_hsien_chih a day ago ,

Well said. Soviet university training in "cybernetics" as it was called in the late 1980s involved two years of programming on blackboards before the students even touched an actual computer.

It gave the students an understanding of how computers works down to the bit flipping level. Imagine trying to fuzz code in your head.

FarNorthSolitude TTG a day ago ,

I recall flowcharting entirely on paper before committing a program to punched cards. I used to do hex and octal math in my head as part of debugging core dumps. Ah, the glory days.

Honeywell once made a military computer that was 10 bit. That stumped me for a while, as everything was 8 or 16 bit back then.

kao_hsien_chih FarNorthSolitude 10 hours ago ,

That used to be fairly common in the civilian sector (in US) too: computing time was expensive, so you had to make sure that the stuff worked flawlessly before it was committed.

No opportunity to seeing things go wrong and do things over like much of how things happen nowadays. Russians, with their hardware limitations/shortages, I imagine must have been much more thorough than US programmers were back in the old days, and you could only get there by being very thoroughly grounded n the basics.

[Sep 07, 2018] How Can We Fix The Broken Economics of Open Source?

Notable quotes:
"... [with some subset of features behind a paywall] ..."
Sep 07, 2018 | news.slashdot.org

If we take consulting, services, and support off the table as an option for high-growth revenue generation (the only thing VCs care about), we are left with open core [with some subset of features behind a paywall] , software as a service, or some blurring of the two... Everyone wants infrastructure software to be free and continuously developed by highly skilled professional developers (who in turn expect to make substantial salaries), but no one wants to pay for it. The economics of this situation are unsustainable and broken ...

[W]e now come to what I have recently called "loose" open core and SaaS. In the future, I believe the most successful OSS projects will be primarily monetized via this method. What is it? The idea behind "loose" open core and SaaS is that a popular OSS project can be developed as a completely community driven project (this avoids the conflicts of interest inherent in "pure" open core), while value added proprietary services and software can be sold in an ecosystem that forms around the OSS...

Unfortunately, there is an inflection point at which in some sense an OSS project becomes too popular for its own good, and outgrows its ability to generate enough revenue via either "pure" open core or services and support... [B]uilding a vibrant community and then enabling an ecosystem of "loose" open core and SaaS businesses on top appears to me to be the only viable path forward for modern VC-backed OSS startups.
Klein also suggests OSS foundations start providing fellowships to key maintainers, who currently "operate under an almost feudal system of patronage, hopping from company to company, trying to earn a living, keep the community vibrant, and all the while stay impartial..."

"[A]s an industry, we are going to have to come to terms with the economic reality: nothing is free, including OSS. If we want vibrant OSS projects maintained by engineers that are well compensated and not conflicted, we are going to have to decide that this is something worth paying for. In my opinion, fellowships provided by OSS foundations and funded by companies generating revenue off of the OSS is a great way to start down this path."

[Apr 30, 2018] New Book Describes Bluffing Programmers in Silicon Valley

Notable quotes:
"... Live Work Work Work Die: A Journey into the Savage Heart of Silicon Valley ..."
"... Older generations called this kind of fraud "fake it 'til you make it." ..."
"... Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring ..."
"... It's not a "kids these days" sort of issue, it's *always* been the case that shameless, baseless self-promotion wins out over sincere skill without the self-promotion, because the people who control the money generally understand boasting more than they understand the technology. ..."
"... In the bad old days we had a hell of a lot of ridiculous restriction We must somehow made our programs to run successfully inside a RAM that was 48KB in size (yes, 48KB, not 48MB or 48GB), on a CPU with a clock speed of 1.023 MHz ..."
"... So what are the uses for that? I am curious what things people have put these to use for. ..."
"... Also, Oracle, SAP, IBM... I would never buy from them, nor use their products. I have used plenty of IBM products and they suck big time. They make software development 100 times harder than it could be. ..."
"... I have a theory that 10% of people are good at what they do. It doesn't really matter what they do, they will still be good at it, because of their nature. These are the people who invent new things, who fix things that others didn't even see as broken and who automate routine tasks or simply question and erase tasks that are not necessary. ..."
"... 10% are just causing damage. I'm not talking about terrorists and criminals. ..."
"... Programming is statistically a dead-end job. Why should anyone hone a dead-end skill that you won't be able to use for long? For whatever reason, the industry doesn't want old programmers. ..."
Apr 30, 2018 | news.slashdot.org

Long-time Slashdot reader Martin S. pointed us to this an excerpt from the new book Live Work Work Work Die: A Journey into the Savage Heart of Silicon Valley by Portland-based investigator reporter Corey Pein.

The author shares what he realized at a job recruitment fair seeking Java Legends, Python Badasses, Hadoop Heroes, "and other gratingly childish classifications describing various programming specialities.

" I wasn't the only one bluffing my way through the tech scene. Everyone was doing it, even the much-sought-after engineering talent.

I was struck by how many developers were, like myself, not really programmers , but rather this, that and the other. A great number of tech ninjas were not exactly black belts when it came to the actual onerous work of computer programming. So many of the complex, discrete tasks involved in the creation of a website or an app had been automated that it was no longer necessary to possess knowledge of software mechanics. The coder's work was rarely a craft. The apps ran on an assembly line, built with "open-source", off-the-shelf components. The most important computer commands for the ninja to master were copy and paste...

[M]any programmers who had "made it" in Silicon Valley were scrambling to promote themselves from coder to "founder". There wasn't necessarily more money to be had running a startup, and the increase in status was marginal unless one's startup attracted major investment and the right kind of press coverage. It's because the programmers knew that their own ladder to prosperity was on fire and disintegrating fast. They knew that well-paid programming jobs would also soon turn to smoke and ash, as the proliferation of learn-to-code courses around the world lowered the market value of their skills, and as advances in artificial intelligence allowed for computers to take over more of the mundane work of producing software. The programmers also knew that the fastest way to win that promotion to founder was to find some new domain that hadn't yet been automated. Every tech industry campaign designed to spur investment in the Next Big Thing -- at that time, it was the "sharing economy" -- concealed a larger programme for the transformation of society, always in a direction that favoured the investor and executive classes.

"I wasn't just changing careers and jumping on the 'learn to code' bandwagon," he writes at one point. "I was being steadily indoctrinated in a specious ideology."


Anonymous Coward , Saturday April 28, 2018 @11:40PM ( #56522045 )

older generations already had a term for this ( Score: 5 , Interesting)

Older generations called this kind of fraud "fake it 'til you make it."

raymorris ( 2726007 ) , Sunday April 29, 2018 @02:05AM ( #56522343 ) Journal
The people who are smarter won't ( Score: 5 , Informative)

> The people can do both are smart enough to build their own company and compete with you.

Been there, done that. Learned a few lessons. Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring, managing people, corporate strategy, staying up on the competition, figuring out tax changes each year and getting taxes filed six times each year, the various state and local requirements, legal changes, contract hassles, etc, while hoping the company makes money this month so they can take a paycheck and lay their rent.

I learned that I'm good at creating software systems and I enjoy it. I don't enjoy all-nighters, partners being dickheads trying to pull out of a contract, or any of a thousand other things related to running a start-up business. I really enjoy a consistent, six-figure compensation package too.

brian.stinar ( 1104135 ) writes:
Re: ( Score: 2 )

* getting taxes filled eighteen times a year.

I pay monthly gross receipts tax (12), quarterly withholdings (4) and a corporate (1) and individual (1) returns. The gross receipts can vary based on the state, so I can see how six times a year would be the minimum.

Cederic ( 9623 ) writes:
Re: ( Score: 2 )

Fuck no. Cost of full automation: $4m Cost of manual entry: $0 Opportunity cost of manual entry: $800/year

At worse, pay for an accountant, if you can get one that cheaply. Bear in mind talking to them incurs most of that opportunity cost anyway.

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

Nowadays I work 9:30-4:30 for a very good, consistent paycheck and let some other "smart person" put in 75 hours a week dealing with hiring

There's nothing wrong with not wnting to run your own business, it's not for most people, and even if it was, the numbers don't add up. But putting the scare qoutes in like that makes it sound like you have huge chip on your shoulder. Those things re just as essential to the business as your work and without them you wouldn't have the steady 9:30-4:30 with good paycheck.

raymorris ( 2726007 ) writes:
Important, and dumb. ( Score: 3 , Informative)

Of course they are important. I wouldn't have done those things if they weren't important!

I frequently have friends say things like "I love baking. I can't get enough of baking. I'm going to open a bakery.". I ask them "do you love dealing with taxes, every month? Do you love contract law? Employment law? Marketing? Accounting?" If you LOVE baking, the smart thing to do is to spend your time baking. Running a start-up business, you're not going to do much baking.

If you love marketing, employment law, taxes

raymorris ( 2726007 ) writes:
Four tips for a better job. Who has more? ( Score: 3 )

I can tell you a few things that have worked for me. I'll go in chronological order rather than priority order.

Make friends in the industry you want to be in. Referrals are a major way people get jobs.

Look at the job listings for jobs you'd like to have and see which skills a lot of companies want, but you're missing. For me that's Java. A lot companies list Java skills and I'm not particularly good with Java. Then consider learning the skills you lack, the ones a lot of job postings are looking for.

Certifi

goose-incarnated ( 1145029 ) , Sunday April 29, 2018 @02:34PM ( #56524475 ) Journal
Re: older generations already had a term for this ( Score: 5 , Insightful)
You don't understand the point of an ORM do you? I'd suggest reading why they exist

They exist because programmers value code design more than data design. ORMs are the poster-child for square-peg-round-hole solutions, which is why all ORMs choose one of three different ways of squashing hierarchical data into a relational form, all of which are crappy.

If the devs of the system (the ones choosing to use an ORM) had any competence at all they'd design their database first because in any application that uses a database the database is the most important bit, not the OO-ness or Functional-ness of the design.

Over the last few decades I've seen programs in a system come and go; a component here gets rewritten, a component there gets rewritten, but you know what? They all have to work with the same damn data.

You can more easily switch out your code for new code with new design in a new language, than you can switch change the database structure. So explain to me why it is that you think the database should be mangled to fit your OO code rather than mangling your OO code to fit the database?

cheekyboy ( 598084 ) writes:
im sick of reinventors and new frameworks ( Score: 3 )

Stick to the one thing for 10-15years. Often all this new shit doesn't do jack different to the old shit, its not faster, its not better. Every dick wants to be famous so make another damn library/tool with his own fancy name and feature, instead of enhancing an existing product.

gbjbaanb ( 229885 ) writes:
Re: ( Score: 2 )

amen to that.

Or kids who can't hack the main stuff, suddenly discover the cool new, and then they can pretend they're "learning" it, and when the going gets tough (as it always does) they can declare the tech to be pants and move to another.

hence we had so many people on the bandwagon for functional programming, then dumped it for ruby on rails, then dumped that for Node.js, not sure what they're on at currently, probably back to asp.net.

Greyfox ( 87712 ) writes:
Re: ( Score: 2 )

How much code do you have to reuse before you're not really programming anymore? When I started in this business, it was reasonably possible that you could end up on a project that didn't particularly have much (or any) of an operating system. They taught you assembly language and the process by which the system boots up, but I think if I were to ask most of the programmers where I work, they wouldn't be able to explain how all that works...

djinn6 ( 1868030 ) writes:
Re: ( Score: 2 )
It really feels like if you know what you're doing it should be possible to build a team of actually good programmers and put everyone else out of business by actually meeting your deliverables, but no one has yet. I wonder why that is.

You mean Amazon, Google, Facebook and the like? People may not always like what they do, but they manage to get things done and make plenty of money in the process. The problem for a lot of other businesses is not having a way to identify and promote actually good programmers. In your example, you could've spent 10 minutes fixing their query and saved them days of headache, but how much recognition will you actually get? Where is your motivation to help them?

Junta ( 36770 ) writes:
Re: ( Score: 2 )

It's not a "kids these days" sort of issue, it's *always* been the case that shameless, baseless self-promotion wins out over sincere skill without the self-promotion, because the people who control the money generally understand boasting more than they understand the technology. Yes it can happen that baseless boasts can be called out over time by a large enough mass of feedback from competent peers, but it takes a *lot* to overcome the tendency for them to have faith in the boasts.

It does correlate stron

cheekyboy ( 598084 ) writes:
Re: ( Score: 2 )

And all these modern coders forget old lessons, and make shit stuff, just look at instagram windows app, what a load of garbage shit, that us old fuckers could code in 2-3 weeks.

Instagram - your app sucks, cookie cutter coders suck, no refinement, coolness. Just cheap ass shit, with limited usefulness.

Just like most of commercial software that's new - quick shit.

Oh and its obvious if your an Indian faking it, you haven't worked in 100 companies at the age of 29.

Junta ( 36770 ) writes:
Re: ( Score: 2 )

Here's another problem, if faced with a skilled team that says "this will take 6 months to do right" and a more naive team that says "oh, we can slap that together in a month", management goes with the latter. Then the security compromises occur, then the application fails due to pulling in an unvetted dependency update live into production. When the project grows to handling thousands instead of dozens of users and it starts mysteriously folding over and the dev team is at a loss, well the choice has be

molarmass192 ( 608071 ) , Sunday April 29, 2018 @02:15AM ( #56522359 ) Homepage Journal
Re:older generations already had a term for this ( Score: 5 , Interesting)

These restrictions is a large part of what makes Arduino programming "fun". If you don't plan out your memory usage, you're gonna run out of it. I cringe when I see 8MB web pages of bloated "throw in everything including the kitchen sink and the neighbor's car". Unfortunately, the careful and cautious way is a dying in favor of the throw 3rd party code at it until it does something. Of course, I don't have time to review it but I'm sure everybody else has peer reviewed it for flaws and exploits line by line.

AmiMoJo ( 196126 ) writes: < mojo@@@world3...net > on Sunday April 29, 2018 @05:15AM ( #56522597 ) Homepage Journal
Re:older generations already had a term for this ( Score: 4 , Informative)
Unfortunately, the careful and cautious way is a dying in favor of the throw 3rd party code at it until it does something.

Of course. What is the business case for making it efficient? Those massive frameworks are cached by the browser and run on the client's system, so cost you nothing and save you time to market. Efficient costs money with no real benefit to the business.

If we want to fix this, we need to make bloat have an associated cost somehow.

locketine ( 1101453 ) writes:
Re: older generations already had a term for this ( Score: 2 )

My company is dealing with the result of this mentality right now. We released the web app to the customer without performance testing and doing several majorly inefficient things to meet deadlines. Once real load was put on the application by users with non-ideal hardware and browsers, the app was infuriatingly slow. Suddenly our standard sub-40 hour workweek became a 50+ hour workweek for months while we fixed all the inefficient code and design issues.

So, while you're right that getting to market and opt

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

In the bad old days we had a hell of a lot of ridiculous restriction We must somehow made our programs to run successfully inside a RAM that was 48KB in size (yes, 48KB, not 48MB or 48GB), on a CPU with a clock speed of 1.023 MHz

We still have them. In fact some of the systems I've programmed have been more resource limited than the gloriously spacious 32KiB memory of the BBC model B. Take the PIC12F or 10F series. A glorious 64 bytes of RAM, max clock speed of 16MHz, but not unusual to run it 32kHz.

serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

So what are the uses for that? I am curious what things people have put these to use for.

It's hard to determine because people don't advertise use of them at all. However, I know that my electric toothbrush uses an Epson 4 bit MCU of some description. It's got a status LED, basic NiMH batteryb charger and a PWM controller for an H Bridge. Braun sell a *lot* of electric toothbrushes. Any gadget that's smarter than a simple switch will probably have some sort of basic MCU in it. Alarm system components, sensor

tlhIngan ( 30335 ) writes:
Re: ( Score: 3 , Insightful)
b) No computer ever ran at 1.023 MHz. It was either a nice multiple of 1Mhz or maybe a multiple of 3.579545Mhz (ie. using the TV output circuit's color clock crystal to drive the CPU).

Well, it could be used to drive the TV output circuit, OR, it was used because it's a stupidly cheap high speed crystal. You have to remember except for a few frequencies, most crystals would have to be specially cut for the desired frequency. This occurs even today, where most oscillators are either 32.768kHz (real time clock

Anonymous Coward writes:
Re: ( Score: 2 , Interesting)

Yeah, nice talk. You could have stopped after the first sentence. The other AC is referring to the Commodore C64 [wikipedia.org]. The frequency has nothing to do with crystal availability but with the simple fact that everything in the C64 is synced to the TV. One clock cycle equals 8 pixels. The graphics chip and the CPU take turns accessing the RAM. The different frequencies dictated by the TV standards are the reason why the CPU in the NTSC version of the C64 runs at 1.023MHz and the PAL version at 0.985MHz.

Wraithlyn ( 133796 ) writes:
Re: ( Score: 2 )

LOL what exactly is so special about 16K RAM? https://yourlogicalfallacyis.c... [yourlogicalfallacyis.com]

I cut my teeth on a VIC20 (5K RAM), then later a C64 (which ran at 1.023MHz...)

Anonymous Coward writes:
Re: ( Score: 2 , Interesting)

Commodore 64 for the win. I worked for a company that made detection devices for the railroad, things like monitoring axle temperatures, reading the rail car ID tags. The original devices were made using Commodore 64 boards using software written by an employee at the one rail road company working with them.

The company then hired some electrical engineers to design custom boards using the 68000 chips and I was hired as the only programmer. Had to rewrite all of the code which was fine...

wierd_w ( 1375923 ) , Saturday April 28, 2018 @11:58PM ( #56522075 )
... A job fair can easily test this competency. ( Score: 4 , Interesting)

Many of these languages have an interactive interpreter. I know for a fact that Python does.

So, since job-fairs are an all day thing, and setup is already a thing for them -- set up a booth with like 4 computers at it, and an admin station. The 4 terminals have an interactive session with the interpreter of choice. Every 20min or so, have a challenge for "Solve this problem" (needs to be easy and already solved in general. Programmers hate being pimped without pay. They don't mind tests of skill, but hate being pimped. Something like "sort this array, while picking out all the prime numbers" or something.) and see who steps up. The ones that step up have confidence they can solve the problem, and you can quickly see who can do the work and who can't.

The ones that solve it, and solve it to your satisfaction, you offer a nice gig to.

ShanghaiBill ( 739463 ) , Sunday April 29, 2018 @01:50AM ( #56522321 )
Re:... A job fair can easily test this competency. ( Score: 5 , Informative)
Then you get someone good at sorting arrays while picking out prime numbers, but potentially not much else.

The point of the test is not to identify the perfect candidate, but to filter out the clearly incompetent. If you can't sort an array and write a function to identify a prime number, I certainly would not hire you. Passing the test doesn't get you a job, but it may get you an interview ... where there will be other tests.

wierd_w ( 1375923 ) writes:
Re: ( Score: 2 )

BINGO!

(I am not even a professional programmer, but I can totally perform such a trivially easy task. The example tests basic understanding of loop construction, function construction, variable use, efficient sorting, and error correction-- especially with mixed type arrays. All of these are things any programmer SHOULD now how to do, without being overly complicated, or clearly a disguised occupational problem trying to get a free solution. Like I said, programmers hate being pimped, and will be turned off

wierd_w ( 1375923 ) , Sunday April 29, 2018 @04:02AM ( #56522443 )
Re: ... A job fair can easily test this competency ( Score: 5 , Insightful)

Again, the quality applicant and the code monkey both have something the fakers do not-- Actual comprehension of what a program is, and how to create one.

As Bill points out, this is not the final exam. This is the "Oh, I see you do actually know how to program-- show me more" portion of the process. This is the part that HR drones are not capable of performing, due to Dunning-Krueger. Those that are actually, REALLY competent will do more than just satisfy the requirements of the challenge, they will provide actually working solutions to the challenge that properly validate their input, and return proper error states if the input is invalid, etc-- You can learn a LOT about a potential hire by observing their work. *THAT* is what this is really about. The triviality of the problem is a necessity, because you ***DON'T*** try to get free solutions out of people.

I realize that may be difficult for you to comprehend, but you *DON'T* do that. The job fair is to let people know that you have a position available, and try to curry interest in people to apply. A successful pre-screening is confidence building, and helps the potential hire to feel that your company is actually interested in actually hiring somebody, and not just fucking off in the booth, to cover for "failing to find somebody" and then "Getting yet another H1B". It gives them a chance to show you what they can do. That is what it is for, and what it does. It also excludes the fakers that this article is about-- The ones that can talk a good talk, but could not program a simple boolean check condition if their life depended on it.

If it were not for the time constraints of a job fair (usually only 2 days, and in that time you need to try and pre-screen as many as possible), I would suggest a tiered challenge, with progressively harder challenges, where you hand out resumes to the ones that make it to the top 3 brackets, but that is not the way the world works.

luis_a_espinal ( 1810296 ) writes:
Re: ( Score: 2 )
This in my opinion is really a waste of time. Challenges like this have to be so simple they can be done walking up to a booth are not likely to filter the "all talks" any better than a few interview questions could (imperson so the candidate can't just google it).

Tougher more involved stuff isn't good either it gives a huge advantage to the full time job hunter, the guy or gal that already has a 9-5 and a family that wants to seem them has not got time for games. We have been struggling with hiring where I work ( I do a lot of the interviews ) and these are the conclusions we have reached

You would be surprised at the number of people with impeccable-looking resumes failing at something as simple as the FizzBuzz test [codinghorror.com]

PaulRivers10 ( 4110595 ) writes:
Re: ... A job fair can easily test this competenc ( Score: 2 )

The only thing fuzzbuzz tests is "have you done fizzbuzz before"? It's a short question filled with every petty trick the author could think ti throw in there. If you haven't seen the tricks they trip you up for no reason related to your actual coding skills. Once you have seen them they're trivial and again unrelated to real work. Fizzbuzz is best passed by someone aiming to game the interview system. It passes people gaming it and trips up people who spent their time doing on the job real work.

Hognoxious ( 631665 ) writes:
Re: ( Score: 2 )
they trip you up for no reason related to your actual codung skills.

Bullshit!

luis_a_espinal ( 1810296 ) , Sunday April 29, 2018 @07:49AM ( #56522861 ) Homepage
filter the lame code monkeys ( Score: 4 , Informative)
Lame monkey tests select for lame monkeys.

A good programmer first and foremost has a clean mind. Experience suggests puzzle geeks, who excel at contrived tests, are usually sloppy thinkers.

No. Good programmers can trivially knock out any of these so-called lame monkey tests. It's lame code monkeys who can't do it. And I've seen their work. Many night shifts and weekends I've burned trying to fix their shit because they couldn't actually do any of the things behind what you call "lame monkey tests", like:

    pulling expensive invariant calculations out of loops using for loops to scan a fucking table to pull rows or calculate an aggregate when they could let the database do what it does best with a simple SQL statement systems crashing under actual load because their shitty code was never stress tested ( but it worked on my dev box! .) again with databases, having to redo their schemas because they were fattened up so much with columns like VALUE1, VALUE2, ... VALUE20 (normalize you assholes!) chatting remote APIs - because these code monkeys cannot think about the need for bulk operations in increasingly distributed systems. storing dates in unsortable strings because the idiots do not know most modern programming languages have a date data type.

Oh and the most important, off-by-one looping errors. I see this all the time, the type of thing a good programmer can spot on quickly because he or she can do the so-called "lame monkey tests" that involve arrays and sorting.

I've seen the type: "I don't need to do this shit because I have business knowledge and I code for business and IT not google", and then they go and code and fuck it up... and then the rest of us have to go clean up their shit at 1AM or on weekends.

If you work as an hourly paid contractor cleaning that crap, it can be quite lucrative. But sooner or later it truly sucks the energy out of your soul.

So yeah, we need more lame monkey tests ... to filter the lame code monkeys.

ShanghaiBill ( 739463 ) writes:
Re: ( Score: 3 )
Someone could Google the problem with the phone then step up and solve the challenge.

If given a spec, someone can consistently cobble together working code by Googling, then I would love to hire them. That is the most productive way to get things done.

There is nothing wrong with using external references. When I am coding, I have three windows open: an editor, a testing window, and a browser with a Stackoverflow tab open.

Junta ( 36770 ) writes:
Re: ( Score: 2 )

Yeah, when we do tech interviews, we ask questions that we are certain they won't be able to answer, but want to see how they would think about the problem and what questions they ask to get more data and that they don't just fold up and say "well that's not the sort of problem I'd be thinking of" The examples aren't made up or anything, they are generally selection of real problems that were incredibly difficult that our company had faced before, that one may not think at first glance such a position would

bobstreo ( 1320787 ) writes:
Nothing worse ( Score: 2 )

than spending weeks interviewing "good" candidates for an opening, selecting a couple and hiring them as contractors, then finding out they are less than unqualified to do the job they were hired for.

I've seen it a few times, Java "experts", Microsoft "experts" with years of experience on their resumes, but completely useless in coding, deployment or anything other than buying stuff from the break room vending machines.

That being said, I've also seen projects costing hundreds of thousands of dollars, with y

Anonymous Coward , Sunday April 29, 2018 @12:34AM ( #56522157 )
Re:Nothing worse ( Score: 4 , Insightful)

The moment you said "contractors", and you have lost any sane developer. Keep swimming, its not a fish.

Anonymous Coward writes:
Re: ( Score: 2 , Informative)

I agree with this. I consider myself to be a good programmer and I would never go into contractor game. I also wonder, how does it take you weeks to interview someone and you still can't figure out if the person can't code? I could probably see that in 15 minutes in a pair coding session.

Also, Oracle, SAP, IBM... I would never buy from them, nor use their products. I have used plenty of IBM products and they suck big time. They make software development 100 times harder than it could be. Their technical supp

Lanthanide ( 4982283 ) writes:
Re: ( Score: 2 )

It's weeks to interview multiple different candidates before deciding on 1 or 2 of them. Not weeks per person.

Anonymous Coward writes:
Re: ( Score: 3 , Insightful)
That being said, I've also seen projects costing hundreds of thousands of dollars, with years of delays from companies like Oracle, Sun, SAP, and many other "vendors"

Software development is a hard thing to do well, despite the general thinking of technology becoming cheaper over time, and like health care the quality of the goods and services received can sometimes be difficult to ascertain. However, people who don't respect developers and the problems we solve are very often the same ones who continually frustrate themselves by trying to cheap out, hiring outsourced contractors, and then tearing their hair out when sub par results are delivered, if anything is even del

pauljlucas ( 529435 ) writes:
Re: ( Score: 2 )

As part of your interview process, don't you have candidates code a solution to a problem on a whiteboard? I've interviewed lots of "good" candidates (on paper) too, but they crashed and burned when challenged with a coding exercise. As a result, we didn't make them job offers.

VeryFluffyBunny ( 5037285 ) writes:
I do the opposite ( Score: 2 )

I'm not a great coder but good enough to get done what clients want done. If I'm not sure or don't think I can do it, I tell them. I think they appreciate the honesty. I don't work in a tech-hub, startups or anything like that so I'm not under the same expectations and pressures that others may be.

Tony Isaac ( 1301187 ) writes:
Bigger building blocks ( Score: 2 )

OK, so yes, I know plenty of programmers who do fake it. But stitching together components isn't "fake" programming.

Back in the day, we had to write our own code to loop through an XML file, looking for nuggets. Now, we just use an XML serializer. Back then, we had to write our own routines to send TCP/IP messages back and forth. Now we just use a library.

I love it! I hated having to make my own bricks before I could build a house. Now, I can get down to the business of writing the functionality I want, ins

Anonymous Coward writes:
Re: ( Score: 2 , Insightful)

But, I suspect you could write the component if you had to. That makes you a very different user of that component than someone who just knows it as a magic black box.

Because of this, you understand the component better and have real knowledge of its strengths and limitations. People blindly using components with only a cursory idea of their internal operation often cause major performance problems. They rarely recognize when it is time to write their own to overcome a limitation (or even that it is possibl

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

You're right on all counts. A person who knows how the innards work, is better than someone who doesn't, all else being equal. Still, today's world is so specialized that no one can possibly learn it all. I've never built a processor, as you have, but I still have been able to build a DNA matching algorithm for a major DNA lab.

I would argue that anyone who can skillfully use off-the-shelf components can also learn how to build components, if they are required to.

thesupraman ( 179040 ) writes:
Ummm. ( Score: 2 )

1, 'Back in the Day' there was no XML, XMl was not very long ago.
2, its a parser, a serialiser is pretty much the opposite (unless this weeks fashion has redefined that.. anything is possible).
3, 'Back then' we didnt have TCP stacks...

But, actually I agree with you. I can only assume the author thinks there are lots of fake plumbers because they dont cast their own toilet bowels from raw clay, and use pre-build fittings and pipes! That car mechanics start from raw steel scrap and a file.. And that you need

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

For the record, XML was invented in 1997, you know, in the last century! https://en.wikipedia.org/wiki/... [wikipedia.org]
And we had a WinSock library in 1992. https://en.wikipedia.org/wiki/... [wikipedia.org]

Yes, I agree with you on the "middle ground." My reaction was to the author's point that "not knowing how to build the components" was the same as being a "fake programmer."

Tony Isaac ( 1301187 ) , Sunday April 29, 2018 @01:46AM ( #56522313 ) Homepage
Re:Bigger building blocks ( Score: 5 , Interesting)

If I'm a plumber, and I don't know anything about the engineering behind the construction of PVC pipe, I can still be a good plumber. If I'm an electrician, and I don't understand the role of a blast furnace in the making of the metal components, I can still be a good electrician.

The analogy fits. If I'm a programmer, and I don't know how to make an LZW compression library, I can still be a good programmer. It's a matter of layers. These days, we specialize. You've got your low-level programmers that make the components, the high level programmers that put together the components, the graphics guys who do HTML/CSS, and the SQL programmers that just know about databases. Every person has their specialty. It's no longer necessary to be a low-level programmer, or jack-of-all-trades, to be "good."

If I don't know the layout of the IP header, I can still write quality networking software, and if I know XSLT, I can still do cool stuff with XML, even if I don't know how to write a good parser.

frank_adrian314159 ( 469671 ) writes:
Re: ( Score: 3 )

I was with you until you said " I can still do cool stuff with XML".

Tony Isaac ( 1301187 ) writes:
Re: ( Score: 2 )

LOL yeah I know it's all JSON now. I've been around long enough to see these fads come and go. Frankly, I don't see a whole lot of advantage of JSON over XML. It's not even that much more compact, about 10% or so. But the point is that the author laments the "bad old days" when you had to create all your own building blocks, and you didn't have a team of specialists. I for one don't want to go back to those days!

careysub ( 976506 ) writes:
Re: ( Score: 3 )

The main advantage is that JSON is that it is consistent. XML has attributes, embedded optional stuff within tags. That was derived from the original SGML ancestor where is was thought to be a convenience for the human authors who were supposed to be making the mark-up manually. Programmatically it is a PITA.

Cederic ( 9623 ) writes:
Re: ( Score: 3 )

I got shit for decrying XML back when it was the trendy thing. I've had people apologise to me months later because they've realized I was right, even though at the time they did their best to fuck over my career because XML was the new big thing and I wasn't fully on board.

XML has its strengths and its place, but fuck me it taught me how little some people really fucking understand shit.

Anonymous Coward writes:
Silicon Valley is Only Part of the Tech Business ( Score: 2 , Informative)

And a rather small part at that, albeit a very visible and vocal one full of the proverbial prima donas. However, much of the rest of the tech business, or at least the people working in it, are not like that. It's small groups of developers working in other industries that would not typically be considered technology. There are software developers working for insurance companies, banks, hedge funds, oil and gas exploration or extraction firms, national defense and many hundreds and thousands of other small

phantomfive ( 622387 ) writes:
bonfire of fakers ( Score: 2 )

This is the reason I wish programming didn't pay so much....the field is better when it's mostly populated by people who enjoy programming.

Njovich ( 553857 ) , Sunday April 29, 2018 @05:35AM ( #56522641 )
Learn to code courses ( Score: 5 , Insightful)
They knew that well-paid programming jobs would also soon turn to smoke and ash, as the proliferation of learn-to-code courses around the world lowered the market value of their skills, and as advances in artificial intelligence allowed for computers to take over more of the mundane work of producing software.

Kind of hard to take this article serious after saying gibberish like this. I would say most good programmers know that neither learn-to-code courses nor AI are going to make a dent in their income any time soon.

AndyKron ( 937105 ) writes:
Me? No ( Score: 2 )

As a non-programmer Arduino and libraries are my friends

Escogido ( 884359 ) , Sunday April 29, 2018 @06:59AM ( #56522777 )
in the silly cone valley ( Score: 5 , Interesting)

There is a huge shortage of decent programmers. I have personally witnessed more than one phone "interview" that went like "have you done this? what about this? do you know what this is? um, can you start Monday?" (120K-ish salary range)

Partly because there are way more people who got their stupid ideas funded than good coders willing to stain their resume with that. partly because if you are funded, and cannot do all the required coding solo, here's your conundrum:

  • top level hackers can afford to be really picky, so on one hand it's hard to get them interested, and if you could get that, they often want some ownership of the project. the plus side is that they are happy to work for lots of equity if they have faith in the idea, but that can be a huge "if".
  • "good but not exceptional" senior engineers aren't usually going to be super happy, as they often have spouses and children and mortgages, so they'd favor job security over exciting ideas and startup lottery.
  • that leaves you with fresh-out-of-college folks, which are really really a mixed bunch. some are actually already senior level of understanding without the experience, some are absolutely useless, with varying degrees in between, and there's no easy way to tell which is which early.

so the not-so-scrupulous folks realized what's going on, and launched multiple coding boot camps programmes, to essentially trick both the students into believing they can become a coder in a month or two, and also the prospective employers that said students are useful. so far it's been working, to a degree, in part because in such companies coding skill evaluation process is broken. but one can only hide their lack of value add for so long, even if they do manage to bluff their way into a job.

quonset ( 4839537 ) , Sunday April 29, 2018 @07:20AM ( #56522817 )
Duh! ( Score: 4 , Insightful)

All one had to do was look at the lousy state of software and web sites today to see this is true. It's quite obvious little to no thought is given on how to make something work such that one doesn't have to jump through hoops.

I have many times said the most perfect word processing program ever developed was WordPefect 5.1 for DOS. Ones productivity was astonishing. It just worked.

Now we have the bloated behemoth Word which does its utmost to get in the way of you doing your work. The only way to get it to function is to turn large portions of its "features" off, and even then it still insists on doing something other than what you told it to do.

Then we have the abomination of Windows 10, which is nothing but Clippy on 10X steroids. It is patently obvious the people who program this steaming pile have never heard of simplicity. Who in their right mind would think having to "search" for something is more efficient than going directly to it? I would ask the question if these people wander around stores "searching" for what they're looking for, but then I realize that's how their entire life is run. They search for everything online rather than going directly to the source. It's no wonder they complain about not having time to things. They're always searching.

Web sites are another area where these people have no clue what they're doing. Anything that might be useful is hidden behind dropdown menus, flyouts, popup bubbles and intriately designed mazes of clicks needed to get to where you want to go. When someone clicks on a line of products, they shouldn't be harassed about what part of the product line they want to look at. Give them the information and let the user go where they want.

This rant could go on, but this article explains clearly why we have regressed when it comes to software and web design. Instead of making things simple and easy to use, using the one or two brain cells they have, programmers and web designers let the software do what it wants without considering, should it be done like this?

swb ( 14022 ) , Sunday April 29, 2018 @07:48AM ( #56522857 )
Tech industry churn ( Score: 3 )

The tech industry has a ton of churn -- there's some technological advancement, but there's an awful lot of new products turned out simply to keep customers buying new licenses and paying for upgrades.

This relentless and mostly phony newness means a lot of people have little experience with current products. People fake because they have no choice. The good ones understand the general technologies and problems they're meant to solve and can generally get up to speed quickly, while the bad ones are good at faking it but don't really know what they're doing. Telling the difference from the outside is impossible.

Sales people make it worse, promoting people as "experts" in specific products or implementations because the people have experience with a related product and "they're all the same". This burns out the people with good adaption skills.

DaMattster ( 977781 ) , Sunday April 29, 2018 @08:39AM ( #56522979 )
Interesting ( Score: 3 )

From the summary, it sounds like a lot of programmers and software engineers are trying to develop the next big thing so that they can literally beg for money from the elite class and one day, hopefully, become a member of the aforementioned. It's sad how the middle class has been utterly decimated in the United States that some of us are willing to beg for scraps from the wealthy. I used to work in IT but I've aged out and am now back in school to learn automotive technology so that I can do something other than being a security guard. Currently, the only work I have been able to find has been in the unglamorous security field.

I am learning some really good new skills in the automotive program that I am in but I hate this one class called "Professionalism in the Shop." I can summarize the entire class in one succinct phrase, "Learn how to appeal to, and communicate with, Mr. Doctor, Mr. Lawyer, or Mr. Wealthy-man." Basically, the class says that we are supposed to kiss their ass so they keep coming back to the Audi, BMW, Mercedes, Volvo, or Cadillac dealership. It feels a lot like begging for money on behalf of my employer (of which very little of it I will see) and nothing like professionalism. Professionalism is doing the job right the first time, not jerking the customer off. Professionalism is not begging for a 5 star review for a few measly extra bucks but doing absolute top quality work. I guess the upshot is that this class will be the easiest 4.0 that I've ever seen.

There is something fundamentally wrong when the wealthy elite have basically demanded that we beg them for every little scrap. I can understand the importance of polite and professional interaction but this prevalent expectation that we bend over backwards for them crosses a line with me. I still suck it up because I have to but it chafes my ass to basically validate the wealthy man.

ElitistWhiner ( 79961 ) writes:
Natural talent... ( Score: 2 )

In 70's I worked with two people who had a natural talent for computer science algorithms .vs. coding syntax. In the 90's while at COLUMBIA I worked with only a couple of true computer scientists out of 30 students. I've met 1 genius who programmed, spoke 13 languages, ex-CIA, wrote SWIFT and spoke fluent assembly complete with animated characters.

According to the Bluff Book, everyone else without natural talent fakes it. In the undiluted definition of computer science, genetics roulette and intellectual d

fahrbot-bot ( 874524 ) writes:
Other book sells better and is more interesting ( Score: 2 )
New Book Describes 'Bluffing' Programmers in Silicon Valley

It's not as interesting as the one about "fluffing" [urbandictionary.com] programmers.

Anonymous Coward writes:
Re: ( Score: 3 , Funny)

Ah yes, the good old 80:20 rule, except it's recursive for programmers.

80% are shit, so you fire them. Soon you realize that 80% of the remaining 20% are also shit, so you fire them too. Eventually you realize that 80% of the 4% remaining after sacking the 80% of the 20% are also shit, so you fire them!

...

The cycle repeats until there's just one programmer left: the person telling the joke.

---

tl;dr: All programmers suck. Just ask them to review their own code from more than 3 years ago: they'll tell you that

luis_a_espinal ( 1810296 ) writes:
Re: ( Score: 3 )
Who gives a fuck about lines? If someone gave me JavaScript, and someone gave me minified JavaScript, which one would I want to maintain?

I donβ(TM)t care about your line savings, less isnβ(TM)t always better.

Because the world of programming is not centered about JavasScript and reduction of lines is not the same as minification. If the first thing that came to your mind was about minified JavaScript when you saw this conversation, you are certainly not the type of programmer I would want to inherit code from.

See, there's a lot of shit out there that is overtly redundant and unnecessarily complex. This is specially true when copy-n-paste code monkeys are left to their own devices for whom code formatting seems

Anonymous Coward , Sunday April 29, 2018 @01:17AM ( #56522241 )
Re:Most "Professional programmers" are useless. ( Score: 4 , Interesting)

I have a theory that 10% of people are good at what they do. It doesn't really matter what they do, they will still be good at it, because of their nature. These are the people who invent new things, who fix things that others didn't even see as broken and who automate routine tasks or simply question and erase tasks that are not necessary. If you have a software team that contain 5 of these, you can easily beat a team of 100 average people, not only in cost but also in schedule, quality and features. In theory they are worth 20 times more than average employees, but in practise they are usually paid the same amount of money with few exceptions.

80% of people are the average. They can follow instructions and they can get the work done, but they don't see that something is broken and needs fixing if it works the way it has always worked. While it might seem so, these people are not worthless. There are a lot of tasks that these people are happily doing which the 10% don't want to do. E.g. simple maintenance work, implementing simple features, automating test cases etc. But if you let the top 10% lead the project, you most likely won't be needed that much of these people. Most work done by these people is caused by themselves, by writing bad software due to lack of good leader.

10% are just causing damage. I'm not talking about terrorists and criminals. I have seen software developers who have tried (their best?), but still end up causing just damage to the code that someone else needs to fix, costing much more than their own wasted time. You really must use code reviews if you don't know your team members, to find these people early.

Anonymous Coward , Sunday April 29, 2018 @01:40AM ( #56522299 )
Re:Most "Professional programmers" are useless. ( Score: 5 , Funny)
to find these people early

and promote them to management where they belong.

raymorris ( 2726007 ) , Sunday April 29, 2018 @01:51AM ( #56522329 ) Journal
Seems about right. Constantly learning, studying ( Score: 5 , Insightful)

That seems about right to me.

I have a lot of weaknesses. My people skills suck, I'm scrawny, I'm arrogant. I'm also generally known as a really good programmer and people ask me how/why I'm so much better at my job than everyone else in the room. (There are a lot of things I'm not good at, but I'm good at my job, so say everyone I've worked with.)

I think one major difference is that I'm always studying, intentionally working to improve, every day. I've been doing that for twenty years.

I've worked with people who have "20 years of experience"; they've done the same job, in the same way, for 20 years. Their first month on the job they read the first half of "Databases for Dummies" and that's what they've been doing for 20 years. They never read the second half, and use Oracle database 18.0 exactly the same way they used Oracle Database 2.0 - and it was wrong 20 years ago too. So it's not just experience, it's 20 years of learning, getting better, every day. That's 7,305 days of improvement.

gbjbaanb ( 229885 ) writes:
Re: ( Score: 2 )

I think I can guarantee that they are a lot better at their jobs than you think, and that you are a lot worse at your job than you think too.

m00sh ( 2538182 ) writes:
Re: ( Score: 2 )
That seems about right to me.

I have a lot of weaknesses. My people skills suck, I'm scrawny, I'm arrogant. I'm also generally known as a really good programmer and people ask me how/why I'm so much better at my job than everyone else in the room. (There are a lot of things I'm not good at, but I'm good at my job, so say everyone I've worked with.)

I think one major difference is that I'm always studying, intentionally working to improve, every day. I've been doing that for twenty years.

I've worked with people who have "20 years of experience"; they've done the same job, in the same way, for 20 years. Their first month on the job they read the first half of "Databases for Dummies" and that's what they've been doing for 20 years. They never read the second half, and use Oracle database 18.0 exactly the same way they used Oracle Database 2.0 - and it was wrong 20 years ago too. So it's not just experience, it's 20 years of learning, getting better, every day. That's 7,305 days of improvement.

If you take this attitude towards other people, people will not ask your for help. At the same time, you'll be also be not able to ask for their help.

You're not interviewing your peers. They are already in your team. You should be working together.

I've seen superstar programmers suck the life out of project by over-complicating things and not working together with others.

raymorris ( 2726007 ) writes:
Which part? Learning makes you better? ( Score: 2 )

You quoted a lot. Is there one part exactly do you have in mind? The thesis of my post is of course "constant learning, on purpose, makes you better"

> you take this attitude towards other people, people will not ask your for help. At the same time, you'll be also be not able to ask for their help.

Are you saying that trying to learn means you can't ask for help, or was there something more specific? For me, trying to learn means asking.

Trying to learn, I've had the opportunity to ask for help from peop

phantomfive ( 622387 ) writes:
Re: ( Score: 2 )

The difference between a smart programmer who succeeds and a stupid programmer who drops out is that the smart programmer doesn't give up.

complete loony ( 663508 ) writes:
Re: ( Score: 2 )

In other words;

What is often mistaken for 20 years' experience, is just 1 year's experience repeated 20 times.
serviscope_minor ( 664417 ) writes:
Re: ( Score: 2 )

10% are just causing damage. I'm not talking about terrorists and criminals.

Terrorists and criminals have nothing on those guys. I know guy who is one of those. Worse, he's both motivated and enthusiastic. He also likes to offer help and advice to other people who don't know the systems well.

asifyoucare ( 302582 ) , Sunday April 29, 2018 @08:49AM ( #56522999 )
Re:Most "Professional programmers" are useless. ( Score: 5 , Insightful)

Good point. To quote Kurt von Hammerstein-Equord:

"I divide my officers into four groups. There are clever, diligent, stupid, and lazy officers. Usually two characteristics are combined. Some are clever and diligent -- their place is the General Staff. The next lot are stupid and lazy -- they make up 90 percent of every army and are suited to routine duties. Anyone who is both clever and lazy is qualified for the highest leadership duties, because he possesses the intellectual clarity and the composure necessary for difficult decisions. One must beware of anyone who is stupid and diligent -- he must not be entrusted with any responsibility because he will always cause only mischief."

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

Oops. Good thing I never did anything military. I am definitely in the "clever and lazy" class.

apoc.famine ( 621563 ) writes:
Re: ( Score: 2 )

I was just thinking the same thing. One of my passions in life is coming up with clever ways to do less work while getting more accomplished.

Software_Dev_GL ( 5377065 ) writes:
Re: ( Score: 2 )

It's called the Pareto Distribution [wikipedia.org]. The number of competent people (people doing most of the work) in any given organization goes like the square root of the number of employees.

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

Matches my observations. 10-15% are smart, can think independently, can verify claims by others and can identify and use rules in whatever they do. They are not fooled by things "everybody knows" and see standard-approaches as first approximations that, of course, need to be verified to work. They do not trust anything blindly, but can identify whether something actually work well and build up a toolbox of such things.

The problem is that in coding, you do not have a "(mass) production step", and that is the

geoskd ( 321194 ) writes:
Re: ( Score: 2 )

In basic concept I agree with your theory, it fits my own anecdotal experience well, but I find that your numbers are off. The top bracket is actually closer to 20%. The reason it seems so low is that a large portion of the highly competent people are running one programmer shows, so they have no co-workers to appreciate their knowledge and skill. The places they work do a very good job of keeping them well paid and happy (assuming they don't own the company outright), so they rarely if ever switch jobs.

The

Tablizer ( 95088 ) , Sunday April 29, 2018 @01:54AM ( #56522331 ) Journal
Re:Most "Professional programmers" are useless. ( Score: 4 , Interesting)
at least 70, probably 80, maybe even 90 percent of professional programmers should just fuck off and do something else as they are useless at programming.

Programming is statistically a dead-end job. Why should anyone hone a dead-end skill that you won't be able to use for long? For whatever reason, the industry doesn't want old programmers.

Otherwise, I'd suggest longer training and education before they enter the industry. But that just narrows an already narrow window of use.

Cesare Ferrari ( 667973 ) writes:
Re: ( Score: 2 )

Well, it does rather depend on which industry you work in - i've managed to find interesting programming jobs for 25 years, and there's no end in sight for interesting projects and new avenues to explore. However, this isn't for everyone, and if you have good personal skills then moving from programming into some technical management role is a very worthwhile route, and I know plenty of people who have found very interesting work in that direction.

gweihir ( 88907 ) writes:
Re: ( Score: 3 , Insightful)

I think that is a misinterpretation of the facts. Old(er) coders that are incompetent are just much more obvious and usually are also limited to technologies that have gotten old as well. Hence the 90% old coders that can actually not hack it and never really could get sacked at some time and cannot find a new job with their limited and outdated skills. The 10% that are good at it do not need to worry though. Who worries there is their employers when these people approach retirement age.

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

My experience as an IT Security Consultant (I also do some coding, but only at full rates) confirms that. Most are basically helpless and many have negative productivity, because people with a clue need to clean up after them. "Learn to code"? We have far too many coders already.

tomhath ( 637240 ) writes:
Re: ( Score: 2 )

You can't bluff you way through writing software, but many, many people have bluffed their way into a job and then tried to learn it from the people who are already there. In a marginally functional organization those incompetents are let go pretty quickly, but sometimes they stick around for months or years.

Apparently the author of this book is one of those, probably hired and fired several times before deciding to go back to his liberal arts roots and write a book.

DaMattster ( 977781 ) writes:
Re: ( Score: 2 )

There are some mechanics that bluff their way through an automotive repair. It's the same damn thing

gweihir ( 88907 ) writes:
Re: ( Score: 2 )

I think you can and this is by far not the first piece describing that. Here is a classic: https://blog.codinghorror.com/... [codinghorror.com]
Yet these people somehow manage to actually have "experience" because they worked in a role they are completely unqualified to fill.

phantomfive ( 622387 ) writes:
Re: ( Score: 2 )
Fiddling with JavaScript libraries to get a fancy dancy interface that makes PHB's happy is a sought-after skill, for good or bad. Now that we rely more on half-ass libraries, much of "programming" is fiddling with dark-grey boxes until they work good enough.

This drives me crazy, but I'm consoled somewhat by the fact that it will all be thrown out in five years anyway.

[Nov 30, 2017] Will Robots Kill the Asian Century

This aritcle is two years old and not much happned during those two years. But still there is a chance that highly authomated factories can make manufacturing in the USA again profitable. the problme is that they will be even more profible in East Asia;-)
Notable quotes:
"... The National Interest ..."
The National Interest

The rise of technologies such as 3-D printing and advanced robotics means that the next few decades for Asia's economies will not be as easy or promising as the previous five.

OWEN HARRIES, the first editor, together with Robert Tucker, of The National Interest, once reminded me that experts-economists, strategists, business leaders and academics alike-tend to be relentless followers of intellectual fashion, and the learned, as Harold Rosenberg famously put it, a "herd of independent minds." Nowhere is this observation more apparent than in the prediction that we are already into the second decade of what will inevitably be an "Asian Century"-a widely held but rarely examined view that Asia's continued economic rise will decisively shift global power from the Atlantic to the western Pacific Ocean.

No doubt the numbers appear quite compelling. In 1960, East Asia accounted for a mere 14 percent of global GDP; today that figure is about 27 percent. If linear trends continue, the region could account for about 36 percent of global GDP by 2030 and over half of all output by the middle of the century. As if symbolic of a handover of economic preeminence, China, which only accounted for about 5 percent of global GDP in 1960, will likely surpass the United States as the largest economy in the world over the next decade. If past record is an indicator of future performance, then the "Asian Century" prediction is close to a sure thing.

[Nov 29, 2017] Take This GUI and Shove It

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.
Notable quotes:
"... Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI. ..."
"... What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers. ..."
"... AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT. ..."
"... Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI. ..."
"... Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead. ..."
"... Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location. ..."
"... I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script. ..."
Slashdot

Deep End's Paul Venezia speaks out against the overemphasis on GUIs in today's admin tools, saying that GUIs are fine and necessary in many cases, but only after a complete CLI is in place, and that they cannot interfere with the use of the CLI, only complement it. Otherwise, the GUI simply makes easy things easy and hard things much harder. He writes, 'If you have to make significant, identical changes to a bunch of Linux servers, is it easier to log into them one-by-one and run through a GUI or text-menu tool, or write a quick shell script that hits each box and either makes the changes or simply pulls down a few new config files and restarts some services? And it's not just about conservation of effort - it's also about accuracy. If you write a script, you're certain that the changes made will be identical on each box. If you're doing them all by hand, you aren't.'"

alain94040 (785132)

Here is a Link to the print version of the article [infoworld.com] (that conveniently fits on 1 page instead of 3).

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.

A bad GUI with no CLI is the worst of both worlds, the author of the article got that right. The 80/20 rule applies: 80% of the work is common to everyone, and should be offered with a GUI. And the 20% that is custom to each sysadmin, well use the CLI.

maxwell demon:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

0123456 (636235) writes:

What would be nice is if the GUI could automatically create a shell script doing the change.

While it's not quite the same thing, our GUI-based home router has an option to download the config as a text file so you can automatically reconfigure it from that file if it has to be reset to defaults. You could presumably use sed to change IP addresses, etc, and copy it to a different router. Of course it runs Linux.

Alain Williams:

AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT.

Ephemeriis:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI.

We've just started working with Aruba hardware. Installed a mobility controller last week. They've got a GUI that does something similar. It's all a pretty web-based front-end, but it again generates CLI commands and a human-readable configuration. I'm still very new to the platform, but I'm already learning about their CLI through the GUI. And getting work done that I wouldn't be able to if I had to look up the CLI commands for everything.

Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead.

Anpheus:

Just about every Microsoft tool newer than 2007 does this. Virtual machine manager, SQL Server has done it for ages, I think almost all the system center tools do, etc.

It's a huge improvement.

PoV:

All good admins document their work (don't they? DON'T THEY?). With a CLI or a script that's easy: it comes down to "log in as user X, change to directory Y, run script Z with arguments A B and C - the output should look like D". Try that when all you have is a GLUI (like a GUI, but you get stuck): open this window, select that option, drag a slider, check these boxes, click Yes, three times. The output might look a little like this blurry screen shot and the only record of a successful execution is a window that disappears as soon as the application ends.

I suppose the Linux community should be grateful that windows made the fundemental systems design error of making everything graphic. Without that basic failure, Linux might never have even got the toe-hold it has now.

skids:

I think this is a stronger point than the OP: GUIs do not lead to good documentation. In fact, GUIs pretty much are limited to procedural documentation like the example you gave.

The best they can do as far as actual documentation, where the precise effect of all the widgets is explained, is a screenshot with little quote bubbles pointing to each doodad. That's a ridiculous way to document.

This is as opposed to a command reference which can organize, usually in a pretty sensible fashion, exact descriptions of what each command does.

Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location.

Not that even good command references are mandatory by today's pathetic standards. Even the big boys like Cisco have shown major degradation in the quality of their documentation during the last decade.

pedantic bore:

I think the author might not fully understand who most admins are. They're people who couldn't write a shell script if their lives depended on it, because they've never had to. GUI-dependent users become GUI-dependent admins.

As a percentage of computer users, people who can actually navigate a CLI are an ever-diminishing group.

arth1: /etc/resolv.conf

/etc/init.d/NetworkManager stop
chkconfig NetworkManager off
chkconfig network on
vi /etc/sysconfig/network
vi /etc/sysconfig/network-scripts/eth0

At least they named it NetworkManager, so experienced admins could recognize it as a culprit. Anything named in CamelCase is almost invariably written by new school programmers who don't grok the Unix toolbox concept and write applications instead of tools, and the bloated drivel is usually best avoided.

Darkness404 (1287218) writes: on Monday October 04, @07:21PM (#33789446)

There are more and more small businesses (5, 10 or so employees) realizing that they can get things done easier if they had a server. Because the business can't really afford to hire a sysadmin or a full-time tech person, its generally the employee who "knows computers" (you know, the person who has to help the boss check his e-mail every day, etc.) and since they don't have the knowledge of a skilled *Nix admin, a GUI makes their administration a lot easier.

So with the increasing use of servers among non-admins, it only makes sense for a growth in GUI-based solutions.

Svartalf (2997) writes: Ah... But the thing is... You don't NEED the GUI with recent Linux systems- you do with Windows.

oatworm (969674) writes: on Monday October 04, @07:38PM (#33789624) Homepage

Bingo. Realistically, if you're a company with less than a 100 employees (read: most companies), you're only going to have a handful of servers in house and they're each going to be dedicated to particular roles. You're not going to have 100 clustered fileservers - instead, you're going to have one or maybe two. You're not going to have a dozen e-mail servers - instead, you're going to have one or two. Consequently, the office admin's focus isn't going to be scalability; it just won't matter to the admin if they can script, say, creating a mailbox for 100 new users instead of just one. Instead, said office admin is going to be more focused on finding ways to do semi-unusual things (e.g. "create a VPN between this office and our new branch office", "promote this new server as a domain controller", "install SQL", etc.) that they might do, oh, once a year.

The trouble with Linux, and I'm speaking as someone who's used YaST in precisely this context, is that you have to make a choice - do you let the GUI manage it or do you CLI it? If you try to do both, there will be inconsistencies because the grammar of the config files is too ambiguous; consequently, the GUI config file parser will probably just overwrite whatever manual changes it thinks is "invalid", whether it really is or not. If you let the GUI manage it, you better hope the GUI has the flexibility necessary to meet your needs. If, for example, YaST doesn't understand named Apache virtual hosts, well, good luck figuring out where it's hiding all of the various config files that it was sensibly spreading out in multiple locations for you, and don't you dare use YaST to manage Apache again or it'll delete your Apache-legal but YaST-"invalid" directive.

The only solution I really see is for manual config file support with optional XML (or some other machine-friendly but still human-readable format) linkages. For example, if you want to hand-edit your resolv.conf, that's fine, but if the GUI is going to take over, it'll toss a directive on line 1 that says "#import resolv.conf.xml" and immediately overrides (but does not overwrite) everything following that. Then, if you still want to use the GUI but need to hand-edit something, you can edit the XML file using the appropriate syntax and know that your change will be reflected on the GUI.

That's my take. Your mileage, of course, may vary.

icebraining (1313345) writes: on Monday October 04, @07:24PM (#33789494) Homepage

I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script.

devent (1627873) writes:

Why Windows servers have a GUI is beyond me anyway. The servers are running 99,99% of the time without a monitor and normally you just login per ssh to a console if you need to administer them. But they are consuming the extra RAM, the extra CPU cycles and the extra security threats. I don't now, but can you de-install the GUI from a Windows server? Or better, do you have an option for no-GUI installation? Just saw the minimum hardware requirements. 512 MB RAM and 32 GB or greater disk space. My server runs

sirsnork (530512) writes: on Monday October 04, @07:43PM (#33789672)

it's called a "core" install in Server 2008 and up, and if you do that, there is no going back, you can't ever add the GUI back.

What this means is you can run a small subset of MS services that don't need GUI interaction. With R2 that subset grew somwhat as they added the ability to install .Net too, which mean't you could run IIS in a useful manner (arguably the strongest reason to want to do this in the first place).

Still it's a one way trip and you better be damn sure what services need to run on that box for the lifetime of that box or you're looking at a reinstall. Most windows admins will still tell you the risk isn't worth it.

Simple things like network configuration without a GUI in windows is tedious, and, at least last time i looked, you lost the ability to trunk network poers because the NIC manufactuers all assumed you had a GUI to configure your NICs

prichardson (603676) writes: on Monday October 04, @07:27PM (#33789520) Journal

This is also a problem with Max OS X Server. Apple builds their services from open source products and adds a GUI for configuration to make it all clickable and easy to set up. However, many options that can be set on the command line can't be set in the GUI. Even worse, making CLI changes to services can break the GUI entirely.

The hardware and software are both super stable and run really smoothly, so once everything gets set up, it's awesome. Still, it's hard for a guy who would rather make changes on the CLI to get used to.

MrEricSir (398214) writes:

Just because you're used to a CLI doesn't make it better. Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. In essence, the question here is whether it's okay for the user to be lazy and use a GUI, or whether the programmer should be too lazy to develop a GUI.

ak_hepcat (468765) writes: <leif@MENCKENdenali.net minus author> on Monday October 04, @07:38PM (#33789626) Homepage Journal

Probably because it's also about the ease of troubleshooting issues.

How do you troubleshoot something with a GUI after you've misconfigured? How do you troubleshoot a programming error (bug) in the GUI -> device communication? How do you scale to tens, hundreds, or thousands of devices with a GUI?

CLI makes all this easier and more manageable.

arth1 (260657) writes:

Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. Because then you'll be stuck at doing simple tasks, and will never be able to do more advanced tasks. Without hiring a team to write an app for you instead of doing it yourself in two minutes, that is. The time you spend reading man

fandingo (1541045) writes: on Monday October 04, @07:54PM (#33789778)

I don't think you really understand systems administration. 'Users,' or in this case admins, don't typically do stuff once. Furthermore, they need to know what he did and how to do it again (i.e. new server or whatever) or just remember what he did. One-off stuff isn't common and is a sign of poor administration (i.e. tracking changes and following processes).

What I'm trying to get at is that admins shouldn't do anything without reading the manual. As a Windows/Linux admin, I tend to find Linux easier to properly administer because I either already know how to perform an operation or I have to read the manual (manpage) and learn a decent amount about the operation (i.e. more than click here/use this flag).

Don't get me wrong, GUIs can make unknown operations significantly easier, but they often lead to poor process management. To document processes, screenshots are typically needed. They can be done well, but I find that GUI documentation (created by admins, not vendor docs) tend to be of very low quality. They are also vulnerable to 'upgrades' where vendors change the interface design. CLI programs typically have more stable interfaces, but maybe that's just because they have been around longer...

maotx (765127) writes: <maotx@NoSPAM.yahoo.com> on Monday October 04, @07:42PM (#33789666)

That's one thing Microsoft did right with Exchange 2007. They built it entirely around their new powershell CLI and then built a GUI for it. The GUI is limited in compared to what you can do with the CLI, but you can get most things done. The CLI becomes extremely handy for batch jobs and exporting statistics to csv files. I'd say it's really up there with BASH in terms of scripting, data manipulation, and integration (not just Exchange but WMI, SQL, etc.)

They tried to do similar with Windows 2008 and their Core [petri.co.il] feature, but they still have to load a GUI to present a prompt...Reply to This

Charles Dodgeson (248492) writes: <jeffrey@goldmark.org> on Monday October 04, @08:51PM (#33790206) Homepage Journal

Probably Debian would have been OK, but I was finding admin of most Linux distros a pain for exactly these reasons. I couldn't find a layer where I could do everything that I needed to do without worrying about one thing stepping on another. No doubt there are ways that I could manage a Linux system without running into different layers of management tools stepping on each other, but it was a struggle.

There were other reasons as well (although there is a lot that I miss about Linux), but I think that this was one of the leading reasons.

(NB: I realize that this is flamebait (I've got karma to burn), but that isn't my intention here.)

[Nov 28, 2017] Sometimes the Old Ways Are Best by Brian Kernighan

Notable quotes:
"... Sometimes the old ways are best, and they're certainly worth knowing well ..."
Nov 01, 2008 | IEEE Software, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time.

But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past.

On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

[Nov 28, 2017] Rees Re OO

Notable quotes:
"... The conventional Simula 67-like pattern of class and instance will get you {1,3,7,9}, and I think many people take this as a definition of OO. ..."
"... Because OO is a moving target, OO zealots will choose some subset of this menu by whim and then use it to try to convince you that you are a loser. ..."
"... In such a pack-programming world, the language is a constitution or set of by-laws, and the interpreter/compiler/QA dept. acts in part as a rule checker/enforcer/police force. Co-programmers want to know: If I work with your code, will this help me or hurt me? Correctness is undecidable (and generally unenforceable), so managers go with whatever rule set (static type system, language restrictions, "lint" program, etc.) shows up at the door when the project starts. ..."
Nov 04, 2017 | www.paulgraham.com

(Jonathan Rees had a really interesting response to Why Arc isn't Especially Object-Oriented , which he has allowed me to reproduce here.)

Here is an a la carte menu of features or properties that are related to these terms; I have heard OO defined to be many different subsets of this list.

  1. Encapsulation - the ability to syntactically hide the implementation of a type. E.g. in C or Pascal you always know whether something is a struct or an array, but in CLU and Java you can hide the difference.
  2. Protection - the inability of the client of a type to detect its implementation. This guarantees that a behavior-preserving change to an implementation will not break its clients, and also makes sure that things like passwords don't leak out.
  3. Ad hoc polymorphism - functions and data structures with parameters that can take on values of many different types.
  4. Parametric polymorphism - functions and data structures that parameterize over arbitrary values (e.g. list of anything). ML and Lisp both have this. Java doesn't quite because of its non-Object types.
  5. Everything is an object - all values are objects. True in Smalltalk (?) but not in Java (because of int and friends).
  6. All you can do is send a message (AYCDISAM) = Actors model - there is no direct manipulation of objects, only communication with (or invocation of) them. The presence of fields in Java violates this.
  7. Specification inheritance = subtyping - there are distinct types known to the language with the property that a value of one type is as good as a value of another for the purposes of type correctness. (E.g. Java interface inheritance.)
  8. Implementation inheritance/reuse - having written one pile of code, a similar pile (e.g. a superset) can be generated in a controlled manner, i.e. the code doesn't have to be copied and edited. A limited and peculiar kind of abstraction. (E.g. Java class inheritance.)
  9. Sum-of-product-of-function pattern - objects are (in effect) restricted to be functions that take as first argument a distinguished method key argument that is drawn from a finite set of simple names.

So OO is not a well defined concept. Some people (eg. Abelson and Sussman?) say Lisp is OO, by which they mean {3,4,5,7} (with the proviso that all types are in the programmers' heads). Java is supposed to be OO because of {1,2,3,7,8,9}. E is supposed to be more OO than Java because it has {1,2,3,4,5,7,9} and almost has 6; 8 (subclassing) is seen as antagonistic to E's goals and not necessary for OO.

The conventional Simula 67-like pattern of class and instance will get you {1,3,7,9}, and I think many people take this as a definition of OO.

Because OO is a moving target, OO zealots will choose some subset of this menu by whim and then use it to try to convince you that you are a loser.

Perhaps part of the confusion - and you say this in a different way in your little memo - is that the C/C++ folks see OO as a liberation from a world that has nothing resembling a first-class functions, while Lisp folks see OO as a prison since it limits their use of functions/objects to the style of (9.). In that case, the only way OO can be defended is in the same manner as any other game or discipline -- by arguing that by giving something up (e.g. the freedom to throw eggs at your neighbor's house) you gain something that you want (assurance that your neighbor won't put you in jail).

This is related to Lisp being oriented to the solitary hacker and discipline-imposing languages being oriented to social packs, another point you mention. In a pack you want to restrict everyone else's freedom as much as possible to reduce their ability to interfere with and take advantage of you, and the only way to do that is by either becoming chief (dangerous and unlikely) or by submitting to the same rules that they do. If you submit to rules, you then want the rules to be liberal so that you have a chance of doing most of what you want to do, but not so liberal that others nail you.

In such a pack-programming world, the language is a constitution or set of by-laws, and the interpreter/compiler/QA dept. acts in part as a rule checker/enforcer/police force. Co-programmers want to know: If I work with your code, will this help me or hurt me? Correctness is undecidable (and generally unenforceable), so managers go with whatever rule set (static type system, language restrictions, "lint" program, etc.) shows up at the door when the project starts.

I recently contributed to a discussion of anti-OO on the e-lang list. My main anti-OO message (actually it only attacks points 5/6) was http://www.eros-os.org/pipermail/e-lang/2001-October/005852.html . The followups are interesting but I don't think they're all threaded properly.

(Here are the pet definitions of terms used above:

Complete Exchange

[Nov 28, 2017] Sometimes the Old Ways Are Best by Brian Kernighan

Nov 01, 2008 | IEEE Software, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time.

But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past.

On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

[Nov 27, 2017] Stop Writing Classes

Notable quotes:
"... If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code. ..."
Nov 27, 2017 | www.youtube.com

Tom coAdjoint , 1 year ago

My god I wish the engineers at my work understood this

kobac , 2 years ago

If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code.

If there's something else that I've noticed in my career, it's that their code is the hardest to maintain and for some reason they want the rest of the team to depend on them since they are the only "enough smart" to understand that code and change it. No need to say that these guys are not part of my team. Your code should be direct, simple and readable. End of story.

[Nov 27, 2017] Stop Writing Classes

Notable quotes:
"... If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code. ..."
Nov 27, 2017 | www.youtube.com

Tom coAdjoint , 1 year ago

My god I wish the engineers at my work understood this

kobac , 2 years ago

If there's something I've noticed in my career that is that there are always some guys that desperately want to look "smart" and they reflect that in their code.

If there's something else that I've noticed in my career, it's that their code is the hardest to maintain and for some reason they want the rest of the team to depend on them since they are the only "enough smart" to understand that code and change it. No need to say that these guys are not part of my team. Your code should be direct, simple and readable. End of story.

[Nov 27, 2017] The Robot Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Notable quotes:
"... "In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.) ..."
"... At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks. ..."
"... This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle. ..."
"... In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly. ..."
"... In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks ..."
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Nov 27, 2017] The productivity paradox by Ryan Avent

Notable quotes:
"... But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay. ..."
"... Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it. ..."
Mar 20, 2017 | medium.com

People are worried about robots taking jobs. Driverless cars are around the corner. Restaurants and shops increasingly carry the option to order by touchscreen. Google's clever algorithms provide instant translations that are remarkably good.

But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay.

The obvious conclusion, the one lots of people are drawing, is that the robot threat is totally overblown: the fantasy, perhaps, of a bubble-mad Silicon Valley - or an effort to distract from workers' real problems, trade and excessive corporate power. Generally speaking, the problem is not that we've got too much amazing new technology but too little.

This is not a strawman of my own invention. Robert Gordon makes this case. You can see Matt Yglesias make it here. Duncan Weldon, for his part, writes:

We are debating a problem we don't have, rather than facing a real crisis that is the polar opposite. Productivity growth has slowed to a crawl over the last 15 or so years, business investment has fallen and wage growth has been weak. If the robot revolution truly was under way, we would see surging capital expenditure and soaring productivity. Right now, that would be a nice "problem" to have. Instead we have the reality of weak growth and stagnant pay. The real and pressing concern when it comes to the jobs market and automation is that the robots aren't taking our jobs fast enough.

And in a recent blog post Paul Krugman concluded:

I'd note, however, that it remains peculiar how we're simultaneously worrying that robots will take all our jobs and bemoaning the stalling out of productivity growth. What is the story, really?

What is the story, indeed. Let me see if I can tell one. Last fall I published a book: "The Wealth of Humans". In it I set out how rapid technological progress can coincide with lousy growth in pay and productivity. Start with this:

Low labour costs discourage investments in labour-saving technology, potentially reducing productivity growth.

Peter K. -> Peter K.... Monday, March 20, 2017 at 09:26 AM

Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it.

anne -> Peter K.... March 20, 2017 at 10:32 AM

https://twitter.com/paulkrugman/status/843167658577182725

Paul Krugman @paulkrugman

But is [Ryan Avent] saying something different from the assertion that recent tech progress is capital-biased?

https://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

If so, what?

anne -> Peter K.... March 20, 2017 at 10:33 AM

http://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

December 26, 2012

Capital-biased Technological Progress: An Example (Wonkish)
By Paul Krugman

Ever since I posted about robots and the distribution of income, * I've had queries from readers about what capital-biased technological change – the kind of change that could make society richer but workers poorer – really means. And it occurred to me that it might be useful to offer a simple conceptual example – the kind of thing easily turned into a numerical example as well – to clarify the possibility. So here goes.

Imagine that there are only two ways to produce output. One is a labor-intensive method – say, armies of scribes equipped only with quill pens. The other is a capital-intensive method – say, a handful of technicians maintaining vast server farms. (I'm thinking in terms of office work, which is the dominant occupation in the modern economy).

We can represent these two techniques in terms of unit inputs – the amount of each factor of production required to produce one unit of output. In the figure below I've assumed that initially the capital-intensive technique requires 0.2 units of labor and 0.8 units of capital per unit of output, while the labor-intensive technique requires 0.8 units of labor and 0.2 units of capital.

[Diagram]

The economy as a whole can make use of both techniques – in fact, it will have to unless it has either a very large amount of capital per worker or a very small amount. No problem: we can just use a mix of the two techniques to achieve any input combination along the blue line in the figure. For economists reading this, yes, that's the unit isoquant in this example; obviously if we had a bunch more techniques it would start to look like the convex curve of textbooks, but I want to stay simple here.

What will the distribution of income be in this case? Assuming perfect competition (yes, I know, but let's deal with that case for now), the real wage rate w and the cost of capital r – both measured in terms of output – have to be such that the cost of producing one unit is 1 whichever technique you use. In this example, that means w=r=1. Graphically, by the way, w/r is equal to minus the slope of the blue line.

Oh, and if you're worried, yes, workers and machines are both paid their marginal product.

But now suppose that technology improves – specifically, that production using the capital-intensive technique gets more efficient, although the labor-intensive technique doesn't. Scribes with quill pens are the same as they ever were; server farms can do more than ever before. In the figure, I've assumed that the unit inputs for the capital-intensive technique are cut in half. The red line shows the economy's new choices.

So what happens? It's obvious from the figure that wages fall relative to the cost of capital; it's less obvious, maybe, but nonetheless true that real wages must fall in absolute terms as well. In this specific example, technological progress reduces the real wage by a third, to 0.667, while the cost of capital rises to 2.33.

OK, it's obvious how stylized and oversimplified all this is. But it does, I think, give you some sense of what it would mean to have capital-biased technological progress, and how this could actually hurt workers.

* http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

anne -> Peter K.... March 20, 2017 at 10:34 AM

http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

December 8, 2012

Rise of the Robots
By Paul Krugman

Catherine Rampell and Nick Wingfield write about the growing evidence * for "reshoring" of manufacturing to the United States. * They cite several reasons: rising wages in Asia; lower energy costs here; higher transportation costs. In a followup piece, ** however, Rampell cites another factor: robots.

"The most valuable part of each computer, a motherboard loaded with microprocessors and memory, is already largely made with robots, according to my colleague Quentin Hardy. People do things like fitting in batteries and snapping on screens.

"As more robots are built, largely by other robots, 'assembly can be done here as well as anywhere else,' said Rob Enderle, an analyst based in San Jose, California, who has been following the computer electronics industry for a quarter-century. 'That will replace most of the workers, though you will need a few people to manage the robots.' "

Robots mean that labor costs don't matter much, so you might as well locate in advanced countries with large markets and good infrastructure (which may soon not include us, but that's another issue). On the other hand, it's not good news for workers!

This is an old concern in economics; it's "capital-biased technological change," which tends to shift the distribution of income away from workers to the owners of capital.

Twenty years ago, when I was writing about globalization and inequality, capital bias didn't look like a big issue; the major changes in income distribution had been among workers (when you include hedge fund managers and CEOs among the workers), rather than between labor and capital. So the academic literature focused almost exclusively on "skill bias", supposedly explaining the rising college premium.

But the college premium hasn't risen for a while. What has happened, on the other hand, is a notable shift in income away from labor:

[Graph]

If this is the wave of the future, it makes nonsense of just about all the conventional wisdom on reducing inequality. Better education won't do much to reduce inequality if the big rewards simply go to those with the most assets. Creating an "opportunity society," or whatever it is the likes of Paul Ryan etc. are selling this week, won't do much if the most important asset you can have in life is, well, lots of assets inherited from your parents. And so on.

I think our eyes have been averted from the capital/labor dimension of inequality, for several reasons. It didn't seem crucial back in the 1990s, and not enough people (me included!) have looked up to notice that things have changed. It has echoes of old-fashioned Marxism - which shouldn't be a reason to ignore facts, but too often is. And it has really uncomfortable implications.

But I think we'd better start paying attention to those implications.

* http://www.nytimes.com/2012/12/07/technology/apple-to-resume-us-manufacturing.html

** http://economix.blogs.nytimes.com/2012/12/07/when-cheap-foreign-labor-gets-less-cheap/

anne -> anne... March 20, 2017 at 10:41 AM

https://fred.stlouisfed.org/graph/?g=d4ZY

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015


https://fred.stlouisfed.org/graph/?g=d507

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015

(Indexed to 1948)

[Nov 27, 2017] Nineteen Ninety-Six: The Robot/Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye.

The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Oct 26, 2017] Amazon.com Customer reviews Extreme Programming Explained Embrace Change

Rapid Development by Steve McConnell is an older and better book. The Mythical Man-Month remains valuable book as well, albeit dated.
Notable quotes:
"... Having a customer always available on site would mean that the customer in question is probably a small, expendable fish in his organization and is unlikely to have any useful knowledge of its business practices. ..."
"... Unit testing code before it is written means that one would have to have a mental picture of what one is going to write before writing it, which is difficult without upfront design. And maintaining such tests as the code changes would be a nightmare. ..."
"... Programming in pairs all the time would assume that your topnotch developers are also sociable creatures, which is rarely the case, and even if they were, no one would be able to justify the practice in terms of productivity. I won't discuss why I think that abandoning upfront design is a bad practice; the whole idea is too ridiculous to debate ..."
"... Both book and methodology will attract fledgling developers with its promise of hacking as an acceptable software practice and a development universe revolving around the programmer. It's a cult, not a methodology, were the followers shall find salvation and 40-hour working weeks ..."
"... Two stars for the methodology itself, because it underlines several common sense practices that are very useful once practiced without the extremity. ..."
"... The second is the dictatorial social engineering that eXtremity mandates. I've actually tried the pair programming - what a disaster. ..."
"... I've also worked with people who felt that their slightest whim was adequate reason to interfere with my work. That's what Beck institutionalizes by saying that any request made of me by anyone on the team must be granted. It puts me completely at the mercy of anyone walking by. The requisite bullpen physical environment doesn't work for me either. I find that the visual and auditory distraction make intense concentration impossible. ..."
"... One of the things I despise the most about the software development culture is the mindless adoption of fads. Extreme programming has been adopted by some organizations like a religious dogma. ..."
Oct 26, 2017 | www.amazon.com

Mohammad B. Abdulfatah on February 10, 2003

Programming Malpractice Explained: Justifying Chaos

To fairly review this book, one must distinguish between the methodology it presents and the actual presentation. As to the presentation, the author attempts to win the reader over with emotional persuasion and pep talk rather than with facts and hard evidence. Stories of childhood and comradeship don't classify as convincing facts to me.

A single case study-the C3 project-is often referred to, but with no specific information (do note that the project was cancelled by the client after staying in development for far too long).

As to the method itself, it basically boils down to four core practices:

  1. Always have a customer available on site.
  2. Unit test before you code.
  3. Program in pairs.
  4. Forfeit detailed design in favor of incremental, daily releases and refactoring.

If you do the above, and you have excellent staff on your hands, then the book promises that you'll reap the benefits of faster development, less overtime, and happier customers. Of course, the book fails to point out that if your staff is all highly qualified people, then the project is likely to succeed no matter what methodology you use. I'm sure that anyone who has worked in the software industry for sometime has noticed the sad state that most computer professionals are in nowadays.

However, assuming that you have all the topnotch developers that you desire, the outlined methodology is almost impossible to apply in real world scenarios. Having a customer always available on site would mean that the customer in question is probably a small, expendable fish in his organization and is unlikely to have any useful knowledge of its business practices.

Unit testing code before it is written means that one would have to have a mental picture of what one is going to write before writing it, which is difficult without upfront design. And maintaining such tests as the code changes would be a nightmare.

Programming in pairs all the time would assume that your topnotch developers are also sociable creatures, which is rarely the case, and even if they were, no one would be able to justify the practice in terms of productivity. I won't discuss why I think that abandoning upfront design is a bad practice; the whole idea is too ridiculous to debate.

Both book and methodology will attract fledgling developers with its promise of hacking as an acceptable software practice and a development universe revolving around the programmer. It's a cult, not a methodology, were the followers shall find salvation and 40-hour working weeks.

Experience is a great teacher, but only a fool would learn from it alone. Listen to what the opponents have to say before embracing change, and don't forget to take the proverbial grain of salt.

Two stars out of five for the presentation for being courageous and attempting to defy the standard practices of the industry. Two stars for the methodology itself, because it underlines several common sense practices that are very useful once practiced without the extremity.

wiredweird HALL OF FAME TOP 1000 REVIEWER on May 24, 2004
eXtreme buzzwording

Maybe it's an interesting idea, but it's just not ready for prime time.

Parts of Kent's recommended practice - including aggressive testing and short integration cycle - make a lot of sense. I've shared the same beliefs for years, but it was good to see them clarified and codified. I really have changed some of my practice after reading this and books like this.

I have two broad kinds of problem with this dogma, though. First is the near-abolition of documentation. I can't defend 2000 page specs for typical kinds of development. On the other hand, declaring that the test suite is the spec doesn't do it for me either. The test suite is code, written for machine interpretation. Much too often, it is not written for human interpretation. Based on the way I see most code written, it would be a nightmare to reverse engineer the human meaning out of any non-trivial test code. Some systematic way of ensuring human intelligibility in the code, traceable to specific "stories" (because "requirements" are part of the bad old way), would give me a lot more confidence in the approach.

The second is the dictatorial social engineering that eXtremity mandates. I've actually tried the pair programming - what a disaster. The less said the better, except that my experience did not actually destroy any professional relationships. I've also worked with people who felt that their slightest whim was adequate reason to interfere with my work. That's what Beck institutionalizes by saying that any request made of me by anyone on the team must be granted. It puts me completely at the mercy of anyone walking by. The requisite bullpen physical environment doesn't work for me either. I find that the visual and auditory distraction make intense concentration impossible.

I find revival tent spirit of the eXtremists very off-putting. If something works, it works for reasons, not as a matter of faith. I find much too much eXhortation to believe, to go ahead and leap in, so that I will eXperience the wonderfulness for myself. Isn't that what the evangelist on the subway platform keeps saying? Beck does acknowledge unbelievers like me, but requires their exile in order to maintain the group-think of the X-cult.
Beck's last chapters note a number of exceptions and special cases where eXtremism may not work - actually, most of the projects I've ever encountered.

There certainly is good in the eXtreme practice. I look to future authors to tease that good out from the positively destructive threads that I see interwoven.

A customer on May 2, 2004
A work of fiction

The book presents extreme programming. It is divided into three parts:
(1) The problem
(2) The solution
(3) Implementing XP.

The problem, as presented by the author, is that requirements change but current methodologies are not agile enough to cope with this. This results in customer being unhappy. The solution is to embrace change and to allow the requirements to be changed. This is done by choosing the simplest solution, releasing frequently, refactoring with the security of unit tests.

The basic assumption which underscores the approach is that the cost of change is not exponential but reaches a flat asymptote. If this is not the case, allowing change late in the project would be disastrous. The author does not provide data to back his point of view. On the other hand there is a lot of data against a constant cost of change (see for example discussion of cost in Code Complete). The lack of reasonable argumentation is an irremediable flaw in the book. Without some supportive data it is impossible to believe the basic assumption, nor the rest of the book. This is all the more important since the only project that the author refers to was cancelled before full completion.

Many other parts of the book are unconvincing. The author presents several XP practices. Some of them are very useful. For example unit tests are a good practice. They are however better treated elsewhere (e.g., Code Complete chapter on unit test). On the other hand some practices seem overkill. Pair programming is one of them. I have tried it and found it useful to generate ideas while prototyping. For writing production code, I find that a quiet environment is by far the best (see Peopleware for supportive data). Again the author does not provide any data to support his point.

This book suggests an approach aiming at changing software engineering practices. However the lack of supportive data makes it a work of fiction.
I would suggest reading Code Complete for code level advice or Rapid Development for management level advice.

A customer on November 14, 2002
Not Software Engineering.

Any Engineering discipline is based on solid reasoning and logic not on blind faith. Unfortunately, most of this book attempts to convince you that Extreme programming is better based on the author's experiences. A lot of the principles are counterintuitive and the author exhorts you just try it out and get enlightened. I'm sorry but these kind of things belong in infomercials not in s/w engineering.

The part about "code is the documentation" is the scariest part. It's true that keeping the documentation up to date is tough on any software project, but to do away with documentation is the most ridiculous thing I have heard.

It's like telling people to cut of their noses to avoid colds. Yes we are always in search of a better software process. Let me tell you that this book won't lead you there.

Philip K. Ronzone on November 24, 2000
The "gossip magazine diet plans" style of programming.

This book reminds me of the "gossip magazine diet plans", you know, the vinegar and honey diet, or the fat-burner 2000 pill diet etc. Occasionally, people actually lose weight on those diets, but, only because they've managed to eat less or exercise more. The diet plans themselves are worthless. XP is the same - it may sometimes help people program better, but only because they are (unintentionally) doing something different. People look at things like XP because, like dieters, they see a need for change. Overall, the book is a decently written "fad diet", with ideas that are just as worthless.

A customer on August 11, 2003
Hackers! Salvation is nigh!!

It's interesting to see the phenomenon of Extreme Programming happening in the dawn of the 21st century. I suppose historians can explain such a reaction as a truly conservative movement. Of course, serious software engineering practice is hard. Heck, documentation is a pain in the neck. And what programmer wouldn't love to have divine inspiration just before starting to write the latest web application and so enlightened by the Almighty, write the whole thing in one go, as if by magic? No design, no documentation, you and me as a pair, and the customer too. Sounds like a hacker's dream with "Imagine" as the soundtrack (sorry, John).
The Software Engineering struggle is over 50 years old and it's only logical to expect some resistance, from time to time. In the XP case, the resistance comes in one of its worst forms: evangelism. A fundamentalist cult, with very little substance, no proof of any kind, but then again if you don't have faith you won't be granted the gift of the mystic revelation. It's Gnosticism for Geeks.
Take it with a pinch of salt.. well, maybe a sack of salt. If you can see through the B.S. that sells millions of dollars in books, consultancy fees, lectures, etc, you will recognise some common-sense ideas that are better explained, explored and detailed elsewhere.

Ian K. VINE VOICE on February 27, 2015
Long have I hated this book

Kent is an excellent writer. He does an excellent job of presenting an approach to software development that is misguided for anything but user interface code. The argument that user interface code must be gotten into the hands of users to get feedback is used to suggest that complex system code should not be "designed up front". This is simply wrong. For example, if you are going to deploy an application in the Amazon Cloud that you want to scale, you better have some idea of how this is going to happen. Simply waiting until your application falls over and fails is not an acceptable approach.

One of the things I despise the most about the software development culture is the mindless adoption of fads. Extreme programming has been adopted by some organizations like a religious dogma.

Engineering large software systems is one of the most difficult things that humans do. There are no silver bullets and there are no dogmatic solutions that will make the difficult simple.

Anil Philip on March 24, 2005
not found - the silver bullet

Maybe I'm too cynical because I never got to work for the successful, whiz-kid companies; Maybe this book wasn't written for me!

This book reminds me of Jacobsen's "Use Cases" book of the 1990s. 'Use Cases' was all the rage but after several years, we slowly learned the truth: Uses Cases does not deal with the architecture - a necessary and good foundation for any piece of software.

Similarly, this book seems to be spotlighting Testing and taking it to extremes.

'the test plan is the design doc'

Not True. The design doc encapsulates wisdom and insight

a picture that accurately describes the interactions of the lower level software components is worth a thousand lines of code-reading.

Also present is an evangelistic fervor that reminds me of the rah-rah eighties' bestseller, "In Search Of Excellence" by Peters and Waterman. (Many people have since noted that most of the spotlighted companies of that book are bankrupt twenty five years later).

Lastly, I noted that the term 'XP' was used throughout the book, and the back cover has a blurb from an M$ architect. Was it simply coincidence that Windows shares the same name for its XP release? I wondered if M$ had sponsored part of the book as good advertising for Windows XP! :)

[Oct 08, 2017] >Disbelieving the 'many eyes' myth Opensource.com

Notable quotes:
"... This article originally appeared on Alice, Eve, and Bob – a security blog and is republished with permission. ..."
Oct 08, 2017 | opensource.com

Review by many eyes does not always prevent buggy code There is a view that because open source software is subject to review by many eyes, all the bugs will be ironed out of it. This is a myth. 06 Oct 2017 Mike Bursell (Red Hat) Feed 8 up Disbelieving the 'many eyes' hypothesis Image credits : Internet Archive Book Images . CC BY-SA 4.0 Writing code is hard. Writing secure code is harder -- much harder. And before you get there, you need to think about design and architecture. When you're writing code to implement security functionality, it's often based on architectures and designs that have been pored over and examined in detail. They may even reflect standards that have gone through worldwide review processes and are generally considered perfect and unbreakable. *

However good those designs and architectures are, though, there's something about putting things into actual software that's, well, special. With the exception of software proven to be mathematically correct, ** being able to write software that accurately implements the functionality you're trying to realize is somewhere between a science and an art. This is no surprise to anyone who's actually written any software, tried to debug software, or divine software's correctness by stepping through it; however, it's not the key point of this article.

Nobody *** actually believes that the software that comes out of this process is going to be perfect, but everybody agrees that software should be made as close to perfect and bug-free as possible. This is why code review is a core principle of software development. And luckily -- in my view, at least -- much of the code that we use in our day-to-day lives is open source, which means that anybody can look at it, and it's available for tens or hundreds of thousands of eyes to review.

And herein lies the problem: There is a view that because open source software is subject to review by many eyes, all the bugs will be ironed out of it. This is a myth. A dangerous myth. The problems with this view are at least twofold. The first is the "if you build it, they will come" fallacy. I remember when there was a list of all the websites in the world, and if you added your website to that list, people would visit it. **** In the same way, the number of open source projects was (maybe) once so small that there was a good chance that people might look at and review your code. Those days are past -- long past. Second, for many areas of security functionality -- crypto primitives implementation is a good example -- the number of suitably qualified eyes is low.

Don't think that I am in any way suggesting that the problem is any less in proprietary code: quite the opposite. Not only are the designs and architectures in proprietary software often hidden from review, but you have fewer eyes available to look at the code, and the dangers of hierarchical pressure and groupthink are dramatically increased. "Proprietary code is more secure" is less myth, more fake news. I completely understand why companies like to keep their security software secret, and I'm afraid that the "it's to protect our intellectual property" line is too often a platitude they tell themselves when really, it's just unsafe to release it. So for me, it's open source all the way when we're looking at security software.

So, what can we do? Well, companies and other organizations that care about security functionality can -- and have, I believe a responsibility to -- expend resources on checking and reviewing the code that implements that functionality. Alongside that, the open source community, can -- and is -- finding ways to support critical projects and improve the amount of review that goes into that code. ***** And we should encourage academic organizations to train students in the black art of security software writing and review, not to mention highlighting the importance of open source software.

We can do better -- and we are doing better. Because what we need to realize is that the reason the "many eyes hypothesis" is a myth is not that many eyes won't improve code -- they will -- but that we don't have enough expert eyes looking. Yet.


* Yeah, really: "perfect and unbreakable." Let's just pretend that's true for the purposes of this discussion.

** and that still relies on the design and architecture to actually do what you want -- or think you want -- of course, so good luck.

*** Nobody who's actually written more than about five lines of code (or more than six characters of Perl).

**** I added one. They came. It was like some sort of magic.

***** See, for instance, the Linux Foundation 's Core Infrastructure Initiative .

This article originally appeared on Alice, Eve, and Bob – a security blog and is republished with permission.

[Oct 03, 2017] Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about)

Notable quotes:
"... We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money. ..."
Oct 03, 2017 | discussion.theguardian.com

mlzarathustra , 21 Sep 2017 16:52

I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.

The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.

I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.

Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.

[Oct 03, 2017] Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

Notable quotes:
"... That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country! ..."
"... I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore... ..."
"... Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model. ..."
"... There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor ..."
"... If you want a high tech executive to suffer a stroke, mention the words "labor unions". ..."
"... India isn't being hired for the quality, they're being hired for cheap labor. ..."
"... Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again... ..."
"... Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology. ..."
"... I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children. ..."
"... Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact. ..."
Oct 02, 2017 | profile.theguardian.com
Terryl Dorian , 21 Sep 2017 13:26
That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country!
Ray D Wright -> RogTheDodge , , 21 Sep 2017 14:52
I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore...
Richard Livingstone -> KatieL , , 21 Sep 2017 14:50
+++1 to all of that.

Automated coding just pushes the level of coding further up the development food chain, rather than gets rid of it. It is the wrong approach for current tech. AI that is smart enough to model new problems and create their own descriptive and runnable language - hopefully after my lifetime but coming sometime.

Arne Babenhauserheide -> Evelita , , 21 Sep 2017 14:48
What coding does not teach is how to improve our non-code infrastructure and how to keep it running (that's the stuff which actually moves things). Code can optimize stuff, but it needs actual actuators to affect reality.

Sometimes these actuators are actual people walking on top of a roof while fixing it.

WyntonK , 21 Sep 2017 14:47
Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor .

If you want a high tech executive to suffer a stroke, mention the words "labor unions".

TheEgg -> UncommonTruthiness , , 21 Sep 2017 14:43

The ship has sailed on this activity as a career.

Nope. Married to a highly-technical skillset, you can still make big bucks. I say this as someone involved in this kind of thing academically and our Masters grads have to beat the banks and fintech companies away with dog shits on sticks. You're right that you can teach anyone to potter around and throw up a webpage but at the prohibitively difficult maths-y end of the scale, someone suitably qualified will never want for a job.

Mike_Dexter -> Evelita , , 21 Sep 2017 14:43
In a similar vein, if you accept the argument that it does drive down wages, wouldn't the culprit actually be the multitudes of online and offline courses and tutorials available to an existing workforce?
Terryl Dorian -> CountDooku , , 21 Sep 2017 14:42
Funny you should pick medicine, law, engineering... 3 fields that are *not* taught in high school. The writer is simply adding "coding" to your list. So it seems you agree with his "garbage" argument after all.
anticapitalist -> RogTheDodge , , 21 Sep 2017 14:42
Key word is "good". Teaching everyone is just going to increase the pool of programmers code I need to fix. India isn't being hired for the quality, they're being hired for cheap labor. As for women sure I wouldn't mind more women around but why does no one say their needs to be more equality in garbage collection or plumbing? (And yes plumbers are a high paid professional).

In the end I don't care what the person is, I just want to hire and work with the best and not someone I have to correct their work because they were hired by quota. If women only graduate at 15% why should IT contain more than that? And let's be a bit honest with the facts, of those 15% how many spend their high school years staying up all night hacking? Very few. Now the few that did are some of the better developers I work with but that pool isn't going to increase by forcing every child to program... just like sports aren't better by making everyone take gym class.

WithoutPurpose , 21 Sep 2017 14:42
I ran a development team for 10 years and I never had any trouble hiring programmers - we just had to pay them enough. Every job would have at least 10 good applicants.

Two years ago I decided to scale back a bit and go into programming (I can code real-time low latency financial apps in 4 languages) and I had four interviews in six months with stupidly low salaries. I'm lucky in that I can bounce between tech and the business side so I got a decent job out of tech.

My entirely anecdotal conclusion is that there is no shortage of good programmers just a shortage of companies willing to pay them.

oddbubble -> Tori Turner , , 21 Sep 2017 14:41
I've worn many hats so far, I started out as a started out as a sysadmin, then I moved on to web development, then back end and now I'm doing test automation because I am on almost the same money for half the effort.
peter nelson -> raffine , , 21 Sep 2017 14:38
But the concepts won't. Good programming requires the ability to break down a task, organise the steps in performing it, identify parts of the process that are common or repetitive so they can be bundled together, handed-off or delegated, etc.

These concepts can be applied to any programming language, and indeed to many non-software activities.

Oliver Jones -> Trumbledon , , 21 Sep 2017 14:37
In the city maybe with a financial background, the exception.
anticapitalist -> Ethan Hawkins , 21 Sep 2017 14:32
Well to his point sort of... either everything will go php or all those entry level php developers will be on the street. A good Java or C developer is hard to come by. And to the others, being a being a developer, especially a good one, is nothing like reading and writing. The industry is already saturated with poor coders just doing it for a paycheck.
peter nelson -> Tori Turner , 21 Sep 2017 14:31
I'm just going to say this once: not everyone with a computer science degree is a coder.

And vice versa. I'm retiring from a 40-year career as a software engineer. Some of the best software engineers I ever met did not have CS degrees.

KatieL -> Mishal Almohaimeed , 21 Sep 2017 14:30
"already developing automated coding scripts. "

Pretty much the entire history of the software industry since FORAST was developed for the ORDVAC has been about desperately trying to make software development in some way possible without driving everyone bonkers.

The gulf between FORAST and today's IDE-written, type-inferring high level languages, compilers, abstracted run-time environments, hypervisors, multi-computer architectures and general tech-world flavour-of-2017-ness is truly immense[1].

And yet software is still fucking hard to write. There's no sign it's getting easier despite all that work.

Automated coding was promised as the solution in the 1980s as well. In fact, somewhere in my archives, I've got paper journals which include adverts for automated systems that would programmers completely redundant by writing all your database code for you. These days, we'd think of those tools as automated ORM generators and they don't fix the problem; they just make a new one -- ORM impedance mismatch -- which needs more engineering on top to fix...

The tools don't change the need for the humans, they just change what's possible for the humans to do.

[1] FORAST executed in about 20,000 bytes of memory without even an OS. The compile artifacts for the map-reduce system I built today are an astonishing hundred million bytes... and don't include the necessary mapreduce environment, management interface, node operating system and distributed filesystem...

raffine , 21 Sep 2017 14:29
Whatever they are taught today will be obsolete tomorrow.
yannick95 -> savingUK , , 21 Sep 2017 14:27
"There are already top quality coders in China and India"

AHAHAHAHAHAHAHAHAHAHAHA *rolls on the floor laughting* Yes........ 1%... and 99% of incredibly bad, incompetent, untalented one that produce cost 50% of a good developer but produce only 5% in comparison. And I'm talking with a LOT of practical experience through more than a dozen corporations all over the world which have been outsourcing to India... all have been disasters for the companies (but good for the execs who pocketed big bonuses and left the company before the disaster blows up in their face)

Wiretrip -> mcharts , , 21 Sep 2017 14:25
Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again...
TomRoche , 21 Sep 2017 14:11

Tech executives have pursued [the goal of suppressing workers' compensation] in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement.

Folks interested in the story of the Techtopus (less drily presented than in the links in this article) should check out Mark Ames' reporting, esp this overview article and this focus on the egregious Steve Jobs (whose canonization by the US corporate-funded media is just one more impeachment of their moral bankruptcy).

Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status.

Folks interested in H-1B and US technical visas more generally should head to Norm Matloff 's summary page , and then to his blog on the subject .

Olympus68 , 21 Sep 2017 13:49

I have watched as schools run by trade unions have done the opposite for the 5 decades. By limiting the number of graduates, they were able to help maintain living wages and benefits. This has been stopped in my area due to the pressure of owners run "trade associations".

During that same time period I have witnessed trade associations controlled by company owners, while publicising their support of the average employee, invest enormous amounts of membership fees in creating alliances with public institutions. Their goal has been that of flooding the labor market and thus keeping wages low. A double hit for the average worker because membership fees were paid by employees as well as those in control.

And so it goes....

savingUK , 21 Sep 2017 13:38
Coding jobs are just as susceptible to being moved to lower cost areas of the world as hardware jobs already have. It's already happening. There are already top quality coders in China and India. There is a much larger pool to chose from and they are just as good as their western counterparts and work harder for much less money.

Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology.

whitehawk66 , 21 Sep 2017 15:18

I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children.

They would definitely not survive the zombie apocalypse.

P.S. not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread.

UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

William Fitch III , 21 Sep 2017 13:52
Hi: As I have said many times before, there is no shortage of people who fully understand the problem and can see all the connections.

However, they all fall on their faces when it comes to the solution. To cut to the chase, Concentrated Wealth needs to go, permanently. Of course the challenge is how to best accomplish this.....

.....Bill

MostlyHarmlessD , , 21 Sep 2017 13:16

Damn engineers and their black and white world view, if they weren't so inept they would've unionized instead of being trampled again and again in the name of capitalism.
mcharts -> Aldous0rwell , , 21 Sep 2017 13:07
Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact.

Woe to our children and grandchildren.

Where's Bernie Sanders when we need him.

[Oct 03, 2017] The dream of coding automation remain illusive... Very illusive...

Oct 03, 2017 | discussion.theguardian.com

Richard Livingstone -> Mishal Almohaimeed , 21 Sep 2017 14:46

Wrong again, that approach has been tried since the 80s and will keep failing only because software development is still more akin to a technical craft than an engineering discipline. The number of elements required to assemble a working non trivial system is way beyond scriptable.
freeandfair -> Taylor Dotson , 21 Sep 2017 14:26
> That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?

You don't believe there will be robots to do plumbing and cleaning? The cleaner's job will be to program robots to do what they need.
CEOs? Absolutely.

English teachers? Both of my kids have school laptops and everything is being done on the computers. The teachers use software and create websites and what not. Yes, even English teachers.

Not knowing / understanding how to code will be the same as not knowing how to use Word/ Excel. I am assuming there are people who don't, but I don't know any above the age of 6.

Wiretrip -> Mishal Almohaimeed , 21 Sep 2017 14:20
We've had 'automated coding scripts' for years for small tasks. However, anyone who says they're going to obviate programmers, analysts and designers doesn't understand the software development process.
Ethan Hawkins -> David McCaul , 21 Sep 2017 13:22
Even if expert systems (an 80's concept, BTW) could code, we'd still have a huge need for managers. The hard part of software isn't even the coding. It's determining the requirements and working with clients. It will require general intelligence to do 90% of what we do right now. The 10% we could automate right now, mostly gets in the way. I agree it will change, but it's going to take another 20-30 years to really happen.
Mishal Almohaimeed -> PolydentateBrigand , , 21 Sep 2017 13:17
wrong, software companies are already developing automated coding scripts. You'll get a bunch of door to door knives salespeople once the dust settles that's what you'll get.
freeandfair -> rgilyead , , 21 Sep 2017 14:22
> In 20 years time AI will be doing the coding

Possible, but your still have to understand how AI operates and what it can and cannot do.

[Oct 03, 2017] Coding and carpentry are not so distant, are they ?

Thw user "imipak" views are pretty common misconceptions. They are all wrong.
Notable quotes:
"... I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers. ..."
"... Many people can write, but few become journalists, and fewer still become real authors. ..."
Oct 03, 2017 | discussion.theguardian.com

imipak, 21 Sep 2017 15:13

Coding has little or nothing to do with Silicon Valley. They may or may not have ulterior motives, but ultimately they are nothing in the scheme of things.

I disagree with teaching coding as a discrete subject. I think it should be combined with home economics and woodworking because 90% of these subjects consist of transferable skills that exist in all of them. Only a tiny residual is actually topic-specific.

In the case of coding, the residual consists of drawing skills and typing skills. Programming language skills? Irrelevant. You should choose the tools to fit the problem. Neither of these needs a computer. You should only ever approach the computer at the very end, after you've designed and written the program.

Is cooking so very different? Do you decide on the ingredients before or after you start? Do you go shopping half-way through cooking an omelette?

With woodwork, do you measure first or cut first? Do you have a plan or do you randomly assemble bits until it does something useful?

Real coding, taught correctly, is barely taught at all. You teach the transferable skills. ONCE. You then apply those skills in each area in which they apply.

What other transferable skills apply? Top-down design, bottom-up implementation. The correct methodology in all forms of engineering. Proper testing strategies, also common across all forms of engineering. However, since these tests are against logic, they're a test of reasoning. A good thing to have in the sciences and philosophy.

Technical writing is the art of explaining things to idiots. Whether you're designing a board game, explaining what you like about a house, writing a travelogue or just seeing if your wild ideas hold water, you need to be able to put those ideas down on paper in a way that exposes all the inconsistencies and errors. It doesn't take much to clean it up to be readable by humans. But once it is cleaned up, it'll remain free of errors.

So I would teach a foundation course that teaches top-down reasoning, bottom-up design, flowcharts, critical path analysis and symbolic logic. Probably aimed at age 7. But I'd not do so wholly in the abstract. I'd have it thoroughly mixed in with one field, probably cooking as most kids do that and it lacks stigma at that age.

I'd then build courses on various crafts and engineering subjects on top of that, building further hierarchies where possible. Eliminate duplication and severely reduce the fictions we call disciplines.

oldzealand, 21 Sep 2017 14:58
I used to employ 200 computer scientists in my business and now teach children so I'm apparently as guilty as hell. To be compared with a carpenter is, however, a true compliment, if you mean those that create elegant, aesthetically-pleasing, functional, adaptable and long-lasting bespoke furniture, because our crafts of problem-solving using limited resources in confined environments to create working, life-improving artifacts both exemplify great human ingenuity in action. Capitalism or no.
peter nelson, 21 Sep 2017 14:29
"But coding is not magic. It is a technical skill, akin to carpentry."

But some people do it much better than others. Just like journalism. This article is complete nonsense, as I discuss in another comment. The author might want to consider a career in carpentry.

Fanastril, 21 Sep 2017 14:13
"But coding is not magic. It is a technical skill, akin to carpentry."

It is a way of thinking. Perhaps carpentry is too, but the arrogance of the above statement shows a soul who is done thinking.

NDReader, 21 Sep 2017 14:12
"But coding is not magic. It is a technical skill, akin to carpentry."

I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers.

Many people can write, but few become journalists, and fewer still become real authors.

MostlyHarmlessD, 21 Sep 2017 13:08
A carpenter!? Good to know that engineers are still thought of as jumped up tradesmen.

[Oct 02, 2017] Techs push to teach coding isnt about kids success – its about cutting wages by Ben Tarnoff

Highly recommended!
IT is probably one of the most "neoliberalized" industry (even in comparison with finance). So atomization of labor and "plantation economy" is a norm in IT. It occurs on rather high level of wages, but with influx of foreign programmers and IT specialists (in the past) and mass outsourcing (now) this is changing. Completion for good job positions is fierce. Dog eats dog competition, the dream of neoliberals. Entry level jobs are already paying $15 an hour, if not less.
Programming is a relatively rare talent, much like ability to play violin. Even amateur level is challenging. On high level (developing large complex programs in a team and still preserving your individuality and productivity ) it is extremely rare. Most of "commercial" programmers are able to produce only a mediocre code (which might be adequate). Only a few programmers can excel if complex software projects. Sometimes even performing solo. There is also a pathological breed of "programmer junkie" ( graphomania happens in programming too ) who are able sometimes to destroy something large projects singlehandedly. That often happens with open source projects after the main developer lost interest and abandoned the project.
It's good to allow children the chance to try their hand at coding when they otherwise may not had that opportunity, But in no way that means that all of them can became professional programmers. No way. Again the top level of programmers required position of a unique talent, much like top musical performer talent.
Also to get a decent entry position you iether need to be extremely talented or graduate from Ivy League university. When applicants are abundant, resume from less prestigious universities are not even considered. this is just easier for HR to filter applications this way.
Also under neoliberalism cheap labor via H1B visas flood the market and depresses wages. Many Silicon companies were so to say "Russian speaking in late 90th after the collapse of the USSR. Not offshoring is the dominant way to offload the development to cheaper labor.
Notable quotes:
"... As software mediates more of our lives, and the power of Silicon Valley grows, it's tempting to imagine that demand for developers is soaring. The media contributes to this impression by spotlighting the genuinely inspiring stories of those who have ascended the class ladder through code. You may have heard of Bit Source, a company in eastern Kentucky that retrains coalminers as coders. They've been featured by Wired , Forbes , FastCompany , The Guardian , NPR and NBC News , among others. ..."
"... A former coalminer who becomes a successful developer deserves our respect and admiration. But the data suggests that relatively few will be able to follow their example. Our educational system has long been producing more programmers than the labor market can absorb. ..."
"... More tellingly, wage levels in the tech industry have remained flat since the late 1990s. Adjusting for inflation, the average programmer earns about as much today as in 1998. If demand were soaring, you'd expect wages to rise sharply in response. Instead, salaries have stagnated. ..."
"... Tech executives have pursued this goal in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement . Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status. ..."
"... Guest workers and wage-fixing are useful tools for restraining labor costs. But nothing would make programming cheaper than making millions more programmers. ..."
"... Silicon Valley has been unusually successful in persuading our political class and much of the general public that its interests coincide with the interests of humanity as a whole. But tech is an industry like any other. It prioritizes its bottom line, and invests heavily in making public policy serve it. The five largest tech firms now spend twice as much as Wall Street on lobbying Washington – nearly $50m in 2016. The biggest spender, Google, also goes to considerable lengths to cultivate policy wonks favorable to its interests – and to discipline the ones who aren't. ..."
"... Silicon Valley is not a uniquely benevolent force, nor a uniquely malevolent one. Rather, it's something more ordinary: a collection of capitalist firms committed to the pursuit of profit. And as every capitalist knows, markets are figments of politics. They are not naturally occurring phenomena, but elaborately crafted contraptions, sustained and structured by the state – which is why shaping public policy is so important. If tech works tirelessly to tilt markets in its favor, it's hardly alone. What distinguishes it is the amount of money it has at its disposal to do so. ..."
"... The problem isn't training. The problem is there aren't enough good jobs to be trained for ..."
"... Everyone should have the opportunity to learn how to code. Coding can be a rewarding, even pleasurable, experience, and it's useful for performing all sorts of tasks. More broadly, an understanding of how code works is critical for basic digital literacy – something that is swiftly becoming a requirement for informed citizenship in an increasingly technologized world. ..."
"... But coding is not magic. It is a technical skill, akin to carpentry. Learning to build software does not make you any more immune to the forces of American capitalism than learning to build a house. Whether a coder or a carpenter, capital will do what it can to lower your wages, and enlist public institutions towards that end. ..."
"... Exposing large portions of the school population to coding is not going to magically turn them into coders. It may increase their basic understanding but that is a long way from being a software engineer. ..."
"... All schools teach drama and most kids don't end up becoming actors. You need to give all kids access to coding in order for some can go on to make a career out of it. ..."
"... it's ridiculous because even out of a pool of computer science B.Sc. or M.Sc. grads - companies are only interested in the top 10%. Even the most mundane company with crappy IT jobs swears that they only hire "the best and the brightest." ..."
"... It's basically a con-job by the big Silicon Valley companies offshoring as many US jobs as they can, or "inshoring" via exploitation of the H1B visa ..."
"... Masters is the new Bachelors. ..."
"... I taught CS. Out of around 100 graduates I'd say maybe 5 were reasonable software engineers. The rest would be fine in tech support or other associated trades, but not writing software. Its not just a set of trainable skills, its a set of attitudes and ways of perceiving and understanding that just aren't that common. ..."
"... Yup, rings true. I've been in hi tech for over 40 years and seen the changes. I was in Silicon Valley for 10 years on a startup. India is taking over, my current US company now has a majority Indian executive and is moving work to India. US politicians push coding to drive down wages to Indian levels. ..."
Oct 02, 2017 | www.theguardian.com

This month, millions of children returned to school. This year, an unprecedented number of them will learn to code.

Computer science courses for children have proliferated rapidly in the past few years. A 2016 Gallup report found that 40% of American schools now offer coding classes – up from only 25% a few years ago. New York, with the largest public school system in the country, has pledged to offer computer science to all 1.1 million students by 2025. Los Angeles, with the second largest, plans to do the same by 2020. And Chicago, the fourth largest, has gone further, promising to make computer science a high school graduation requirement by 2018.

The rationale for this rapid curricular renovation is economic. Teaching kids how to code will help them land good jobs, the argument goes. In an era of flat and falling incomes, programming provides a new path to the middle class – a skill so widely demanded that anyone who acquires it can command a livable, even lucrative, wage.

This narrative pervades policymaking at every level, from school boards to the government. Yet it rests on a fundamentally flawed premise. Contrary to public perception, the economy doesn't actually need that many more programmers. As a result, teaching millions of kids to code won't make them all middle-class. Rather, it will proletarianize the profession by flooding the market and forcing wages down – and that's precisely the point.

At its root, the campaign for code education isn't about giving the next generation a shot at earning the salary of a Facebook engineer. It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

As software mediates more of our lives, and the power of Silicon Valley grows, it's tempting to imagine that demand for developers is soaring. The media contributes to this impression by spotlighting the genuinely inspiring stories of those who have ascended the class ladder through code. You may have heard of Bit Source, a company in eastern Kentucky that retrains coalminers as coders. They've been featured by Wired , Forbes , FastCompany , The Guardian , NPR and NBC News , among others.

A former coalminer who becomes a successful developer deserves our respect and admiration. But the data suggests that relatively few will be able to follow their example. Our educational system has long been producing more programmers than the labor market can absorb. A study by the Economic Policy Institute found that the supply of American college graduates with computer science degrees is 50% greater than the number hired into the tech industry each year. For all the talk of a tech worker shortage, many qualified graduates simply can't find jobs.

More tellingly, wage levels in the tech industry have remained flat since the late 1990s. Adjusting for inflation, the average programmer earns about as much today as in 1998. If demand were soaring, you'd expect wages to rise sharply in response. Instead, salaries have stagnated.

Still, those salaries are stagnating at a fairly high level. The Department of Labor estimates that the median annual wage for computer and information technology occupations is $82,860 – more than twice the national average. And from the perspective of the people who own the tech industry, this presents a problem. High wages threaten profits. To maximize profitability, one must always be finding ways to pay workers less.

Tech executives have pursued this goal in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement . Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status.

Guest workers and wage-fixing are useful tools for restraining labor costs. But nothing would make programming cheaper than making millions more programmers. And where better to develop this workforce than America's schools? It's no coincidence, then, that the campaign for code education is being orchestrated by the tech industry itself. Its primary instrument is Code.org, a nonprofit funded by Facebook, Microsoft, Google and others . In 2016, the organization spent nearly $20m on training teachers, developing curricula, and lobbying policymakers.

Silicon Valley has been unusually successful in persuading our political class and much of the general public that its interests coincide with the interests of humanity as a whole. But tech is an industry like any other. It prioritizes its bottom line, and invests heavily in making public policy serve it. The five largest tech firms now spend twice as much as Wall Street on lobbying Washington – nearly $50m in 2016. The biggest spender, Google, also goes to considerable lengths to cultivate policy wonks favorable to its interests – and to discipline the ones who aren't.

Silicon Valley is not a uniquely benevolent force, nor a uniquely malevolent one. Rather, it's something more ordinary: a collection of capitalist firms committed to the pursuit of profit. And as every capitalist knows, markets are figments of politics. They are not naturally occurring phenomena, but elaborately crafted contraptions, sustained and structured by the state – which is why shaping public policy is so important. If tech works tirelessly to tilt markets in its favor, it's hardly alone. What distinguishes it is the amount of money it has at its disposal to do so.

Money isn't Silicon Valley's only advantage in its crusade to remake American education, however. It also enjoys a favorable ideological climate. Its basic message – that schools alone can fix big social problems – is one that politicians of both parties have been repeating for years. The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric. That if we teach students the right skills, we can solve poverty, inequality and stagnation. The school becomes an engine of economic transformation, catapulting young people from challenging circumstances into dignified, comfortable lives.

This argument is immensely pleasing to the technocratic mind. It suggests that our core economic malfunction is technical – a simple asymmetry. You have workers on one side and good jobs on the other, and all it takes is training to match them up. Indeed, every president since Bill Clinton has talked about training American workers to fill the "skills gap". But gradually, one mainstream economist after another has come to realize what most workers have known for years: the gap doesn't exist. Even Larry Summers has concluded it's a myth.

The problem isn't training. The problem is there aren't enough good jobs to be trained for . The solution is to make bad jobs better, by raising the minimum wage and making it easier for workers to form a union, and to create more good jobs by investing for growth. This involves forcing business to put money into things that actually grow the productive economy rather than shoveling profits out to shareholders. It also means increasing public investment, so that people can make a decent living doing socially necessary work like decarbonizing our energy system and restoring our decaying infrastructure.

Everyone should have the opportunity to learn how to code. Coding can be a rewarding, even pleasurable, experience, and it's useful for performing all sorts of tasks. More broadly, an understanding of how code works is critical for basic digital literacy – something that is swiftly becoming a requirement for informed citizenship in an increasingly technologized world.

But coding is not magic. It is a technical skill, akin to carpentry. Learning to build software does not make you any more immune to the forces of American capitalism than learning to build a house. Whether a coder or a carpenter, capital will do what it can to lower your wages, and enlist public institutions towards that end.

Silicon Valley has been extraordinarily adept at converting previously uncommodified portions of our common life into sources of profit. Our schools may prove an easy conquest by comparison.

See also:

willyjack, 21 Sep 2017 16:56

"Everyone should have the opportunity to learn how to code. " OK, and that's what's being done. And that's what the article is bemoaning. What would be better: teach them how to change tires or groom pets? Or pick fruit? Amazingly condescending article.

MrFumoFumo , 21 Sep 2017 14:54
However, training lots of people to be coders won't automatically result in lots of people who can actually write good code. Nor will it give managers/recruiters the necessary skills to recognize which programmers are any good.

congenialAnimal -> alfredooo , 24 Sep 2017 09:57

A valid rebuttal but could I offer another observation? Exposing large portions of the school population to coding is not going to magically turn them into coders. It may increase their basic understanding but that is a long way from being a software engineer.

Just as children join art, drama or biology classes so they do not automatically become artists, actors or doctors. I would agree entirely that just being able to code is not going to guarantee the sort of income that might be aspired to. As with all things, it takes commitment, perseverance and dogged determination. I suppose ultimately it becomes the Gattaca argument.

alfredooo -> racole , 24 Sep 2017 06:51
Fair enough, but, his central argument, that an overabundance of coders will drive wages in that sector down, is generally true, so in the future if you want your kids to go into a profession that will earn them 80k+ then being a "coder" is not the route to take. When coding is - like reading, writing, and arithmetic - just a basic skill, there's no guarantee having it will automatically translate into getting a "good" job.
Wiretrip , 21 Sep 2017 14:14
This article lumps everyone in computing into the 'coder' bin, without actually defining what 'coding' is. Yes there is a glut of people who can knock together a bit of HTML and JavaScript, but that is not really programming as such.

There are huge shortages of skilled developers however; people who can apply computer science and engineering in terms of analysis and design of software. These are the real skills for which relatively few people have a true aptitude.

The lack of really good skills is starting to show in some terrible software implementation decisions, such as Slack for example; written as a web app running in Electron (so that JavaScript code monkeys could knock it out quickly), but resulting in awful performance. We will see more of this in the coming years...

Taylor Dotson -> youngsteveo , 21 Sep 2017 13:53
My brother is a programmer, and in his experience these coding exams don't test anything but whether or not you took (and remember) a very narrow range of problems introduce in the first years of a computer science degree. The entire hiring process seems premised on a range of ill-founded ideas about what skills are necessary for the job and how to assess them in people. They haven't yet grasped that those kinds of exams mostly test test-taking ability, rather than intelligence, creativity, diligence, communication ability, or anything else that a job requires beside coughing up the right answer in a stressful, timed environment without outside resources.

The_Raven , 23 Sep 2017 15:45

I'm an embedded software/firmware engineer. Every similar engineer I've ever met has had the same background - starting in electronics and drifting into embedded software writing in C and assembler. It's virtually impossible to do such software without an understanding of electronics. When it goes wrong you may need to get the test equipment out to scope the hardware to see if it's a hardware or software problem. Coming from a pure computing background just isn't going to get you a job in this type of work.
waltdangerfield , 23 Sep 2017 14:42
All schools teach drama and most kids don't end up becoming actors. You need to give all kids access to coding in order for some can go on to make a career out of it.
TwoSugarsPlease , 23 Sep 2017 06:13
Coding salaries will inevitably fall over time, but such skills give workers the option, once they discover that their income is no longer sustainable in the UK, of moving somewhere more affordable and working remotely.
DiGiT81 -> nixnixnix , 23 Sep 2017 03:29
Completely agree. Coding is a necessary life skill for 21st century but there are levels to every skill. From basic needs for an office job to advanced and specialised.
nixnixnix , 23 Sep 2017 00:46
Lots of people can code but very few of us ever get to the point of creating something new that has a loyal and enthusiastic user-base. Everyone should be able to code because it is or will be the basis of being able to create almost anything in the future. If you want to make a game in Unity, knowing how to code is really useful. If you want to work with large data-sets, you can't rely on Excel and so you need to be able to code (in R?). The use of code is becoming so pervasive that it is going to be like reading and writing.

All the science and engineering graduates I know can code but none of them have ever sold a stand-alone software. The argument made above is like saying that teaching everyone to write will drive down the wages of writers. Writing is useful for anyone and everyone but only a tiny fraction of people who can write, actually write novels or even newspaper columns.

DolyGarcia -> Carl Christensen , 22 Sep 2017 19:24
Immigrants have always a big advantage over locals, for any company, including tech companies: the government makes sure that they will stay in their place and never complain about low salaries or bad working conditions because, you know what? If the company sacks you, an immigrant may be forced to leave the country where they live because their visa expires, which is never going to happen with a local. Companies always have more leverage over immigrants. Given a choice between more and less exploitable workers, companies will choose the most exploitable ones.

Which is something that Marx figured more than a century ago, and why he insisted that socialism had to be international, which led to the founding of the First International Socialist. If worker's fights didn't go across country boundaries, companies would just play people from one country against the other. Unfortunately, at some point in time socialists forgot this very important fact.

xxxFred -> Tomix Da Vomix , 22 Sep 2017 18:52
SO what's wrong with having lots of people able to code? The only argument you seem to have is that it'll lower wages. So do you think that we should stop teaching writing skills so that journalists can be paid more? And no one os going to "force" kids into high-level abstract coding practices in kindergarten, fgs. But there is ample empirical proof that young children can learn basic principles. In fact the younger that children are exposed to anything, the better they can enhance their skills adn knowlege of it later in life, and computing concepts are no different.
Tomix Da Vomix -> xxxFred , 22 Sep 2017 18:40
You're completely missing the point. Kids are forced into the programming field (even STEM as a more general term), before they evolve their abstract reasoning. For that matter, you're not producing highly skilled people, but functional imbeciles and a decent labor that will eventually lower the wages.
Conspiracy theory? So Google, FB and others paying hundreds of millions of dollars for forming a cartel to lower the wages is not true? It sounds me that you're sounding more like a 1969 denier that Guardian is. Tech companies are not financing those incentives because they have a good soul. Their primary drive has always been money, otherwise they wouldn't sell your personal data to earn money.

But hey, you can always sleep peacefully when your kid becomes a coder. When he is 50, everyone will want to have a Cobol, Ada programmer with 25 years of experience when you can get 16 year old kid from a high school for 1/10 of a price. Go back to sleep...

Carl Christensen -> xxxFred , 22 Sep 2017 16:49
it's ridiculous because even out of a pool of computer science B.Sc. or M.Sc. grads - companies are only interested in the top 10%. Even the most mundane company with crappy IT jobs swears that they only hire "the best and the brightest."
Carl Christensen , 22 Sep 2017 16:47
It's basically a con-job by the big Silicon Valley companies offshoring as many US jobs as they can, or "inshoring" via exploitation of the H1B visa - so they can say "see, we don't have 'qualified' people in the US - maybe when these kids learn to program in a generation." As if American students haven't been coding for decades -- and saw their salaries plummet as the H1B visa and Indian offshore firms exploded......
Declawed -> KDHughes , 22 Sep 2017 16:40
Dude, stow the attitude. I've tested code from various entities, and seen every kind of crap peddled as gold.

But I've also seen a little 5-foot giggly lady with two kids, grumble a bit and save a $100,000 product by rewriting another coder's man-month of work in a few days, without any flaws or cracks. Almost nobody will ever know she did that. She's so far beyond my level it hurts.

And yes, the author knows nothing. He's genuinely crying wolf while knee-deep in amused wolves. The last time I was in San Jose, years ago , the room was already full of people with Indian surnames. If the problem was REALLY serious, a programmer from POLAND was called in.

If you think fighting for a violinist spot is hard, try fighting for it with every spare violinist in the world . I am training my Indian replacement to do my job right now . At least the public can appreciate a good violin. Can you appreciate Duff's device ?

So by all means, don't teach local kids how to think in a straight line, just in case they make a dent in the price of wages IN INDIA.... *sheesh*

Declawed -> IanMcLzzz , 22 Sep 2017 15:35
That's the best possible summarisation of this extremely dumb article. Bravo.

For those who don't know how to think of coding, like the article author, here's a few analogies :

A computer is a box that replays frozen thoughts, quickly. That is all.

Coding is just the art of explaining. Anyone who can explain something patiently and clearly, can code. Anyone who can't, can't.

Making hardware is very much like growing produce while blind. Making software is very much like cooking that produce while blind.

Imagine looking after a room full of young eager obedient children who only do exactly, *exactly*, what you told them to do, but move around at the speed of light. Imagine having to try to keep them from smashing into each other or decapitating themselves on the corners of tables, tripping over toys and crashing into walls, etc, while you get them all to play games together.

The difference between a good coder and a bad coder is almost life and death. Imagine a broth prepared with ingredients from a dozen co-ordinating geniuses and one idiot, that you'll mass produce. The soup is always far worse for the idiot's additions. The more cooks you involve, the more chance your mass produced broth will taste bad.

People who hire coders, typically can't tell a good coder from a bad coder.

Zach Dyer -> Mystik Al , 22 Sep 2017 15:18
Tech jobs will probably always be available long after your gone or until another mass extinction.
edmundberk -> AmyInNH , 22 Sep 2017 14:59
No you do it in your own time. If you're not prepared to put in long days IT is not for you in any case. It was ever thus, but more so now due to offshoring - rather than the rather obscure forces you seem to believe are important.
WithoutPurpose -> freeandfair , 22 Sep 2017 13:21
Bit more rhan that.
peter nelson -> offworldguy , 22 Sep 2017 12:44
Sorry, offworldguy, but you're losing this one really badly. I'm a professional software engineer in my 60's and I know lots of non-professionals in my age range who write little programs, scripts and apps for fun. I know this because they often contact me for help or advice.

So you've now been told by several people in this thread that ordinary people do code for fun or recreation. The fact that you don't know any probably says more about your network of friends and acquaintances than about the general population.

xxxFred , 22 Sep 2017 12:18
This is one of the daftest articles I've come across in a long while.
If it's possible that so many kids can be taught to code well enough so that wages come down, then that proves that the only reason we've been paying so much for development costs is the scarcity of people able to do it, not that it's intrinsically so hard that only a select few could anyway. In which case, there is no ethical argument for keeping the pools of skilled workers to some select group. Anyone able to do it should have an equal opportunity to do it.
What is the argument for not teaching coding (other than to artificially keep wages high)? Why not stop teaching the three R's, in order to boost white-collar wages in general?
Computing is an ever-increasingly intrinsic part of life, and people need to understand it at all levels. It is not just unfair, but tantamount to neglect, to fail to teach children all the skills they may require to cope as adults.
Having said that, I suspect that in another generation or two a good many lower-level coding jobs will be redundant anyway, with such code being automatically generated, and "coders" at this level will be little more than technicians setting various parameters. Even so, understanding the basics behind computing is a part of understanding the world they live in, and every child needs that.
Suggesting that teaching coding is some kind of conspiracy to force wages down is well, it makes the moon-landing conspiracy looks sensible by comparison.
timrichardson -> offworldguy , 22 Sep 2017 12:16
I think it is important to demystify advanced technology, I think that has importance in its own right.Plus, schools should expose kids to things which may spark their interest. Not everyone who does a science project goes on years later to get a PhD, but you'd think that it makes it more likely. Same as giving a kid some music lessons. There is a big difference between serious coding and the basic steps needed to automate a customer service team or a marketing program, but the people who have some mastery over automation will have an advantage in many jobs. Advanced machines are clearly going to be a huge part of our future. What should we do about it, if not teach kids how to understand these tools?
rogerfederere -> William Payne , 22 Sep 2017 12:13
tl;dr.
Mystik Al , 22 Sep 2017 12:08
As automation is about to put 40% of the workforce permanently out of work getting into to tech seems like a good idea!
timrichardson , 22 Sep 2017 12:04
This is like arguing that teaching kids to write is nothing more than a plot to flood the market for journalists. Teaching first aid and CPR does not make everyone a doctor.
Coding is an essential skill for many jobs already: 50 years ago, who would have thought you needed coders to make movies? Being a software engineer, a serious coder, is hard. IN fact, it takes more than technical coding to be a software engineer: you can learn to code in a week. Software Engineering is a four year degree, and even then you've just started a career. But depriving kids of some basic insights may mean they won't have the basic skills needed in the future, even for controlling their car and house. By all means, send you kids to a school that doesn't teach coding. I won't.
James Jones -> vimyvixen , 22 Sep 2017 11:41
Did you learn SNOBOL, or is Snowball a language I'm not familiar with? (Entirely possible, as an American I never would have known Extended Mercury Autocode existed we're it not for a random book acquisition at my home town library when I was a kid.)
William Payne , 22 Sep 2017 11:17
The tide that is transforming technology jobs from "white collar professional" into "blue collar industrial" is part of a larger global economic cycle.

Successful "growth" assets inevitably transmogrify into "value" and "income" assets as they progress through the economic cycle. The nature of their work transforms also. No longer focused on innovation; on disrupting old markets or forging new ones; their fundamental nature changes as they mature into optimising, cost reducing, process oriented and most importantly of all -- dividend paying -- organisations.

First, the market invests. And then, .... it squeezes.

Immature companies must invest in their team; must inspire them to be innovative so that they can take the creative risks required to create new things. This translates into high skills, high wages and "white collar" social status.

Mature, optimising companies on the other hand must necessarily avoid risks and seek variance-minimising predictability. They seek to control their human resources; to eliminate creativity; to to make the work procedural, impersonal and soulless. This translates into low skills, low wages and "blue collar" social status.

This is a fundamental part of the economic cycle; but it has been playing out on the global stage which has had the effect of hiding some of its' effects.

Over the past decades, technology knowledge and skills have flooded away from "high cost" countries and towards "best cost" countries at a historically significant rate. Possibly at the maximum rate that global infrastructure and regional skills pools can support. Much of this necessarily inhumane and brutal cost cutting and deskilling has therefore been hidden by the tide of outsourcing and offshoring. It is hard to see the nature of the jobs change when the jobs themselves are changing hands at the same time.

The ever tighter ratchet of dehumanising industrialisation; productivity and efficiency continues apace, however, and as our global system matures and evens out, we see the seeds of what we have sown sail home from over the sea.

Technology jobs in developed nations have been skewed towards "growth" activities since for the past several decades most "value" and "income" activities have been carried out in developing nations. Now, we may be seeing the early preparations for the diffusion of that skewed, uneven and unsustainable imbalance.

The good news is that "Growth" activities are not going to disappear from the world. They just may not be so geographically concentrated as they are today. Also, there is a significant and attention-worthy argument that the re-balancing of skills will result in a more flexible and performant global economy as organisations will better be able to shift a wider variety of work around the world to regions where local conditions (regulation, subsidy, union activity etc...) are supportive.

For the individuals concerned it isn't going to be pretty. And of course it is just another example of the race to the bottom that pits states and public sector purse-holders against one another to win the grace and favour of globally mobile employers.

As a power play move it has a sort of inhumanly psychotic inevitability to it which is quite awesome to observe.

I also find it ironic that the only way to tame the leviathan that is the global free-market industrial system might actually be effective global governance and international cooperation within a rules-based system.

Both "globalist" but not even slightly both the same thing.

Vereto -> Wiretrip , 22 Sep 2017 11:17
not just coders, it put even IT Ops guys into this bin. Basically good old - so you are working with computers sentence I used to hear a lot 10-15 years ago.
Sangmin , 22 Sep 2017 11:15
You can teach everyone how to code but it doesn't necessarily mean everyone will be able to work as one. We all learn math but that doesn't mean we're all mathematicians. We all know how to write but we're not all professional writers.

I have a graduate degree in CS and been to a coding bootcamp. Not everyone's brain is wired to become a successful coder. There is a particular way how coders think. Quality of a product will stand out based on these differences.

Vereto -> Jared Hall , 22 Sep 2017 11:12
Very hyperbolic is to assume that the profit in those companies is done by decreasing wages. In my company the profit is driven by ability to deliver products to the market. And that is limited by number of top people (not just any coder) you can have.
KDHughes -> kcrane , 22 Sep 2017 11:06
You realise that the arts are massively oversupplied and that most artists earn very little, if anything? Which is sort of like the situation the author is warning about. But hey, he knows nothing. Congratulations, though, on writing one of the most pretentious posts I've ever read on CIF.
offworldguy -> Melissa Boone , 22 Sep 2017 10:21
So you know kids, college age people and software developers who enjoy doing it in their leisure time? Do you know any middle aged mothers, fathers, grandparents who enjoy it and are not software developers?

Sorry, I don't see coding as a leisure pursuit that is going to take off beyond a very narrow demographic and if it becomes apparent (as I believe it will) that there is not going to be a huge increase in coding job opportunities then it will likely wither in schools too, perhaps replaced by music lessons.

Bread Eater , 22 Sep 2017 10:02
From their perspective yes. But there are a lot of opportunities in tech so it does benefit students looking for jobs.
Melissa Boone -> jamesbro , 22 Sep 2017 10:00
No, because software developer probably fail more often than they succeed. Building anything worthwhile is an iterative process. And it's not just the compiler but the other devs, oyur designer, your PM, all looking at your work.
Melissa Boone -> peterainbow , 22 Sep 2017 09:57
It's not shallow or lazy. I also work at a tech company and it's pretty common to do that across job fields. Even in HR marketing jobs, we hire students who can't point to an internship or other kind of experience in college, not simply grades.
Vereto -> savingUK , 22 Sep 2017 09:50
It will take ages, the issue of Indian programmers is in the education system and in "Yes boss" culture.

But on the other hand most of Americans are just as bad as Indians

Melissa Boone -> offworldguy , 22 Sep 2017 09:50
A lot of people do find it fun. I know many kids - high school and young college age - who code in the leisure time because they find it pleasurable to make small apps and video games. I myself enjoy it too. Your argument is like saying since you don't like to read books in your leisure time, nobody else must.

The point is your analogy isn't a good one - people who learn to code can not only enjoy it in their spare time just like music, but they can also use it to accomplish all kinds of basic things. I have a friend who's a software developer who has used code to program his Roomba to vacuum in a specific pattern and to play Candy Land with his daughter when they lost the spinner.

Owlyrics -> CapTec , 22 Sep 2017 09:44
Creativity could be added to your list. Anyone can push a button but only a few can invent a new one.
One company in the US (after it was taken over by a new owner) decided it was more profitable to import button pushers from off-shore, they lost 7 million customers (gamers) and had to employ more of the original American developers to maintain their high standard and profits.
Owlyrics -> Maclon , 22 Sep 2017 09:40
Masters is the new Bachelors.
Maclon , 22 Sep 2017 09:22
So similar to 500k a year people going to university ( UK) now when it used to be 60k people a year( 1980). There was never enough graduate jobs in 1980 so can't see where the sudden increase in need for graduates has come from.
PaulDavisTheFirst -> Ethan Hawkins , 22 Sep 2017 09:17

They aren't really crucial pieces of technology except for their popularity

It's early in the day for me, but this is the most ridiculous thing I've read so far, and I suspect it will be high up on the list by the end of the day.

There's no technology that is "crucial" unless it's involved in food, shelter or warmth. The rest has its "crucialness" decided by how widespread its use is, and in the case of those 3 languages, the answer is "very".

You (or I) might not like that very much, but that's how it is.

Julian Williams -> peter nelson , 22 Sep 2017 09:12
My benchmark would be if the average new graduate in the discipline earns more or less than one of the "professions", Law, medicine, Economics etc. The short answer is that they don't. Indeed, in my experience of professions, many good senior SW developers, say in finance, are paid markedly less than the marketing manager, CTO etc. who are often non-technical.

My benchmark is not "has a car, house etc." but what does 10, 15 20 years of experience in the area generate as a relative income to another profession, like being a GP or a corporate solicitor or a civil servant (which is usually the benchmark academics use for pay scaling). It is not to denigrate, just to say that markets don't always clear to a point where the most skilled are the highest paid.

I was also suggesting that even if you are not intending to work in the SW area, being able to translate your imagination into a program that reflects your ideas is a nice life skill.

AmyInNH -> freeandfair , 22 Sep 2017 09:05
Your assumption has no basis in reality. In my experience, as soon as Clinton ramped up H1Bs, my employer would invite 6 same college/degree/curriculum in for interviews, 5 citizen, 1 foreign student and default offer to foreign student without asking interviewers a single question about the interview. Eventually, the skipped the farce of interviewing citizens all together. That was in 1997, and it's only gotten worse. Wall St's been pretty blunt lately. Openly admits replacing US workers for import labor, as it's the "easiest" way to "grow" the economy, even though they know they are ousting citizens from their jobs to do so.
AmyInNH -> peter nelson , 22 Sep 2017 08:59
"People who get Masters and PhD's in computer science" Feed western universities money, for degree programs that would otherwise not exist, due to lack of market demand. "someone has a Bachelor's in CS" As citizens, having the same college/same curriculum/same grades, as foreign grad. But as citizens, they have job market mobility, and therefore are shunned. "you can make something real and significant on your own" If someone else is paying your rent, food and student loans while you do so.
Ethan Hawkins -> farabundovive , 22 Sep 2017 07:40
While true, it's not the coders' fault. The managers and execs above them have intentionally created an environment where these things are secondary. What's primary is getting the stupid piece of garbage out the door for Q profit outlook. Ship it amd patch it.
offworldguy -> millartant , 22 Sep 2017 07:38
Do most people find it fun? I can code. I don't find it 'fun'. Thirty years ago as a young graduate I might have found it slightly fun but the 'fun' wears off pretty quick.
Ethan Hawkins -> anticapitalist , 22 Sep 2017 07:35
In my estimation PHP is an utter abomination. Python is just a little better but still very bad. Ruby is a little better but still not at all good.

Languages like PHP, Python and JS are popular for banging out prototypes and disposable junk, but you greatly overestimate their importance. They aren't really crucial pieces of technology except for their popularity and while they won't disappear they won't age well at all. Basically they are big long-lived fads. Java is now over 20 years old and while Java 8 is not crucial, the JVM itself actually is crucial. It might last another 20 years or more. Look for more projects like Ceylon, Scala and Kotlin. We haven't found the next step forward yet, but it's getting more interesting, especially around type systems.

A strong developer will be able to code well in a half dozen languages and have fairly decent knowledge of a dozen others. For me it's been many years of: Z80, x86, C, C++, Java. Also know some Perl, LISP, ANTLR, Scala, JS, SQL, Pascal, others...

millartant -> Islingtonista , 22 Sep 2017 07:26
You need a decent IDE
millartant -> offworldguy , 22 Sep 2017 07:24

One is hardly likely to 'do a bit of coding' in ones leisure time

Why not? The right problem is a fun and rewarding puzzle to solve. I spend a lot of my leisure time "doing a bit of coding"

Ethan Hawkins -> Wiretrip , 22 Sep 2017 07:12
The worst of all are the academics (on average).
Ethan Hawkins -> KatieL , 22 Sep 2017 07:09
This makes people like me with 35 years of experience shipping products on deadlines up and down every stack (from device drivers and operating systems to programming languages, platforms and frameworks to web, distributed computing, clusters, big data and ML) so much more valuable. Been there, done that.
Ethan Hawkins -> Taylor Dotson , 22 Sep 2017 07:01
It's just not true. In SV there's this giant vacuum created by Apple, Google, FB, etc. Other good companies struggle to fill positions. I know from being on the hiring side at times.
TheBananaBender -> peter nelson , 22 Sep 2017 07:00
You don't work for a major outsourcer then like Serco, Atos, Agilisys
offworldguy -> LabMonkey , 22 Sep 2017 06:59
Plenty of people? I don't know of a single person outside of my work which is teaming with programmers. Not a single friend, not my neighbours, not my wife or her extended family, not my parents. Plenty of people might do it but most people don't.
Ethan Hawkins -> finalcentury , 22 Sep 2017 06:56
Your ignorance of coding is showing. Coding IS creative.
Ricardo111 -> peter nelson , 22 Sep 2017 06:56
Agreed: by gifted I did not meant innate. It's more of a mix of having the interest, the persistence, the time, the opportunity and actually enjoying that kind of challenge.

While some of those things are to a large extent innate personality traits, others are not and you don't need max of all of them, you just need enough to drive you to explore that domain.

That said, somebody that goes into coding purelly for the money and does it for the money alone is extremely unlikelly to become an exceptional coder.

Ricardo111 -> eirsatz , 22 Sep 2017 06:50
I'm as senior as they get and have interviewed quite a lot of programmers for several positions, including for Technical Lead (in fact, to replace me) and so far my experience leads me to believe that people who don't have a knack for coding are much less likely to expose themselves to many different languages and techniques, and also are less experimentalist, thus being far less likely to have those moments of transcending merely being aware of the visible and obvious to discover the concerns and concepts behind what one does. Without those moments that open the door to the next Universe of concerns and implications, one cannot do state transitions such as Coder to Technical Designer or Technical Designer to Technical Architect.

Sure, you can get the title and do the things from the books, but you will not get WHY are those things supposed to work (and when they will not work) and thus cannot adjust to new conditions effectively and will be like a sailor that can't sail away from sight of the coast since he can't navigate.

All this gets reflected in many things that enhance productivity, from the early ability to quickly piece together solutions for a new problem out of past solutions for different problems to, later, conceiving software architecture designs fittted to the typical usage pattern in the industry for which the software is going to be made.

LabMonkey , 22 Sep 2017 06:50
From the way our IT department is going, needing millions of coders is not the future. It'll be a minority of developers at the top, and an army of low wage monkeys at the bottom who can troubleshoot from a script - until AI comes along that can code faster and more accurately.
LabMonkey -> offworldguy , 22 Sep 2017 06:46

One is hardly likely to 'do a bit of coding' in ones leisure time

Really? I've programmed a few simple videogames in my spare time. Plenty of people do.

CapTec , 22 Sep 2017 06:29
Interesting piece that's fundamentally flawed. I'm a software engineer myself. There is a reason a University education of a minimum of three years is the base line for a junior developer or 'coder'.

Software engineering isn't just writing code. I would say 80% of my time is spent designing and structuring software before I even touch the code.

Explaining software engineering as a discipline at a high level to people who don't understand it is simple.

Most of us who learn to drive learn a few basics about the mechanics of a car. We know that brake pads need to be replaced, we know that fuel is pumped into an engine when we press the gas pedal. Most of us know how to change a bulb if it blows.

The vast majority of us wouldn't be able to replace a head gasket or clutch though. Just knowing the basics isn't enough to make you a mechanic.

Studying in school isn't enough to produce software engineers. Software engineering isn't just writing code, it's cross discipline. We also need to understand the science behind the computer, we need too understand logic, data structures, timings, how to manage memory, security, how databases work etc.

A few years of learning at school isn't nearly enough, a degree isn't enough on its own due to the dynamic and ever evolving nature of software engineering. Schools teach technology that is out of date and typically don't explain the science very well.

This is why most companies don't want new developers, they want people with experience and multiple skills.

Programming is becoming cool and people think that because of that it's easy to become a skilled developer. It isn't. It takes time and effort and most kids give up.

French was on the national curriculum when I was at school. Most people including me can't hold a conversation in French though.

Ultimately there is a SKILL shortage. And that's because skill takes a long time, successes and failures to acquire. Most people just give up.

This article is akin to saying 'schools are teaching basic health to reduce the wages of Doctors'. It didn't happen.

offworldguy -> thecurio , 22 Sep 2017 06:19
There is a difference. When you teach people music you teach a skill that can be used for a lifetimes enjoyment. One might sit at a piano in later years and play. One is hardly likely to 'do a bit of coding' in ones leisure time.

The other thing is how good are people going to get at coding and how long will they retain the skill if not used? I tend to think maths is similar to coding and most adults have pretty terrible maths skills not venturing far beyond arithmetic. Not many remember how to solve a quadratic equation or even how to rearrange some algebra.

One more thing is we know that if we teach people music they will find a use for it, if only in their leisure time. We don't know that coding will be in any way useful because we don't know if there will be coding jobs in the future. AI might take over coding but we know that AI won't take over playing piano for pleasure.

If we want to teach logical thinking then I think maths has always done this and we should make sure people are better at maths.

Alex Mackaness , 22 Sep 2017 06:08
Am I missing something here? Being able to code is a skill that is a useful addition to the skill armoury of a youngster entering the work place. Much like reading, writing, maths... Not only is it directly applicable and pervasive in our modern world, it is built upon logic.

The important point is that American schools are not ONLY teaching youngsters to code, and producing one dimensional robots... instead coding makes up one part of their overall skill set. Those who wish to develop their coding skills further certainly can choose to do so. Those who specialise elsewhere are more than likely to have found the skills they learnt whilst coding useful anyway.

I struggle to see how there is a hidden capitalist agenda here. I would argue learning the basics of coding is simply becoming seen as an integral part of the school curriculum.

thecurio , 22 Sep 2017 05:56
The word "coding" is shorthand for "computer programming" or "software development" and it masks the depth and range of skills that might be required, depending on the application.

This subtlety is lost, I think, on politicians and perhaps the general public. Asserting that teaching lots of people to code is a sneaky way to commodotise an industry might have some truth to it, but remember that commodotisation (or "sharing and re-use" as developers might call it) is nothing new. The creation of freely available and re-usable software components and APIs has driven innovation, and has put much power in the hands of developers who would not otherwise have the skill or time to tackle such projects.

There's nothing to fear from teaching more people to "code", just as there's nothing to fear from teaching more people to "play music". These skills simply represent points on a continuum.

There's room for everyone, from the kid on a kazoo all the way to Coltrane at the Village Vanguard.

sbw7 -> ragingbull , 22 Sep 2017 05:44
I taught CS. Out of around 100 graduates I'd say maybe 5 were reasonable software engineers. The rest would be fine in tech support or other associated trades, but not writing software. Its not just a set of trainable skills, its a set of attitudes and ways of perceiving and understanding that just aren't that common.
offworldguy , 22 Sep 2017 05:02
I can't understand the rush to teach coding in schools. First of all I don't think we are going to be a country of millions of coders and secondly if most people have the skills then coding is hardly going to be a well paid job. Thirdly you can learn coding from scratch after school like people of my generation did. You could argue that it is part of a well rounded education but then it is as important for your career as learning Shakespeare, knowing what an oxbow lake is or being able to do calculus: most jobs just won't need you to know.
savingUK -> yannick95 , 22 Sep 2017 04:35
While you roll on the floor laughing, these countries will slowly but surely get their act together. That is how they work. There are top quality coders over there and they will soon promoted into a position to organise the others.

You are probably too young to remember when people laughed at electronic products when they were made in Japan then Taiwan. History will repeat it's self.

zii000 -> JohnFreidburg , 22 Sep 2017 04:04
Yes it's ironic and no different here in the UK. Traditionally Labour was the party focused on dividing the economic pie more fairly, Tories on growing it for the benefit of all. It's now completely upside down with Tories paying lip service to the idea of pay rises but in reality supporting this deflationary race to the bottom, hammering down salaries and so shrinking discretionary spending power which forces price reductions to match and so more pressure on employers to cut costs ... ad infinitum.
Labour now favour policies which would cause an expansion across the entire economy through pay rises and dramatically increased investment with perhaps more tolerance of inflation to achieve it.
ID0193985 -> jamesbro , 22 Sep 2017 03:46
Not surprising if they're working for a company that is cold-calling people - which should be banned in my opinion. Call centres providing customer support are probably less abuse-heavy since the customer is trying to get something done.
vimyvixen , 22 Sep 2017 02:04
I taught myself to code in 1974. Fortran, COBOL were first. Over the years as a aerospace engineer I coded in numerous languages ranging from PLM, Snowball, Basic, and more assembly languages than I can recall, not to mention deep down in machine code on more architectures than most know even existed. Bottom line is that coding is easy. It doesn't take a genius to code, just another way of thinking. Consider all the bugs in the software available now. These "coders", not sufficiently trained need adult supervision by engineers who know what they are doing for computer systems that are important such as the electrical grid, nuclear weapons, and safety critical systems. If you want to program toy apps then code away, if you want to do something important learn engineering AND coding.
Dwight Spencer , 22 Sep 2017 01:44
Laughable. It takes only an above-average IQ to code. Today's coders are akin to the auto mechanics of the 1950s where practically every high school had auto shop instruction . . . nothing but a source of cheap labor for doing routine implementations of software systems using powerful code libraries built by REAL software engineers.
sieteocho -> Islingtonista , 22 Sep 2017 01:19
That's a bit like saying that calculus is more valuable than arithmetic, so why teach children arithmetic at all?

Because without the arithmetic, you're not going to get up to the calculus.

JohnFreidburg -> Tommyward , 22 Sep 2017 01:15
I disagree. Technology firms are just like other firms. Why then the collusion not to pay more to workers coming from other companies? To believe that they are anything else is naive. The author is correct. We need policies that actually grow the economy and not leaders who cave to what the CEOs want like Bill Clinton did. He brought NAFTA at the behest of CEOs and all it ended up doing was ripping apart the rust belt and ushering in Trump.
Tommyward , 22 Sep 2017 00:53
So the media always needs some bad guys to write about, and this month they seem to have it in for the tech industry. The article is BS. I interview a lot of people to join a large tech company, and I can guarantee you that we aren't trying to find cheaper labor, we're looking for the best talent.

I know that lots of different jobs have been outsourced to low cost areas, but these days the top companies are instead looking for the top talent globally.

I see this article as a hit piece against Silicon Valley, and it doesn't fly in the face of the evidence.

finalcentury , 22 Sep 2017 00:46
This has got to be the most cynical and idiotic social interest piece I have ever read in the Guardian. Once upon a time it was very helpful to learn carpentry and machining, but now, even if you are learning those, you will get a big and indispensable headstart if you have some logic and programming skills. The fact is, almost no matter what you do, you can apply logic and programming skills to give you an edge. Even journalists.
hoplites99 , 22 Sep 2017 00:02
Yup, rings true. I've been in hi tech for over 40 years and seen the changes. I was in Silicon Valley for 10 years on a startup. India is taking over, my current US company now has a majority Indian executive and is moving work to India. US politicians push coding to drive down wages to Indian levels.

On the bright side I am old enough and established enough to quit tomorrow, its someone else's problem, but I still despise those who have sold us out, like the Clintons, the Bushes, the Googoids, the Zuckerboids.

liberalquilt -> yannick95 , 21 Sep 2017 23:45
Sure markets existed before governments, but capitalism didn't, can't in fact. It needs the organs of state, the banking system, an education system, and an infrastructure.
thegarlicfarmer -> canprof , 21 Sep 2017 23:36
Then teach them other things but not coding! Here in Australia every child of school age has to learn coding. Now tell me that everyone of them will need it? Look beyond computers as coding will soon be automated just like every other job.
Islingtonista , 21 Sep 2017 22:25
If you have never coded then you will not appreciate how labour intensive it is. Coders effectively use line editors to type in, line by line, the instructions. And syntax is critical; add a comma when you meant a semicolon and the code doesn't work properly. Yeah, we use frameworks and libraries of already written subroutines, but, in the end, it is all about manually typing in the code.

Which is an expensive way of doing things (hence the attractions of 'off-shoring' the coding task to low cost economies in Asia).

And this is why teaching kids to code is a waste of time.

Already, AI based systems are addressing the task of interpreting high level design models and simply generating the required application.

One of the first uses templates and a smart chatbot to enable non-tech business people to build their websites. By describe in non-coding terms what they want, the chatbot is able to assemble the necessary components and make the requisite template amendments to build a working website.

Much cheaper than hiring expensive coders to type it all in manually.

It's early days yet, but coding may well be one of the big losers to AI automation along with all those back office clerical jobs.

Teaching kids how to think about design rather than how to code would be much more valuable.

jamesbro -> peter nelson , 21 Sep 2017 21:31
Thick-skinned? Just because you might get a few error messages from the compiler? Call centre workers have to put up with people telling them to fuck off eight hours a day.
Joshua Ian Lee , 21 Sep 2017 21:03
Spot on. Society will never need more than 1% of its people to code. We will need far more garbage men. There are only so many (relatively) good jobs to go around and its about competing to get them.
canprof , 21 Sep 2017 20:53
I'm a professor (not of computer science) and yet, I try to give my students a basic understanding of algorithms and logic, to spark an interest and encourage them towards programming. I have no skin in the game, except that I've seen unemployment first-hand, and want them to avoid it. The best chance most of them have is to learn to code.
Evelita , 21 Sep 2017 14:35
Educating youth does not drive wages down. It drives our economy up. China, India, and other countries are training youth in programming skills. Educating our youth means that they will be able to compete globally. This is the standard GOP stand that we don't need to educate our youth, but instead fantasize about high-paying manufacturing jobs miraculously coming back.

Many jobs, including new manufacturing jobs have an element of coding because they are automated. Other industries require coding skills to maintain web sites and keep computer systems running. Learning coding skills opens these doors.

Coding teaches logic, an essential thought process. Learning to code, like learning anything, increases the brains ability to adapt to new environments which is essential to our survival as a species. We must invest in educating our youth.

cwblackwell , 21 Sep 2017 13:38
"Contrary to public perception, the economy doesn't actually need that many more programmers." This really looks like a straw man introducing a red herring. A skill can be extremely valuable for those who do not pursue it as a full time profession.

The economy doesn't actually need that many more typists, pianists, mathematicians, athletes, dietitians. So, clearly, teaching typing, the piano, mathematics, physical education, and nutrition is a nefarious plot to drive down salaries in those professions. None of those skills could possibly enrich the lives or enhance the productivity of builders, lawyers, public officials, teachers, parents, or store managers.

DJJJJJC , 21 Sep 2017 14:23

A study by the Economic Policy Institute found that the supply of American college graduates with computer science degrees is 50% greater than the number hired into the tech industry each year.

You're assuming that all those people are qualified to work in software because they have a piece of paper that says so, but that's not a valid assumption. The quality of computer science degree courses is generally poor, and most people aren't willing or able to teach themselves. Universities are motivated to award degrees anyway because if they only awarded degrees to students who are actually qualified then that would reflect very poorly on their quality of teaching.

A skills shortage doesn't mean that everyone who claims to have a skill gets hired and there are still some jobs left over that aren't being done. It means that employers are forced to hire people who are incompetent in order to fill all their positions. Many people who get jobs in programming can't really do it and do nothing but create work for everyone else. That's why most of the software you use every day doesn't work properly. That's why competent programmers' salaries are still high in spite of the apparently large number of "qualified" people who aren't employed as programmers.

[Oct 02, 2017] Programming vs coding

This idiotic US term "coder" is complete baloney.
Notable quotes:
"... You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field. ..."
"... Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to. ..."
"... I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it. ..."
"... "...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck. ..."
"... Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed. ..."
Oct 02, 2017 | profile.theguardian.com
Wiretrip -> Mark Mauvais , 21 Sep 2017 14:23
Yes, 'engineers' (and particularly mathematicians) write appalling code.
Trumbledon , 21 Sep 2017 14:23
A good developer can easily earn £600-800 per day, which suggests to me that they are in high demand, and society needs more of them.
Wiretrip -> KatieL , 21 Sep 2017 14:22
Agreed, to many people 'coding' consists of copying other people's JavaScript snippets from StackOverflow... I tire of the many frauds in the business...
stratplaya , 21 Sep 2017 14:21
You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field.
peter nelson -> UncommonTruthiness , 21 Sep 2017 14:21

The ship has sailed on this activity as a career.

Oh, rubbish. I'm in the process of retiring from my job as an Android software designer so I'm tasked with hiring a replacement for my organisation. It pays extremely well, the work is interesting, and the company is successful and serves an important worldwide industry.

Still, finding highly-qualified people is hard and they get snatched up in mid-interview because the demand is high. Not only that but at these pay scales, we can pretty much expect the Guardian will do yet another article about the unconscionable gap between what rich, privileged techies like software engineers make and everyone else.

Really, we're damned if we do and damned if we don't. If tech workers are well-paid we're castigated for gentrifying neighbourhoods and living large, and yet anything that threatens to lower what we're paid produces conspiracy-theory articles like this one.

Fanastril -> Taylor Dotson , 21 Sep 2017 14:17
I learned to cook in school. Was there a shortage of cooks? No. Did I become a professional cook? No. but I sure as hell would not have missed the skills I learned for the world, and I use them every day.
KatieL -> Taylor Dotson , 21 Sep 2017 14:13
Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to.
youngsteveo -> Taylor Dotson , 21 Sep 2017 14:12
I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it.
Fanastril -> Taylor Dotson , 21 Sep 2017 14:11
It is not zero-sum: If you teach something empowering, like programming, motivating is a lot easier, and they will learn more.
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented.

Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

KatieL -> Taylor Dotson , 21 Sep 2017 14:10
"intelligence, creativity, diligence, communication ability, or anything else that a job"

None of those are any use if, when asked to turn your intelligent, creative, diligent, communicated idea into some software, you perform as well as most candidates do at simple coding assessments... and write stuff that doesn't work.

peter nelson , 21 Sep 2017 14:09

At its root, the campaign for code education isn't about giving the next generation a shot at earning the salary of a Facebook engineer. It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

Of course the writer does not offer the slightest shred of evidence to support the idea that this is the actual goal of these programs. So it appears that the tinfoil-hat conspiracy brigade on the Guardian is operating not only below the line, but above it, too.

The fact is that few of these students will ever become software engineers (which, incidentally, is my profession) but programming skills are essential in many professions for writing little scripts to automate various tasks, or to just understand 21st century technology.

kcrane , 21 Sep 2017 14:07
Sadly this is another article by a partial journalist who knows nothing about the software industry, but hopes to subvert what he had read somewhere to support a position he had already assumed. As others had said, understanding coding had already become akin to being able to use a pencil. It is a basic requirement of many higher level roles.

But knowing which end of a pencil to put on the paper (the equivalent of the level of coding taught in schools) isn't the same as being an artist. Moreover anyone who knows the field recognises that top coders are gifted, they embody genius. There are coding Caravaggio's out there, but few have the experience to know that. No amount of teaching will produce high level coders from average humans, there is an intangible something needed, as there is in music and art, to elevate the merely good to genius.

All to say, however many are taught the basics, it won't push down the value of the most talented coders, and so won't reduce the costs of the technology industry in any meaningful way as it is an industry, like art, that relies on the few not the many.

DebuggingLife , 21 Sep 2017 14:06
Not all of those children will want to become programmers but at least the barrier to entry, - for more to at least experience it - will be lower.

Teaching music to only the children whose parents can afford music tuition means than society misses out on a greater potential for some incredible gifted musicians to shine through.

Moreover, learning to code really means learning how to wrangle with the practical application of abstract concepts, algorithms, numerical skills, logic, reasoning, etc. which are all transferrable skills some of which are not in the scope of other classes, certainly practically.
Like music, sport, literature etc. programming a computer, a website, a device, a smartphone is an endeavour that can be truly rewarding as merely a pastime, and similarly is limited only by ones imagination.

rgilyead , 21 Sep 2017 14:01
"...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck.
Taylor Dotson -> cwblackwell , 21 Sep 2017 14:00
Yeah, but mania over coding skills inevitably pushes over skills out of the curriculum (or deemphasizes it). Education is zero-sum in that there's only so much time and energy to devote to it. Hence, you need more than vague appeals to "enhancement," especially given the risks pointed out by the author.
Taylor Dotson -> PolydentateBrigand , 21 Sep 2017 13:57
"Talented coders will start new tech businesses and create more jobs."

That could be argued for any skill set, including those found in the humanities and social sciences likely to pushed out by the mania over coding ability. Education is zero-sum: Time spent on one subject is time that invariably can't be spent learning something else.

Taylor Dotson -> WumpieJr , 21 Sep 2017 13:49
"If they can't literally fix everything let's just get rid of them, right?"

That's a strawman. His point is rooted in the recognition that we only have so much time, energy, and money to invest in solutions. One's that feel good but may not do anything distract us for the deeper structural issues in our economy. The probably with thinking "education" will fix everything is that it leaves the status quo unquestioned.

martinusher , 21 Sep 2017 13:31
Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down.

To confuse things further there's various levels of skill that all look the same to the untutored eye. Suppose you wished to bridge a waterway. If that waterway was a narrow ditch then you could just throw a plank across. As the distance to be spanned got larger and larger eventually you'd have to abandon intuition for engineering and experience. Exactly the same issues happen with software but they're less tangible; anyone can build a small program but a complex system requires a lot of other knowledge (in my field, that's engineering knowledge -- coding is almost an afterthought).

Its a good idea to teach young people to code but I wouldn't raise their expectations of huge salaries too much. For children educating them in wider, more general, fields and abstract activities such as music will pay off huge dividends, far more than just teaching them whatever the fashionable language du jour is. (...which should be Logo but its too subtle and abstract, it doesn't look "real world" enough!).

freeandfair , 21 Sep 2017 13:30
I don't see this is an issue. Sure, there could be ulterior motives there, but anyone who wants to still be employed in 20 years has to know how to code . It is not that everyone will be a coder, but their jobs will either include part-time coding or will require understanding of software and what it can and cannot do. AI is going to be everywhere.
WumpieJr , 21 Sep 2017 13:23
What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

youngsteveo , 21 Sep 2017 13:16
I'm not going to argue that the goal of mass education isn't to drive down wages, but the idea that the skills gap is a myth doesn't hold water in my experience. I'm a software engineer and manager at a company that pays well over the national average, with great benefits, and it is downright difficult to find a qualified applicant who can pass a rudimentary coding exam.

A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed.

[Oct 02, 2017] Does programming provides a new path to the middle class? Probably no longer, unless you are really talanted. In the latter case it is not that different from any other fields, but the pressure from H1B makes is harder for programmers. The neoliberal USA have a real problem with the social mobility

Notable quotes:
"... I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse ..."
"... This is interesting. Indeed, I do think there is excess supply of software programmers. ..."
"... Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US. ..."
"... Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history. ..."
"... I was laid off at your age in the depths of the recent recession and I got a job. ..."
"... The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done. ..."
"... Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible. ..."
Oct 02, 2017 | discussion.theguardian.com
swelle , 21 Sep 2017 17:36
I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse, though many 'older' (read: over 40) native-born tech workers will tell your that's plenty of talent here already, but even with the immigration hassles, H1B workers will be cheaper overall...

Julian Williams , 21 Sep 2017 18:06

This is interesting. Indeed, I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security. However, these jobs are usually occupied and the incumbents are not likely to move on quickly. Road blocks are also put up by creating sub networks of engineers who ensure that some knowledge is not ubiquitous.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

Still, the ability to write a computer program in an enabler, knowing how it works means you have an ability to imagine something and make it real. To me it is a bit like language, some people can use language to make more money than others, but it is still important to be able to have a basic level of understanding.

FabBlondie -> peter nelson , 21 Sep 2017 17:42
And yet I know a lot of people that has happened to. Better to replace a $125K a year programmer with one who will do the same, or even less, job for $50K.

JMColwill , 21 Sep 2017 18:17

This could backfire if the programmers don't find the work or pay to match their expectations... Programmers, after all tend to make very good hackers if their minds are turned to it.

freeandfair -> FabBlondie , 21 Sep 2017 18:23

> While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

Well, I am a software architect and what he says sounds correct for a certain type of applications. Maybe you do a different type of programming.

peter nelson -> FabBlondie , 21 Sep 2017 18:23

While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

How else can you do it?

Java is popular because it's a very versatile language - On this list it's the most popular general-purpose programming language. (Above it javascript is just a scripting language and HTML/CSS aren't even programming languages) https://fossbytes.com/most-used-popular-programming-languages/ ... and below it you have to go down to C# at 20% to come to another general-purpose language, and even that's a Microsoft house language.

Also the "correct" choice of programming languages is also based on how many people in the shop know it so they maintain code that's written in it by someone else.

freeandfair -> FabBlondie , 21 Sep 2017 18:22
> job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training.

Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US.

freeandfair -> mlzarathustra , 21 Sep 2017 18:20
> The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the buck

Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history.

theindyisbetter -> Game Cabbage , 21 Sep 2017 18:18
No. The amount of work is not a fixed sum. That's the lump of labour fallacy. We are not tied to the land.
ConBrio , 21 Sep 2017 18:10
Since newspaper are consolidating and cutting jobs gotta clamp down on colleges offering BA degrees, particularly in English Literature and journalism.

And then... and...then...and...

LMichelle -> chillisauce , 21 Sep 2017 18:03
This article focuses on the US schools, but I can imagine it's the same in the UK. I don't think these courses are going to be about creating great programmers capable of new innovations as much as having a work force that can be their own IT Help Desk.

They'll learn just enough in these classes to do that.

Then most companies will be hiring for other jobs, but want to make sure you have the IT skills to serve as your own "help desk" (although they will get no salary for their IT work).

edmundberk -> FabBlondie , 21 Sep 2017 17:57
I find that quite remarkable - 40 years ago you must have been using assembler and with hardly any memory to work with. If you blitzed through that without applying the thought processes described, well...I'm surprised.
James Dey , 21 Sep 2017 17:55
Funny. Every day in the Brexit articles, I read that increasing the supply of workers has negligible effect on wages.
peter nelson -> peterainbow , 21 Sep 2017 17:54
I was laid off at your age in the depths of the recent recession and I got a job. As I said in another posting, it usually comes down to fresh skills and good personal references who will vouch for your work-habits and how well you get on with other members of your team.

The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done.

Game Cabbage -> theindyisbetter , 21 Sep 2017 17:52
The situation has a direct comparison to today. It has nothing to do with land. There was a certain amount of profit making work and not enough labour to satisfy demand. There is currently a certain amount of profit making work and in many situations (especially unskilled low paid work) too much labour.
edmundberk , 21 Sep 2017 17:52
So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Or is this the most obtuse argument yet for avoiding what everyone in tech knows - even more blatantly than in many other industries, wages are curtailed by offshoring; and in the US, by having offshoring centres on US soil.

chillisauce , 21 Sep 2017 17:48
Well, speaking as someone who spends a lot of time trying to find really good programmers... frankly there aren't that many about. We take most of ours from Eastern Europe and SE Asia, which is quite expensive, given the relocation costs to the UK. But worth it.

So, yes, if more British kids learnt about coding, it might help a bit. But not much; the real problem is that few kids want to study IT in the first place, and that the tuition standards in most UK universities are quite low, even if they get there.

Baobab73 , 21 Sep 2017 17:48
True......
peter nelson -> rebel7 , 21 Sep 2017 17:47
There was recently an programme/podcast on ABC/RN about the HUGE shortage in Australia of techies with specialized security skills.
peter nelson -> jigen , 21 Sep 2017 17:46
Robots, or AI, are already making us more productive. I can write programs today in an afternoon that would have taken me a week a decade or two ago.

I can create a class and the IDE will take care of all the accessors, dependencies, enforce our style-guide compliance, stub-in the documentation ,even most test cases, etc, and all I have to write is very-specific stuff required by my application - the other 90% is generated for me. Same with UI/UX - stubs in relevant event handlers, bindings, dependencies, etc.

Programmers are a zillion times more productive than in the past, yet the demand keeps growing because so much more stuff in our lives has processors and code. Your car has dozens of processors running lots of software; your TV, your home appliances, your watch, etc.

Quaestor , 21 Sep 2017 17:43

Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible.

jamesupton , 21 Sep 2017 17:42
Getting children to learn how to write code, as part of core education, will be the first step to the long overdue revolution. The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.
cjenk415 -> LMichelle , 21 Sep 2017 17:40
did you misread? it seemed like he was emphasizing that learning to code, like learning art (and sports and languages), will help them develop skills that benefit them in whatever profession they choose.
FabBlondie -> peter nelson , 21 Sep 2017 17:40
While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done. And, FWIW, IMHO choosing the tool (programming language) might reasonably be expected to follow designing a solution, in practice this rarely happens. No, these days it's Java all the way, from day one.
theindyisbetter -> Game Cabbage , 21 Sep 2017 17:40
There was a fixed supply of land and a reduced supply of labour to work the land.

Nothing like then situation in a modern economy.

LMichelle , 21 Sep 2017 17:39
I'd advise parents that the classes they need to make sure their kids excel in are acting/drama. There is no better way to getting that promotion or increasing your pay like being a skilled actor in the job market. It's a fake it till you make it deal.
theindyisbetter , 21 Sep 2017 17:36
What a ludicrous argument.

Let's not teach maths or science or literacy either - then anyone with those skills will earn more.

SheriffFatman -> Game Cabbage , 21 Sep 2017 17:36

After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions

It also produced wage-control legislation (which admittedly failed to work).

peter nelson -> peterainbow , 21 Sep 2017 17:32
if there were truly a shortage i wouldn't be unemployed

I've heard that before but when I've dug deeper I've usually found someone who either let their skills go stale, or who had some work issues.

LMichelle -> loveyy , 21 Sep 2017 17:26
Really? You think they are going to emphasize things like the importance of privacy and consumer rights?
loveyy , 21 Sep 2017 17:25
This really has to be one of the silliest articles I read here in a very long time.
People, let your children learn to code. Even more, educate yourselves and start to code just for the fun of it - look at it like a game.
The more people know how to code the less likely they are to understand how stuff works. If you were ever frustrated by how impossible it seems to shop on certain websites, learn to code and you will be frustrated no more. You will understand the intent behind the process.
Even more, you will understand the inherent limitations and what is the meaning of safety. You will be able to better protect yourself in a real time connected world.

Learning to code won't turn your kid into a programmer, just like ballet or piano classes won't mean they'll ever choose art as their livelihood. So let the children learn to code and learn along with them

Game Cabbage , 21 Sep 2017 17:24
Tipping power to employers in any profession by oversupply of labour is not a good thing. Bit of a macabre example here but...After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions and economic development for hundreds of years after this. Not suggesting a massive depopulation. But you can achieve the same effects by altering the power balance. With decades of Neoliberalism, the employers side of the power see-saw is sitting firmly in the mud and is producing very undesired results for the vast majority of people.
Zuffle -> peterainbow , 21 Sep 2017 17:23
Perhaps you're just not very good. I've been a developer for 20 years and I've never had more than 1 week of unemployment.
Kevin P Brown -> peterainbow , 21 Sep 2017 17:20
" at 55 finding it impossible to get a job"

I am 59, and it is not just the age aspect it is the money aspect. They know you have experience and expectations, and yet they believe hiring someone half the age and half the price, times 2 will replace your knowledge. I have been contracting in IT for 30 years, and now it is obvious it is over. Experience at some point no longer mitigates age. I think I am at that point now.

TheLane82 , 21 Sep 2017 17:20
Completely true! What needs to happen instead is to teach the real valuable subjects.

Gender studies. Islamic studies. Black studies. All important issues that need to be addressed.

peter nelson -> mlzarathustra , 21 Sep 2017 17:06
Dear, dear, I know, I know, young people today . . . just not as good as we were. Everything is just going down the loo . . . Just have a nice cuppa camomile (or chamomile if you're a Yank) and try to relax ... " hey you kids, get offa my lawn !"
FabBlondie , 21 Sep 2017 17:06
There are good reasons to teach coding. Too many of today's computer users are amazingly unaware of the technology that allows them to send and receive emails, use their smart phones, and use websites. Few understand the basic issues involved in computer security, especially as it relates to their personal privacy. Hopefully some introductory computer classes could begin to remedy this, and the younger the students the better.

Security problems are not strictly a matter of coding.

Security issues persist in tech. Clearly that is not a function of the size of the workforce. I propose that it is a function of poor management and design skills. These are not taught in any programming class I ever took. I learned these on the job and in an MBA program, and because I was determined.

Don't confuse basic workforce training with an effective application of tech to authentic needs.

How can the "disruption" so prized in today's Big Tech do anything but aggravate our social problems? Tech's disruption begins with a blatant ignorance of and disregard for causes, and believes to its bones that a high tech app will truly solve a problem it cannot even describe.

Kool Aid anyone?

peterainbow -> brady , 21 Sep 2017 17:05
indeed that idea has been around as long as cobol and in practice has just made things worse, the fact that many people outside of software engineering don;t seem to realise is that the coding itself is a relatively small part of the job
FabBlondie -> imipak , 21 Sep 2017 17:04
Hurrah.
peterainbow -> rebel7 , 21 Sep 2017 17:04
so how many female and old software engineers are there who are unable to get a job, i'm one of them at 55 finding it impossible to get a job and unlike many 'developers' i know what i'm doing
peterainbow , 21 Sep 2017 17:02
meanwhile the age and sex discrimination in IT goes on, if there were truly a shortage i wouldn't be unemployed
Jared Hall -> peter nelson , 21 Sep 2017 17:01
Training more people for an occupation will result in more people becoming qualified to perform that occupation, irregardless of the fact that many will perform poorly at it. A CS degree is no guarantee of competency, but it is one of the best indicators of general qualification we have at the moment. If you can provide a better metric for analyzing the underlying qualifications of the labor force, I'd love to hear it.

Regarding your anecdote, while interesting, it poor evidence when compared to the aggregate statistical data analyzed in the EPI study.

peter nelson -> FabBlondie , 21 Sep 2017 17:00

Job-specific training is completely different.

Good grief. It's not job-specific training. You sound like someone who knows nothing about computer programming.

Designing a computer program requires analysing the task; breaking it down into its components, prioritising them and identifying interdependencies, and figuring out which parts of it can be broken out and done separately. Expressing all this in some programming language like Java, C, or C++ is quite secondary.

So once you learn to organise a task properly you can apply it to anything - remodeling a house, planning a vacation, repairing a car, starting a business, or administering a (non-software) project at work.

[Oct 02, 2017] Evaluation of potential job candidates for programming job should include evaluation of thier previous projects and code written

Notable quotes:
"... Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst. ..."
"... how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts ..."
"... And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that. ..."
"... most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe. ..."
Oct 02, 2017 | discussion.theguardian.com

peter nelson -> c mm , 21 Sep 2017 19:49

Instant feedback is one of the things I really like about programming, but it's also the thing that some people can't handle. As I'm developing a program all day long the compiler is telling me about build errors or warnings or when I go to execute it it crashes or produces unexpected output, etc. Software engineers are bombarded all day with negative feedback and little failures. You have to be thick-skinned for this work.
peter nelson -> peterainbow , 21 Sep 2017 19:42
How is it shallow and lazy? I'm hiring for the real world so I want to see some real world accomplishments. If the candidate is fresh out of university they can't point to work projects in industry because they don't have any. But they CAN point to stuff they've done on their own. That shows both motivation and the ability to finish something. Why do you object to it?
anticapitalist -> peter nelson , 21 Sep 2017 14:47
Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst.
John Kendall , 21 Sep 2017 19:42
There is a big difference between "coding" and programming. Coding for a smart phone app is a matter of calling functions that are built into the device. For example, there are functions for the GPS or for creating buttons or for simulating motion in a game. These are what we used to call subroutines. The difference is that whereas we had to write our own subroutines, now they are just preprogrammed functions. How those functions are written is of little or no importance to today's coders.

Nor are they able to program on that level. Real programming requires not only a knowledge of programming languages, but also a knowledge of the underlying algorithms that make up actual programs. I suspect that "coding" classes operate on a quite superficial level.

Game Cabbage -> theindyisbetter , 21 Sep 2017 19:40
Its not about the amount of work or the amount of labor. Its about the comparative availability of both and how that affects the balance of power, and that in turn affects the overall quality of life for the 'majority' of people.
c mm -> Ed209 , 21 Sep 2017 19:39
Most of this is not true. Peter Nelson gets it right by talking about breaking steps down and thinking rationally. The reason you can't just teach the theory, however, is that humans learn much better with feedback. Think about trying to learn how to build a fast car, but you never get in and test its speed. That would be silly. Programming languages take the system of logic that has been developed for centuries and gives instant feedback on the results. It's a language of rationality.
peter nelson -> peterainbow , 21 Sep 2017 19:37
This article is about the US. The tech industry in the EU is entirely different, and basically moribund. Where is the EU's Microsoft, Apple, Google, Amazon, Oracle, Intel, Facebook, etc, etc? The opportunities for exciting interesting work, plus the time and schedule pressures that force companies to overlook stuff like age because they need a particular skill Right Now, don't exist in the EU. I've done very well as a software engineer in my 60's in the US; I cannot imagine that would be the case in the EU.
peterainbow -> peter nelson , 21 Sep 2017 19:37
sorry but that's just not true, i doubt you are really programming still, or quasi programmer but really a manager who like to keep their hand in, you certainly aren't busy as you've been posting all over this cif. also why would you try and hire someone with such disparate skillsets, makes no sense at all

oh and you'd be correct that i do have workplace issues, ie i have a disability and i also suffer from depression, but that shouldn't bar me from employment and again regarding my skills going stale, that again contradicts your statement that it's about planning/analysis/algorithms etc that you said above ( which to some extent i agree with )

c mm -> peterainbow , 21 Sep 2017 19:36
Not at all, it's really egalitarian. If I want to hire someone to paint my portrait, the best way to know if they're any good is to see their previous work. If they've never painted a portrait before then I may want to go with the girl who has
c mm -> ragingbull , 21 Sep 2017 19:34
There is definitely not an excess. Just look at projected jobs for computer science on the Bureau of Labor statistics.
c mm -> perble conk , 21 Sep 2017 19:32
Right? It's ridiculous. "Hey, there's this industry you can train for that is super valuable to society and pays really well!"
Then Ben Tarnoff, "Don't do it! If you do you'll drive down wages for everyone else in the industry. Build your fire starting and rock breaking skills instead."
peterainbow -> peter nelson , 21 Sep 2017 19:29
how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts
peter nelson -> eirsatz , 21 Sep 2017 19:25
I think the difference between gifted and not is motivation. But I agree it's not innate. The kid who stayed up all night in high school hacking into the school server to fake his coding class grade is probably more gifted than the one who spent 4 years in college getting a BS in CS because someone told him he could get a job when he got out.

I've done some hiring in my life and I always ask them to tell me about stuff they did on their own.

peter nelson -> TheBananaBender , 21 Sep 2017 19:20

Most coding jobs are bug fixing.

The only bugs I have to fix are the ones I make.

peter nelson -> Ed209 , 21 Sep 2017 19:19
As several people have pointed out, writing a computer program requires analyzing and breaking down a task into steps, identifying interdependencies, prioritizing the order, figuring out what parts can be organized into separate tasks that be done separately, etc.

These are completely independent of the language - I've been programming for 40 years in everything from FORTRAN to APL to C to C# to Java and it's all the same. Not only that but they transcend programming - they apply to planning a vacation, remodeling a house, or fixing a car.

peter nelson -> ragingbull , 21 Sep 2017 19:14
Neither coding nor having a bachelor's degree in computer science makes you a suitable job candidate. I've done a lot of recruiting and interviews in my life, and right now I'm trying to hire someone. And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that.

That's the thing that distinguishes software from many other fields - you can do something real and significant on your own. If you haven't managed to do so in 4 years of college you're not a good candidate.

peter nelson -> nickGregor , 21 Sep 2017 19:07
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it.

In a sense that's already true, as i noted elsewhere. 90% of the code in my projects (Java and C# in their respective IDEs) is machine generated. I do relatively little "coding". But the flaw in your idea is this: most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe.

Ricardo111 -> martinusher , 21 Sep 2017 19:03
Completely agree. At the highest levels there is more work that goes into managing complexity and making sure nothing is missed than in making the wheels turn and the beepers beep.
ragingbull , 21 Sep 2017 19:02
Hang on... if the current excess of computer science grads is not driving down wages, why would training more kids to code make any difference?
Ricardo111 -> youngsteveo , 21 Sep 2017 18:59
I've actually interviewed people for very senior technical positions in Investment Banks who had all the fancy talk in the world and yet failed at some very basic "write me a piece of code that does X" tests.

Next hurdle on is people who have learned how to deal with certain situations and yet don't really understand how it works so are unable to figure it out if you change the problem parameters.

That said, the average coder is only slightly beyond this point. The ones who can take in account maintenability and flexibility for future enhancements when developing are already a minority, and those who can understand the why of software development process steps, design software system architectures or do a proper Technical Analysis are very rare.

eirsatz -> Ricardo111 , 21 Sep 2017 18:57
Hubris. It's easy to mistake efficiency born of experience as innate talent. The difference between a 'gifted coder' and a 'non gifted junior coder' is much more likely to be 10 or 15 years sitting at a computer, less if there are good managers and mentors involved.
Ed209 , 21 Sep 2017 18:57
Politicians love the idea of teaching children to 'code', because it sounds so modern, and nobody could possible object... could they? Unfortunately it simply shows up their utter ignorance of technical matters because there isn't a language called 'coding'. Computer programming languages have changed enormously over the years, and continue to evolve. If you learn the wrong language you'll be about as welcome in the IT industry as a lamp-lighter or a comptometer operator.

The pace of change in technology can render skills and qualifications obsolete in a matter of a few years, and only the very best IT employers will bother to retrain their staff - it's much cheaper to dump them. (Most IT posts are outsourced through agencies anyway - those that haven't been off-shored. )

peter nelson -> YEverKnot , 21 Sep 2017 18:54
And this isn't even a good conspiracy theory; it's a bad one. He offers no evidence that there's an actual plan or conspiracy to do this. I'm looking for an account of where the advocates of coding education met to plot this in some castle in Europe or maybe a secret document like "The Protocols of the Elders of Google", or some such.
TheBananaBender , 21 Sep 2017 18:52
Most jobs in IT are shit - desktop support, operations droids. Most coding jobs are bug fixing.
Ricardo111 -> Wiretrip , 21 Sep 2017 18:49
Tool Users Vs Tool Makers. The really good coders actually get why certain things work as they do and can adjust them for different conditions. The mass produced coders are basically code copiers and code gluing specialists.
peter nelson -> AmyInNH , 21 Sep 2017 18:49
People who get Masters and PhD's in computer science are not usually "coders" or software engineers - they're usually involved in obscure, esoteric research for which there really is very little demand. So it doesn't surprise me that they're unemployed. But if someone has a Bachelor's in CS and they're unemployed I would have to wonder what they spent their time at university doing.

The thing about software that distinguishes it from lots of other fields is that you can make something real and significant on your own . I would expect any recent CS major I hire to be able to show me an app or an open-source component or something similar that they made themselves, and not just test scores and grades. If they could not then I wouldn't even think about hiring them.

Ricardo111 , 21 Sep 2017 18:44
Fortunately for those of us who are actually good at coding, the difference in productivity between a gifted coder and a non-gifted junior developer is something like 100-fold. Knowing how to code and actually being efficient at creating software programs and systems are about as far apart as knowing how to write and actually being able to write a bestselling exciting Crime trilogy.
peter nelson -> jamesupton , 21 Sep 2017 18:36

The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.

If you know how to write software you can get a robot to do those things.

peter nelson -> Julian Williams , 21 Sep 2017 18:34
I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security.

This article is about coding; most of those jobs require very little of that.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

How do you define "high paying". Everyone I know (and I know a lot because I've been a sw engineer for 40 years) who is working fulltime as a software engineer is making a high-middle-class salary, and can easily afford a home, travel on holiday, investments, etc.

YEverKnot , 21 Sep 2017 18:32

Tech's push to teach coding isn't about kids' success – it's about cutting wages

Nowt like a good conspiracy theory.
freeandfair -> WithoutPurpose , 21 Sep 2017 18:31
What is a stupidly low salary? 100K?
freeandfair -> AmyInNH , 21 Sep 2017 18:30
> Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.

That just means 50% of them are no good and need to develop their skills further or try something else.
Not every with a STEM degree from some 3rd rate college is capable of doing complex IT or STEM work.

peter nelson -> edmundberk , 21 Sep 2017 18:30

So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Yes. Haven't you noticed how wage growth has flattened? That's because some do-gooders" thought it would be a fine idea to educate the peasants. There was a time when only the well-to do knew how to read and write, and that's why they well-to-do were well-to-do. Education is evil. Stop educating people and then those of us who know how to read and write can charge them for reading and writing letters and email. Better yet, we can have Chinese and Indians do it for us and we just charge a transaction fee.

AmyInNH -> peter nelson , 21 Sep 2017 18:27
Massive amounts of public use cars, it doesn't mean millions need schooling in auto mechanics. Same for software coding. We aren't even using those who have Bachelors, Masters and PhDs in CS.
carlospapafritas , 21 Sep 2017 18:27
"..importing large numbers of skilled guest workers from other countries through the H1-B visa program..."

"skilled" is good. H1B has long ( appx 17 years) been abused and turned into trafficking scheme. One can buy H1B in India. Powerful ethnic networks wheeling & dealing in US & EU selling IT jobs to essentially migrants.

The real IT wages haven't been stagnant but steadily falling from the 90s. It's easy to see why. $82K/year IT wage was about average in the 90s. Comparing the prices of housing (& pretty much everything else) between now gives you the idea.

freeandfair -> whitehawk66 , 21 Sep 2017 18:27
> not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread

Taking a couple of years of programming are not enough to do this as a job, don't worry.
But learning to code is like learning maths, - it helps to develop logical thinking, which will benefit you in every area of your life.

James Dey , 21 Sep 2017 18:25
We should stop teaching our kids to be journalists, then your wage might go up.
peter nelson -> AmyInNH , 21 Sep 2017 18:23
What does this even mean?

[Oct 02, 2017] Programming is a culturally important skill

Notable quotes:
"... A lot of basic entry level jobs require a good level of Excel skills. ..."
"... Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money. ..."
"... Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled. ..."
"... Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely. ..."
"... We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? ..."
www.moonofalabama.org
David McCaul -> IanMcLzzz , 21 Sep 2017 13:03
There are very few professional Scribes nowadays, a good level of reading & writing is simplely a default even for the lowest paid jobs. A lot of basic entry level jobs require a good level of Excel skills. Several years from now basic coding will be necessary to manipulate basic tools for entry level jobs, especially as increasingly a lot of real code will be generated by expert systems supervised by a tiny number of supervisors. Coding jobs will go the same way that trucking jobs will go when driverless vehicles are perfected.

anticapitalist, 21 Sep 2017 14:25

Offer the class but not mandatory. Just like I could never succeed playing football others will not succeed at coding. The last thing the industry needs is more bad developers showing up for a paycheck.

Fanastril , 21 Sep 2017 14:08

Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. What's next, keep them off Math, because, you know . .
Taylor Dotson -> freeandfair , 21 Sep 2017 13:59
That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?
PolydentateBrigand , 21 Sep 2017 12:59
The economy isn't a zero-sum game. Developing a more skilled workforce that can create more value will lead to economic growth and improvement in the general standard of living. Talented coders will start new tech businesses and create more jobs.

WumpieJr , 21 Sep 2017 13:23

What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

mlzarathustra , 21 Sep 2017 16:52
I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.

The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.

I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.

Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.

AmyInNH -> peter nelson , 21 Sep 2017 16:51
Geeze it'd be nice if you'd make an effort.
rucore.libraries.rutgers.edu/rutgers-lib/45960/PDF/1/
https://rucore.libraries.rutgers.edu/rutgers-lib/46156 /
https://rucore.libraries.rutgers.edu/rutgers-lib/46207 /
peter nelson -> WyntonK , 21 Sep 2017 16:45
Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled.

But it's not. I'm in my 60's and retiring but I've been a software engineer all my life. I've worked for many different companies, and in different industries and I've never had any trouble competing with cheap imported workers. The people I've seen fall behind were ones who did not keep their skills fresh. When I was laid off in 2009 in my mid-50's I made sure my mobile-app skills were bleeding edge (in those days ANYTHING having to do with mobile was bleeding edge) and I used to go to job interviews with mobile devices to showcase what I could do. That way they could see for themselves and not have to rely on just a CV.

They older guys who fell behind did so because their skills and toolsets had become obsolete.

Now I'm trying to hire a replacement to write Android code for use in industrial production and struggling to find someone with enough experience. So where is this oversupply I keep hearing about?

Jared Hall -> RogTheDodge , 21 Sep 2017 16:42
Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely.
JayThomas , 21 Sep 2017 16:39

It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? That just seems inefficient.

FabBlondie -> RogTheDodge , 21 Sep 2017 16:39
There was never any need to give our jobs to foreigners. That is, if you are comparing the production of domestic vs. foreign workers. The sole need was, and is, to increase profits.
peter nelson -> AmyInNH , 21 Sep 2017 16:34
Link?
FabBlondie , 21 Sep 2017 16:34
Schools MAY be able to fix big social problems, but only if they teach a well-rounded curriculum that includes classical history and the humanities. Job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training. The existing social problems were not caused by a lack of programmers, and cannot be solved by Big Tech.

I agree with the author that computer programming skills are not that limited in availability. Big Tech solved the problem of the well-paid professional some years ago by letting them go, these were mostly workers in their 50s, and replacing them with H1-B visa-holders from India -- who work for a fraction of their experienced American counterparts.

It is all about profits. Big Tech is no different than any other "industry."

peter nelson -> Jared Hall , 21 Sep 2017 16:31
Supply of apples does not affect the demand for oranges. Teaching coding in high school does not necessarily alter the supply of software engineers. I studied Chinese History and geology at University but my doing so has had no effect on the job prospects of people doing those things for a living.
johnontheleft -> Taylor Dotson , 21 Sep 2017 16:30
You would be surprised just how much a little coding knowledge has transformed my ability to do my job (a job that is not directly related to IT at all).
peter nelson -> Jared Hall , 21 Sep 2017 16:29
Because teaching coding does not affect the supply of actual engineers. I've been a professional software engineer for 40 years and coding is only a small fraction of what I do.
peter nelson -> Jared Hall , 21 Sep 2017 16:28
You and the linked article don't know what you're talking about. A CS degree does not equate to a productive engineer.

A few years ago I was on the recruiting and interviewing committee to try to hire some software engineers for a scientific instrument my company was making. The entire team had about 60 people (hw, sw, mech engineers) but we needed 2 or 3 sw engineers with math and signal-processing expertise. The project was held up for SIX months because we could not find the people we needed. It would have taken a lot longer than that to train someone up to our needs. Eventually we brought in some Chinese engineers which cost us MORE than what we would have paid for an American engineer when you factor in the agency and visa paperwork.

Modern software engineers are not just generic interchangable parts - 21st century technology often requires specialised scientific, mathematical, production or business domain-specific knowledge and those people are hard to find.

freeluna -> freeluna , 21 Sep 2017 16:18
...also, this article is alarmist and I disagree with it. Dear Author, Phphphphtttt! Sincerely, freeluna
AmyInNH , 21 Sep 2017 16:16
Regimentation of the many, for benefit of the few.
AmyInNH -> Whatitsaysonthetin , 21 Sep 2017 16:15
Visa jobs are part of trade agreements. To be very specific, US gov (and EU) trade Western jobs for market access in the East.
http://www.marketwatch.com/story/in-india-british-leader-theresa-may-preaches-free-trade-2016-11-07
There is no shortage. This is selling off the West's middle class.
Take a look at remittances in wikipedia and you'll get a good idea just how much it costs the US and EU economies, for sake of record profits to Western industry.
jigen , 21 Sep 2017 16:13
And thanks to the author for not using the adjective "elegant" in describing coding.
freeluna , 21 Sep 2017 16:13
I see advantages in teaching kids to code, and for kids to make arduino and other CPU powered things. I don't see a lot of interest in science and tech coming from kids in school. There are too many distractions from social media and game platforms, and not much interest in developing tools for future tech and science.
jigen , 21 Sep 2017 16:13
Let the robots do the coding. Sorted.
FluffyDog -> rgilyead , 21 Sep 2017 16:13
Although coding per se is a technical skill it isn't designing or integrating systems. It is only a small, although essential, part of the whole software engineering process. Learning to code just gets you up the first steps of a high ladder that you need to climb a fair way if you intend to use your skills to earn a decent living.
rebel7 , 21 Sep 2017 16:11
BS.

Friend of mine in the SV tech industry reports that they are about 100,000 programmers short in just the internet security field.

Y'all are trying to create a problem where there isn't one. Maybe we shouldn't teach them how to read either. They might want to work somewhere besides the grill at McDonalds.

AmyInNH -> WyntonK , 21 Sep 2017 16:11
To which they will respond, offshore.
AmyInNH -> MrFumoFumo , 21 Sep 2017 16:10
They're not looking for good, they're looking for cheap + visa indentured. Non-citizens.
nickGregor , 21 Sep 2017 16:09
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it. Coding is going to change from its purely abstract form that is not utilized at peak- but if you can describe what you envision in an effective concise manner u could become a very good coder very quickly -- and competence will be determined entirely by imagination and the barriers of entry will all but be extinct
AmyInNH -> unclestinky , 21 Sep 2017 16:09
Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.
AmyInNH -> User10006 , 21 Sep 2017 16:06
Apparently a whole lot of people are just making it up, eh?
http://www.motherjones.com/politics/2017/09/inside-the-growing-guest-worker-program-trapping-indian-students-in-virtual-servitude /
From today,
http://www.computerworld.com/article/2915904/it-outsourcing/fury-rises-at-disney-over-use-of-foreign-workers.html
All the way back to 1995,
https://www.youtube.com/watch?v=vW8r3LoI8M4&feature=youtu.be
JCA1507 -> whitehawk66 , 21 Sep 2017 16:04
Bravo
JCA1507 -> DirDigIns , 21 Sep 2017 16:01
Total... utter... no other way... huge... will only get worse... everyone... (not a very nuanced commentary is it).

I'm glad pieces like this are mounting, it is relevant that we counter the mix of messianism and opportunism of Silicon Valley propaganda with convincing arguments.

RogTheDodge -> WithoutPurpose , 21 Sep 2017 16:01
That's not my experience.
AmyInNH -> TTauriStellarbody , 21 Sep 2017 16:01
It's a stall tactic by Silicon Valley, "See, we're trying to resolve the [non-existant] shortage."
AmyInNH -> WyntonK , 21 Sep 2017 16:00
They aren't immigrants. They're visa indentured foreign workers. Why does that matter? It's part of the cheap+indentured hiring criteria. If it were only cheap, they'd be lowballing offers to citizen and US new grads.
RogTheDodge -> Jared Hall , 21 Sep 2017 15:59
No. Because they're the ones wanting them and realizing the US education system is not producing enough
RogTheDodge -> Jared Hall , 21 Sep 2017 15:58
Except the demand is increasing massively.
RogTheDodge -> WyntonK , 21 Sep 2017 15:57
That's why we are trying to educate American coders - so we don't need to give our jobs to foreigners.
AmyInNH , 21 Sep 2017 15:56
Correct premises,
- proletarianize programmers
- many qualified graduates simply can't find jobs.
Invalid conclusion:
- The problem is there aren't enough good jobs to be trained for.

That conclusion only makes sense if you skip right past ...
" importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status"

Hiring Americans doesn't "hurt" their record profits. It's incessant greed and collusion with our corrupt congress.

Oldvinyl , 21 Sep 2017 15:51
This column was really annoying. I taught my students how to program when I was given a free hand to create the computer studies curriculum for a new school I joined. (Not in the UK thank Dog). 7th graders began with studying the history and uses of computers and communications tech. My 8th grade learned about computer logic (AND, OR, NOT, etc) and moved on with QuickBASIC in the second part of the year. My 9th graders learned about databases and SQL and how to use HTML to make their own Web sites. Last year I received a phone call from the father of one student thanking me for creating the course, his son had just received a job offer and now works in San Francisco for Google.
I am so glad I taught them "coding" (UGH) as the writer puts it, rather than arty-farty subjects not worth a damn in the jobs market.
WyntonK -> DirDigIns , 21 Sep 2017 15:47
I live and work in Silicon Valley and you have no idea what you are talking about. There's no shortage of coders at all. Terrific coders are let go because of their age and the availability of much cheaper foreign coders(no, I am not opposed to immigration).
Sean May , 21 Sep 2017 15:43
Looks like you pissed off a ton of people who can't write code and are none to happy with you pointing out the reason they're slinging insurance for geico.

I think you're quite right that coding skills will eventually enter the mainstream and slowly bring down the cost of hiring programmers.

The fact is that even if you don't get paid to be a programmer you can absolutely benefit from having some coding skills.

There may however be some kind of major coding revolution with the advent of quantum computing. The way code is written now could become obsolete.

Jared Hall -> User10006 , 21 Sep 2017 15:43
Why is it a fantasy? Does supply and demand not apply to IT labor pools?
Jared Hall -> ninianpark , 21 Sep 2017 15:42
Why is it a load of crap? If you increase the supply of something with no corresponding increase in demand, the price will decrease.
pictonic , 21 Sep 2017 15:40
A well-argued article that hits the nail on the head. Amongst any group of coders, very few are truly productive, and they are self starters; training is really needed to do the admin.
Jared Hall -> DirDigIns , 21 Sep 2017 15:39
There is not a huge skills shortage. That is why the author linked this EPI report analyzing the data to prove exactly that. This may not be what people want to believe, but it is certainly what the numbers indicate. There is no skills gap.

http://www.epi.org/files/2013/bp359-guestworkers-high-skill-labor-market-analysis.pdf

Axel Seaton -> Jaberwocky , 21 Sep 2017 15:34
Yeah, but the money is crap
DirDigIns -> IanMcLzzz , 21 Sep 2017 15:32
Perfect response for the absolute crap that the article is pushing.
DirDigIns , 21 Sep 2017 15:30
Total and utter crap, no other way to put it.

There is a huge skills shortage in key tech areas that will only get worse if we don't educate and train the young effectively.

Everyone wants youth to have good skills for the knowledge economy and the ability to earn a good salary and build up life chances for UK youth.

So we get this verbal diarrhoea of an article. Defies belief.

Whatitsaysonthetin -> Evelita , 21 Sep 2017 15:27
Yes. China and India are indeed training youth in coding skills. In order that they take jobs in the USA and UK! It's been going on for 20 years and has resulted in many experienced IT staff struggling to get work at all and, even if they can, to suffer stagnating wages.
WmBoot , 21 Sep 2017 15:23
Wow. Congratulations to the author for provoking such a torrent of vitriol! Job well done.
TTauriStellarbody , 21 Sep 2017 15:22
Has anyones job is at risk from a 16 year old who can cobble together a couple of lines of javascript since the dot com bubble?

Good luck trying to teach a big enough pool of US school kids regular expressions let alone the kind of test driven continuous delivery that is the norm in the industry now.

freeandfair -> youngsteveo , 21 Sep 2017 13:27
> A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job

I have exactly the same experience. There is undeniable a skill gap. It takes about a year for a skilled professional to adjust and learn enough to become productive, it takes about 3-5 years for a college grad.

It is nothing new. But the issue is, as the college grad gets trained, another company steal him/ her. And also keep in mind, all this time you are doing job and training the new employee as time permits. Many companies in the US cut the non-profit department (such as IT) to the bone, we cannot afford to lose a person and then train another replacement for 3-5 years.

The solution? Hire a skilled person. But that means nobody is training college grads and in 10-20 years we are looking at the skill shortage to the point where the only option is brining foreign labor.

American cut-throat companies that care only about the bottom line cannibalized themselves.

farabundovive -> Ethan Hawkins , 21 Sep 2017 15:10

Heh. You are not a coder, I take it. :) Going to be a few decades before even the easiest coding jobs vanish.

Given how shit most coders of my acquaintance have been - especially in matters of work ethic, logic, matching s/w to user requirements and willingness to test and correct their gormless output - most future coding work will probably be in the area of disaster recovery. Sorry, since the poor snowflakes can't face the sad facts, we have to call it "business continuation" these days, don't we?
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

[Sep 19, 2017] Boston Startups Are Teaching Boats to Drive Themselves by Joshua Brustein

Notable quotes:
"... He's also a sort of maritime-technology historian. A tall, white-haired man in a baseball cap, shark t-shirt and boat shoes, Benjamin said he's spent the last 15 years "making vehicles wet." He has the U.S. armed forces to thank for making his autonomous work possible. The military sparked the field of marine autonomy decades ago, when it began demanding underwater robots for mine detection, ..."
"... In 2006, Benjamin launched his open-source software project. With it, a computer is able to take over a boat's navigation-and-control system. Anyone can write programs for it. The project is funded by the U.S. Office for Naval Research and Battelle Memorial Institute, a nonprofit. Benjamin said there are dozens of types of vehicles using the software, which is called MOOS-IvP. ..."
Sep 19, 2017 | www.msn.com

Originally from: Bloomberg via Associated Press

Frank Marino, an engineer with Sea Machines Robotics, uses a remote control belt pack to control a self-driving boat in Boston Harbor. (Bloomberg) -- Frank Marino sat in a repurposed U.S. Coast Guard boat bobbing in Boston Harbor one morning late last month. He pointed the boat straight at a buoy several hundred yards away, while his colleague Mohamed Saad Ibn Seddik used a laptop to set the vehicle on a course that would run right into it. Then Ibn Seddik flipped the boat into autonomous driving mode. They sat back as the vessel moved at a modest speed of six knots, smoothly veering right to avoid the buoy, and then returned to its course.

In a slightly apologetic tone, Marino acknowledged the experience wasn't as harrowing as barreling down a highway in an SUV that no one is steering. "It's not like a self-driving car, where the wheel turns on its own," he said. Ibn Seddik tapped in directions to get the boat moving back the other way at twice the speed. This time, the vessel kicked up a wake, and the turn felt sharper, even as it gave the buoy the same wide berth as it had before. As far as thrills go, it'd have to do. Ibn Seddik said going any faster would make everyone on board nauseous.

The two men work for Sea Machines Robotics Inc., a three-year old company developing computer systems for work boats that can make them either remote-controllable or completely autonomous. In May, the company spent $90,000 to buy the Coast Guard hand-me-down at a government auction. Employees ripped out one of the four seats in the cabin to make room for a metal-encased computer they call a "first-generation autonomy cabinet." They painted the hull bright yellow and added the words "Unmanned Vehicle" in big, red letters. Cameras are positioned at the stern and bow, and a dome-like radar system and a digital GPS unit relay additional information about the vehicle's surroundings. The company named its new vessel Steadfast.

Autonomous maritime vehicles haven't drawn as much the attention as self-driving cars, but they're hitting the waters with increased regularity. Huge shipping interests, such as Rolls-Royce Holdings Plc, Tokyo-based fertilizer producer Nippon Yusen K.K. and BHP Billiton Ltd., the world's largest mining company, have all recently announced plans to use driverless ships for large-scale ocean transport. Boston has become a hub for marine technology startups focused on smaller vehicles, with a handful of companies like Sea Machines building their own autonomous systems for boats, diving drones and other robots that operate on or under the water.

As Marino and Ibn Seddik were steering Steadfast back to dock, another robot boat trainer, Michael Benjamin, motored past them. Benjamin, a professor at Massachusetts Institute of Technology, is a regular presence on the local waters. His program in marine autonomy, a joint effort by the school's mechanical engineering and computer science departments, serves as something of a ballast for Boston's burgeoning self-driving boat scene. Benjamin helps engineers find jobs at startups and runs an open-source software project that's crucial to many autonomous marine vehicles.

He's also a sort of maritime-technology historian. A tall, white-haired man in a baseball cap, shark t-shirt and boat shoes, Benjamin said he's spent the last 15 years "making vehicles wet." He has the U.S. armed forces to thank for making his autonomous work possible. The military sparked the field of marine autonomy decades ago, when it began demanding underwater robots for mine detection, Benjamin explained from a chair on MIT's dock overlooking the Charles River. Eventually, self-driving software worked its way into all kinds of boats.

These systems tended to chart a course based on a specific script, rather than sensing and responding to their environments. But a major shift came about a decade ago, when manufacturers began allowing customers to plug in their own autonomy systems, according to Benjamin. "Imagine where the PC revolution would have gone if the only one who could write software on an IBM personal computer was IBM," he said.

In 2006, Benjamin launched his open-source software project. With it, a computer is able to take over a boat's navigation-and-control system. Anyone can write programs for it. The project is funded by the U.S. Office for Naval Research and Battelle Memorial Institute, a nonprofit. Benjamin said there are dozens of types of vehicles using the software, which is called MOOS-IvP.

Startups using MOOS-IvP said it has created a kind of common vocabulary. "If we had a proprietary system, we would have had to develop training and train new employees," said Ibn Seddik. "Fortunately for us, Mike developed a course that serves exactly that purpose."

Teaching a boat to drive itself is easier than conditioning a car in some ways. They typically don't have to deal with traffic, stoplights or roundabouts. But water is unique challenge. "The structure of the road, with traffic lights, bounds your problem a little bit," said Benjamin. "The number of unique possible situations that you can bump into is enormous." At the moment, underwater robots represent a bigger chunk of the market than boats. Sales are expected to hit $4.6 billion in 2020, more than double the amount from 2015, according to ABI Research. The biggest customer is the military.

Several startups hope to change that. Michael Johnson, Sea Machines' chief executive officer, said the long-term potential for self-driving boats involves teams of autonomous vessels working in concert. In many harbors, multiple tugs bring in large container ships, communicating either through radio or by whistle. That could be replaced by software controlling all the boats as a single system, Johnson said.

Sea Machines' first customer is Marine Spill Response Corp., a nonprofit group funded by oil companies. The organization operates oil spill response teams that consist of a 210-foot ship paired with a 32-foot boat, which work together to drag a device collecting oil. Self-driving boats could help because staffing the 32-foot boat in choppy waters or at night can be dangerous, but the theory needs proper vetting, said Judith Roos, a vice president for MSRC. "It's too early to say, 'We're going to go out and buy 20 widgets.'"

Another local startup, Autonomous Marine Systems Inc., has been sending boats about 10 miles out to sea and leaving them there for weeks at a time. AMS's vehicles are designed to operate for long stretches, gathering data in wind farms and oil fields. One vessel is a catamaran dubbed the Datamaran, a name that first came from an employee's typo, said AMS CEO Ravi Paintal. The company also uses Benjamin's software platform. Paintal said AMS's longest missions so far have been 20 days, give or take. "They say when your boat can operate for 30 days out in the ocean environment, you'll be in the running for a commercial contract," he said.

... ... ...

[Sep 17, 2017] The last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization

Notable quotes:
"... To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI. ..."
"... Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person. ..."
"... Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment. ..."
"... If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh. ..."
"... One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true. ..."
"... Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: https://en.wikipedia.org/wiki/7_nanometer ). ..."
May 28, 2017 | economistsview.typepad.com

libezkova , May 27, 2017 at 10:53 PM

"When combined with our brains, human fingers are amazingly fine manipulation devices."

Not only fingers. The whole human arm is an amazing device. Pure magic, if you ask me.

To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI.

Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person.

Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment.

Such cruel natural experiments have shown that the level of flexibility of human brain is something really incredible. And IMHO can not be achieved by computers (although never say never).

Here we are talking about tasks that are 1 million times more complex task that playing GO or chess, or driving a car on the street.

My impression is that most of recent AI successes (especially IBM win in Jeopardy ( http://www.techrepublic.com/article/ibm-watson-the-inside-story-of-how-the-jeopardy-winning-supercomputer-was-born-and-what-it-wants-to-do-next/ ), which probably was partially staged, is by-and-large due to the growth of storage and the number of cores of computers, not so much sophistication of algorithms used.

The limits of AI are clearly visible when we see the quality of translation from one language to another. For more or less complex technical text it remains medium to low. As in "requires human editing".

If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh.

Same thing with the speech recognition. The progress is tremendous, especially the last three-five years. But it is still far from perfect. Now, with a some training, programs like Dragon are quite usable as dictation device on, say PC with 4 core 3GHz CPU with 16 GB of memory (especially if you are native English speaker), but if you deal with special text or have strong accent, they still leaves much to be desired (although your level of knowledge of the program, experience and persistence can improve the results considerably.

One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true.

Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: https://en.wikipedia.org/wiki/7_nanometer ).

[Sep 16, 2017] Google Publicly Releases Internal Developer Documentation Style Guide

Sep 12, 2017 | developers.slashdot.org

(betanews.com)

Posted by BeauHD on Tuesday September 12, 2017

@06:00AM from the free-for-all dept.

BrianFagioli shares a report from BetaNews: The documentation aspect of any project is very important, as it can help people to both understand it and track changes. Unfortunately, many developers aren't very interested in documentation aspect, so it often gets neglected. Luckily, if you want to maintain proper documentation and stay organized, today, Google is releasing its internal developer documentation style guide .

This can quite literally guide your documentation, giving you a great starting point and keeping things consistent. Jed Hartman, Technical Writer, Google says , "For some years now, our technical writers at Google have used an internal-only editorial style guide for most of our developer documentation. In order to better support external contributors to our open source projects, such as Kubernetes, AMP, or Dart, and to allow for more consistency across developer documentation, we're now making that style guide public.

If you contribute documentation to projects like those, you now have direct access to useful guidance about voice, tone, word choice, and other style considerations. It can be useful for general issues, like reminders to use second person, present tense, active voice, and the serial comma; it can also be great for checking very specific issues, like whether to write 'app' or 'application' when you want to be consistent with the Google Developers style."

You can access Google's style guide here .

[Aug 21, 2017] As the crisis unfolds there will be talk about giving the UN some role in resolving international problems.

Aug 21, 2017 | www.lettinggobreath.com

psychohistorian | Aug 21, 2017 12:01:32 AM | 27

My understanding of the UN is that it is the High Court of the World where fealty is paid to empire that funds most of the political circus anyway...and speaking of funding or not, read the following link and lets see what PavewayIV adds to the potential sickness we are sleep walking into.

As the UN delays talks, more industry leaders back ban on weaponized AI

[Jul 25, 2017] Knuth Computer Programming as an Art

Jul 25, 2017 | www.paulgraham.com

CACM , December 1974

When Communications of the ACM began publication in 1959, the members of ACM'S Editorial Board made the following remark as they described the purposes of ACM'S periodicals [2]:

"If computer programming is to become an important part of computer research and development, a transition of programming from an art to a disciplined science must be effected."
Such a goal has been a continually recurring theme during the ensuing years; for example, we read in 1970 of the "first steps toward transforming the art of programming into a science" [26]. Meanwhile we have actually succeeded in making our discipline a science, and in a remarkably simple way: merely by deciding to call it "computer science."

Implicit in these remarks is the notion that there is something undesirable about an area of human activity that is classified as an "art"; it has to be a Science before it has any real stature. On the other hand, I have been working for more than 12 years on a series of books called "The Art of Computer Programming." People frequently ask me why I picked such a title; and in fact some people apparently don't believe that I really did so, since I've seen at least one bibliographic reference to some books called "The Act of Computer Programming."

In this talk I shall try to explain why I think "Art" is the appropriate word. I will discuss what it means for something to be an art, in contrast to being a science; I will try to examine whether arts are good things or bad things; and I will try to show that a proper viewpoint of the subject will help us all to improve the quality of what we are now doing.

One of the first times I was ever asked about the title of my books was in 1966, during the last previous ACM national meeting held in Southern California. This was before any of the books were published, and I recall having lunch with a friend at the convention hotel. He knew how conceited I was, already at that time, so he asked if I was going to call my books "An Introduction to Don Knuth." I replied that, on the contrary, I was naming the books after him . His name: Art Evans. (The Art of Computer Programming, in person.)

From this story we can conclude that the word "art" has more than one meaning. In fact, one of the nicest things about the word is that it is used in many different senses, each of which is quite appropriate in connection with computer programming. While preparing this talk, I went to the library to find out what people have written about the word "art" through the years; and after spending several fascinating days in the stacks, I came to the conclusion that "art" must be one of the most interesting words in the English language.

The Arts of Old

If we go back to Latin roots, we find ars, artis meaning "skill." It is perhaps significant that the corresponding Greek word was τεχνη , the root of both "technology" and "technique."

Nowadays when someone speaks of "art" you probably think first of "fine arts" such as painting and sculpture, but before the twentieth century the word was generally used in quite a different sense. Since this older meaning of "art" still survives in many idioms, especially when we are contrasting art with science, I would like to spend the next few minutes talking about art in its classical sense.

In medieval times, the first universities were established to teach the seven so-called "liberal arts," namely grammar, rhetoric, logic, arithmetic, geometry, music, and astronomy. Note that this is quite different from the curriculum of today's liberal arts colleges, and that at least three of the original seven liberal arts are important components of computer science. At that time, an "art" meant something devised by man's intellect, as opposed to activities derived from nature or instinct; "liberal" arts were liberated or free, in contrast to manual arts such as plowing (cf. [6]). During the middle ages the word "art" by itself usually meant logic [4], which usually meant the study of syllogisms.

Science vs. Art

The word "science" seems to have been used for many years in about the same sense as "art"; for example, people spoke also of the seven liberal sciences, which were the same as the seven liberal arts [1]. Duns Scotus in the thirteenth century called logic "the Science of Sciences, and the Art of Arts" (cf. [12, p. 34f]). As civilization and learning developed, the words took on more and more independent meanings, "science" being used to stand for knowledge, and "art" for the application of knowledge. Thus, the science of astronomy was the basis for the art of navigation. The situation was almost exactly like the way in which we now distinguish between "science" and "engineering."

Many authors wrote about the relationship between art and science in the nineteenth century, and I believe the best discussion was given by John Stuart Mill. He said the following things, among others, in 1843 [28]:

Several sciences are often necessary to form the groundwork of a single art. Such is the complication of human affairs, that to enable one thing to be done , it is often requisite to know the nature and properties of many things... Art in general consists of the truths of Science, arranged in the most convenient order for practice, instead of the order which is the most convenient for thought. Science groups and arranges its truths so as to enable us to take in at one view as much as possible of the general order of the universe. Art... brings together from parts of the field of science most remote from one another, the truths relating to the production of the different and heterogeneous conditions necessary to each effect which the exigencies of practical life require.
As I was looking up these things about the meanings of "art," I found that authors have been calling for a transition from art to science for at least two centuries. For example, the preface to a textbook on mineralogy, written in 1784, said the following [17]: "Previous to the year 1780, mineralogy, though tolerably understood by many as an Art, could scarce be deemed a Science."

According to most dictionaries "science" means knowledge that has been logically arranged and systematized in the form of general "laws." The advantage of science is that it saves us from the need to think things through in each individual case; we can turn our thoughts to higher-level concepts. As John Ruskin wrote in 1853 [32]: "The work of science is to substitute facts for appearances, and demonstrations for impressions."

It seems to me that if the authors I studied were writing today, they would agree with the following characterization: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it. Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.

Artificial intelligence has been making significant progress, yet there is a huge gap between what computers can do in the foreseeable future and what ordinary people can do. The mysterious insights that people have when speaking, listening, creating, and even when they are programming, are still beyond the reach of science; nearly everything we do is still an art.

From this standpoint it is certainly desirable to make computer programming a science, and we have indeed come a long way in the 15 years since the publication of the remarks I quoted at the beginning of this talk. Fifteen years ago computer programming was so badly understood that hardly anyone even thought about proving programs correct; we just fiddled with a program until we "knew" it worked. At that time we didn't even know how to express the concept that a program was correct, in any rigorous way. It is only in recent years that we have been learning about the processes of abstraction by which programs are written and understood; and this new knowledge about programming is currently producing great payoffs in practice, even though few programs are actually proved correct with complete rigor, since we are beginning to understand the principles of program structure. The point is that when we write programs today, we know that we could in principle construct formal proofs of their correctness if we really wanted to, now that we understand how such proofs are formulated. This scientific basis is resulting in programs that are significantly more reliable than those we wrote in former days when intuition was the only basis of correctness.

The field of "automatic programming" is one of the major areas of artificial intelligence research today. Its proponents would love to be able to give a lecture entitled "Computer Programming as an Artifact" (meaning that programming has become merely a relic of bygone days), because their aim is to create machines that write programs better than we can, given only the problem specification. Personally I don't think such a goal will ever be completely attained, but I do think that their research is extremely important, because everything we learn about programming helps us to improve our own artistry. In this sense we should continually be striving to transform every art into a science: in the process, we advance the art.

Science and Art

Our discussion indicates that computer programming is by now both a science and an art, and that the two aspects nicely complement each other. Apparently most authors who examine such a question come to this same conclusion, that their subject is both a science and an art, whatever their subject is (cf. [25]). I found a book about elementary photography, written in 1893, which stated that "the development of the photographic image is both an art and a science" [13]. In fact, when I first picked up a dictionary in order to study the words "art" and "science," I happened to glance at the editor's preface, which began by saying, "The making of a dictionary is both a science and an art." The editor of Funk & Wagnall's dictionary [27] observed that the painstaking accumulation and classification of data about words has a scientific character, while a well-chosen phrasing of definitions demands the ability to write with economy and precision: "The science without the art is likely to be ineffective; the art without the science is certain to be inaccurate."

When preparing this talk I looked through the card catalog at Stanford library to see how other people have been using the words "art" and "science" in the titles of their books. This turned out to be quite interesting.

For example, I found two books entitled The Art of Playing the Piano [5, 15], and others called The Science of Pianoforte Technique [10], The Science of Pianoforte Practice [30]. There is also a book called The Art of Piano Playing: A Scientific Approach [22].

Then I found a nice little book entitled The Gentle Art of Mathematics [31], which made me somewhat sad that I can't honestly describe computer programming as a "gentle art." I had known for several years about a book called The Art of Computation , published in San Francisco, 1879, by a man named C. Frusher Howard [14]. This was a book on practical business arithmetic that had sold over 400,000 copies in various editions by 1890. I was amused to read the preface, since it shows that Howard's philosophy and the intent of his title were quite different from mine; he wrote: "A knowledge of the Science of Number is of minor importance; skill in the Art of Reckoning is absolutely indispensible."

Several books mention both science and art in their titles, notably The Science of Being and Art of Living by Maharishi Mahesh Yogi [24]. There is also a book called The Art of Scientific Discovery [11], which analyzes how some of the great discoveries of science were made.

So much for the word "art" in its classical meaning. Actually when I chose the title of my books, I wasn't thinking primarily of art in this sense, I was thinking more of its current connotations. Probably the most interesting book which turned up in my search was a fairly recent work by Robert E. Mueller called The Science of Art [29]. Of all the books I've mentioned, Mueller's comes closest to expressing what I want to make the central theme of my talk today, in terms of real artistry as we now understand the term. He observes: "It was once thought that the imaginative outlook of the artist was death for the scientist. And the logic of science seemed to spell doom to all possible artistic flights of fancy." He goes on to explore the advantages which actually do result from a synthesis of science and art.

A scientific approach is generally characterized by the words logical, systematic, impersonal, calm, rational, while an artistic approach is characterized by the words aesthetic, creative, humanitarian, anxious, irrational. It seems to me that both of these apparently contradictory approaches have great value with respect to computer programming.

Emma Lehmer wrote in 1956 that she had found coding to be "an exacting science as well as an intriguing art" [23]. H.S.M. Coxeter remarked in 1957 that he sometimes felt "more like an artist than a scientist" [7]. This was at the time C.P. Snow was beginning to voice his alarm at the growing polarization between "two cultures" of educated people [34, 35]. He pointed out that we need to combine scientific and artistic values if we are to make real progress.

Works of Art

When I'm sitting in an audience listening to a long lecture, my attention usually starts to wane at about this point in the hour. So I wonder, are you getting a little tired of my harangue about "science" and "art"? I really hope that you'll be able to listen carefully to the rest of this, anyway, because now comes the part about which I feel most deeply.

When I speak about computer programming as an art, I am thinking primarily of it as an art form , in an aesthetic sense. The chief goal of my work as educator and author is to help people learn how to write beautiful programs . It is for this reason I was especially pleased to learn recently [32] that my books actually appear in the Fine Arts Library at Cornell University. (However, the three volumes apparently sit there neatly on the shelf, without being used, so I'm afraid the librarians may have made a mistake by interpreting my title literally.)

My feeling is that when we prepare a program, it can be like composing poetry or music; as Andrei Ershov has said [9], programming can give us both intellectual and emotional satisfaction, because it is a real achievement to master complexity and to establish a system of consistent rules.

Furthermore when we read other people's programs, we can recognize some of them as genuine works of art. I can still remember the great thrill it was for me to read the listing of Stan Poley's SOAP II assembly program in 1958; you probably think I'm crazy, and styles have certainly changed greatly since then, but at the time it meant a great deal to me to see how elegant a system program could be, especially by comparison with the heavy-handed coding found in other listings I had been studying at the same time. The possibility of writing beautiful programs, even in assembly language, is what got me hooked on programming in the first place.

Some programs are elegant, some are exquisite, some are sparkling. My claim is that it is possible to write grand programs, noble programs, truly magnificent ones!

Taste and Style

The idea of style in programming is now coming to the forefront at last, and I hope that most of you have seen