Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Unix system administration bulletin, 2017

Home 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999

For the list of top articles see Recommended Links section


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Dec 25, 2017] American Carnage by Brad Griffin

Notable quotes:
"... It tells me that the bottom line is that Christmas has become a harder season for White families. We are worse off because of BOTH social and economic liberalism which has only benefited an elite few. The bottom half of the White population is now in total disarray – drug addiction, demoralization, divorce, suicide, abortion, atomization, stagnant wages, declining household income and investments – and this dysfunction is creeping up the social ladder. The worst thing we can do is step on the accelerator. ..."
Dec 24, 2017 | www.unz.com

As we move into 2018, I am swinging away from the Republicans. I don't support the Paul Ryan "Better Way" agenda. I don't support neoliberal economics. I think we have been going in the wrong direction since the 1970s and don't want to continue going down this road.

  1. Opioid Deaths: As we all know, the opioid epidemic has become a national crisis and the White working class has been hit the hardest by it. It is a "sea of despair" out there.
  2. White Mortality: As the family crumbles, religion recedes in his life, and his job prospects dwindle, the middle aged White working class man is turning to drugs, alcohol and suicide: The White suicide rate has soared since 2000:
  3. Median Household Income: The average household in the United States is poorer in 2017 than it was in 1997:
  4. Real GDP: Since the late 1990s, real GDP and real median household income have parted ways:
  5. Productivity and Real Wages: Since the 1970s, the minimum wage has parted ways with productivity gains in the US economy:
  6. Stock Market: Since 2000, the stock market has soared, but 10% of Americans own 80% of stocks. The top 1% owns 38% of stocks. In 2007, 3/4th of middle class households were invested in the stock market, but now only 50% are investors. Overall, 52% of Americans now own stocks, which is down from 65%. The average American has less than $1,000 in their combined checking and savings accounts.

Do you know what this tells me?

It tells me that the bottom line is that Christmas has become a harder season for White families. We are worse off because of BOTH social and economic liberalism which has only benefited an elite few. The bottom half of the White population is now in total disarray – drug addiction, demoralization, divorce, suicide, abortion, atomization, stagnant wages, declining household income and investments – and this dysfunction is creeping up the social ladder. The worst thing we can do is step on the accelerator.

Paul Ryan and his fellow conservatives look at this and conclude we need MORE freedom. We need lower taxes, more free trade, more deregulation, weaker unions, more immigration and less social safety net spending. He wants to follow up tax reform with entitlement reform in 2018. I can't but see how this is going to make an already bad situation for the White working class even worse.

I'm not rightwing in the sense that these people are. I think their policies are harmful to the nation. I don't think they feel any sense of duty and obligation to the working class like we do. They believe in liberal abstractions and make an Ayn Rand fetish out of freedom whereas we feel a sense of solidarity with them grounded in race, ethnicity and culture which tempers class division. We recoil at the evisceration of the social fabric whereas conservatives celebrate this blind march toward plutocracy.

Do the wealthy need to own a greater share of the stock market? Do they need to own a greater share of our national wealth? Do we need to loosen up morals and the labor market? Do we need more White children growing up in financially stressed, broken homes on Christmas? Is the greatest problem facing the nation spending on anti-poverty programs? Paul Ryan and the True Cons think so.

Yeah, I don't think so. I also think it is a good thing right now that we aren't associated with the mainstream Right. In the long run, I bet this will pay off for us. I predict this platform they have been standing on for decades now, which they call the conservative base, is going to implode on them. Donald Trump was only the first sign that Atlas is about to shrug.

(Republished from Occidental Dissent by permission of author or representative)

[Dec 13, 2017] Stress of long-term unemployment takes a toll on thousands of Jerseyans who are out of work by Leslie Kwoh

Notable quotes:
"... Leslie Kwoh may be reached at lkwoh@starledger.com or (973) 392-4147. ..."
Jun 13, 2010 | www.nj.com

At 5:30 every morning, Tony Gwiazdowski rolls out of bed, brews a pot of coffee and carefully arranges his laptop, cell phone and notepad like silverware across the kitchen table.

And then he waits.

Gwiazdowski, 57, has been waiting for 16 months. Since losing his job as a transportation sales manager in February 2009, he wakes each morning to the sobering reminder that, yes, he is still unemployed. So he pushes aside the fatigue, throws on some clothes and sends out another flurry of resumes and cheery cover letters.

But most days go by without a single phone call. And around sundown, when he hears his neighbors returning home from work, Gwiazdowski -- the former mayor of Hillsborough -- can't help but allow himself one tiny sigh of resignation.

"You sit there and you wonder, 'What am I doing wrong?'" said Gwiazdowski, who finds companionship in his 2-year-old golden retriever, Charlie, until his wife returns from work.

"The worst moment is at the end of the day when it's 4:30 and you did everything you could, and the phone hasn't rung, the e-mails haven't come through."

Gwiazdowski is one of a growing number of chronically unemployed workers in New Jersey and across the country who are struggling to get through what is becoming one long, jobless nightmare -- even as the rest of the economy has begun to show signs of recovery.

Nationwide, 46 percent of the unemployed -- 6.7 million Americans -- have been without work for at least half a year, by far the highest percentage recorded since the U.S. Labor Department began tracking the data in 1948.

In New Jersey, nearly 40 percent of the 416,000 unemployed workers last year fit that profile, up from about 20 percent in previous years, according to the department, which provides only annual breakdowns for individual states. Most of them were unemployed for more than a year.

But the repercussions of chronic unemployment go beyond the loss of a paycheck or the realization that one might never find the same kind of job again. For many, the sinking feeling of joblessness -- with no end in sight -- can take a psychological toll, experts say.

Across the state, mental health crisis units saw a 20 percent increase in demand last year as more residents reported suffering from unemployment-related stress, according to the New Jersey Association of Mental Health Agencies.

"The longer the unemployment continues, the more impact it will have on their personal lives and mental health," said Shauna Moses, the association's associate executive director. "There's stress in the marriage, with the kids, other family members, with friends."

And while a few continue to cling to optimism, even the toughest admit there are moments of despair: Fear of never finding work, envy of employed friends and embarassment at having to tell acquaintances that, nope, still no luck.

"When they say, 'Hi Mayor,' I don't tell a lot of people I'm out of work -- I say I'm semi-retired," said Gwiazdowski, who maxed out on unemployment benefits several months ago.

"They might think, 'Gee, what's wrong with him? Why can't he get a job?' It's a long story and maybe people really don't care and now they want to get away from you."


SECOND TIME AROUND

Lynn Kafalas has been there before, too. After losing her computer training job in 2000, the East Hanover resident took four agonizing years to find new work -- by then, she had refashioned herself into a web designer.

That not-too-distant experience is why Kafalas, 52, who was laid off again eight months ago, grows uneasier with each passing day. Already, some of her old demons have returned, like loneliness, self-doubt and, worst of all, insomnia. At night, her mind races to dissect the latest interview: What went wrong? What else should she be doing? And why won't even Barnes & Noble hire her?

"It's like putting a stopper on my life -- I can't move on," said Kafalas, who has given up karate lessons, vacations and regular outings with friends. "Everything is about the interviews."

And while most of her friends have been supportive, a few have hinted to her that she is doing something wrong, or not doing enough. The remarks always hit Kafalas with a pang.

In a recent study, researchers at Rutgers University found that the chronically unemployed are prone to high levels of stress, anxiety, depression, loneliness and even substance abuse, which take a toll on their self-esteem and personal relationships.

"They're the forgotten group," said Carl Van Horn, director of the John J. Heldrich Center for Workforce Development at Rutgers, and a co-author of the report. "And the longer you are unemployed, the less likely you are to get a job."

Of the 900 unemployed workers first interviewed last August for the study, only one in 10 landed full-time work by March of this year, and only half of those lucky few expressed satisfaction with their new jobs. Another one in 10 simply gave up searching.

Among those who were still unemployed, many struggled to make ends meet by borrowing from friends or family, turning to government food stamps and forgoing health care, according to the study.

More than half said they avoided all social contact, while slightly less than half said they had lost touch with close friends. Six in 10 said they had problems sleeping.

Kafalas says she deals with her chronic insomnia by hitting the gym for two hours almost every evening, lifting weights and pounding the treadmill until she feels tired enough to fall asleep.

"Sometimes I forget what day it is. Is it Tuesday? And then I'll think of what TV show ran the night before," she said. "Waiting is the toughest part."


AGE A FACTOR

Generally, the likelihood of long-term unemployment increases with age, experts say. A report by the National Employment Law Project this month found that nearly half of those who were unemployed for six months or longer were at least 45 years old. Those between 16 and 24 made up just 14 percent.

Tell that to Adam Blank, 24, who has been living with his girlfriend and her parents at their Martinsville home since losing his sales job at Best Buy a year and half ago.

Blank, who graduated from Rutgers with a major in communications, says he feels like a burden sometimes, especially since his girlfriend, Tracy Rosen, 24, works full-time at a local nonprofit. He shows her family gratitude with small chores, like taking out the garbage, washing dishes, sweeping floors and doing laundry.

Still, he often feels inadequate.

"All I'm doing on an almost daily basis is sitting around the house trying to keep myself from going stir-crazy," said Blank, who dreams of starting a social media company.

When he is feeling particularly low, Blank said he turns to a tactic employed by prisoners of war in Vietnam: "They used to build dream houses in their head to help keep their sanity. It's really just imagining a place I can call my own."


LESSONS LEARNED

Meanwhile, Gwiazdowski, ever the optimist, says unemployment has taught him a few things.

He has learned, for example, how to quickly assess an interviewer's age and play up or down his work experience accordingly -- he doesn't want to appear "threatening" to a potential employer who is younger. He has learned that by occasionally deleting and reuploading his resume to job sites, his entry appears fresh.

"It's almost like a game," he said, laughing. "You are desperate, but you can't show it."

But there are days when he just can't find any humor in his predicament -- like when he finishes a great interview but receives no offer, or when he hears a fellow job seeker finally found work and feels a slight twinge of jealousy.

"That's what I'm missing -- putting on that shirt and tie in the morning and going to work," he said.

The memory of getting dressed for work is still so vivid, Gwiazdowski says, that he has to believe another job is just around the corner.

"You always have to hope that that morning when you get up, it's going to be the day," he said.

"Today is going to be the day that something is going to happen."

Leslie Kwoh may be reached at lkwoh@starledger.com or (973) 392-4147.

DrBuzzard Jun 13, 2010

I collect from the state of iowa, was on tier I and when the gov't recessed without passing extension, iowa stopped paying tier I claims that were already open, i was scheduled to be on tier I until july 15th, and its gone now, as a surprise, when i tried to claim my week this week i was notified. SURPRISE, talk about stress.

berganliz Jun 13, 2010

This is terrible....just wait until RIF'd teachers hit the unemployment offices....but then, this is what NJ wanted...fired teachers who are to blame for the worst recession our country has seen in 150 years...thanks GWB.....thanks Donald Rumsfeld......thanks Dick Cheney....thanks Karl "Miss Piggy" Rove...and thank you Mr. Big Boy himself...Gov Krispy Kreame!

rp121 Jun 13, 2010

For readers who care about this nation's unemployed- Call your Senators to pass HR 4213, the "Extenders" bill. Unfortunately, it does not add UI benefits weeks, however it DOES continue the emergency federal tiers of UI. If it does not pass this week many of us are cut off at 26 wks. No tier 1, 2 -nothing.

[Dec 13, 2017] Unemployment health hazard and stress

The longer you are unemployed, the more you are effected by those factors.
Notable quotes:
"... The good news is that only a relatively small number of people are seriously affected by the stress of unemployment to the extent they need medical assistance. Most people don't get to the serious levels of stress, and much as they loathe being unemployed, they suffer few, and minor, ill effects. ..."
"... Worries about income, domestic problems, whatever, the list is as long as humanity. The result of stress is a strain on the nervous system, and these create the physical effects of the situation over time. The chemistry of stress is complex, but it can be rough on the hormonal system. ..."
"... Not at all surprisingly, people under stress experience strong emotions. It's a perfectly natural response to what can be quite intolerable emotional strains. It's fair to say that even normal situations are felt much more severely by people already under stress. Things that wouldn't normally even be issues become problems, and problems become serious problems. Relationships can suffer badly in these circumstances, and that, inevitably, produces further crises. Unfortunately for those affected, these are by now, at this stage, real crises. ..."
"... Some people are stubborn enough and tough enough mentally to control their emotions ruthlessly, and they do better under these conditions. Even that comes at a cost, and although under control, the stress remains a problem. ..."
"... One of the reasons anger management is now a growth industry is because of the growing need for assistance with severe stress over the last decade. This is a common situation, and help is available. ..."
"... Depression is universally hated by anyone who's ever had it. ..."
"... Very important: Do not, under any circumstances, try to use drugs or alcohol as a quick fix. They make it worse, over time, because they actually add stress. Some drugs can make things a lot worse, instantly, too, particularly the modern made-in-a-bathtub variety. They'll also destroy your liver, which doesn't help much, either. ..."
"... You don't have to live in a gym to get enough exercise for basic fitness. A few laps of the pool, a good walk, some basic aerobic exercises, you're talking about 30-45 minutes a day. It's not hard. ..."
Dec 13, 2017 | www.cvtips.com

It's almost impossible to describe the various psychological impacts, because there are so many. There are sometimes serious consequences, including suicide, and, some would say worse, chronic depression.

There's not really a single cause and effect. It's a compound effect, and unemployment, by adding stress, affects people, often badly.

The world doesn't need any more untrained psychologists, and we're not pretending to give medical advice. That's for professionals. Everybody is different, and their problems are different. What we can do is give you an outline of the common problems, and what you can do about them.

The good news is that only a relatively small number of people are seriously affected by the stress of unemployment to the extent they need medical assistance. Most people don't get to the serious levels of stress, and much as they loathe being unemployed, they suffer few, and minor, ill effects.

For others, there are a series of issues, and the big three are:

Stress

Stress is Stage One. It's a natural result of the situation. Worries about income, domestic problems, whatever, the list is as long as humanity. The result of stress is a strain on the nervous system, and these create the physical effects of the situation over time. The chemistry of stress is complex, but it can be rough on the hormonal system.

Over an extended period, the body's natural hormonal balances are affected, and this can lead to problems. These are actually physical issues, but the effects are mental, and the first obvious effects are, naturally, emotional.

Anger, and other negative emotions

Not at all surprisingly, people under stress experience strong emotions. It's a perfectly natural response to what can be quite intolerable emotional strains. It's fair to say that even normal situations are felt much more severely by people already under stress. Things that wouldn't normally even be issues become problems, and problems become serious problems. Relationships can suffer badly in these circumstances, and that, inevitably, produces further crises. Unfortunately for those affected, these are by now, at this stage, real crises.

If the actual situation was already bad, this mental state makes it a lot worse. Constant aggravation doesn't help people to keep a sense of perspective. Clear thinking isn't easy when under constant stress.

Some people are stubborn enough and tough enough mentally to control their emotions ruthlessly, and they do better under these conditions. Even that comes at a cost, and although under control, the stress remains a problem.

One of the reasons anger management is now a growth industry is because of the growing need for assistance with severe stress over the last decade. This is a common situation, and help is available.

If you have reservations about seeking help, bear in mind it can't possibly be any worse than the problem.

Depression

Depression is universally hated by anyone who's ever had it. This is the next stage, and it's caused by hormonal imbalances which affect serotonin. It's actually a physical problem, but it has mental effects which are sometimes devastating, and potentially life threatening.

The common symptoms are:

It's a disgusting experience. No level of obscenity could possibly describe it. Depression is misery on a level people wouldn't conceive in a nightmare. At this stage the patient needs help, and getting it is actually relatively easy. It's convincing the person they need to do something about it that's difficult. Again, the mental state is working against the person. Even admitting there's a problem is hard for many people in this condition.

Generally speaking, a person who is trusted is the best person to tell anyone experiencing the onset of depression to seek help. Important: If you're experiencing any of those symptoms:

Very important: Do not, under any circumstances, try to use drugs or alcohol as a quick fix. They make it worse, over time, because they actually add stress. Some drugs can make things a lot worse, instantly, too, particularly the modern made-in-a-bathtub variety. They'll also destroy your liver, which doesn't help much, either.

Alcohol, in particular, makes depression much worse. Alcohol is a depressant, itself, and it's also a nasty chemical mix with all those stress hormones.

If you've ever had alcohol problems, or seen someone with alcohol wrecking their lives, depression makes things about a million times worse.

Just don't do it. Steer clear of any so-called stimulants, because they don't mix with antidepressants, either.

Unemployment and staying healthy

The above is what you need to know about the risks of unemployment to your health and mental well being.

These situations are avoidable.

Your best defense against the mental stresses and strains of unemployment, and their related problems is staying healthy.

We can promise you that is nothing less than the truth. The healthier you are, the better your defenses against stress, and the more strength you have to cope with situations.

Basic health is actually pretty easy to achieve:

Diet

Eat real food, not junk, and make sure you're getting enough food. Your body can't work with resources it doesn't have. Good food is a real asset, and you'll find you don't get tired as easily. You need the energy reserves.

Give yourself a good selection of food that you like, that's also worth eating.

The good news is that plain food is also reasonably cheap, and you can eat as much as you need. Basic meals are easy enough to prepare, and as long as you're getting all the protein veg and minerals you need, you're pretty much covered.

You can also use a multivitamin cap, or broad spectrum supplements, to make sure you're getting all your trace elements. Also make sure you're getting the benefits of your food by taking acidophilus or eating yogurt regularly.

Exercise

You don't have to live in a gym to get enough exercise for basic fitness. A few laps of the pool, a good walk, some basic aerobic exercises, you're talking about 30-45 minutes a day. It's not hard.

Don't just sit and suffer

If anything's wrong, check it out when it starts, not six months later. Most medical conditions become serious when they're allowed to get worse.

For unemployed people the added risk is also that they may prevent you getting that job, or going for interviews. If something's causing you problems, get rid of it.

Nobody who's been through the blender of unemployment thinks it's fun.

Anyone who's really done it tough will tell you one thing:

Don't be a victim. Beat the problem, and you'll really appreciate the feeling.

[Dec 13, 2017] Being homeless is better than working for Amazon by Nichole Gracely

Notable quotes:
"... According to Amazon's metrics, I was one of their most productive order pickers -- I was a machine, and my pace would accelerate throughout the course of a shift. What they didn't know was that I stayed fast because if I slowed down for even a minute, I'd collapse from boredom and exhaustion ..."
"... toiling in some remote corner of the warehouse, alone for 10 hours, with my every move being monitored by management on a computer screen. ..."
"... ISS could simply deactivate a worker's badge and they would suddenly be out of work. They treated us like beggars because we needed their jobs. Even worse, more than two years later, all I see is: Jeff Bezos is hiring. ..."
"... I have never felt more alone than when I was working there. I worked in isolation and lived under constant surveillance ..."
"... That was 2012 and Amazon's labor and business practices were only beginning to fall under scrutiny. ..."
"... I received $200 a week for the following six months and I haven't had any source of regular income since those benefits lapsed. I sold everything in my apartment and left Pennsylvania as fast as I could. I didn't know how to ask for help. I didn't even know that I qualified for food stamps. ..."
Nov 28, 2014 | theguardian.com

wa8dzp:

Nichole Gracely has a master's degree and was one of Amazon's best order pickers. Now, after protesting the company, she's homeless.

I am homeless. My worst days now are better than my best days working at Amazon.

According to Amazon's metrics, I was one of their most productive order pickers -- I was a machine, and my pace would accelerate throughout the course of a shift. What they didn't know was that I stayed fast because if I slowed down for even a minute, I'd collapse from boredom and exhaustion.

During peak season, I trained incoming temps regularly. When that was over, I'd be an ordinary order picker once again, toiling in some remote corner of the warehouse, alone for 10 hours, with my every move being monitored by management on a computer screen.

Superb performance did not guarantee job security. ISS is the temp agency that provides warehouse labor for Amazon and they are at the center of the SCOTUS case Integrity Staffing Solutions vs. Busk. ISS could simply deactivate a worker's badge and they would suddenly be out of work. They treated us like beggars because we needed their jobs. Even worse, more than two years later, all I see is: Jeff Bezos is hiring.

I have never felt more alone than when I was working there. I worked in isolation and lived under constant surveillance. Amazon could mandate overtime and I would have to comply with any schedule change they deemed necessary, and if there was not any work, they would send us home early without pay. I started to fall behind on my bills.

At some point, I lost all fear. I had already been through hell. I protested Amazon. The gag order was lifted and I was free to speak. I spent my last days in a lovely apartment constructing arguments on discussion boards, writing articles and talking to reporters. That was 2012 and Amazon's labor and business practices were only beginning to fall under scrutiny. I walked away from Amazon's warehouse and didn't have any other source of income lined up.

I cashed in on my excellent credit, took out cards, and used them to pay rent and buy food because it would be six months before I could receive my first unemployment compensation check.

I received $200 a week for the following six months and I haven't had any source of regular income since those benefits lapsed. I sold everything in my apartment and left Pennsylvania as fast as I could. I didn't know how to ask for help. I didn't even know that I qualified for food stamps.

I furthered my Amazon protest while homeless in Seattle. When the Hachette dispute flared up I "flew a sign," street parlance for panhandling with a piece of cardboard: "I was an order picker at amazon.com. Earned degrees. Been published. Now, I'm homeless, writing and doing this. Anything helps."

I have made more money per word with my signs than I will probably ever earn writing, and I make more money per hour than I will probably ever be paid for my work. People give me money and offer well wishes and I walk away with a restored faith in humanity.

I flew my protest sign outside Whole Foods while Amazon corporate employees were on lunch break, and they gawked. I went to my usual flying spots around Seattle and made more money per hour protesting Amazon with my sign than I did while I worked with them. And that was in Seattle. One woman asked, "What are you writing?" I told her about the descent from working poor to homeless, income inequality, my personal experience. She mentioned Thomas Piketty's book, we chatted a little, she handed me $10 and wished me luck. Another guy said, "Damn, that's a great story! I'd read it," and handed me a few bucks.

[snip]

[Dec 13, 2017] Business Staff brand colleagues as 'lazy'

While lazy people do happen, this compulsive quest for "high performance" is one of the most disgusting futures of neoliberlaism. Cemented by annual "performance reviews" which are the scam.
Aug 19, 2005 | BBC NEWS
An overwhelming majority of bosses and employees think that some of their colleagues consistently underperform.

An Investors in People survey found 75% of bosses and 80% of staff thought some colleagues were "dead wood" - and the main reason was thought to be laziness. Nearly half of employees added they worked closely with someone who they thought was lazy and not up to the job. However, four out of ten workers said that their managers did nothing about colleagues not pulling their weight.

According to Investors in People, the problem of employees not doing their jobs properly seemed to be more prevalent in larger organizations. The survey found that 84% of workers in organizations with more than 1,000 employees thought they had an underperforming colleague, compared with 50% in firms with fewer than 50 staff.

Tell tale signs

The survey identified the tell-tale signs of people not pulling their weight, according to both employers and employees, including:

Both employers and employees agreed that the major reason for someone failing in their job was sheer laziness. "Dead wood" employees can have a stark effect on their colleagues' physical and mental well-being, the survey found. Employees reported that they had to work longer hours to cover for shirking colleagues and felt undervalued as a result. Ultimately, working alongside a lazy colleague could prompt workers to look for a new job the survey found.

But according to Nick Parfitt, spokesman for human resources firm Cubiks, an unproductive worker isn't necessarily lazy.

"It can be too easy to brand a colleague lazy," he said. "They may have genuine personal problems or are being asked to do a job that they have not been given the training to do. "The employer must look out for the warning signs of a worker becoming de-motivated - hold regular conversations and appraisals with staff."

However, Mr Parfitt added that ultimately lazy employees may have to be shown the door. "The cost of sacking someone can be colossal and damaging to team morale but sometimes it maybe the only choice."

[Dec 12, 2017] Can Uber Ever Deliver Part Eleven Annual Uber Losses Now Approaching $5 Billion

Notable quotes:
"... Total 2015 gross passenger payments were 200% higher than 2014, but Uber corporate revenue improved 300% because Uber cut the driver share of passenger revenue from 83% to 77%. This was an effective $500 million wealth transfer from drivers to Uber's investors. ..."
"... Uber's P&L gains were wiped out by higher non-EBIDTAR expense. Thus the 300% Uber revenue growth did not result in any improvement in Uber profit margins. ..."
"... In 2016, Uber unilaterally imposed much larger cuts in driver compensation, costing drivers an additional $3 billion. [6] Prior to Uber's market entry, the take home pay of big-city cab drivers in the US was in the $12-17/hour range, and these earnings were possible only if drivers worked 65-75 hours a week. ..."
"... An independent study of the net earnings of Uber drivers (after accounting for the costs of the vehicles they had to provide) in Denver, Houston and Detroit in late 2015 (prior to Uber's big 2016 cuts) found that driver earnings had fallen to the $10-13/hour range. [7] Multiple recent news reports have documented how Uber drivers are increasing unable to support themselves from their reduced share of passenger payments. [8] ..."
"... Since mass driver defections would cause passenger volume growth to collapse completely, Uber was forced to reverse these cuts in 2017 and increased the driver share from 68% to 80%. This meant that Uber's corporate revenue, which had grown over 300% in 2015 and over 200% in 2016 will probably only grow by about 15% in 2017. ..."
"... Socialize the losses, privatize the gains, VC-ize the subsidies. ..."
"... The cold hard truth is that Uber is backed into a corner with severely limited abilities to tweak the numbers on either the supply or the demand side: cut driver compensation and they trigger driver churn (as has already been demonstrated), increase fare prices for riders and riders defect to cheaper alternatives. ..."
"... "Growth and Efficiency" are the sine qua non of Neoliberalism. Kalanick's "hype brilliance" was to con the market with "revenue growth" and signs ..."
Dec 12, 2017 | www.nakedcapitalism.com

Uber lost $2.5 billion in 2015, probably lost $4 billion in 2016, and is on track to lose $5 billion in 2017.

The top line on the table below shows is total passenger payments, which must be split between Uber corporate and its drivers. Driver gross earnings are substantially higher than actual take home pay, as gross earning must cover all the expenses drivers bear, including fuel, vehicle ownership, insurance and maintenance.

Most of the "profit" data released by Uber over time and discussed in the press is not true GAAP (generally accepted accounting principles) profit comparable to the net income numbers public companies publish but is EBIDTAR contribution. Companies have significant leeway as to how they calculate EBIDTAR (although it would exclude interest, taxes, depreciation, amortization) and the percentage of total costs excluded from EBIDTAR can vary significantly from quarter to quarter, given the impact of one-time expenses such as legal settlements and stock compensation. We only have true GAAP net profit results for 2014, 2015 and the 2nd/3rd quarters of 2017, but have EBIDTAR contribution numbers for all other periods. [5]

Uber had GAAP net income of negative $2.6 billion in 2015, and a negative profit margin of 132%. This is consistent with the negative $2.0 billion loss and (143%) margin for the year ending September 2015 presented in part one of the NC Uber series over a year ago.

No GAAP profit results for 2016 have been disclosed, but actual losses likely exceed $4 billion given the EBIDTAR contribution of negative $3.2 billion. Uber's GAAP losses for the 2nd and 3rd quarters of 2017 were over $2.5 billion, suggesting annual losses of roughly $5 billion.

While many Silicon Valley funded startups suffered large initial losses, none of them lost anything remotely close to $2.6 billion in their sixth year of operation and then doubled their losses to $5 billion in year eight. Reversing losses of this magnitude would require the greatest corporate financial turnaround in history.

No evidence of significant efficiency/scale gains; 2015 and 2016 margin improvements entirely explained by unilateral cuts in driver compensation, but losses soared when Uber had to reverse these cuts in 2017.

Total 2015 gross passenger payments were 200% higher than 2014, but Uber corporate revenue improved 300% because Uber cut the driver share of passenger revenue from 83% to 77%. This was an effective $500 million wealth transfer from drivers to Uber's investors. These driver compensation cuts improved Uber's EBIDTAR margin, but Uber's P&L gains were wiped out by higher non-EBIDTAR expense. Thus the 300% Uber revenue growth did not result in any improvement in Uber profit margins.

In 2016, Uber unilaterally imposed much larger cuts in driver compensation, costing drivers an additional $3 billion. [6] Prior to Uber's market entry, the take home pay of big-city cab drivers in the US was in the $12-17/hour range, and these earnings were possible only if drivers worked 65-75 hours a week.

An independent study of the net earnings of Uber drivers (after accounting for the costs of the vehicles they had to provide) in Denver, Houston and Detroit in late 2015 (prior to Uber's big 2016 cuts) found that driver earnings had fallen to the $10-13/hour range. [7] Multiple recent news reports have documented how Uber drivers are increasing unable to support themselves from their reduced share of passenger payments. [8]

A business model where profit improvement is hugely dependent on wage cuts is unsustainable, especially when take home wages fall to (or below) minimum wage levels. Uber's primary focus has always been the rate of growth in gross passenger revenue, as this has been a major justification for its $68 billion valuation. This growth rate came under enormous pressure in 2017 given Uber efforts to raise fares, major increases in driver turnover as wages fell, [9] and the avalanche of adverse publicity it was facing.

Since mass driver defections would cause passenger volume growth to collapse completely, Uber was forced to reverse these cuts in 2017 and increased the driver share from 68% to 80%. This meant that Uber's corporate revenue, which had grown over 300% in 2015 and over 200% in 2016 will probably only grow by about 15% in 2017.

MKS , December 12, 2017 at 6:19 am

"Uber's business model can never produce sustainable profits"

Two words not in my vocabulary are "Never" and "Always", that is a pretty absolute statement in an non-absolute environment. The same environment that has produced the "Silicon Valley Growth Model", with 15x earnings companies like NVIDA, FB and Tesla (Average earnings/stock price ratio in dot com bubble was 10x) will people pay ridiculous amounts of money for a company with no underlying fundamentals you damn right they will! Please stop with the I know all no body knows anything, especially the psychology and irrationality of markets which are made up of irrational people/investors/traders.

JohnnySacks , December 12, 2017 at 7:34 am

My thoughts exactly. Seems the only possible recovery for the investors is a perfectly engineered legendary pump and dump IPO scheme. Risky, but there's a lot of fools out there and many who would also like to get on board early in the ride in fear of missing out on all the money to be hoovered up from the greater fools. Count me out.

SoCal Rhino , December 12, 2017 at 8:30 am

The author clearly distinguishes between GAAP profitability and valuations, which is after all rather the point of the series. And he makes a more nuanced point than the half sentence you have quoted without context or with an indication that you omitted a portion. Did you miss the part about how Uber would have a strong incentive to share the evidence of a network effect or other financial story that pointed the way to eventual profit? Otherwise (my words) it is the classic sell at a loss, make it up with volume path to liquidation.

tegnost , December 12, 2017 at 9:52 am

apples and oranges comparison, nvidia has lots and lots of patented tech that produces revenue, facebook has a kajillion admittedly irrational users, but those users drive massive ad sales (as just one example of how that company capitalizes itself) and tesla makes an actual car, using technology that inspires it's buyers (the put your money where your mouth is crowd and it can't be denied that tesla, whatever it's faults are, battery tech is not one of them and that intellectual property is worth a lot, and tesla's investors are in on that real business, profitable or otherwise)

Uber is an iphone app. They lose money and have no path to profitability (unless it's the theory you espouse that people are unintelligent so even unintelligent ideas work to fleece them). This article touches on one of the great things about the time we now inhabit, uber drivers could bail en masse, there are two sides to the low attachment employees who you can get rid of easily. The drivers can delete the uber app as soon as another iphone app comes along that gets them a better return

allan , December 12, 2017 at 6:52 am

Yet another source (unintended) of subsidies for Uber, Lyft, etc., which might or might not have been mentioned earlier in the series:

Airports Are Losing Money as Ride-Hailing Services Grow [NYT]

For many air travelers, getting to and from the airport has long been part of the whole miserable experience. Do they drive and park in some distant lot? Take mass transit or a taxi? Deal with a rental car?

Ride-hailing services like Uber and Lyft are quickly changing those calculations. That has meant a bit less angst for travelers.

But that's not the case for airports. Travelers' changing habits, in fact, have begun to shake the airports' financial underpinnings. The money they currently collect from ride-hailing services do not compensate for the lower revenues from the other sources.

At the same time, some airports have had to add staff to oversee the operations of the ride-hailing companies, the report said. And with more ride-hailing vehicles on the roads outside terminals,
there's more congestion.

Socialize the losses, privatize the gains, VC-ize the subsidies.

Thuto , December 12, 2017 at 6:55 am

The cold hard truth is that Uber is backed into a corner with severely limited abilities to tweak the numbers on either the supply or the demand side: cut driver compensation and they trigger driver churn (as has already been demonstrated), increase fare prices for riders and riders defect to cheaper alternatives. The only question is how long can they keep the show going before the lights go out, slick marketing and propaganda can only take you so far, and one assumes the dumb money has a finite supply of patience and will at some point begin asking the tough questions.

Louis Fyne , December 12, 2017 at 8:35 am

The irony is that Uber would have been a perfectly fine, very profitable mid-sized company if Uber stuck with its initial model -- sticking to dense cities with limited parking, limiting driver supply, and charging a premium price for door-to-door delivery, whether by livery or a regular sedan. And then perhaps branching into robo-cars.

But somehow Uber/board/Travis got suckered into the siren call of self-driving cars, triple-digit user growth, and being in the top 100 US cities and on every continent.

Thuto , December 12, 2017 at 11:30 am

I've shared a similar sentiment in one of the previous posts about Uber. But operating profitably in decent sized niche doesn't fit well with ambitions of global domination. For Uber to be "right-sized", an admission of folly would have to be made, its managers and investors would have to transcend the sunk cost fallacy in their strategic decision making, and said investors would have to accept massive hits on their invested capital. The cold, hard reality of being blindsided and kicked to the curb in the smartphone business forced RIM/Blackberry to right-size, and they may yet have a profitable future as an enterprise facing software and services company. Uber would benefit from that form of sober mindedness, but I wouldn't hold my breath.

David Carl Grimes , December 12, 2017 at 6:57 am

The question is: Why did Softbank invest in Uber?

Michael Fiorillo , December 12, 2017 at 9:33 am

I know nothing about Softbank or its management, but I do know that the Japanese were the dumb money rubes in the late '80's, overpaying for trophy real estate they lost billions on.

Until informed otherwise, that's my default assumption

JimTan , December 12, 2017 at 10:50 am

Softbank possibly looking to buy more Uber shares at a 30% discount is very odd. Uber had a Series G funding round in June 2016 where a $3.5 billion investment from Saudi Arabia's Public Investment Fund resulted in its current $68 billion valuation. Now apparently Softbank wants to lead a new $6 billion funding round to buy the shares of Uber employees and early investors at a 30% discount from this last "valuation". It's odd because Saudi Arabia's Public Investment Fund has pledged $45 billion to SoftBank's Vision Fund , an amount which was supposed to come from the proceeds of its pending Aramco IPO. If the Uber bid is linked to SoftBank's Vision Fund, or KSA money, then its not clear why this investor might be looking to literally 'double down' from $3.5 billion o $6 billion on a declining investment.

Yves Smith Post author , December 12, 2017 at 11:38 am

SoftBank has not yet invested. Its tender is still open. If it does not get enough shares at a price it likes, it won't invest.

As to why, I have no idea.

Robert McGregor , December 12, 2017 at 7:04 am

"Growth and Efficiency" are the sine qua non of Neoliberalism. Kalanick's "hype brilliance" was to con the market with "revenue growth" and signs of efficiency, and hopes of greater efficiency, and make most people just overlook the essential fact that Uber is the most unprofitable company of all time!

divadab , December 12, 2017 at 7:19 am

What comprises "Uber Expenses"? 2014 – $1.06 billion; 2015 $3.33 billion; 2016 $9.65 billion; forecast 2017 $11.418 billion!!!!!! To me this is the big question – what are they spending $10 billion per year on?

ALso – why did driver share go from 68% in 2016 to 80% in 2017? If you use 68% as in 2016, 2017 Uber revenue is $11.808 billion, which means a bit better than break-even EBITDA, assuming Uber expenses are as stated $11.428 billion.

Perhaps not so bleak as the article presents, although I would not invest in this thing.

Phil in Kansas City , December 12, 2017 at 7:55 am

I have the same question: What comprises over 11 billion dollars in expenses in 2017? Could it be they are paying out dividends to the early investors? Which would mean they are cannibalizing their own company for the sake of the VC! How long can this go on before they'll need a new infusion of cash?

lyman alpha blob , December 12, 2017 at 2:37 pm

The Saudis have thrown a few billion Uber's way and they aren't necessarily known as the smart money.

Maybe the pole dancers have started chipping in too as they are for bitcoin .

Vedant Desai , December 12, 2017 at 10:37 am

Oh article does answer your 2nd question. Read this paragraph:-

Since mass driver defections would cause passenger volume growth to collapse completely , Uber was forced to reverse these cuts in 2017 and increased the driver share from 68% to 80%. This meant that Uber's corporate revenue, which had grown over 300% in 2015 and over 200% in 2016 will probably only grow by about 15% in 2017.

As for the 1st, read this line in the article:-

There are undoubtedly a number of things Uber could do to reduce losses at the margin, but it is difficult to imagine it could suddenly find the $4-5 billion in profit improvement needed merely to reach breakeven.

Louis Fyne , December 12, 2017 at 8:44 am

in addition to all the points listed in the article/comments, the absolute biggest flaw with Uber is that Uber HQ conditioned its customers on (a) cheap fares and (b) that a car is available within minutes (1-5 if in a big city).

Those two are not mutually compatible in the long-term.

Alfred , December 12, 2017 at 9:49 am

Thus (a) "We cost less" and (b) "We're more convenient" -- aren't those also the advantages that Walmart claims and feeds as a steady diet to its ever hungry consumers? Often if not always, disruption may repose upon delusion.

Martin Finnucane , December 12, 2017 at 11:06 am

Uber's business model could never produce sustainable profits unless it was able to exploit significant anti-competitive market power.

Upon that dependent clause hangs the future of capitalism, and – dare I say it? – its inevitable demise.

Altandmain , December 12, 2017 at 11:09 am

When this Uber madness blows up, I wonder if people will finally begin to discuss the brutal reality of Silicon Valley's so called "disruption".

It is heavily built in around the idea of economic exploitation. Uber drivers are often, especially when the true costs to operate an Uber including the vehicle depreciation are factored in, making not very much per hour driven, especially if they don't get the surge money.

Instacart is another example. They are paying the deliver operators very little.

Jim A. , December 12, 2017 at 12:21 pm

At a fundamental level, I think that the Silicon Valley "disruption" model only works for markets (like software) where the marginal cost for production is de minimus and the products can be protected by IP laws. Volume and market power really work in those cases. But out here in meat-space, where actual material and labor are big inputs to each item sold, you can never just sit back on your laurels and rake in the money. Somebody else will always be able to come and and make an equivalent product. If they can do it more cheaply, you are in trouble.

Altandmain , December 12, 2017 at 5:40 pm

There aren't that many areas in goods and services where the marginal costs are very low.

Software is actually quite unique in that regard, costing merely the bandwidth and permanent storage space to store.

Let's see:

1. From the article, they cannot go public and have limited ways to raise more money. An IPO with its more stringent disclosure requirements would expose them.

2. They tried lowering driver compensation and found that model unsustainable.

3. There are no benefits to expanding in terms of economies of scale.

From where I am standing, it looks like a lot of industries gave similar barriers. Silicon Valley is not going to be able to disrupt those.

Tesla, another Silicon Valley company seems to be struggling to mass produce its Model 3 and deliver an electric car that breaks even, is reliable, while disrupting the industry in the ways that Elon Musk attempted to hype up.

So that basically leaves services and manufacturing out for Silicon Valley disruption.

Joe Bentzel , December 12, 2017 at 2:19 pm

UBER has become a "too big to fail" startup because of all the different tentacles of capital from various Tier 1 VCs and investment bankers.

VCs have admitted openly that UBER is a subsidized business, meaning it's product is sold below market value, and the losses reflect that subsidization. The whole "2 sided platform" argument is just marketecture to hustle more investors. It's a form of service "dumping" that puts legacy businesses into bankruptcy. Back during the dotcom bubble one popular investment banker (Paul Deninger) characterized this model as "Terrorist Competition", i.e. coffers full of invested cash to commoditize the market and drive out competition.

UBER is an absolute disaster that has forked the startup model in Silicon Valley in order to drive total dependence on venture capital by founders. And its current diversification into "autonomous vehicles", food delivery, et al are simply more evidence that the company will never be profitable due to its whacky "blitzscaling" approach of layering on new "businesses" prior to achieving "fit" in its current one.

It's economic model has also metastasized into a form of startup cancer that is killing Silicon Valley as a "technology" innovator. Now it's all cargo cult marketing BS tied to "strategic capital".

UBER is the victory of venture capital and user subsidized startups over creativity by real entrepreneurs.

It's shadow is long and that's why this company should be ..wait for it UNBUNDLED (the new silicon valley word attached to that other BS religion called "disruption"). Call it a great unbundling and you can break up this monster corp any way you want.

Naked Capitalism is a great website.

Phil in KC , December 12, 2017 at 3:20 pm

1. I Agree with your last point.

2. The elevator pitch for Uber: subsidize rides to attract customers, put the competition out of business, and then enjoy an unregulated monopoly, all while exploiting economically ignorant drivers–ahem–"partners."

3. But more than one can play that game, and

4. Cab and livery companies are finding ways to survive!

Phil in KC , December 12, 2017 at 3:10 pm

If subsidizing rides is counted as an expense, (not being an accountant, I would guess it so), then whether the subsidy goes to the driver or the passenger, that would account for the ballooning expenses, to answer my own question. Otherwise, the overhead for operating what Uber describes as a tech company should be minimal: A billion should fund a decent headquarters with staff, plus field offices in, say, 100 U.S. cities. However, their global pretensions are probably burning cash like crazy. On top of that, I wonder what the exec compensation is like?

After reading HH's initial series, I made a crude, back-of-the-envelope calculation that Uber would run out of money sometime in the third fiscal quarter of 2018, but that was based on assuming losses were stabilizing in the range of 3 billion a year. Not so, according to the article. I think crunch time is rapidly approaching. If so, then SoftBank's tender offer may look quite appetizing to VC firms and to any Uber employee able to cash in their options. I think there is a way to make a re-envisioned Uber profitable, and with a more independent board, they may be able to restructure the company to show a pathway to profitability before the IPO. But time is running out.

A not insignificant question is the recruitment and retention of the front line "partners." It would seem to me that at some point, Uber will run out of economically ignorant drivers with good manners and nice cars. I would be very interested to know how many drivers give up Uber and other ride-sharing gigs once the 1099's start flying at the beginning of the year. One of the harsh realities of owning a business or being an contractor is the humble fact that you get paid LAST!

Jan Stickle , December 12, 2017 at 5:00 pm

We became instant Uber riders while spending holidays with relatives in San Diego. While their model is indeed unique from a rider perspective, it was the driver pool that fascinates me. These are not professional livery drivers, but rather freebooters of all stripes driving for various reasons. The remuneration they receive cannot possibly generate much income after expenses, never mind the problems associated with IRS filing as independent contractors.

One guy was just cruising listening to music; cooler to get paid for it than just sitting home! A young lady was babbling and gesticulating non stop about nothing coherent and appeared to be on some sort of stimulant. A foreign gentleman, very professional, drove for extra money when not at his regular job. He was the only one who had actually bought a new Prius for this gig, hoping to pay it off in two years.

This is indeed a brave new world. There was a period in Nicaragua just after the Contra war ended when citizens emerged from their homes and hit the streets in large numbers, desperately looking for income. Every car was a taxi and there was a bipedal mini Walmart at every city intersection as individuals sold everything and anything in a sort of euphoric optimism towards the future. Reality just hadn't caught up with them yet .

[Dec 09, 2017] How to rsync only a specific list of files - Stack Overflow

Notable quotes:
"... The filenames that are read from the FILE are all relative to the source dir ..."
Dec 09, 2017 | stackoverflow.com

ash, May 11, 2015 at 20:05

There is a flag --files-from that does exactly what you want. From man rsync :
--files-from=FILE

Using this option allows you to specify the exact list of files to transfer (as read from the specified FILE or - for standard input). It also tweaks the default behavior of rsync to make transferring just the specified files and directories easier:

The filenames that are read from the FILE are all relative to the source dir -- any leading slashes are removed and no ".." references are allowed to go higher than the source dir. For example, take this command:

rsync -a --files-from=/tmp/foo /usr remote:/backup

If /tmp/foo contains the string "bin" (or even "/bin"), the /usr/bin directory will be created as /backup/bin on the remote host. If it contains "bin/" (note the trailing slash), the immediate contents of the directory would also be sent (without needing to be explicitly mentioned in the file -- this began in version 2.6.4). In both cases, if the -r option was enabled, that dir's entire hierarchy would also be transferred (keep in mind that -r needs to be specified explicitly with --files-from, since it is not implied by -a). Also note that the effect of the (enabled by default) --relative option is to duplicate only the path info that is read from the file -- it does not force the duplication of the source-spec path (/usr in this case).

In addition, the --files-from file can be read from the remote host instead of the local host if you specify a "host:" in front of the file (the host must match one end of the transfer). As a short-cut, you can specify just a prefix of ":" to mean "use the remote end of the transfer". For example:

rsync -a --files-from=:/path/file-list src:/ /tmp/copy

This would copy all the files specified in the /path/file-list file that was located on the remote "src" host.

If the --iconv and --protect-args options are specified and the --files-from filenames are being sent from one host to another, the filenames will be translated from the sending host's charset to the receiving host's charset.

NOTE: sorting the list of files in the --files-from input helps rsync to be more efficient, as it will avoid re-visiting the path elements that are shared between adjacent entries. If the input is not sorted, some path elements (implied directories) may end up being scanned multiple times, and rsync will eventually unduplicate them after they get turned into file-list elements.

Nicolas Mattia, Feb 11, 2016 at 11:06

Note that you still have to specify the directory where the files listed are located, for instance: rsync -av --files-from=file-list . target/ for copying files from the current dir. – Nicolas Mattia Feb 11 '16 at 11:06

ash, Feb 12, 2016 at 2:25

Yes, and to reiterate: The filenames that are read from the FILE are all relative to the source dir . – ash Feb 12 '16 at 2:25

Michael ,Nov 2, 2016 at 0:09

if the files-from file has anything starting with .. rsync appears to ignore the .. giving me an error like rsync: link_stat "/home/michael/test/subdir/test.txt" failed: No such file or directory (in this case running from the "test" dir and trying to specify "../subdir/test.txt" which does exist. – Michael Nov 2 '16 at 0:09

xxx,

--files-from= parameter needs trailing slash if you want to keep the absolute path intact. So your command would become something like below:
rsync -av --files-from=/path/to/file / /tmp/

This could be done like there are a large number of files and you want to copy all files to x path. So you would find the files and throw output to a file like below:

find /var/* -name *.log > file

[Dec 09, 2017] linux - What does the line '!-bin-sh -e' do

Dec 09, 2017 | stackoverflow.com

,

That line defines what program will execute the given script. For sh normally that line should start with the # character as so:
#!/bin/sh -e

The -e flag's long name is errexit , causing the script to immediately exit on the first error.

[Dec 07, 2017] First Rule of Usability Don't Listen to Users

Notable quotes:
"... So, do users know what they want? No, no, and no. Three times no. ..."
Dec 07, 2017 | www.nngroup.com

But ultimately, the way to get user data boils down to the basic rules of usability

... ... ...

So, do users know what they want? No, no, and no. Three times no.

Finally, you must consider how and when to solicit feedback. Although it might be tempting to simply post a survey online, you're unlikely to get reliable input (if you get any at all). Users who see the survey and fill it out before they've used the site will offer irrelevant answers. Users who see the survey after they've used the site will most likely leave without answering the questions. One question that does work well in a website survey is "Why are you visiting our site today?" This question goes to users' motivation and they can answer it as soon as they arrive.

[Dec 07, 2017] The rogue DHCP server

Notable quotes:
"... from Don Watkins ..."
Dec 07, 2017 | opensource.com

from Don Watkins

I am a liberal arts person who wound up being a technology director. With the exception of 15 credit hours earned on my way to a Cisco Certified Network Associate credential, all of the rest of my learning came on the job. I believe that learning what not to do from real experiences is often the best teacher. However, those experiences can frequently come at the expense of emotional pain. Prior to my Cisco experience, I had very little experience with TCP/IP networking and the kinds of havoc I could create albeit innocently due to my lack of understanding of the nuances of routing and DHCP.

At the time our school network was an active directory domain with DHCP and DNS provided by a Windows 2000 server. All of our staff access to the email, Internet, and network shares were served this way. I had been researching the use of the K12 Linux Terminal Server ( K12LTSP ) project and had built a Fedora Core box with a single network card in it. I wanted to see how well my new project worked so without talking to my network support specialists I connected it to our main LAN segment. In a very short period of time our help desk phones were ringing with principals, teachers, and other staff who could no longer access their email, printers, shared directories, and more. I had no idea that the Windows clients would see another DHCP server on our network which was my test computer and pick up an IP address and DNS information from it.

I had unwittingly created a "rogue" DHCP server and was oblivious to the havoc that it would create. I shared with the support specialist what had happened and I can still see him making a bee-line for that rogue computer, disconnecting it from the network. All of our client computers had to be rebooted along with many of our switches which resulted in a lot of confusion and lost time due to my ignorance. That's when I learned that it is best to test new products on their own subnet.

[Dec 03, 2017] Business Has Killed IT With Overspecialization by Charlie Schluting

Highly recommended!
Notable quotes:
"... What happened to the old "sysadmin" of just a few years ago? We've split what used to be the sysadmin into application teams, server teams, storage teams, and network teams. There were often at least a few people, the holders of knowledge, who knew how everything worked, and I mean everything. ..."
"... Now look at what we've done. Knowledge is so decentralized we must invent new roles to act as liaisons between all the IT groups. Architects now hold much of the high-level "how it works" knowledge, but without knowing how any one piece actually does work. In organizations with more than a few hundred IT staff and developers, it becomes nearly impossible for one person to do and know everything. This movement toward specializing in individual areas seems almost natural. That, however, does not provide a free ticket for people to turn a blind eye. ..."
"... Does your IT department function as a unit? Even 20-person IT shops have turf wars, so the answer is very likely, "no." As teams are split into more and more distinct operating units, grouping occurs. One IT budget gets split between all these groups. Often each group will have a manager who pitches his needs to upper management in hopes they will realize how important the team is. ..."
"... The "us vs. them" mentality manifests itself at all levels, and it's reinforced by management having to define each team's worth in the form of a budget. One strategy is to illustrate a doomsday scenario. If you paint a bleak enough picture, you may get more funding. Only if you are careful enough to illustrate the failings are due to lack of capital resources, not management or people. A manager of another group may explain that they are not receiving the correct level of service, so they need to duplicate the efforts of another group and just implement something themselves. On and on, the arguments continue. ..."
Apr 07, 2010 | Enterprise Networking Planet

What happened to the old "sysadmin" of just a few years ago? We've split what used to be the sysadmin into application teams, server teams, storage teams, and network teams. There were often at least a few people, the holders of knowledge, who knew how everything worked, and I mean everything. Every application, every piece of network gear, and how every server was configured -- these people could save a business in times of disaster.

Now look at what we've done. Knowledge is so decentralized we must invent new roles to act as liaisons between all the IT groups. Architects now hold much of the high-level "how it works" knowledge, but without knowing how any one piece actually does work. In organizations with more than a few hundred IT staff and developers, it becomes nearly impossible for one person to do and know everything. This movement toward specializing in individual areas seems almost natural. That, however, does not provide a free ticket for people to turn a blind eye.

Specialization

You know the story: Company installs new application, nobody understands it yet, so an expert is hired. Often, the person with a certification in using the new application only really knows how to run that application. Perhaps they aren't interested in learning anything else, because their skill is in high demand right now. And besides, everything else in the infrastructure is run by people who specialize in those elements. Everything is taken care of.

Except, how do these teams communicate when changes need to take place? Are the storage administrators teaching the Windows administrators about storage multipathing; or worse logging in and setting it up because it's faster for the storage gurus to do it themselves? A fundamental level of knowledge is often lacking, which makes it very difficult for teams to brainstorm about new ways evolve IT services. The business environment has made it OK for IT staffers to specialize and only learn one thing.

If you hire someone certified in the application, operating system, or network vendor you use, that is precisely what you get. Certifications may be a nice filter to quickly identify who has direct knowledge in the area you're hiring for, but often they indicate specialization or compensation for lack of experience.

Resource Competition

Does your IT department function as a unit? Even 20-person IT shops have turf wars, so the answer is very likely, "no." As teams are split into more and more distinct operating units, grouping occurs. One IT budget gets split between all these groups. Often each group will have a manager who pitches his needs to upper management in hopes they will realize how important the team is.

The "us vs. them" mentality manifests itself at all levels, and it's reinforced by management having to define each team's worth in the form of a budget. One strategy is to illustrate a doomsday scenario. If you paint a bleak enough picture, you may get more funding. Only if you are careful enough to illustrate the failings are due to lack of capital resources, not management or people. A manager of another group may explain that they are not receiving the correct level of service, so they need to duplicate the efforts of another group and just implement something themselves. On and on, the arguments continue.

Most often, I've seen competition between server groups result in horribly inefficient uses of hardware. For example, what happens in your organization when one team needs more server hardware? Assume that another team has five unused servers sitting in a blade chassis. Does the answer change? No, it does not. Even in test environments, sharing doesn't often happen between IT groups.

With virtualization, some aspects of resource competition get better and some remain the same. When first implemented, most groups will be running their own type of virtualization for their platform. The next step, I've most often seen, is for test servers to get virtualized. If a new group is formed to manage the virtualization infrastructure, virtual machines can be allocated to various application and server teams from a central pool and everyone is now sharing. Or, they begin sharing and then demand their own physical hardware to be isolated from others' resource hungry utilization. This is nonetheless a step in the right direction. Auto migration and guaranteed resource policies can go a long way toward making shared infrastructure, even between competing groups, a viable option.

Blamestorming

The most damaging side effect of splitting into too many distinct IT groups is the reinforcement of an "us versus them" mentality. Aside from the notion that specialization creates a lack of knowledge, blamestorming is what this article is really about. When a project is delayed, it is all too easy to blame another group. The SAN people didn't allocate storage on time, so another team was delayed. That is the timeline of the project, so all work halted until that hiccup was restored. Having someone else to blame when things get delayed makes it all too easy to simply stop working for a while.

More related to the initial points at the beginning of this article, perhaps, is the blamestorm that happens after a system outage.

Say an ERP system becomes unresponsive a few times throughout the day. The application team says it's just slowing down, and they don't know why. The network team says everything is fine. The server team says the application is "blocking on IO," which means it's a SAN issue. The SAN team say there is nothing wrong, and other applications on the same devices are fine. You've ran through nearly every team, but without an answer still. The SAN people don't have access to the application servers to help diagnose the problem. The server team doesn't even know how the application runs.

See the problem? Specialized teams are distinct and by nature adversarial. Specialized staffers often relegate themselves into a niche knowing that as long as they continue working at large enough companies, "someone else" will take care of all the other pieces.

I unfortunately don't have an answer to this problem. Maybe rotating employees between departments will help. They gain knowledge and also get to know other people, which should lessen the propensity to view them as outsiders

[Dec 03, 2017] IT workers voices heard in the Senate, confidentially

The resentment against outsourcing was brewing for a long time.
Notable quotes:
"... Much of the frustration focused on the IT layoffs at Southern California Edison , which is cutting 500 IT workers after hiring two offshore outsourcing firms. This has become the latest example for critics of the visa program's capacity for abuse. ..."
"... Infosys whistleblower Jay Palmer, who testified, and is familiar with the displacement process, told Sessions said these workers will get sued if they speak out. "That's the fear and intimidation that these people go through - they're blindsided," said Palmer. ..."
"... Moreover, if IT workers refuse to train their foreign replacement, "they are going to be terminated with cause, which means they won't even get their unemployment insurance," said Ron Hira, an associate professor at Howard University, who also testified. Affected tech workers who speak out publicly and use their names, "will be blackballed from the industry," he said. ..."
"... Hatch, who is leading the effort to increase the H-1B cap, suggested a willingness to raise wage levels for H-1B dependent employers. They are exempt from U.S. worker protection rules if the H-1B worker is paid at least $60,000 or has a master's degree, a figure that was set in law in 1998. Hatch suggested a wage level of $95,000. ..."
"... Sen. Dick Durbin, (Dem-Ill.), who has joined with Grassley on legislation to impose some restrictions on H-1B visa use -- particularly in offshoring -- has argued for a rule that would keep large firms from having more than 50% of their workers on the visa. This so-called 50/50 rule, as Durbin has noted, has drawn much criticism from India, where most of the affected companies are located. ..."
"... "I want to put the H-1B factories out of business," said Durbin. ..."
"... Hal Salzman, a Rutgers University professor who studies STEM (Science, Technology, Engineering and Math) workforce issues, told the committee that the IT industry now fills about two-thirds of its entry-level positions with guest workers. "At the same time, IT wages have stagnated for over a decade," he said. ..."
"... H-1B supporters use demand for the visa - which will exceed the 85,000 cap -- as proof of economic demand. But Salzman argues that U.S. colleges already graduate more scientists and engineers than find employment in those fields, about 200,000 more. ..."
Mar 18, 2015 | Network World

A Senate Judiciary Committee hearing today on the H-1B visa offered up a stew of policy arguments, positioning and frustration.

Much of the frustration focused on the IT layoffs at Southern California Edison, which is cutting 500 IT workers after hiring two offshore outsourcing firms. This has become the latest example for critics of the visa program's capacity for abuse.

Sen. Charles Grassley (R-Iowa), the committee chair who has long sought H-1B reforms, said he invited Southern California Edison officials "to join us today" and testify. "I thought they would want to defend their actions and explain why U.S. workers have been left high and dry," said Grassley. "Unfortunately, they declined my invitation."

The hearing, by the people picked to testify, was weighted toward critics of the program, prompting a response by industry groups.

Compete America, the Consumer Electronics Association, FWD.us, the U.S. Chamber of Commerce and many others submitted a letter to the committee to rebut the "flawed studies" and "non-representative anecdotes used to create myths that suggest immigration harms American and American workers."

The claim that H-1B critics are using "anecdotes" to make their points (which include layoff reports at firms such as Edison) is a naked example of the pot calling the kettle black. The industry musters anecdotal stories in support of its positions readily and often. It makes available to the press and congressional committees people who came to the U.S. on an H-1B visa who started a business or took on a critical role in a start-up. These people are free to share their often compelling and admirable stories.

The voices of the displaced, who may be in fear of losing their homes, are thwarted by severance agreements.

The committee did hear from displaced workers, including some at Southern California Edison. But the communications with these workers are being kept confidential.

"I got the letters here from people, without the names," said Sen. Jeff Sessions (R-Ala.). "If they say what they know and think about this, they will lose the buy-outs."

Infosys whistleblower Jay Palmer, who testified, and is familiar with the displacement process, told Sessions said these workers will get sued if they speak out. "That's the fear and intimidation that these people go through - they're blindsided," said Palmer.

Moreover, if IT workers refuse to train their foreign replacement, "they are going to be terminated with cause, which means they won't even get their unemployment insurance," said Ron Hira, an associate professor at Howard University, who also testified. Affected tech workers who speak out publicly and use their names, "will be blackballed from the industry," he said.

While lawmakers voiced either strong support or criticism of the program, there was interest in crafting legislation that impose some restrictions on H-1B use.

"America and American companies need more high-skilled workers - this is an undeniable fact," said Sen. Orrin Hatch (R-Utah). "America's high-skilled worker shortage has become a crisis."

Hatch, who is leading the effort to increase the H-1B cap, suggested a willingness to raise wage levels for H-1B dependent employers. They are exempt from U.S. worker protection rules if the H-1B worker is paid at least $60,000 or has a master's degree, a figure that was set in law in 1998. Hatch suggested a wage level of $95,000.

Sen. Dick Durbin, (Dem-Ill.), who has joined with Grassley on legislation to impose some restrictions on H-1B visa use -- particularly in offshoring -- has argued for a rule that would keep large firms from having more than 50% of their workers on the visa. This so-called 50/50 rule, as Durbin has noted, has drawn much criticism from India, where most of the affected companies are located.

"I want to put the H-1B factories out of business," said Durbin.

Durbin got some support for the 50/50 rule from one person testifying in support of expanding the cap, Bjorn Billhardt, the founder and president of Enspire Learning, an Austin-based company. Enspire creates learning development tools; Billhardt came to the U.S. as an exchange student and went from an H-1B visa to a green card to, eventually, citizenship.

"I actually think that's a reasonable provision," said Billhardt of the 50% visa limit. He said it could help, "quite a bit." At the same time, he urged lawmakers to raise the cap to end the lottery system now used to distribute visas once that cap is reached.

Today's hearing went well beyond the impact of H-1B use by outsourcing firms to the displacement of workers overall.

Hal Salzman, a Rutgers University professor who studies STEM (Science, Technology, Engineering and Math) workforce issues, told the committee that the IT industry now fills about two-thirds of its entry-level positions with guest workers. "At the same time, IT wages have stagnated for over a decade," he said.

H-1B supporters use demand for the visa - which will exceed the 85,000 cap -- as proof of economic demand. But Salzman argues that U.S. colleges already graduate more scientists and engineers than find employment in those fields, about 200,000 more.

"Asking domestic graduates, both native-born and immigrant, to compete with guest workers on wages is not a winning strategy for strengthening U.S. science, technology and innovation," said Salzman.

See also

[Dec 02, 2017] BASH Shell How To Redirect stderr To stdout ( redirect stderr to a File )

Dec 02, 2017 | www.cyberciti.biz

BASH Shell: How To Redirect stderr To stdout ( redirect stderr to a File ) Posted on March 12, 2008 March 12, 2008 in Categories BASH Shell , Linux , UNIX last updated March 12, 2008 Q. How do I redirect stderr to stdout? How do I redirect stderr to a file?

A. Bash and other modern shell provides I/O redirection facility. There are 3 default standard files (standard streams) open:

[a] stdin – Use to get input (keyboard) i.e. data going into a program.

[b] stdout – Use to write information (screen)

[c] stderr – Use to write error message (screen)

Understanding I/O streams numbers

The Unix / Linux standard I/O streams with numbers:

Handle Name Description
0 stdin Standard input
1 stdout Standard output
2 stderr Standard error
Redirecting the standard error stream to a file

The following will redirect program error message to a file called error.log:
$ program-name 2> error.log
$ command1 2> error.log

Redirecting the standard error (stderr) and stdout to file

Use the following syntax:
$ command-name &>file
OR
$ command > file-name 2>&1
Another useful example:
# find /usr/home -name .profile 2>&1 | more

Redirect stderr to stdout

Use the command as follows:
$ command-name 2>&1

[Dec 01, 2017] NSA hacks system administrators, new leak reveals

Highly recommended!
"I hunt sysadm" policy is the most realosnableif you you want to get into some coporate netwrok. So republication of this three years old post is just a reminder. Any sysadmin that access corporates netwrok not from a dedicated computer using VPN (corporate laptop) is engangering the corporation. As simple as that. The level of non-professionalism demonstrated by Hillary Clinton IT staff suggests that this can be a problem in government too. After all Snowden documents now are studied by all major intelligence agencies of the world.
This also outlines the main danger of "shadow It".
Notable quotes:
"... Journalist Ryan Gallagher reported that Edward Snowden , a former sys admin for NSA contractor Booz Allen Hamilton, provided The Intercept with the internal documents, including one from 2012 that's bluntly titled "I hunt sys admins." ..."
"... "Who better to target than the person that already has the 'keys to the kingdom'?" ..."
"... "They were written by an NSA official involved in the agency's effort to break into foreign network routers, the devices that connect computer networks and transport data across the Internet," ..."
"... "By infiltrating the computers of system administrators who work for foreign phone and Internet companies, the NSA can gain access to the calls and emails that flow over their networks." ..."
"... The latest leak suggests that some NSA analysts took a much different approach when tasked with trying to collect signals intelligence that otherwise might not be easily available. According to the posts, the author advocated for a technique that involves identifying the IP address used by the network's sys admin, then scouring other NSA tools to see what online accounts used those addresses to log-in. Then by using a ..."
"... that tricks targets into installing malware by being misdirected to fake Facebook servers, the intelligence analyst can hope that the sys admin's computer is sufficiently compromised and exploited. ..."
"... Once the NSA has access to the same machine a sys admin does, American spies can mine for a trove of possibly invaluable information, including maps of entire networks, log-in credentials, lists of customers and other details about how systems are wired. In turn, the NSA has found yet another way to, in theory, watch over all traffic on a targeted network. ..."
"... "Up front, sys admins generally are not my end target. My end target is the extremist/terrorist or government official that happens to be using the network some admin takes care of," the NSA employee says in the documents. ..."
"... "A key part of the protections that apply to both US persons and citizens of other countries is the mandate that information be in support of a valid foreign intelligence requirement, and comply with US Attorney General-approved procedures to protect privacy rights." ..."
"... Coincidentally, outgoing-NSA Director Keith Alexander said last year that he was working on drastically cutting the number of sys admins at that agency by upwards of 90 percent - but didn't say it was because they could be exploited by similar tactics waged by adversarial intelligence groups. ..."
Mar 21, 2014 | news.slashdot.org

In its quest to take down suspected terrorists and criminals abroad, the United States National Security Agency has adopted the practice of hacking the system administrators that oversee private computer networks, new documents reveal.

In its quest to take down suspected terrorists and criminals abroad, the United States National Security Agency has adopted the practice of hacking the system administrators that oversee private computer networks, new documents reveal.

The Intercept has published a handful of leaked screenshots taken from an internal NSA message board where one spy agency specialist spoke extensively about compromising not the computers of specific targets, but rather the machines of the system administrators who control entire networks.

Journalist Ryan Gallagher reported that Edward Snowden, a former sys admin for NSA contractor Booz Allen Hamilton, provided The Intercept with the internal documents, including one from 2012 that's bluntly titled "I hunt sys admins."

According to the posts - some labeled "top secret" - NSA staffers should not shy away from hacking sys admins: a successful offensive mission waged against an IT professional with extensive access to a privileged network could provide the NSA with unfettered capabilities, the analyst acknowledged.

"Who better to target than the person that already has the 'keys to the kingdom'?" one of the posts reads.

"They were written by an NSA official involved in the agency's effort to break into foreign network routers, the devices that connect computer networks and transport data across the Internet," Gallagher wrote for the article published late Thursday. "By infiltrating the computers of system administrators who work for foreign phone and Internet companies, the NSA can gain access to the calls and emails that flow over their networks."

Since last June, classified NSA materials taken by Snowden and provided to certain journalists have exposed an increasing number of previously-secret surveillance operations that range from purposely degrading international encryption standards and implanting malware in targeted machines, to tapping into fiber-optic cables that transfer internet traffic and even vacuuming up data as its moved into servers in a decrypted state.

The latest leak suggests that some NSA analysts took a much different approach when tasked with trying to collect signals intelligence that otherwise might not be easily available. According to the posts, the author advocated for a technique that involves identifying the IP address used by the network's sys admin, then scouring other NSA tools to see what online accounts used those addresses to log-in. Then by using a previously-disclosed NSA tool that tricks targets into installing malware by being misdirected to fake Facebook servers, the intelligence analyst can hope that the sys admin's computer is sufficiently compromised and exploited.

Once the NSA has access to the same machine a sys admin does, American spies can mine for a trove of possibly invaluable information, including maps of entire networks, log-in credentials, lists of customers and other details about how systems are wired. In turn, the NSA has found yet another way to, in theory, watch over all traffic on a targeted network.

"Up front, sys admins generally are not my end target. My end target is the extremist/terrorist or government official that happens to be using the network some admin takes care of," the NSA employee says in the documents.

When reached for comment by The Intercept, NSA spokesperson Vanee Vines said that, "A key part of the protections that apply to both US persons and citizens of other countries is the mandate that information be in support of a valid foreign intelligence requirement, and comply with US Attorney General-approved procedures to protect privacy rights."

Coincidentally, outgoing-NSA Director Keith Alexander said last year that he was working on drastically cutting the number of sys admins at that agency by upwards of 90 percent - but didn't say it was because they could be exploited by similar tactics waged by adversarial intelligence groups. Gen. Alexander's decision came just weeks after Snowden - previously one of around 1,000 sys admins working on the NSA's networks, according to Reuters - walked away from his role managing those networks with a trove of classified information.

[Nov 30, 2017] Will Robots Kill the Asian Century

This aritcle is two years old and not much happned during those two years. But still there is a chance that highly authomated factories can make manufacturing in the USA again profitable. the problme is that they will be even more profible in East Asia;-)
Notable quotes:
"... The National Interest ..."
The National Interest

The rise of technologies such as 3-D printing and advanced robotics means that the next few decades for Asia's economies will not be as easy or promising as the previous five.

OWEN HARRIES, the first editor, together with Robert Tucker, of The National Interest, once reminded me that experts-economists, strategists, business leaders and academics alike-tend to be relentless followers of intellectual fashion, and the learned, as Harold Rosenberg famously put it, a "herd of independent minds." Nowhere is this observation more apparent than in the prediction that we are already into the second decade of what will inevitably be an "Asian Century"-a widely held but rarely examined view that Asia's continued economic rise will decisively shift global power from the Atlantic to the western Pacific Ocean.

No doubt the numbers appear quite compelling. In 1960, East Asia accounted for a mere 14 percent of global GDP; today that figure is about 27 percent. If linear trends continue, the region could account for about 36 percent of global GDP by 2030 and over half of all output by the middle of the century. As if symbolic of a handover of economic preeminence, China, which only accounted for about 5 percent of global GDP in 1960, will likely surpass the United States as the largest economy in the world over the next decade. If past record is an indicator of future performance, then the "Asian Century" prediction is close to a sure thing.

[Nov 29, 2017] Take This GUI and Shove It

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.
Notable quotes:
"... Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI. ..."
"... What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers. ..."
"... AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT. ..."
"... Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI. ..."
"... Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead. ..."
"... Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location. ..."
"... I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script. ..."
Slashdot

Deep End's Paul Venezia speaks out against the overemphasis on GUIs in today's admin tools, saying that GUIs are fine and necessary in many cases, but only after a complete CLI is in place, and that they cannot interfere with the use of the CLI, only complement it. Otherwise, the GUI simply makes easy things easy and hard things much harder. He writes, 'If you have to make significant, identical changes to a bunch of Linux servers, is it easier to log into them one-by-one and run through a GUI or text-menu tool, or write a quick shell script that hits each box and either makes the changes or simply pulls down a few new config files and restarts some services? And it's not just about conservation of effort - it's also about accuracy. If you write a script, you're certain that the changes made will be identical on each box. If you're doing them all by hand, you aren't.'"

alain94040 (785132)

Here is a Link to the print version of the article [infoworld.com] (that conveniently fits on 1 page instead of 3).

Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.

A bad GUI with no CLI is the worst of both worlds, the author of the article got that right. The 80/20 rule applies: 80% of the work is common to everyone, and should be offered with a GUI. And the 20% that is custom to each sysadmin, well use the CLI.

maxwell demon:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

0123456 (636235) writes:

What would be nice is if the GUI could automatically create a shell script doing the change.

While it's not quite the same thing, our GUI-based home router has an option to download the config as a text file so you can automatically reconfigure it from that file if it has to be reset to defaults. You could presumably use sed to change IP addresses, etc, and copy it to a different router. Of course it runs Linux.

Alain Williams:

AIX's SMIT did this, or rather it wrote the commands that it executed to achieve what you asked it to do. This meant that you could learn: look at what it did and find out about which CLI commands to run. You could also take them, build them into a script, copy elsewhere, ... I liked SMIT.

Ephemeriis:

What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

Cisco's GUI stuff doesn't really generate any scripts, but the commands it creates are the same things you'd type into a CLI. And the resulting configuration is just as human-readable (barring any weird naming conventions) as one built using the CLI. I've actually learned an awful lot about the Cisco CLI by using their GUI.

We've just started working with Aruba hardware. Installed a mobility controller last week. They've got a GUI that does something similar. It's all a pretty web-based front-end, but it again generates CLI commands and a human-readable configuration. I'm still very new to the platform, but I'm already learning about their CLI through the GUI. And getting work done that I wouldn't be able to if I had to look up the CLI commands for everything.

Microsoft's more recent tools are also doing this. Exchange 2007 and newer, for example, are really completely driven by the PowerShell CLI. The GUI generates commands and just feeds them into PowerShell for you. So you can again issue your commands through the GUI, and learn how you could have done it in PowerShell instead.

Anpheus:

Just about every Microsoft tool newer than 2007 does this. Virtual machine manager, SQL Server has done it for ages, I think almost all the system center tools do, etc.

It's a huge improvement.

PoV:

All good admins document their work (don't they? DON'T THEY?). With a CLI or a script that's easy: it comes down to "log in as user X, change to directory Y, run script Z with arguments A B and C - the output should look like D". Try that when all you have is a GLUI (like a GUI, but you get stuck): open this window, select that option, drag a slider, check these boxes, click Yes, three times. The output might look a little like this blurry screen shot and the only record of a successful execution is a window that disappears as soon as the application ends.

I suppose the Linux community should be grateful that windows made the fundemental systems design error of making everything graphic. Without that basic failure, Linux might never have even got the toe-hold it has now.

skids:

I think this is a stronger point than the OP: GUIs do not lead to good documentation. In fact, GUIs pretty much are limited to procedural documentation like the example you gave.

The best they can do as far as actual documentation, where the precise effect of all the widgets is explained, is a screenshot with little quote bubbles pointing to each doodad. That's a ridiculous way to document.

This is as opposed to a command reference which can organize, usually in a pretty sensible fashion, exact descriptions of what each command does.

Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location.

Not that even good command references are mandatory by today's pathetic standards. Even the big boys like Cisco have shown major degradation in the quality of their documentation during the last decade.

pedantic bore:

I think the author might not fully understand who most admins are. They're people who couldn't write a shell script if their lives depended on it, because they've never had to. GUI-dependent users become GUI-dependent admins.

As a percentage of computer users, people who can actually navigate a CLI are an ever-diminishing group.

arth1: /etc/resolv.conf

/etc/init.d/NetworkManager stop
chkconfig NetworkManager off
chkconfig network on
vi /etc/sysconfig/network
vi /etc/sysconfig/network-scripts/eth0

At least they named it NetworkManager, so experienced admins could recognize it as a culprit. Anything named in CamelCase is almost invariably written by new school programmers who don't grok the Unix toolbox concept and write applications instead of tools, and the bloated drivel is usually best avoided.

Darkness404 (1287218) writes: on Monday October 04, @07:21PM (#33789446)

There are more and more small businesses (5, 10 or so employees) realizing that they can get things done easier if they had a server. Because the business can't really afford to hire a sysadmin or a full-time tech person, its generally the employee who "knows computers" (you know, the person who has to help the boss check his e-mail every day, etc.) and since they don't have the knowledge of a skilled *Nix admin, a GUI makes their administration a lot easier.

So with the increasing use of servers among non-admins, it only makes sense for a growth in GUI-based solutions.

Svartalf (2997) writes: Ah... But the thing is... You don't NEED the GUI with recent Linux systems- you do with Windows.

oatworm (969674) writes: on Monday October 04, @07:38PM (#33789624) Homepage

Bingo. Realistically, if you're a company with less than a 100 employees (read: most companies), you're only going to have a handful of servers in house and they're each going to be dedicated to particular roles. You're not going to have 100 clustered fileservers - instead, you're going to have one or maybe two. You're not going to have a dozen e-mail servers - instead, you're going to have one or two. Consequently, the office admin's focus isn't going to be scalability; it just won't matter to the admin if they can script, say, creating a mailbox for 100 new users instead of just one. Instead, said office admin is going to be more focused on finding ways to do semi-unusual things (e.g. "create a VPN between this office and our new branch office", "promote this new server as a domain controller", "install SQL", etc.) that they might do, oh, once a year.

The trouble with Linux, and I'm speaking as someone who's used YaST in precisely this context, is that you have to make a choice - do you let the GUI manage it or do you CLI it? If you try to do both, there will be inconsistencies because the grammar of the config files is too ambiguous; consequently, the GUI config file parser will probably just overwrite whatever manual changes it thinks is "invalid", whether it really is or not. If you let the GUI manage it, you better hope the GUI has the flexibility necessary to meet your needs. If, for example, YaST doesn't understand named Apache virtual hosts, well, good luck figuring out where it's hiding all of the various config files that it was sensibly spreading out in multiple locations for you, and don't you dare use YaST to manage Apache again or it'll delete your Apache-legal but YaST-"invalid" directive.

The only solution I really see is for manual config file support with optional XML (or some other machine-friendly but still human-readable format) linkages. For example, if you want to hand-edit your resolv.conf, that's fine, but if the GUI is going to take over, it'll toss a directive on line 1 that says "#import resolv.conf.xml" and immediately overrides (but does not overwrite) everything following that. Then, if you still want to use the GUI but need to hand-edit something, you can edit the XML file using the appropriate syntax and know that your change will be reflected on the GUI.

That's my take. Your mileage, of course, may vary.

icebraining (1313345) writes: on Monday October 04, @07:24PM (#33789494) Homepage

I have a cheap router with only a web gui. I wrote a two line bash script that simply POSTs the right requests to URL. Simply put, HTTP interfaces, especially if they implement the right response codes, are actually very nice to script.

devent (1627873) writes:

Why Windows servers have a GUI is beyond me anyway. The servers are running 99,99% of the time without a monitor and normally you just login per ssh to a console if you need to administer them. But they are consuming the extra RAM, the extra CPU cycles and the extra security threats. I don't now, but can you de-install the GUI from a Windows server? Or better, do you have an option for no-GUI installation? Just saw the minimum hardware requirements. 512 MB RAM and 32 GB or greater disk space. My server runs

sirsnork (530512) writes: on Monday October 04, @07:43PM (#33789672)

it's called a "core" install in Server 2008 and up, and if you do that, there is no going back, you can't ever add the GUI back.

What this means is you can run a small subset of MS services that don't need GUI interaction. With R2 that subset grew somwhat as they added the ability to install .Net too, which mean't you could run IIS in a useful manner (arguably the strongest reason to want to do this in the first place).

Still it's a one way trip and you better be damn sure what services need to run on that box for the lifetime of that box or you're looking at a reinstall. Most windows admins will still tell you the risk isn't worth it.

Simple things like network configuration without a GUI in windows is tedious, and, at least last time i looked, you lost the ability to trunk network poers because the NIC manufactuers all assumed you had a GUI to configure your NICs

prichardson (603676) writes: on Monday October 04, @07:27PM (#33789520) Journal

This is also a problem with Max OS X Server. Apple builds their services from open source products and adds a GUI for configuration to make it all clickable and easy to set up. However, many options that can be set on the command line can't be set in the GUI. Even worse, making CLI changes to services can break the GUI entirely.

The hardware and software are both super stable and run really smoothly, so once everything gets set up, it's awesome. Still, it's hard for a guy who would rather make changes on the CLI to get used to.

MrEricSir (398214) writes:

Just because you're used to a CLI doesn't make it better. Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. In essence, the question here is whether it's okay for the user to be lazy and use a GUI, or whether the programmer should be too lazy to develop a GUI.

ak_hepcat (468765) writes: <leif@MENCKENdenali.net minus author> on Monday October 04, @07:38PM (#33789626) Homepage Journal

Probably because it's also about the ease of troubleshooting issues.

How do you troubleshoot something with a GUI after you've misconfigured? How do you troubleshoot a programming error (bug) in the GUI -> device communication? How do you scale to tens, hundreds, or thousands of devices with a GUI?

CLI makes all this easier and more manageable.

arth1 (260657) writes:

Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did? I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task. Because then you'll be stuck at doing simple tasks, and will never be able to do more advanced tasks. Without hiring a team to write an app for you instead of doing it yourself in two minutes, that is. The time you spend reading man

fandingo (1541045) writes: on Monday October 04, @07:54PM (#33789778)

I don't think you really understand systems administration. 'Users,' or in this case admins, don't typically do stuff once. Furthermore, they need to know what he did and how to do it again (i.e. new server or whatever) or just remember what he did. One-off stuff isn't common and is a sign of poor administration (i.e. tracking changes and following processes).

What I'm trying to get at is that admins shouldn't do anything without reading the manual. As a Windows/Linux admin, I tend to find Linux easier to properly administer because I either already know how to perform an operation or I have to read the manual (manpage) and learn a decent amount about the operation (i.e. more than click here/use this flag).

Don't get me wrong, GUIs can make unknown operations significantly easier, but they often lead to poor process management. To document processes, screenshots are typically needed. They can be done well, but I find that GUI documentation (created by admins, not vendor docs) tend to be of very low quality. They are also vulnerable to 'upgrades' where vendors change the interface design. CLI programs typically have more stable interfaces, but maybe that's just because they have been around longer...

maotx (765127) writes: <maotx@NoSPAM.yahoo.com> on Monday October 04, @07:42PM (#33789666)

That's one thing Microsoft did right with Exchange 2007. They built it entirely around their new powershell CLI and then built a GUI for it. The GUI is limited in compared to what you can do with the CLI, but you can get most things done. The CLI becomes extremely handy for batch jobs and exporting statistics to csv files. I'd say it's really up there with BASH in terms of scripting, data manipulation, and integration (not just Exchange but WMI, SQL, etc.)

They tried to do similar with Windows 2008 and their Core [petri.co.il] feature, but they still have to load a GUI to present a prompt...Reply to This

Charles Dodgeson (248492) writes: <jeffrey@goldmark.org> on Monday October 04, @08:51PM (#33790206) Homepage Journal

Probably Debian would have been OK, but I was finding admin of most Linux distros a pain for exactly these reasons. I couldn't find a layer where I could do everything that I needed to do without worrying about one thing stepping on another. No doubt there are ways that I could manage a Linux system without running into different layers of management tools stepping on each other, but it was a struggle.

There were other reasons as well (although there is a lot that I miss about Linux), but I think that this was one of the leading reasons.

(NB: I realize that this is flamebait (I've got karma to burn), but that isn't my intention here.)

[Nov 27, 2017] This Is Why Hewlett-Packard Just Fired Another 30K

Highly recommended!
Notable quotes:
"... Imagine working at HP and having to listen to Carly Fiorina bulldoze you...she is like a blow-torch...here are 4 minutes of Carly and Ralph Nader (if you can take it): https://www.youtube.com/watch?v=vC4JDwoRHtk ..."
"... My husband has been a software architect for 30 years at the same company. Never before has he seen the sheer unadulterated panic in the executives. All indices are down and they are planning for the worst. Quality is being sacrificed for " just get some relatively functional piece of shit out the door we can sell". He is fighting because he has always produced a stellar product and refuses to have shit tied to his name ( 90% of competitor benchmarks fail against his projects). They can't afford to lay him off, but the first time in my life I see my husband want to quit... ..."
"... HP basically makes computer equipment (PCs, servers, Printers) and software. Part of the problem is that computer hardware has been commodized. Since PCs are cheap and frequent replacements are need, People just by the cheapest models, expecting to toss it in a couple of years and by a newer model (aka the Flat screen TV model). So there is no justification to use quality components. Same is become true with the Server market. Businesses have switched to virtualization and/or cloud systems. So instead of taking a boat load of time to rebuild a crashed server, the VM is just moved to another host. ..."
"... I hung an older sign next to the one saying Information Technology. Somehow MIS-Information Technology seemed appropriate.) ..."
"... Then I got to my first duty assignment. It was about five months after the first moon landing, and the aerospace industry was facing cuts in government aerospace spending. I picked up a copy of an engineering journal in the base library and found an article about job cuts. There was a cartoon with two janitors, buckets at their feet and mops in their hands, standing before a blackboard filled with equations. Once was saying to the other, pointing to one section, "you can see where he made his mistake right here...". It represented two engineers who had been reduced to menial labor after losing their jobs. ..."
"... So while I resent all the H1Bs coming into the US - I worked with several for the last four years of my IT career, and was not at all impressed - and despise the politicians who allow it, I know that it is not the first time American STEM grads have been put out of jobs en masse. In some ways that old saying applies: the more things change, the more they stay the same ..."
"... Just like Amazon, HP will supposedly make billions in profit analyzing things in the cloud that nobody looks at and has no use to the real economy, but it makes good fodder for Power Point presentations. I am amazed how much daily productivity goes into creating fancy charts for meetings that are meaningless to the actual business of the company. ..."
"... 'Computers' cost as much - if not more time than they save, at least in corporate settings. Used to be you'd work up 3 budget projections - expected, worst case and best case, you'd have a meeting, hash it out and decide in a week. Now you have endless alternatives, endless 'tweaking' and changes and decisions take forever, with outrageous amounts of time spent on endless 'analysis' and presentations. ..."
"... A recent lay off here turned out to be quite embarrassing for Parmalat there was nobody left that knew how to properly run the place they had to rehire many ex employees as consultants-at a costly premium ..."
"... HP is laying off 80,000 workers or almost a third of its workforce, converting its long-term human capital into short-term gains for rich shareholders at an alarming rate. The reason that product quality has declined is due to the planned obsolescence that spurs needless consumerism, which is necessary to prop up our debt-backed monetary system and the capitalist-owned economy that sits on top of it. ..."
"... The world is heading for massive deflation. Computers have hit the 14 nano-meter lithography zone, the cost to go from 14nm to say 5nm is very high, and the net benefit to computing power is very low, but lets say we go from 14nm to 5nm over the next 4 years. Going from 5nm to 1nm is not going to net a large boost in computing power and the cost to shrink things down and re-tool will be very high for such an insignificant gain in performance. ..."
"... Another classic "Let's rape all we can and bail with my golden parachute" corporate leaders setting themselves up. Pile on the string of non-IT CEOs that have been leading the company to ruin. To them it is nothing more than a contest of being even worse than their predecessor. Just look at the billions each has lost before their exit. Compaq, a cluster. Palm Pilot, a dead product they paid millions for and then buried. And many others. ..."
"... Let's not beat around the bush, they're outsourcing, firing Americans and hiring cheap labor elsewhere: http://www.bloomberg.com/news/articles/2015-09-15/hewlett-packard-to-cut-up-to-30-000-more-jobs-in-restructuring It's also shifting employees to low-cost areas, and hopes to have 60 percent of its workers located in cheaper countries by 2018, Nefkens said. ..."
"... Carly Fiorina: (LOL, leading a tech company with a degree in medieval history and philosophy) While at ATT she was groomed from the Affirmative Action plan. ..."
"... It is very straightforward. Replace 45,000 US workers with 100,000 offshore workers and you still save millions of USD ! Use the "savings" to buy back stock, then borrow more $$ at ZIRP to buy more stock back. ..."
"... If you look on a site like LinkedIN, it will always say 'We're hiring!'. YES, HP is hiring.....but not YOU, they want Ganesh Balasubramaniamawapbapalooboopawapbamboomtuttifrutti, so that they can work him as modern day slave labor for ultra cheap. We can thank idiot 'leaders' like Meg Pasty Faced Whitman and Bill 'Forced Vaccinations' Gates for lobbying Congress for decades, against the rights of American workers. ..."
"... An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy. ..."
Sep 15, 2015 | Zero Hedge

SixIsNinE

yeah thanks Carly ... HP made bullet-proof products that would last forever..... I still buy HP workstation notebooks, especially now when I can get them for $100 on ebay .... I sold HP products in the 1990s .... we had HP laserjet IIs that companies would run day & night .... virtually no maintenance ... when PCL5 came around then we had LJ IIIs .... and still companies would call for LJ I's, .... 100 pounds of invincible Printing ! .

This kind of product has no place in the World of Planned-Obsolesence .... I'm currently running an 8510w, 8530w, 2530p, Dell 6420 quad i7, hp printers hp scanners, hp pavilion desktops, .... all for less than what a Laserjet II would have cost in 1994, Total.

Not My Real Name

I still have my HP 15C scientific calculator I bought in 1983 to get me through college for my engineering degree. There is nothing better than a hand held calculator that uses Reverse Polish Notation!

BigJim

HP used to make fantastic products. I remember getting their RPN calculators back in th 80's; built like tanks. Then they decided to "add value" by removing more and more material from their consumer/"prosumer" products until they became unspeakably flimsy. They stopped holding things together with proper fastenings and starting hot melting/gluing it together, so if it died you had to cut it open to have any chance of fixing it.

I still have one of their Laserjet 4100 printers. I expect it to outlast anything they currently produce, and it must be going on 16+ years old now.

Fuck you, HP. You started selling shit and now you're eating through your seed corn. I just wish the "leaders" who did this to you had to pay some kind of penalty greater than getting $25M in a severance package.

Automatic Choke

+100. The path of HP is everything that is wrong about modern business models. I still have a 5MP laserjet (one of the first), still works great. Also have a number of 42S calculators.....my day-to-day workhorse and several spares. I don't think the present HP could even dream of making these products today.

nope-1004

How well will I profit, as a salesman, if I sell you something that works? How valuable are you, as a customer in my database, if you never come back? Confucious say "Buy another one, and if you can't afford it, f'n finance it!" It's the growing trend. Look at appliances. Nothing works anymore.

Normalcy Bias

https://en.wikipedia.org/wiki/Planned_obsolescence

Son of Loki

GE to cut Houston jobs as work moves overseas http://www.bizjournals.com/houston/news/2015/09/15/ge-to-cut-houston-job... " Yes we can! "

Automatic Choke

hey big brother.... if you are curious, there is a damn good android emulator of the HP42S available (Free42). really it is so good that it made me relax about accumulating more spares. still not quite the same as a real calculator. (the 42S, by the way, is the modernization/simplification of the classic HP41, the real hardcord very-programmable, reconfigurable, hackable unit with all the plug-in-modules that came out in the early 80s.)

Miss Expectations

Imagine working at HP and having to listen to Carly Fiorina bulldoze you...she is like a blow-torch...here are 4 minutes of Carly and Ralph Nader (if you can take it): https://www.youtube.com/watch?v=vC4JDwoRHtk

Miffed Microbiologist

My husband has been a software architect for 30 years at the same company. Never before has he seen the sheer unadulterated panic in the executives. All indices are down and they are planning for the worst. Quality is being sacrificed for " just get some relatively functional piece of shit out the door we can sell". He is fighting because he has always produced a stellar product and refuses to have shit tied to his name ( 90% of competitor benchmarks fail against his projects). They can't afford to lay him off, but the first time in my life I see my husband want to quit...

unplugged

I've been an engineer for 31 years - our managements's unspoken motto at the place I'm at (large company) is: "release it now, we'll put in the quality later". I try to put in as much as possible before the product is shoved out the door without killing myself doing it.

AGuy

Do they even make test equipment anymore?

HP test and measurement was spun off many years ago as Agilent. The electronics part of Agilent was spun off as keysight late last year.

HP basically makes computer equipment (PCs, servers, Printers) and software. Part of the problem is that computer hardware has been commodized. Since PCs are cheap and frequent replacements are need, People just by the cheapest models, expecting to toss it in a couple of years and by a newer model (aka the Flat screen TV model). So there is no justification to use quality components. Same is become true with the Server market. Businesses have switched to virtualization and/or cloud systems. So instead of taking a boat load of time to rebuild a crashed server, the VM is just moved to another host.

HP has also adopted the Computer Associates business model (aka Borg). HP buys up new tech companies and sits on the tech and never improves it. It decays and gets replaced with a system from a competitor. It also has a habit of buying outdated tech companies that never generate the revenues HP thinks it will.

BullyBearish

When Carly was CEO of HP, she instituted a draconian "pay for performance" plan. She ended up leaving with over $146 Million because she was smart enough not to specify "what type" of performance.

GeezerGeek

Regarding your statement "All those engineers choosing to pursue other opportunities", we need to realize that tech in general has been very susceptible to the vagaries of government actions. Now the employment problems are due to things like globalization and H1B programs. Some 50 years ago tech - meaning science and engineering - was hit hard as the US space program wound down. Permit me this retrospective:

I graduated from a quite good school with a BS in Physics in 1968. My timing was not all that great, since that was when they stopped granting draft deferments for graduate school. I joined the Air Force, but as an enlisted airman, not an officer. Following basic training, I was sent to learn to operate PCAM operations. That's Punched Card Accounting Machines. Collators. Sorters. Interpreters. Key punches. I was in a class with nine other enlistees. One had just gotten a Masters degree in something. Eight of us had a BS in one thing or another, but all what would now be called STEM fields. The least educated only had an Associate degree. We all enlisted simply to avoid being drafted into the Marines. (Not that there's anything wrong with the Marines, but all of us proclaimed an allergy to energetic lead projectiles and acted accordingly. Going to Canada, as many did, pretty much ensured never getting a job in STEM fields later in life.) So thanks to government action (fighting in VietNam, in this case) a significant portion of educated Americans found themselves diverted from chosen career paths. (In my case, it worked out fine. I learned to program, etc., and spent a total of over 40 years in what is now called IT. I think it was called EDP when I started the trek. Somewhere along the line it became (where I worked) Management Information Systems. MIS. And finally the department became simply Information Technology. I hung an older sign next to the one saying Information Technology. Somehow MIS-Information Technology seemed appropriate.)

Then I got to my first duty assignment. It was about five months after the first moon landing, and the aerospace industry was facing cuts in government aerospace spending. I picked up a copy of an engineering journal in the base library and found an article about job cuts. There was a cartoon with two janitors, buckets at their feet and mops in their hands, standing before a blackboard filled with equations. Once was saying to the other, pointing to one section, "you can see where he made his mistake right here...". It represented two engineers who had been reduced to menial labor after losing their jobs.

So while I resent all the H1Bs coming into the US - I worked with several for the last four years of my IT career, and was not at all impressed - and despise the politicians who allow it, I know that it is not the first time American STEM grads have been put out of jobs en masse. In some ways that old saying applies: the more things change, the more they stay the same.

If you made it this far, thanks for your patience.

adr

Just like Amazon, HP will supposedly make billions in profit analyzing things in the cloud that nobody looks at and has no use to the real economy, but it makes good fodder for Power Point presentations. I am amazed how much daily productivity goes into creating fancy charts for meetings that are meaningless to the actual business of the company.

IT'S ALL BULLSHIT!!!!!

I designed more products in one year for the small company I work for than a $15 billion corporation did throughout their entire design department employing hundreds of people. That is because 90% of their workday is spent preparing crap for meetings and they never really get anything meaningful done.

It took me one week to design a product and send it out for production branded for the company I work for, but it took six months to get the same type of product passed through the multi billion dollar corporation we license for. Because it had to pass through layer after layer of bullshit and through every level of management before it could be signed off. Then a month later somebody would change their mind in middle management and the product would need to be changed and go through the cycle all over again.

Their own bag department made six bags last year, I designed 16. Funny how I out produce a department of six people whose only job is to make bags, yet I only get paid the salary of one.

Maybe I'm just an imbecile for working hard.

Bear

You also have to add all the wasted time of employees having to sit through those presentations and the even more wasted time on Ashley Madison

cynicalskeptic

'Computers' cost as much - if not more time than they save, at least in corporate settings. Used to be you'd work up 3 budget projections - expected, worst case and best case, you'd have a meeting, hash it out and decide in a week. Now you have endless alternatives, endless 'tweaking' and changes and decisions take forever, with outrageous amounts of time spent on endless 'analysis' and presentations.

EVERY VP now has an 'Administrative Assistant' whose primary job is to develop PowerPoint presentations for the endless meetings that take up time - without any decisions ever being made.

Computers stop people from thinking. In ages past when you used a slide rule you had to know the order of magnitude of the end result. Now people make a mistake and come up with a ridiculous number and take it at face value because 'the computer' produced it.

Any exec worht anythign knew what a given line in their department or the total should be +or a small amount. I can't count the number of times budgets and analyses were WRONG because someone left off a few lines on a spreadsheet total.

Yes computer modeling for advanced tech and engineering is a help, CAD/CAM is great and many other applications in the tech/scientific world are a great help but letting computers loose in corporate and finance has produced endless waste AND - worsde - thigns like HFT (e.g. 'better' more effective ways to manipulate and cheat markets.

khnum

A recent lay off here turned out to be quite embarrassing for Parmalat there was nobody left that knew how to properly run the place they had to rehire many ex employees as consultants-at a costly premium

Anopheles

Consultants don't come at that much of a premium becaue the company doesn't have to pay benefits, vacation, sick days, or payroll taxes, etc. Plus it's really easy and cheap to get rid of consultants.

arrowrod

Obviously, you haven't worked as a consultant. You get paid by the hour. To clean up a mess. 100 hours a week are not uncommon. (What?, is it possible to work 100 hours a week? Yes, it is, but only for about 3 months.)

RaceToTheBottom

HP Executives are trying hard to bring the company back to its roots: The ability to fit into one garage...

PrimalScream

ALL THAT Meg Whitman needs to do ... is to FIRE EVERYBODY !! Then have all the products made in China, process all the sales orders in Hong Kong, and sub-contract the accounting and tax paperwork to India. Then HP can use all the profits for stock buybacks, except of course for Meg's salary ... which will keep rising astronomically!

Herdee

That's where education gets you in America.The Government sold out America's manufacturing base to Communist China who holds the debt of the USA.Who would ever guess that right-wing neo-cons(neo-nazis) running the government would sell out to communists just to get the money for war? Very weird.

Really20

"Communist"? The Chinese government, like that of the US, never believed in worker ownership of businesses and never believed that the commerical banking system (whether owned by the state, or private corporations which act like a state) should not control money. Both countries believe in centralization of power among a few shareholders, who take the fruits of working people's labor while contributing nothing of value themselves (money being but a token that represents a claim on real capital, not capital itself.)

Management and investors ought to be separate from each other; management should be chosen by workers by universal equal vote, while a complementary investor board should be chosen by investors much as corporate boards are now. Both of these boards should be legally independent but bound organizations; the management board should run the business while the investor board should negotiate with the management board on the terms of equity issuance. No more buybacks, no more layoffs or early retirements, unless workers as a whole see a need for it to maintain the company.

The purpose of investors is to serve the real economy, not the other way round; and in turn, the purpose of the real economy is to serve humanity, not the other way around. Humans should stop being slaves to perpetual growth.

Really20

HP is laying off 80,000 workers or almost a third of its workforce, converting its long-term human capital into short-term gains for rich shareholders at an alarming rate. The reason that product quality has declined is due to the planned obsolescence that spurs needless consumerism, which is necessary to prop up our debt-backed monetary system and the capitalist-owned economy that sits on top of it.

NoWayJose

HP - that company that sells computers and printers made in China and ink cartridges made in Thailand?

Dominus Ludificatio

Another company going down the drain because their focus is short term returns with crappy products.They will also bring down any company they buy as well.

Barnaby

HP is microcosm of what Carly will do to the US: carve it like a pumpkin and leave the shell out to bake in the sun for a few weeks. But she'll make sure and poison the seeds too! Don't want anything growing out of that pesky Palm division...

Dre4dwolf

The world is heading for massive deflation. Computers have hit the 14 nano-meter lithography zone, the cost to go from 14nm to say 5nm is very high, and the net benefit to computing power is very low, but lets say we go from 14nm to 5nm over the next 4 years. Going from 5nm to 1nm is not going to net a large boost in computing power and the cost to shrink things down and re-tool will be very high for such an insignificant gain in performance.

What does that mean

  1. Computers (atleast non-quantum ones) have hit the point where about 80-90% of the potential for the current science has been tap'd
  2. This means that the consumer is not going to be put in the position where they will have to upgrade to faster systems for atleast another 7-8 years.... (because the new computer wont be that much faster than their existing one).
  3. If no one is upgrading the only IT sectors of the economy that stand to make any money are software companies (Microsoft, Apple, and other small software developers), most software has not caught up with hardware yet.
  4. We are obviously heading for massive deflation, consumer spending levels as a % are probably around where they were in the late 70s - mid 80s, this is a very deflationary environment that is being compounded by a high debt burden (most of everyones income is going to service their debts), that signals monetary tightening is going on... people simply don't have enough discretionary income to spend on new toys.

All that to me screams SELL consumer electronics stocks because profits are GOING TO DECLINE , SALES ARE GOING TO DECLINE. There is no way , no amount of buy backs will float the stocks of corporations like HP/Dell/IBM etc... it is inevitable that these stocks will be worth 30% less over the next 5 - 8 years

But what do I know? maybe I am missing something.

In anycase a lot of pressure is being put on HP to do all it can at any cost to boost the stock valuations, because so much of its stock is institution owned, they will strip the wallpaper off the walls and sell it to a recycling plant if it would give them more money to boost stock valuations. That to me signals that most of the people pressuring the board of HP to boost the stock, want them to gut the company as much as they can to boost it some trivial % points so that the majority of shares can be dumped onto muppets.

To me it pretty much also signals something is terribly wrong at HP and no one is talking about it.

PoasterToaster

Other than die shrinks there really hasn't been a lot going on in the CPU world since Intel abandoned its Netburst architecture and went back to its (Israeli created) Pentium 3 style pipeline. After that they gave up on increasing speed and resorted to selling more cores. Now that wall has been hit, they have been selling "green" and "efficient" nonsense in place of increasing power.

x86 just needs to go, but a lot is invested in it not the least of which is that 1-2 punch of forced, contrived obsolesence carried out in a joint operation with Microsoft. 15 years ago you could watch videos with no problem on your old machine using Windows XP. Fast forward to now and their chief bragging point is still "multitasking" and the ability to process datastreams like video. It's a joke.

The future is not in the current CPU paradigm of instructions per second; it will be in terms of variables per second. It will be more along the lines of what GPU manufacturers are creating with their thousands of "engines" or "processing units" per chip, rather than the 4, 6 or 12 core monsters that Intel is pushing. They have nearly given up on their roadmap to push out to 128 cores as it is. x86 just doesn't work with all that.

Dojidog

Another classic "Let's rape all we can and bail with my golden parachute" corporate leaders setting themselves up. Pile on the string of non-IT CEOs that have been leading the company to ruin. To them it is nothing more than a contest of being even worse than their predecessor. Just look at the billions each has lost before their exit. Compaq, a cluster. Palm Pilot, a dead product they paid millions for and then buried. And many others.

Think the split is going to help? Think again. Rather than taking the opportunity to fix their problems, they have just duplicated and perpetuated them into two separate entities.

HP is a company that is mired in a morass of unmanageable business processes and patchwork of antiquated applications all interconnected to the point they are petrified to try and uncouple them.

Just look at their stock price since January. The insiders know. Want to fix HP? All it would take is a savvy IT based leader with a boatload of common sense. What makes money at HP? Their printers and ink. Not thinking they can provide enterprise solutions to others when they can't even get their own house in order.

I Write Code

Let's not beat around the bush, they're outsourcing, firing Americans and hiring cheap labor elsewhere: http://www.bloomberg.com/news/articles/2015-09-15/hewlett-packard-to-cut-up-to-30-000-more-jobs-in-restructuring It's also shifting employees to low-cost areas, and hopes to have 60 percent of its workers located in cheaper countries by 2018, Nefkens said.

yogibear

Carly Fiorina: (LOL, leading a tech company with a degree in medieval history and philosophy) While at ATT she was groomed from the Affirmative Action plan.

Alma Mater: Stanford University (B.A. in medieval history and philosophy); University of Maryland (MBA); Massachusetts Institute of Technology

==================================================================

Patricia Russo: (Lucent) (Dedree in Political Science). Another lady elevated through the AA plan, Russo got her bachelor's degree from Georgetown University in political science and history in 1973. She finished the advanced management program at Harvard Business School in 1989

Both ladies steered their corporations to failure.

Clowns on Acid

It is very straightforward. Replace 45,000 US workers with 100,000 offshore workers and you still save millions of USD ! Use the "savings" to buy back stock, then borrow more $$ at ZIRP to buy more stock back.

You guys don't know nuthin'.

homiegot

HP: one of the worst places you could work. Souless.

Pancho de Villa

Ladies and Gentlemen! Integrity has left the Building!

space junk

I worked there for a while and it was total garbage. There are still some great folks around, but they are getting paid less and less, and having to work longer hours for less pay while reporting to God knows who, often a foreigner with crappy engrish skills, yes likely another 'diversity hire'. People with DEEP knowledge, decades and decades, have either gotten unfairly fired or demoted, made to quit, or if they are lucky, taken some early retirement and GTFO (along with their expertise - whoopsie! who knew? unintended consequences are a bitch aren't they? )....

If you look on a site like LinkedIN, it will always say 'We're hiring!'. YES, HP is hiring.....but not YOU, they want Ganesh Balasubramaniamawapbapalooboopawapbamboomtuttifrutti, so that they can work him as modern day slave labor for ultra cheap. We can thank idiot 'leaders' like Meg Pasty Faced Whitman and Bill 'Forced Vaccinations' Gates for lobbying Congress for decades, against the rights of American workers.

Remember that Meg 'Pasty Faced' Whitman is the person who came up with the idea of a 'lights out' datacenter....that's right, it's the concept of putting all of your computers in a building, in racks, in the dark, and maybe hiring an intern to come in once a month and keep them going. This is what she actually believed. Along with her other statement to the HP workforce which says basically that the future of HP is one of total automation.....TRANSLATION: If you are a smart admin, engineer, project manager, architect, sw tester, etc.....we (HP management) think you are an IDIOT and can be replaced by a robot, a foreigner, or any other cheap worker.

Race to the bottom is like they say a space ship approaching a black hole......after a while the laws of physics and common sense, just don't apply anymore.

InnVestuhrr

An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy.

[Nov 27, 2017] College Is Wildly Exploitative Why Arent Students Raising Hell

Highly recommended!
Notable quotes:
"... By David Masciotra, the author of Mellencamp: American Troubadour (University Press of Kentucky). He has also written for Salon, the Atlantic and the Los Angeles Review of Books. For more information visit www.davidmasciotra.com. Originally published at Alternet ..."
"... Robert Reich, in his book Supercapitalism, explains that in the past 30 years the two industries with the most excessive increases in prices are health care and higher education. ..."
"... Using student loan loot and tax subsidies backed by its $3.5 billion endowment, New York University has created a new administrative class of aristocratic compensation. The school not only continues to hire more administrators – many of whom the professors indict as having no visible value in improving the education for students bankrupting themselves to register for classes – but shamelessly increases the salaries of the academic administrative class. The top 21 administrators earn a combined total of $23,590,794 per year. The NYU portfolio includes many multi-million-dollar mansions and luxury condos, where deans and vice presidents live rent-free. ..."
"... As the managerial class grows, in size and salary, so does the full time faculty registry shrink. Use of part time instructors has soared to stratospheric heights at NYU. Adjunct instructors, despite having a minimum of a master's degree and often having a Ph.D., receive only miserly pay-per-course compensation for their work, and do not receive benefits. Many part-time college instructors must transform their lives into daily marathons, running from one school to the next, barely able to breathe between commutes and courses. Adjunct pay varies from school to school, but the average rate is $2,900 per course. ..."
"... New York Times ..."
"... to the people making decisions ..."
"... it's the executives and management generally. Just like Wall Street, many of these top administrators have perfected the art of failing upwards. ..."
"... What is the benefit? What are the risks? ..."
"... Sophomore Noell Conley lives there, too. She shows off the hotel-like room she shares with a roommate . ..."
"... "As you walk in, to the right you see our granite countertops with two sinks, one for each of the residents," she says. A partial wall separates the beds. Rather than trek down the hall to shower, they share a bathroom with the room next door. "That's really nice compared to community bathrooms that I lived in last year," Conley says. To be fair, granite countertops last longer. Tempur-Pedic is a local company - and gave a big discount. The amenities include classrooms and study space that are part of the dorm. Many of the residents are in the university's Honors program. But do student really need Apple TV in the lounges, or a smartphone app that lets them check their laundry status from afar? "Demand has been very high," says the university's Penny Cox, who is overseeing the construction of several new residence halls on campus. Before Central Hall's debut in August, the average dorm was almost half a century old, she says. That made it harder to recruit. " If you visit places like Ohio State, Michigan, Alabama," Cox says, "and you compare what we had with what they have available to offer, we were very far behind." Today colleges are competing for a more discerning consumer. Students grew up with fewer siblings, in larger homes, Cox says. They expect more privacy than previous generations - and more comforts. "These days we seem to be bringing kids up to expect a lot of material plenty," says Jean Twenge, a psychology professor at San Diego State University and author of the book "Generation Me." Those students could be in for some disappointment when they graduate , she says. "When some of these students have all these luxuries and then they get an entry-level job and they can't afford the enormous flat screen and the granite countertops," Twenge says, "then that's going to be a rude awakening." Some on campus also worry about the divide between students who can afford such luxuries and those who can't. The so-called premium dorms cost about $1,000 more per semester. Freshman Josh Johnson, who grew up in a low-income family and lives in one of the university's 1960s-era buildings, says the traditional dorm is good enough for him. ..."
"... "I wouldn't pay more just to live in a luxury dorm," he says. "It seems like I could just pay the flat rate and get the dorm I'm in. It's perfectly fine." In the near future students who want to live on campus won't have a choice. Eventually the university plans to upgrade all of its residence halls. ..."
"... Competition for students who have more sophisticated tastes than in past years is creating the perfect environment for schools to try to outdo each other with ever-more posh on-campus housing. Keeping up in the luxury dorm race is increasingly critical to a school's bottom line: A 2006 study published by the Association of Higher Education Facilities Officers found that "poorly maintained or inadequate residential facilities" was the number-one reason students rejected enrolling at institutions. PHOTO GALLERY: Click Here to See the 10 Schools with Luxury Dorms ..."
"... Private universities get most of the mentions on lists of schools with great dorms, as recent ratings by the Princeton Review, College Prowler, and Campus Splash make clear. But a few state schools that have invested in brand-new facilities are starting to show up on those reviews, too. ..."
"... While many schools offer first dibs on the nicest digs to upperclassmen on campus, as the war for student dollars ratchets up even first-year students at public colleges are living in style. Here are 10 on-campus dormitories at state schools that offer students resort-like amenities. ..."
"... Perhaps some students are afraid to protest for fear of being photographed or videographed and having their face and identity given to every prospective employer throughout America. Perhaps those students are afraid of being blackballed throughout the Great American Workplace if they are caught protesting anything on camera. ..."
"... Mao was perfectly content to promote technical education in the new China. What he deprecated (and fought to suppress) was the typical liberal arts notion of critical thinking. We're witnessing something comparable in the U.S. We're witnessing something comparable in the U.S. ..."
"... Many of the best students feel enormous pressure to succeed and have some inkling that their job prospects are growing narrower, but they almost universally accept this as the natural order of things. Their outlook: if there are 10 or 100 applicants for every available job, well, by golly, I just have to work that much harder and be the exceptional one who gets the job. ..."
"... I read things like this and think about Louis Althusser and his ideas about "Ideological State Apparatuses." While in liberal ideology the education is usually considered to be the space where opportunity to improve one's situation is founded, Althusser reached the complete opposite conclusion. For him, universities are the definitive bourgeois institution, the ideological state apparatus of the modern capitalist state par excellance . The real purpose of the university was not to level the playing field of opportunity but to preserve the advantages of the bourgeoisie and their children, allowing the class system to perpetuate/reproduce itself. ..."
"... My nephew asked me to help him with his college introductory courses in macroeconomics and accounting. I was disappointed to find out what was going on: no lectures by professors, no discussion sessions with teaching assistants; no team projects–just two automated correspondence courses, with automated computer graded problem sets objective tests – either multiple choice, fill in the blank with a number, or fill in the blank with a form answer. This from a public university that is charging tuition for attendance just as though it were really teaching something. All they're really certifying is that the student can perform exercises is correctly reporting what a couple of textbooks said about subjects of marginal relevance to his degree. My nephew understands exactly that this is going on, but still . ..."
"... The reason students accept this has to be the absolutely demobilized political culture of the United States combined with what college represents structurally to students from the middle classes: the only possibility – however remote – of achieving any kind of middle class income. ..."
"... Straight bullshit, but remember our school was just following the national (Neoliberal) model. ..."
Jun 26, 2015 | naked capitalism

Yves here. In May, we wrote up and embedded the report on how NYU exploits students and adjuncts in "The Art of the Gouge": NYU as a Model for Predatory Higher Education. This article below uses that study as a point of departure for for its discussion of how higher education has become extractive.

By David Masciotra, the author of Mellencamp: American Troubadour (University Press of Kentucky). He has also written for Salon, the Atlantic and the Los Angeles Review of Books. For more information visit www.davidmasciotra.com. Originally published at Alternet

Higher education wears the cloak of liberalism, but in policy and practice, it can be a corrupt and cutthroat system of power and exploitation. It benefits immensely from right-wing McCarthy wannabes, who in an effort to restrict academic freedom and silence political dissent, depict universities as left-wing indoctrination centers.

But the reality is that while college administrators might affix "down with the man" stickers on their office doors, many prop up a system that is severely unfair to American students and professors, a shocking number of whom struggle to make ends meet. Even the most elementary level of political science instructs that politics is about power. Power, in America, is about money: who has it? Who does not have it? Who is accumulating it? Who is losing it? Where is it going?

Four hundred faculty members at New York University, one of the nation's most expensive schools, recently released a report on how their own place of employment, legally a nonprofit institution, has become a predatory business, hardly any different in ethical practice or economic procedure than a sleazy storefront payday loan operator. Its title succinctly summarizes the new intellectual discipline deans and regents have learned to master: "The Art of The Gouge."

The result of their investigation reads as if Charles Dickens and Franz Kafka collaborated on notes for a novel. Administrators not only continue to raise tuition at staggering rates, but they burden their students with inexplicable fees, high cost burdens and expensive requirements like mandatory study abroad programs. When students question the basis of their charges, much of them hidden during the enrollment and registration phases, they find themselves lost in a tornadic swirl of forms, automated answering services and other bureaucratic debris.

Often the additional fees add up to thousands of dollars, and that comes on top of the already hefty tuition, currently $46,000 per academic year, which is more than double its rate of 2001. Tuition at NYU is higher than most colleges, but a bachelor's degree, nearly anywhere else, still comes with a punitive price tag. According to the College Board, the average cost of tuition and fees for the 2014–2015 school year was $31,231 at private colleges, $9,139 for state residents at public colleges, and $22,958 for out-of-state residents attending public universities.

Robert Reich, in his book Supercapitalism, explains that in the past 30 years the two industries with the most excessive increases in prices are health care and higher education. Lack of affordable health care is a crime, Reich argues, but at least new medicines, medical technologies, surgeries, surgery techs, and specialists can partially account for inflation. Higher education can claim no costly infrastructural or operational developments to defend its sophisticated swindle of American families. It is a high-tech, multifaceted, but old fashioned transfer of wealth from the poor, working- and middle-classes to the rich.

Using student loan loot and tax subsidies backed by its $3.5 billion endowment, New York University has created a new administrative class of aristocratic compensation. The school not only continues to hire more administrators – many of whom the professors indict as having no visible value in improving the education for students bankrupting themselves to register for classes – but shamelessly increases the salaries of the academic administrative class. The top 21 administrators earn a combined total of $23,590,794 per year. The NYU portfolio includes many multi-million-dollar mansions and luxury condos, where deans and vice presidents live rent-free.

Meanwhile, NYU has spent billions, over the past 20 years, on largely unnecessary real estate projects, buying property and renovating buildings throughout New York. The professors' analysis, NYU's US News and World Report Ranking, and student reviews demonstrate that few of these extravagant projects, aimed mostly at pleasing wealthy donors, attracting media attention, and giving administrators opulent quarters, had any impact on overall educational quality.

As the managerial class grows, in size and salary, so does the full time faculty registry shrink. Use of part time instructors has soared to stratospheric heights at NYU. Adjunct instructors, despite having a minimum of a master's degree and often having a Ph.D., receive only miserly pay-per-course compensation for their work, and do not receive benefits. Many part-time college instructors must transform their lives into daily marathons, running from one school to the next, barely able to breathe between commutes and courses. Adjunct pay varies from school to school, but the average rate is $2,900 per course.

Many schools offer rates far below the average, most especially community colleges paying only $1,000 to $1,500. Even at the best paying schools, adjuncts, as part time employees, are rarely eligible for health insurance and other benefits. Many universities place strict limits on how many courses an instructor can teach. According to a recent study, 25 percent of adjuncts receive government assistance.

The actual scandal of "The Art of the Gouge" is that even if NYU is a particularly egregious offender of basic decency and honesty, most of the report's indictments could apply equally to nearly any American university. From 2003-2013, college tuition increased by a crushing 80 percent. That far outpaces all other inflation. The closest competitor was the cost of medical care, which in the same time period, increased by a rate of 49 percent. On average, tuition in America rises eight percent on an annual basis, placing it far outside the moral universe. Most European universities charge only marginal fees for attendance, and many of them are free. Senator Bernie Sanders recently introduced a bill proposing all public universities offer free education. It received little political support, and almost no media coverage.

In order to obtain an education, students accept the paralytic weight of student debt, the only form of debt not dischargeable in bankruptcy. Before a young person can even think about buying a car, house or starting a family, she leaves college with thousands of dollars in debt: an average of $29,400 in 2012. As colleges continue to suck their students dry of every dime, the US government profits at $41.3 billion per year by collecting interest on that debt. Congress recently cut funding for Pell Grants, yet increased the budget for hiring debt collectors to target delinquent student borrowers.

The university, once an incubator of ideas and entrance into opportunity, has mutated into a tabletop model of America's economic architecture, where the top one percent of income earners now owns 40 percent of the wealth.

"The One Percent at State U," an Institute for Policy Studies report, found that at the 25 public universities with the highest paid presidents, student debt and adjunct faculty increased at dramatically higher rates than at the average state university. Marjorie Wood, the study's co-author, explained told the New York Times that extravagant executive pay is the "tip of a very large iceberg, with universities that have top-heavy executive spending also having more adjuncts, more tuition increases and more administrative spending.

Unfortunately, students seem like passive participants in their own liquidation. An American student protest timeline for 2014-'15, compiled by historian Angus Johnston, reveals that most demonstrations and rallies focused on police violence, and sexism. Those issues should inspire vigilance and activism, but only 10 out of 160 protests targeted tuition hikes for attack, and only two of those 10 events took place outside the state of California.

Class consciousness and solidarity actually exist in Chile, where in 2011 a student movement began to organize, making demands for free college. More than mere theater, high school and college students, along with many of their parental allies, engaged the political system and made specific demands for inexpensive education. The Chilean government announced that in March 2016, it will eliminate all tuition from public universities. Chile's victory for participatory democracy, equality of opportunity and social justice should instruct and inspire Americans. Triumph over extortion and embezzlement is possible.

This seems unlikely to happen in a culture, however, where even most poor Americans view themselves, in the words of John Steinbeck, as "temporarily embarrassed millionaires." The political, educational and economic ruling class of America is comfortable selling out its progeny. In the words of one student quoted in "The Art of the Gouge," "they see me as nothing more than $200,000."

washunate June 26, 2015 at 10:07 am

Awesome question in the headline.

At a basic level, I think the answer is yes, because on balance, college still provides a lot of privatized value to the individual. Being an exploited student with the College Credential Seal of Approval remains relatively much better than being an exploited non student lacking that all important seal. A college degree, for example, is practically a guarantee of avoiding the more unseemly parts of the US "justice" system.

But I think this is changing. The pressure is building from the bottom as academia loses credibility as an institution capable of, never mind interested in, serving the public good rather than simply being another profit center for connected workers. It's actually a pretty exciting time. The kiddos are getting pretty fed up, and the authoritarians at the top of the hierarchy are running out of money with which to buy off younger technocratic enablers and thought leaders and other Serious People.

washunate June 26, 2015 at 10:17 am

P.S., the author in this post demonstrates the very answer to the question. He assumes as true, without any need for support, that the very act of possessing a college degree makes one worthy of a better place in society. That mindset is why colleges can prey upon students. They hold a monopoly on access to resources in American society. My bold:

Adjunct instructors, despite having a minimum of a master's degree and often having a Ph.D., receive only miserly pay-per-course compensation for their work, and do not receive benefits.

What does having a masters degree or PhD have to do with the moral claim of all human beings to a life of dignity and purpose?

flora June 26, 2015 at 11:37 am

There are so many more job seekers per job opening now than, say, 20 or thirty years ago that a degree is used to sort out applications. Now a job that formerly listed a high school degree as a requirement may now list a college degree as a requirement, just to cut down on the number of applications.

So, no, a B.A. or B.S. doesn't confer moral worth, but it does open more job doors than a high school diploma, even if the actual work only requires high school level math, reading, science or technology.

Ben June 26, 2015 at 1:11 pm

I agree a phd often makes someone no more useful in society. However the behaviour of the kids is rational *because* employers demand a masters / phd.

Students are then caught in a trap. Employers demand the paper, often from an expensive institution. The credit is abundant thanks to govt backed loans. They are caught in a situation where as a collective it makes no sense to join in, but as an individual if they opt out they get hurt also.

Same deal for housing. It's a mad world my masters.

What can we do about this? The weak link in the chain seems to me to be employers. Why are they hurting themselves by selecting people who want higher pay but may offer little to no extra value? I work as a programmer and I often think " if we could just 'see' the non-graduate diamonds in the rough".

If employers had perfect knowledge of prospective employees *and* if they saw that a degree would make no difference to their performance universities would crumble overnight.

The state will never stop printing money via student loans. If we can fix recruitment then universities are dead.

washunate June 26, 2015 at 2:22 pm

Why are they hurting themselves by selecting people who want higher pay but may offer little to no extra value?

Yeah, I have thought a lot about that particular question of organizational behavior. It does make sense, conceptually, that somebody would disrupt the system and take people based on ability rather than credentials. Yet we are moving in the opposite direction, toward more rigidity in educational requirements for employment.

For my two cents, I think the bulk of the answer lies in how hiring specifically, and management philosophy more generally, works in practice. The people who make decisions are themselves also subject to someone else's decisions. This is true all up and down the hierarchical ladder, from board members and senior executives to the most junior managers and professionals.

It's true that someone without a degree may offer the same (or better) performance to the company. But they do not offer the same performance to the people making decisions, because those individual people also depend upon their own college degrees to sell their own labor services. To hire significant numbers of employees without degrees into important roles is to sabotage their own personal value.

Very few people are willing to be that kind of martyr. And generally speaking, they tend to self-select away from occupations where they can meaningfully influence decision-making processes in large organizations.

Absolutely, individual business owners can call BS on the whole scam. It is a way that individual people can take action against systemic oppression. Hire workers based upon their fit for the job, not their educational credentials or criminal background or skin color or sexual orientation or all of the other tests we have used. But that's not a systemic solution because the incentives created by public policy are overwhelming at large organizations to restrict who is 'qualified' to fill the good jobs (and increasingly, even the crappy jobs).

Laaughingsong June 26, 2015 at 3:03 pm

I am not so sure that this is so. So many jobs are now crapified. When I was made redundant in 2009, I could not find many jobs that fit my level of experience (just experience! I have no college degree), so I applied for anything that fit my skill set, pretty much regardless of level. I was called Overqualified. I have heard that in the past as well, but never more so during that stretch of job hunting. Remember that's with no degree. Maybe younger people don't hear it as much. But I also think life experience has something to do with it, you need to have something to compare it to. How many times did our parents tell us how different things were when they were kids, how much easier? I didn't take that on board, did y'all?

sam s smith June 26, 2015 at 4:03 pm

I blame HR.

tsk June 27, 2015 at 4:42 pm

For various reasons, people seeking work these days, especially younger job applicants, might not possess the habits of mind and behavior that would make them good employees – i.e., punctuality, the willingness to come to work every day (even when something more fun or interesting comes up, or when one has partied hard the night before), the ability to meet deadlines rather than make excuses for not meeting them, the ability to write competently at a basic level, the ability to read instructions, diagrams, charts, or any other sort of necessary background material, the ability to handle basic computation, the ability to FOLLOW instructions rather than deciding that one will pick and choose which rules and instructions to follow and which to ignore, trainability, etc.

Even if a job applicant's degree is in a totally unrelated field, the fact that he or she has managed to complete an undergraduate degree–or, if relevant, a master's or a doctorate – is often accepted by employers as a sign that the applicant has a sense of personal responsibility, a certain amount of diligence and educability, and a certain level of basic competence in reading, writing, and math.

By the same token, employers often assume that an applicant who didn't bother going to college or who couldn't complete a college degree program is probably not someone to be counted on to be a responsible, trainable, competent employee.

Obviously those who don't go to college, or who go but drop out or flunk out, end up disadvantaged when competing for jobs, which might not be fair at all in individual cases, especially now that college has been priced so far out of the range of so many bright, diligent students from among the poor and and working classes, and now even those from the middle class.

Nevertheless, in general an individual's ability to complete a college degree is not an unreasonable stand-in as evidence of that person's suitability for employment.

Roland June 27, 2015 at 5:14 pm

Nicely put, Ben.

Students are first caught in a trap of "credentials inflation" needed to obtain jobs, then caught by inflation in education costs, then stuck with undischargeable debt. And the more of them who get the credentials, the worse the credentials inflation–a spiral.

It's all fuelled by loose credit. The only beneficiaries are a managerial elite who enjoy palatial facilities.

As for the employers, they're not so bad off. Wages are coming down for credentialled employees due to all the competition. There is such a huge stock of degreed applicants that they can afford to ignore anyone who isn't. The credentials don't cost the employer–they're not spending the money, nor are they lending the money.

Modern money makes it possible for the central authorities to keep this racket going all the way up to the point of general systemic collapse. Why should they stop? Who's going to make them stop?

Bobbo June 26, 2015 at 10:19 am

The only reason the universities can get away with it is easy money. When the time comes that students actually need to pay tuition with real money, money they or their parents have actually saved, then college tuition rates will crash back down to earth. Don't blame the universities. This is the natural and inevitable outcome of easy money.

Jim June 26, 2015 at 10:54 am

Yes, college education in the US is a classic example of the effects of subsidies. Eliminate the subsidies and the whole education bubble would rapidly implode.

washunate June 26, 2015 at 11:03 am

I'm very curious if anyone will disagree with that assessment.

An obvious commonality across higher education, healthcare, housing, criminal justice, and national security is that we spend huge quantities of public money yet hold the workers receiving that money to extremely low standards of accountability for what they do with it.

tegnost June 26, 2015 at 11:38 am

Correct, it's not the universities, it's the culture that contains the universities, but the universities are training grounds for the culture so it is the universities just not only the universities Been remembering the song from my college days "my futures so bright i gotta wear shades". getting rich was the end in itself, and people who didn't make it didn't deserve anything but a whole lot of student debt,creating perverse incentives. And now we all know what the A in type a stands for at least among those who self identify as such, so yes it is the universities

Chris in Paris June 26, 2015 at 12:07 pm

I don't understand why the ability to accept guaranteed loan money doesn't come with an obligation by the school to cap tuition at a certain percentage over maximum loan amount? Would that be so hard to institute?

Ben June 26, 2015 at 1:53 pm

Student loans are debt issuance. Western states are desperate to issue debt as it's fungible with money and marked down as growth.

Borrow 120K over 3 years and it all gets paid into university coffers and reappears as "profit" now. Let some other president deal with low disposable income due to loan repayments. It's in a different electoral cycle – perfect.

jrd2 June 26, 2015 at 11:50 am

You can try to argue, but it will be hard to refute. If you give mortgages at teaser rates to anybody who can fog a mirror, you get a housing bubble. If you give student loans to any student without regard to the prospects of that student paying back the loan, you get a higher education bubble. Which will include private equity trying to catch as much of this money as they possibly can by investing in for profit educational institutions just barely adequate to benefit from federal student loan funds.

jrs June 26, 2015 at 6:16 pm

A lot of background conditions help. It helps to pump a housing bubble if there's nothing else worth investing in (including saving money at zero interest rates). It helps pump an education bubble if most of the jobs have been outsourced so people are competing more and more for fewer and fewer.

Beans June 26, 2015 at 11:51 am

I don't disagree with the statement that easy money has played the biggest role in jacking up tuition. I do strongly disagree that we shouldn't "blame" the universities. The universities are exactly where we should place the blame. The universities have become job training grounds, and yet continue to droll on and on about the importance of noble things like liberal education, the pursuit of knowledge, the importance of ideas, etc. They cannot have it both ways. Years ago, when tuition rates started escalating faster than inflation, the universities should have been the loudest critics – pointing out the cultural problems that would accompany sending the next generation into the future deeply indebted – namely that all the noble ideas learned at the university would get thrown out the window when financial reality forced recent graduates to chose between noble ideas and survival. If universities truly believed that a liberal education was important; that the pursuit of knowledge benefitted humanity – they should have led the charge to hold down tuition.

washunate June 26, 2015 at 12:47 pm

I took it to mean blame as in what allows the system to function. I heartily agree that highly paid workers at universities bear blame for what they do (and don't do) at a granular level.

It's just that they couldn't do those things without the system handing them gobs of resources, from tax deductability of charitable contributions to ignoring anti-competitive behavior in local real estate ownership to research grants and other direct funding to student loans and other indirect funding.

Jim June 26, 2015 at 3:09 pm

Regarding blaming "highly paid workers at universities" – If a society creates incentives for dysfunctional behavior such a society will have a lot of dysfunction. Eliminate the subsidies and see how quicly the educational bubble pops.

James Levy June 26, 2015 at 2:45 pm

You are ignoring the way that the rich bid up the cost of everything. 2% of the population will pay whatever the top dozen or so schools will charge so that little Billy or Sue can go to Harvard or Stanford. This leads to cost creep as the next tier ratchet up their prices in lock step with those above them, etc. The same dynamic happens with housing, at least around wealthy metropolitan areas.

daniel June 26, 2015 at 12:07 pm

Hi to you two,

A European perspective on this: yep, that's true on an international perspective. I belong to the ugly list of those readers of this blog who do not fully share the liberal values of most of you hear. However, may I say that I can agree on a lot of stuff.

US education and health-care are outrageously costly. Every European citizen moving to the states has a question: will he or she be sick whilst there. Every European parent with kids in higher education is aware that having their kids for one closing year in the US is the more they can afford (except if are a banquier d'affaires ). Is the value of the US education good? No doubt! Is is good value for money, of course not. Is the return on the money ok? It will prove disastrous, except if the USD crashed. The main reason? Easy money. As for any kind of investment. Remember that this is indeed a investment plan

Check the level of revenues of "public sector" teaching staff on both sides of the ponds. The figure for US professionals in these area are available on the Web. They are indeed much more costly than, say, North-of-Europe counterparts, "public sector" professionals in those area. Is higher education in the Netherlands sub-par when compared to the US? Of course not.

Yep financing education via the Fed (directly or not) is not only insanely costly. Just insane. The only decent solution: set up public institutions staffed with service-minded professionals that did not have to pay an insane sum to build up the curriculum themselves.

Are "public services" less efficient than private ones here in those area, health-care and higher education. Yep, most certainly. But, sure, having the fed indirectly finance the educational system just destroy any competitive savings made in building a competitive market-orientated educational system and is one of the worst way to handle your educational system.

Yep, you can do a worst use of the money, subprime or China buildings But that's all about it.

US should forget about exceptionnalism and pay attention to what North of Europe is doing in this area. Mind you, I am Southerner (of Europe). But of course I understand that trying to run these services on a federal basis is indeed "mission impossible".

Way to big! Hence the indirect Washington-decided Wall-Street-intermediated Fed-and-deficit-driven financing of higher education. Mind you: we have more and more of this bankers meddling in education in Europe and I do not like what I see.

John Zelnicker June 27, 2015 at 1:36 pm

@washunate – 6/26/15, 11:03 am. I know I'm late to the party, but I disagree. It's not the workers, it's the executives and management generally. Just like Wall Street, many of these top administrators have perfected the art of failing upwards.

IMNSHO everyone needs to stop blaming labor and/or the labor unions. It's not the front line workers, teachers, retail clerks, adjunct instructors, all those people who do the actual work rather than managing other people. Those workers have no bargaining power, and the unions have lost most of theirs, in part due to the horrible labor market, as well as other important reasons.

We have demonized virtually all of the government workers who actually do the work that enables us to even have a government (all levels) and to provide the services we demand, such as public safety, education, and infrastructure. These people are our neighbors, relatives and friends; we owe them better than this.

/end of rant

Roland June 27, 2015 at 5:20 pm

Unionized support staff at Canadian universities have had sub-inflation wage increases for nearly 20 years, while tuition has been rising at triple the rate of inflation.

So obviously one can't blame the unions for rising education costs.

Spring Texan June 28, 2015 at 8:03 am

Thanks for your rant! You said a mouthful. And could not be more correct.

Adam Eran June 26, 2015 at 12:18 pm

Omitted from this account: Federal funding for education has declined 55% since 1972. Part of the Powell memo's agenda.

It's understandable too; one can hardly blame legislators for punishing the educational establishment given the protests of the '60s and early '70s After all, they were one reason Nixon and Reagan rose to power. How dare they propose real democracy! Harumph!

To add to students' burden, there's the recent revision of bankruptcy law: student loans can no longer be retired by bankruptcy (Thanks Hillary!) It'll be interesting to see whether Hillary's vote on that bankruptcy revision becomes a campaign issue.

I also wonder whether employers will start to look for people without degrees as an indication they were intelligent enough to sidestep this extractive scam.

washunate June 26, 2015 at 1:54 pm

I'd be curious what you count as federal funding. Pell grants, for example, have expanded both in terms of the number of recipients and the amount of spending over the past 3 – 4 decades.

More generally, federal support for higher ed comes in a variety of forms. The bankruptcy law you mention is itself a form of federal funding. Tax exemption is another. Tax deductabiliity of contributions is another. So are research grants and exemptions from anti-competitive laws and so forth. There are a range of individual tax credits and deductions. The federal government also does not intervene in a lot of state supports, such as licensing practices in law and medicine that make higher ed gatekeepers to various fiefdoms and allowing universities to take fees for administering (sponsoring) charter schools. The Federal Work-Study program is probably one of the clearest specific examples of a program that offers both largely meaningless busy work and terrible wages.

As far as large employers seeking intelligence, I'm not sure that's an issue in the US? Generally speaking, the point of putting a college credential in a job requirement is precisely to find people participating in the 'scam'. If an employer is genuinely looking for intelligence, they don't have minimum educational requirements.

Laughingsong June 26, 2015 at 3:12 pm

I heard that Congress is cutting those:
http://www.washingtonpost.com/blogs/wonkblog/wp/2014/12/10/congress-cuts-federal-financial-aid-for-needy-students/

different clue June 28, 2015 at 3:06 am

Why would tuition rates come down when students need to pay with "real money, money they or their parents have actually saved. . . " ? Didn't tuition at state universities begin climbing when state governments began boycotting state universities in terms of embargoing former rates of taxpayer support to them? Leaving the state universities to try making up the difference by raising tuition? If people want to limit or reduce the tuition charged to in-state students of state universities, people will have to resume paying former rates of taxes and elect people to state government to re-target those taxes back to state universities the way they used to do before the reductions in state support to state universities.

Jesper June 26, 2015 at 10:29 am

Protest against exploitation and risk being black-listed by exploitative employers -> Only employers left are the ones who actually do want (not pretend to want) ethical people willing to stand up for what they believe in. Not many of those kind of employers around . What is the benefit? What are the risks?

Tammy June 27, 2015 at 4:35 pm

What is the benefit? What are the risks?
I am not a progressive, yet, there is always risk for solidary progress.

Andrew June 26, 2015 at 10:53 am

The author misrepresents the nature and demands of Chile's student movement.

Over the past few decades, university enrollment rates for Chileans expanded dramatically in part due to the creation of many private universities. In Chile, public universities lead the pack in terms of academic reputation and entrance is determined via competitive exams. As a result, students from poorer households who attended low-quality secondary schools generally need to look at private universities to get a degree. And these are the students to which the newly created colleges catered to.

According to Chilean legislation, universities can only function as non-profit entities. However, many of these new institutions were only nominally non-profit entities (for example, the owners of the university would also set up a real estate company that would rent the facilities to the college at above market prices) and they were very much lacking in quality. After a series of high-profile cases of universities that were open and shut within a few years leaving its students in limbo and debt, anger mounted over for-profit education.

The widespread support of the student movement was due to generalized anger about and education system that is dearly lacking in quality and to the violation of the spirit of the law regulating education. Once the student movement's demands became more specific and morphed from opposing for profit institutions to demanding free tuition for everyone, the widespread support waned quickly.

And while the government announced free tuition in public universities, there is a widespread consensus that this is a pretty terrible idea as it is regressive and involves large fiscal costs. In particular because most of the students that attend public universities come from relatively wealthy households that can afford tuition. The students that need the tuition assistance will not benefit under the new rules.

I personally benefited from the fantastically generous financial aid systems that some private American universities have set up which award grants and scholarships based on financial need only. And I believe that it is desirable for the State to guarantee that any qualified student has access to college regardless of his or her wealth I think that by romanticizing the Chilean student movement the author reveals himself to be either is dishonest or, at best, ignorant.

RanDomino June 27, 2015 at 12:23 pm

The protests also involved extremely large riots.

The Insider June 26, 2015 at 10:57 am

Students aren't protesting because they don't feel the consequences until they graduate.

One thing that struck me when I applied for a student loan a few years back to help me get through my last year of graduate school – the living expense allocation was surprisingly high. Not "student sharing an apartment with five random dudes while eating ramen and riding the bus", but more "living alone in a nice one-bedroom apartment while eating takeout and driving a car". Apocryphal stories of students using their student loans to buy new cars or take extravagant vacations were not impossible to believe.

The living expense portion of student loans is often so generous that students can live relatively well while going to school, which makes it that much easier for them to push to the backs of their minds the consequences that will come from so much debt when they graduate. Consequently, it isn't the students who are complaining – it's the former students. But by the time they are out of school and the university has their money in its pocket, it's too late for them to try and change the system.

lord koos June 26, 2015 at 11:42 am

I'm sure many students are simply happy to be in college the ugly truth hits later.

optimader June 26, 2015 at 12:39 pm

http://www.marketplace.org/topics/life/education/compete-students-colleges-roll-out-amenities

Sophomore Noell Conley lives there, too. She shows off the hotel-like room she shares with a roommate.

"As you walk in, to the right you see our granite countertops with two sinks, one for each of the residents," she says.

A partial wall separates the beds. Rather than trek down the hall to shower, they share a bathroom with the room next door.

"That's really nice compared to community bathrooms that I lived in last year," Conley says.

To be fair, granite countertops last longer. Tempur-Pedic is a local company - and gave a big discount. The amenities include classrooms and study space that are part of the dorm. Many of the residents are in the university's Honors program. But do student really need Apple TV in the lounges, or a smartphone app that lets them check their laundry status from afar?

"Demand has been very high," says the university's Penny Cox, who is overseeing the construction of several new residence halls on campus. Before Central Hall's debut in August, the average dorm was almost half a century old, she says. That made it harder to recruit.

"If you visit places like Ohio State, Michigan, Alabama," Cox says, "and you compare what we had with what they have available to offer, we were very far behind."

Today colleges are competing for a more discerning consumer. Students grew up with fewer siblings, in larger homes, Cox says. They expect more privacy than previous generations - and more comforts.

"These days we seem to be bringing kids up to expect a lot of material plenty," says Jean Twenge, a psychology professor at San Diego State University and author of the book "Generation Me."

Those students could be in for some disappointment when they graduate, she says.

"When some of these students have all these luxuries and then they get an entry-level job and they can't afford the enormous flat screen and the granite countertops," Twenge says, "then that's going to be a rude awakening."

Some on campus also worry about the divide between students who can afford such luxuries and those who can't. The so-called premium dorms cost about $1,000 more per semester. Freshman Josh Johnson, who grew up in a low-income family and lives in one of the university's 1960s-era buildings, says the traditional dorm is good enough for him.

"I wouldn't pay more just to live in a luxury dorm," he says. "It seems like I could just pay the flat rate and get the dorm I'm in. It's perfectly fine."

In the near future students who want to live on campus won't have a choice. Eventually the university plans to upgrade all of its residence halls.

So I wonder who on average will fair better navigating the post-college lifestyle/job market reality check, Noell or Josh? Personally, I would bet on the Joshes living in the 60's vintage enamel painted ciderblock dorm rooms.

optimader June 26, 2015 at 12:47 pm

Universities responding to the market

http://www.thefiscaltimes.com/Articles/2012/08/29/10-Public-Colleges-with-Insanely-Luxurious-Dorms

Competition for students who have more sophisticated tastes than in past years is creating the perfect environment for schools to try to outdo each other with ever-more posh on-campus housing. Keeping up in the luxury dorm race is increasingly critical to a school's bottom line: A 2006 study published by the Association of Higher Education Facilities Officers found that "poorly maintained or inadequate residential facilities" was the number-one reason students rejected enrolling at institutions.

PHOTO GALLERY: Click Here to See the 10 Schools with Luxury Dorms

Private universities get most of the mentions on lists of schools with great dorms, as recent ratings by the Princeton Review, College Prowler, and Campus Splash make clear. But a few state schools that have invested in brand-new facilities are starting to show up on those reviews, too.

While many schools offer first dibs on the nicest digs to upperclassmen on campus, as the war for student dollars ratchets up even first-year students at public colleges are living in style. Here are 10 on-campus dormitories at state schools that offer students resort-like amenities.

Jerry Denim June 26, 2015 at 4:37 pm

Bingo! They don't get really mad until they're in their early thirties and they are still stuck doing some menial job with no vacation time, no health insurance and a monstrous mountain of debt. Up until that point they're still working hard waiting for their ship to come in and blaming themselves for any lack of success like Steinbeck's 'embarrassed millionaires.' Then one day maybe a decade after they graduate they realize they've been conned but they've got bills to pay and other problems to worry about so they solider on. 18 year-olds are told by their high school guidance councilors, their parents and all of the adults they trust that college while expensive is a good investment and the only way to succeed. Why should they argue? They don't know any better yet.

different clue June 28, 2015 at 3:09 am

Perhaps some students are afraid to protest for fear of being photographed or videographed and having their face and identity given to every prospective employer throughout America. Perhaps those students are afraid of being blackballed throughout the Great American Workplace if they are caught protesting anything on camera.

Today isn't like the sixties when you could drop out in the confidence that you could always drop back in again. Nowadays there are ten limpets for every scar on the rock.

seabos84 June 26, 2015 at 11:16 am

the average is such a worthless number. The Data we need, and which all these parasitic professional managerial types won't provide –
x axis would be family income, by $5000 increments.
y axis would be the median debt level
we could get fancy, and also throw in how many kids are in school in each of those income increments.

BTW – this 55 yr. old troglodyte believes that 1 of the roles (note – I did NOT say "The Role") of education is preparing people to useful to society. 300++ million Americans, 7 billion humans – we ALL need shelter, reliable and safe food, reliable and safe water, sewage disposal, clothing, transportation, education, sick care, power, leisure, we should ALL have access to family wage jobs and time for BBQs with our various communities several times a year. I know plenty of techno-dweebs here in Seattle who need to learn some of the lessons of 1984, The Prince, and Shakespeare. I know plenty of fuzzies who could be a bit more useful with some rudimentary skills in engineering, or accounting, or finance, or stats, or bio, or chem
I don't know what the current education system is providing, other than some accidental good things for society at large, and mainly mechanisms for the para$ite cla$$e$ to stay parasites.

rmm.

Adam Eran June 26, 2015 at 12:22 pm

Mao was perfectly content to promote technical education in the new China. What he deprecated (and fought to suppress) was the typical liberal arts notion of critical thinking. We're witnessing something comparable in the U.S.

This suppression in China led to an increase in Mao's authority (obviously), but kept him delusional. For example, because China relied on Mao's agricultural advice, an estimated 70 million Chinese died during peacetime. But who else was to be relied upon as an authority?

Back the the U.S.S.A. (the United StateS of America): One Australian says of the American system: "You Yanks don't consult the wisdom of democracy; you enable mobs."

Tammy June 27, 2015 at 4:41 pm

Mao was perfectly content to promote technical education in the new China. What he deprecated (and fought to suppress) was the typical liberal arts notion of critical thinking. We're witnessing something comparable in the U.S. We're witnessing something comparable in the U.S.

Mao liked chaos because he believed in continuous revolution. I would argue what we're experiencing is nothing comparable to what China experienced. (I hope I've understood you correctly.)

Ted June 26, 2015 at 11:20 am

I am pretty sure a tradition of protest to affect political change in the US is a rather rare bird. Most people "protest" by changing their behavior. As an example, by questioning the value of the 46,000 local private college tuition as opposed the the 15k and 9k tiered state college options. My daughter is entering the freshman class next year, we opted for the cheaper state option because, in the end, a private school degree adds nothing, unless it is to a high name recognition institution.

I think, like housing, a downstream consequence of "the gouge" is not to question - much less understand - class relations, but to assess the value of the lifetyle choice once you are stuck with the price of paying for that lifestyle in the form of inflated debt repayments. Eventually "the folk" figure it out and encourage cheaper alternatives toward the same goal.

Jim June 26, 2015 at 3:18 pm

There's probably little point in engaging in political protest. Most people maximise their chances of success by focusing on variables over which they have some degree of control. The ability of most people to have much effect on the overall political-economic system is slight and any returns from political activity are highly uncertain.

jrs June 26, 2015 at 9:53 pm

How does anyone even expect to maintain cheap available state options without political activity? By wishful thinking I suppose?

The value of a private school might be graduating sooner, state schools are pretty overcrowded, but that may not at all be worth the debt (I doubt it almost ever is on a purely economic basis).

RabidGandhi June 27, 2015 at 7:57 pm

Maybe if we just elect the right people with cool posters and a hopey changey slogan, they'll take care of everything for us and we won't have to be politically active.

jrs June 26, 2015 at 10:04 pm

Of course refusal to engage politically because the returns to oneself by doing so are small really IS the tragedy of the commons. Thus one might say it's ethical to engage politically in order to avoid it. Some ethical action focuses on overcoming tragedy of the commons dilemmas. Of course the U.S. system being what it is I have a hard time blaming anyone for giving up.

chairman June 26, 2015 at 11:37 am

The middle class, working class and poor have no voice in politics or policy at all, and they don't know what's going on until it's too late. They've been pushed by all their high school staff that college is the only acceptable option - and often it is. What else are they going to do out of high school, work a 30 hour a week minimum wage retail job? The upper middle class and rich, who entirely monopolize the media, don't have any reason to care about skyrocketing college tuition - their parents are paying for it anyway. They'd rather write about the hip and trendy issues of the day, like trigger warnings.

Fool June 26, 2015 at 1:17 pm

To the contrary, they're hardly advised by "their high school staff"; nonetheless, subway ads for Phoenix, Monroe, etc. have a significant influence.

Uncle Bruno June 26, 2015 at 11:58 am

They're too busy working

Fool June 26, 2015 at 1:20 pm

Also Tinder.

collegestudent June 26, 2015 at 12:39 pm

Speaking as one of these college students, I think that a large part of the reason that the vast majority of students are just accepting the tuition rates is because it has become the societal norm. Growing up I can remember people saying "You need to go to college to find a good job." Because a higher education is seen as a necessity for most people, students think of tuition as just another form of taxes, acceptable and inevitable, which we will expect to get a refund on later in life.

Pitchfork June 26, 2015 at 1:03 pm

I teach at a "good" private university. Most of my students don't have a clue as to how they're being exploited. Many of the best students feel enormous pressure to succeed and have some inkling that their job prospects are growing narrower, but they almost universally accept this as the natural order of things. Their outlook: if there are 10 or 100 applicants for every available job, well, by golly, I just have to work that much harder and be the exceptional one who gets the job.

Incoming freshmen were born in the late 90s - they've never known anything but widespread corruption, financial and corporate oligarchy, i-Pads and the Long Recession.

But as other posters note, the moment of realization usually comes after four years of prolonged adolescence, luxury dorm living and excessive debt accumulation.

Tammy June 27, 2015 at 4:49 pm

Most Ph.D.'s don't either. I'd argue there have been times they have attempted to debate that exploitation is a good–for their employer and himself/herself–with linguistic games. Mind numbing . To be fair, they have a job.

Gottschee June 26, 2015 at 1:34 pm

I have watched the tuition double–double!–at my alma mater in the last eleven years. During this period, administrators have set a goal of increasing enrollment by a third, and from what I hear, they've done so. My question is always this: where is the additional tuition money going? Because as I walk through the campus, I don't really see that many improvements–yes, a new building, but that was supposedly paid for by donations and endowments. I don't see new offices for these high-priced admin people that colleges are hiring, and in fact, what I do see is an increase in the number of part-time faculty and adjuncts. The tenured faculty is not prospering from all this increased revenue, either.

I suspect the tuition is increasing so rapidly simply because the college can get away with it. And that means they are exploiting the students.

While still a student, I once calculated that it cost me $27.00/hour to be in class. (15 weeks x 20 "contact hours" per week =
300 hours/semester, $8000/semester divided by 300 hours = $27.00/hour). A crude calculation, certainly, but a starting point. I did this because I had an instructor who was consistently late to class, and often cancelled class, so much that he wiped out at least $300.00 worth of instruction. I had the gall to ask for a refund of that amount. I'm full of gall. Of course, I was laughed at, not just by the administrators, but also by some students.

Just like medical care, education pricing is "soft," that is, the price is what you are willing to pay. Desirable students get scholarships and stipends, which other students subsidize; similarly, some pre-ACA patients in hospitals were often treated gratis.

Students AND hospital patients alike seem powerless to affect the contract with the provider. Reform will not likely be forthcoming, as students, like patients, are "just passing through."

Martin Finnucane June 26, 2015 at 2:10 pm

Higher education wears the cloak of liberalism, but in policy and practice, it can be a corrupt and cutthroat system of power and exploitation.

I find the "but" in that sentence to be dissonant.

Mark Anderson June 26, 2015 at 3:12 pm

The tuition at most public universities has quadrupled or more over the last 15 to 20 years precisely BECAUSE state government subsidies have been
slashed in the meantime. I was told around 2005 that quadrupled tuition at the University of Minnesota made up for about half of the state money that the legislature had slashed from the university budget over the previous 15 years.

It is on top of that situation that university administrators are building themselves little aristocratic empires, very much modeled on the kingdoms of corporate CEOs
where reducing expenses (cutting faculty) and services to customers (fewer classes, more adjuncts) is seen as the height of responsibility and accountability, perhaps
even the definition of propriety.

Jim June 26, 2015 at 3:23 pm

Everyone should read the introductory chapter to David Graeber's " The Utopia of Rules: On Technology, Stupidity and the Secret Joys of Bureaucracy."

In Chapter One of this book entitled "The Iron law of Liberalism and the Era of Total Bureaucratization" Graeber notes that the US has become the most rigidly credentialised society in the world where

" in field after field from nurses to art teachers, physical therapists, to foreign policy consultants, careers which used to be considered an art (best learned through doing) now require formal professional training and a certificate of completion."

Graeber, in that same chapter, makes another extremely important point. when he notes that career advancement in may large bureaucratic organizations demands a willingness to play along with the fiction that advancement is based on merit, even though most everyone know that this isn't true.

The structure of modern power in the U.S., in both the merging public and private sectors, is built around the false ideology of a giant credentialized meritorcracy rather than the reality of arbitrary extraction by predatory bureaucratic networks.

armchair June 26, 2015 at 3:27 pm

Anecdote: I was speaking to someone who recently started working at as a law school administrator at my alma mater. Enrollment is actually down at law schools (I believe), because word has spread about the lame legal job market. So, the school administration is watching its pennies, and the new administrator says the administrators aren't getting to go on so many of the all expense paid conferences and junkets that they used to back in the heyday. As I hear this, I am thinking about how many of these awesome conferences in San Diego, New Orleans and New York that I'm paying back. Whatever happened to the metaphorical phrase: "when a pig becomes a hog, it goes to slaughter"?

Another anecdote: I see my undergrad alma mater has demolished the Cold War era dorms on one part of campus and replaced it with tons of slick new student housing.

MaroonBulldog June 26, 2015 at 7:15 pm

No doubt those Cold War era dorms had outlived their planned life. Time for replacement. Hell, they had probably become inhabitable and unsafe.

Meanwhile, has your undergraduate school replaced any of its lecture courses with courses presented same model as on-line traffic school? I have a pending comment below about how my nephew's public university "taught" him introductory courses in accounting and macroeconomics that way. Please be assured that the content of those courses was on a par with best practices in the on-line traffic school industry. It would be hilarious if it weren't so desperately sad.

Roquentin June 26, 2015 at 5:04 pm

I read things like this and think about Louis Althusser and his ideas about "Ideological State Apparatuses." While in liberal ideology the education is usually considered to be the space where opportunity to improve one's situation is founded, Althusser reached the complete opposite conclusion. For him, universities are the definitive bourgeois institution, the ideological state apparatus of the modern capitalist state par excellance. The real purpose of the university was not to level the playing field of opportunity but to preserve the advantages of the bourgeoisie and their children, allowing the class system to perpetuate/reproduce itself.

It certainly would explain a lot. It would explain why trying to send everyone to college won't solve this, because not everyone can have a bourgeois job. Some people actually have to do the work. The whole point of the university as an institution was to act as a sorting/distribution hub for human beings, placing them at certain points within the division of labor. A college degree used to mean more because getting it was like a golden ticket, guaranteeing someone who got it at least a petit-bourgeois lifestyle. The thing is, there are only so many slots in corporate America for this kind of employment. That number is getting smaller too. You could hand every man, woman, and child in America a BS and it wouldn't change this in the slightest.

What has happened instead, for college to preserve its role as the sorting mechanism/preservation of class advantage is what I like to call degree inflation and/or an elite formed within degrees themselves. Now a BS or BA isn't enough, one needs an Master's or PhD to really be distinguished. Now a degree from just any institution won't do, it has to be an Ivy or a Tier 1 school. Until we learn to think realistically about what higher education is as an institution little or nothing will change.

Jim June 26, 2015 at 8:14 pm

Any credential is worthless if everybody has it. All information depends on contrast. It's impossible for everybody to "stand out" from the masses. The more people have college degrees the less value a college degree has.

sid_finster June 26, 2015 at 5:49 pm

When I was half-grown, I heard it said that religion is no longer the opiate of the masses, in that no one believes in God anymore, at least not enough for it to change actual behavior.

Instead, buying on credit is the opiate of the masses.

MaroonBulldog June 26, 2015 at 6:58 pm

My nephew asked me to help him with his college introductory courses in macroeconomics and accounting. I was disappointed to find out what was going on: no lectures by professors, no discussion sessions with teaching assistants; no team projects–just two automated correspondence courses, with automated computer graded problem sets objective tests – either multiple choice, fill in the blank with a number, or fill in the blank with a form answer. This from a public university that is charging tuition for attendance just as though it were really teaching something. All they're really certifying is that the student can perform exercises is correctly reporting what a couple of textbooks said about subjects of marginal relevance to his degree. My nephew understands exactly that this is going on, but still .

This is how 21st century America treats its young people: it takes people who are poor, in the sense that they have no assets, and makes them poorer, loading them up with student debt, which they incur in order to finance a falsely-so-called course of university study that can't be a good deal, even for the best students among them.

I am not suggesting the correspondence courses have no worth at all. But they do not have the worth that is being charged for them in this bait-and-switch exercise by Ed Business.

MaroonBulldog June 27, 2015 at 1:39 am

After further thought, I'd compare my nephew's two courses to on-line traffic school: Mechanized "learning" – forget it all as soon as the test is over – Critical thinking not required. Except for the kind of "test preparation" critical thinking that teaches one to spot and eliminate the obviously wrong choices in objective answers–that kind of thinking saves time and so is very helpful.

Not only is he paying full tuition to receive this treatment, but his family and mine are paying taxes to support it, too.

Very useful preparation for later life, where we can all expect to attend traffic school a few times. But no preparation for any activity of conceivable use or benefit to any other person.

Spring Texan June 28, 2015 at 8:07 am

Good story. What a horrible rip-off!

P. Fitzsimon June 27, 2015 at 12:26 pm

I read recently that the business establishment viewed the most important contribution of colleges was that they warehoused young people for four years to allow maturing.

Fred Grosso June 27, 2015 at 4:55 pm

Where are the young people in all this? Is anyone going to start organizing to change things? Any ideas? Any interest? Are we going to have some frustrated, emotional person attempt to kill a university president once every ten years? Then education can appeal for support from the government to beef up security. Meanwhile the same old practices will prevail and the rich get richer and the rest of us get screwed.

Come on people step up.

Unorthodoxmarxist June 27, 2015 at 6:22 pm

The reason students accept this has to be the absolutely demobilized political culture of the United States combined with what college represents structurally to students from the middle classes: the only possibility – however remote – of achieving any kind of middle class income.

Really your choices in the United States are, in terms of jobs, to go into the military (and this is really for working class kids, Southern families with a military history and college-educated officer-class material) or to go to college.

The rest, who have no interest in the military, attend college, much like those who wanted to achieve despite of their class background went into the priesthood in the medieval period. There hasn't been a revolt due to the lack of any idea it could function differently and that American families are still somehow willing to pay the exorbitant rates to give their children a piece of paper that still enables them to claim middle class status though fewer and fewer find jobs. $100k in debt seems preferable to no job prospects at all.

Colleges have become a way for the ruling class to launder money into supposed non-profits and use endowments to purchase stocks, bonds, and real estate. College administrators and their lackeys (the extended school bureaucracy) are propping up another part of the financial sector – just take a look at Harvard's $30+ billion endowment, or Yale's $17 billion – these are just the top of a very large heap. They're all deep into the financial sector. Professors and students are simply there as an excuse for the alumni money machine and real estate scams to keep running, but there's less and less of a reason for them to employ professors, and I say this as a PhD with ten years of teaching experience who has seen the market dry up even more than it was when I entered grad school in the early 2000s.

A Real Black Person purple monkey dishwasher June 28, 2015 at 9:13 pm

"Colleges have become a way for the ruling class to launder money into supposed non-profits and use endowments to purchase stocks, bonds, and real estate. "

Unorthodoxmarxist, I thought I was the only person who was coming to that conclusion. I think there's data out there that could support our thesis that college tuition inflation may be affecting real estate prices. After all, justification a college grad gave to someone who was questioning the value of a college degree was that by obtaining a "a degree" and a professional job, an adult could afford to buy a home in major metropolitan hubs. I'm not sure if he was that ignorant, (business majors, despite the math requirement are highly ideological people. They're no where near as objective as they like to portray themselves as) or if he hasn't been in contact with anyone with a degree trying to buy a home in a metropolitan area.

Anyways, if our thesis is true, then if home prices declined in 2009, then college tuition should have declined as well, but it didn't at most trustworthy schools. Prospective students kept lining up to pay more for education that many insiders believe is "getting worse" because of widespread propaganda and a lack of alternatives, especially for "middle class" women.

Pelham June 27, 2015 at 7:04 pm

It's hard to say, but there ought to be a power keg of students here primed to blow. And Bernie Sanders' proposal for free college could be the fuse.

But first he'd have the light the fuse, and maybe he can. He's getting huge audiences and a lot of interest these days. And here's a timely issue. What would happen if Sanders toured colleges and called for an angry, mass and extended student strike across the country to launch on a certain date this fall or next spring to protest these obscene tuitions and maybe call for something else concrete, like a maximum ratio of administrators to faculty for colleges to receive accreditation?

It could ignite not only a long-overdue movement on campuses but also give a big boost to his campaign. He'd have millions of motivated and even furious students on his side as well as a lot of motivated and furious parents of students (my wife and I would be among them) - and these are just the types of people likely to get out and vote in the primaries and general election.

Sanders' consistent message about the middle class is a strong one. But here's a solid, specific but very wide-ranging issue that could bring that message into very sharp relief and really get a broad class of politically engaged people fired up.

I'm not one of those who think Sanders can't win but applaud his candidacy because it will nudge Hillary Clinton. I don't give a fig about Clinton. I think there's a real chance Sanders can win not just the nomination but also the presidency. This country is primed for a sharp political turn. Sanders could well be the right man in the right place and time. And this glaring and ongoing tuition ripoff that EVERYONE agrees on could be the single issue that puts him front-and-center rather than on the sidelines.

Rosario June 28, 2015 at 1:18 am

I finished graduate school about three years ago. During the pre-graduate terms that I paid out of pocket (2005-2009) I saw a near 70 percent increase in tuition (look up KY college tuition 1987-2009 for proof).

Straight bullshit, but remember our school was just following the national (Neoliberal) model.

Though, realize that I was 19-23 years old. Very immature (still immature) and feeling forces beyond my control. I did not protest out of a) fear [?] (I don't know, maybe, just threw that in there) b) the sheepskin be the path to salvation (include social/cultural pressures from parent, etc.).

I was more affected by b). This is the incredible power of our current Capitalist culture. It trains us well. We are always speaking its language, as if a Classic. Appraising its world through its values.

I wished to protest (i.e. Occupy, etc.) but to which master? All of its targets are post modern, all of it, to me, nonsense, and, because of this undead (unable to be destroyed). This coming from a young man, as I said, still immature, though I fear this misdirection, and alienation is affecting us all.

John June 28, 2015 at 10:42 am

NYU can gouge away. It's filled with Chinese students (spies) who pay full tuition.

[Nov 27, 2017] The Robot Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Notable quotes:
"... "In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.) ..."
"... At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks. ..."
"... This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle. ..."
"... In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly. ..."
"... In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks ..."
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye. The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Nov 27, 2017] The productivity paradox by Ryan Avent

Notable quotes:
"... But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay. ..."
"... Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it. ..."
Mar 20, 2017 | medium.com

People are worried about robots taking jobs. Driverless cars are around the corner. Restaurants and shops increasingly carry the option to order by touchscreen. Google's clever algorithms provide instant translations that are remarkably good.

But the economy does not feel like one undergoing a technology-driven productivity boom. In the late 1990s, tech optimism was everywhere. At the same time, wages and productivity were rocketing upward. The situation now is completely different. The most recent jobs reports in America and Britain tell the tale. Employment is growing, month after month after month. But wage growth is abysmal. So is productivity growth: not surprising in economies where there are lots of people on the job working for low pay.

The obvious conclusion, the one lots of people are drawing, is that the robot threat is totally overblown: the fantasy, perhaps, of a bubble-mad Silicon Valley - or an effort to distract from workers' real problems, trade and excessive corporate power. Generally speaking, the problem is not that we've got too much amazing new technology but too little.

This is not a strawman of my own invention. Robert Gordon makes this case. You can see Matt Yglesias make it here. Duncan Weldon, for his part, writes:

We are debating a problem we don't have, rather than facing a real crisis that is the polar opposite. Productivity growth has slowed to a crawl over the last 15 or so years, business investment has fallen and wage growth has been weak. If the robot revolution truly was under way, we would see surging capital expenditure and soaring productivity. Right now, that would be a nice "problem" to have. Instead we have the reality of weak growth and stagnant pay. The real and pressing concern when it comes to the jobs market and automation is that the robots aren't taking our jobs fast enough.

And in a recent blog post Paul Krugman concluded:

I'd note, however, that it remains peculiar how we're simultaneously worrying that robots will take all our jobs and bemoaning the stalling out of productivity growth. What is the story, really?

What is the story, indeed. Let me see if I can tell one. Last fall I published a book: "The Wealth of Humans". In it I set out how rapid technological progress can coincide with lousy growth in pay and productivity. Start with this:

Low labour costs discourage investments in labour-saving technology, potentially reducing productivity growth.

Peter K. -> Peter K.... Monday, March 20, 2017 at 09:26 AM

Increasing labour costs by making the minimum wage a living wage would increase the incentives to boost productivity growth? No, the neoliberals and corporate Democrats would never go for it. They're trying to appeal to the business community and their campaign contributors wouldn't like it.

anne -> Peter K.... March 20, 2017 at 10:32 AM

https://twitter.com/paulkrugman/status/843167658577182725

Paul Krugman @paulkrugman

But is [Ryan Avent] saying something different from the assertion that recent tech progress is capital-biased?

https://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

If so, what?

anne -> Peter K.... March 20, 2017 at 10:33 AM

http://krugman.blogs.nytimes.com/2012/12/26/capital-biased-technological-progress-an-example-wonkish/

December 26, 2012

Capital-biased Technological Progress: An Example (Wonkish)
By Paul Krugman

Ever since I posted about robots and the distribution of income, * I've had queries from readers about what capital-biased technological change – the kind of change that could make society richer but workers poorer – really means. And it occurred to me that it might be useful to offer a simple conceptual example – the kind of thing easily turned into a numerical example as well – to clarify the possibility. So here goes.

Imagine that there are only two ways to produce output. One is a labor-intensive method – say, armies of scribes equipped only with quill pens. The other is a capital-intensive method – say, a handful of technicians maintaining vast server farms. (I'm thinking in terms of office work, which is the dominant occupation in the modern economy).

We can represent these two techniques in terms of unit inputs – the amount of each factor of production required to produce one unit of output. In the figure below I've assumed that initially the capital-intensive technique requires 0.2 units of labor and 0.8 units of capital per unit of output, while the labor-intensive technique requires 0.8 units of labor and 0.2 units of capital.

[Diagram]

The economy as a whole can make use of both techniques – in fact, it will have to unless it has either a very large amount of capital per worker or a very small amount. No problem: we can just use a mix of the two techniques to achieve any input combination along the blue line in the figure. For economists reading this, yes, that's the unit isoquant in this example; obviously if we had a bunch more techniques it would start to look like the convex curve of textbooks, but I want to stay simple here.

What will the distribution of income be in this case? Assuming perfect competition (yes, I know, but let's deal with that case for now), the real wage rate w and the cost of capital r – both measured in terms of output – have to be such that the cost of producing one unit is 1 whichever technique you use. In this example, that means w=r=1. Graphically, by the way, w/r is equal to minus the slope of the blue line.

Oh, and if you're worried, yes, workers and machines are both paid their marginal product.

But now suppose that technology improves – specifically, that production using the capital-intensive technique gets more efficient, although the labor-intensive technique doesn't. Scribes with quill pens are the same as they ever were; server farms can do more than ever before. In the figure, I've assumed that the unit inputs for the capital-intensive technique are cut in half. The red line shows the economy's new choices.

So what happens? It's obvious from the figure that wages fall relative to the cost of capital; it's less obvious, maybe, but nonetheless true that real wages must fall in absolute terms as well. In this specific example, technological progress reduces the real wage by a third, to 0.667, while the cost of capital rises to 2.33.

OK, it's obvious how stylized and oversimplified all this is. But it does, I think, give you some sense of what it would mean to have capital-biased technological progress, and how this could actually hurt workers.

* http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

anne -> Peter K.... March 20, 2017 at 10:34 AM

http://krugman.blogs.nytimes.com/2012/12/08/rise-of-the-robots/

December 8, 2012

Rise of the Robots
By Paul Krugman

Catherine Rampell and Nick Wingfield write about the growing evidence * for "reshoring" of manufacturing to the United States. * They cite several reasons: rising wages in Asia; lower energy costs here; higher transportation costs. In a followup piece, ** however, Rampell cites another factor: robots.

"The most valuable part of each computer, a motherboard loaded with microprocessors and memory, is already largely made with robots, according to my colleague Quentin Hardy. People do things like fitting in batteries and snapping on screens.

"As more robots are built, largely by other robots, 'assembly can be done here as well as anywhere else,' said Rob Enderle, an analyst based in San Jose, California, who has been following the computer electronics industry for a quarter-century. 'That will replace most of the workers, though you will need a few people to manage the robots.' "

Robots mean that labor costs don't matter much, so you might as well locate in advanced countries with large markets and good infrastructure (which may soon not include us, but that's another issue). On the other hand, it's not good news for workers!

This is an old concern in economics; it's "capital-biased technological change," which tends to shift the distribution of income away from workers to the owners of capital.

Twenty years ago, when I was writing about globalization and inequality, capital bias didn't look like a big issue; the major changes in income distribution had been among workers (when you include hedge fund managers and CEOs among the workers), rather than between labor and capital. So the academic literature focused almost exclusively on "skill bias", supposedly explaining the rising college premium.

But the college premium hasn't risen for a while. What has happened, on the other hand, is a notable shift in income away from labor:

[Graph]

If this is the wave of the future, it makes nonsense of just about all the conventional wisdom on reducing inequality. Better education won't do much to reduce inequality if the big rewards simply go to those with the most assets. Creating an "opportunity society," or whatever it is the likes of Paul Ryan etc. are selling this week, won't do much if the most important asset you can have in life is, well, lots of assets inherited from your parents. And so on.

I think our eyes have been averted from the capital/labor dimension of inequality, for several reasons. It didn't seem crucial back in the 1990s, and not enough people (me included!) have looked up to notice that things have changed. It has echoes of old-fashioned Marxism - which shouldn't be a reason to ignore facts, but too often is. And it has really uncomfortable implications.

But I think we'd better start paying attention to those implications.

* http://www.nytimes.com/2012/12/07/technology/apple-to-resume-us-manufacturing.html

** http://economix.blogs.nytimes.com/2012/12/07/when-cheap-foreign-labor-gets-less-cheap/

anne -> anne... March 20, 2017 at 10:41 AM

https://fred.stlouisfed.org/graph/?g=d4ZY

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015


https://fred.stlouisfed.org/graph/?g=d507

January 30, 2017

Compensation of Employees as a share of Gross Domestic Income, 1948-2015

(Indexed to 1948)

[Nov 27, 2017] Nineteen Ninety-Six: The Robot/Productivity Paradox and the concept of bezel

This concept of "bezel" is an important one
Feb 22, 2017 | econospeak.blogspot.com

Sandwichman -> Sandwichman ... February 24, 2017 at 08:36 AM

John Kenneth Galbraith, from "The Great Crash 1929":

"In many ways the effect of the crash on embezzlement was more significant than on suicide. To the economist embezzlement is the most interesting of crimes. Alone among the various forms of larceny it has a time parameter. Weeks, months or years may elapse between the commission of the crime and its discovery. (This is a period, incidentally, when the embezzler has his gain and the man who has been embezzled, oddly enough, feels no loss. There is a net increase in psychic wealth.)

At any given time there exists an inventory of undiscovered embezzlement in – or more precisely not in – the country's business and banks.

This inventory – it should perhaps be called the bezzle – amounts at any moment to many millions [trillions!] of dollars. It also varies in size with the business cycle.

In good times people are relaxed, trusting, and money is plentiful. But even though money is plentiful, there are always many people who need more. Under these circumstances the rate of embezzlement grows, the rate of discovery falls off, and the bezzle increases rapidly.

In depression all this is reversed. Money is watched with a narrow, suspicious eye.

The man who handles it is assumed to be dishonest until he proves himself otherwise. Audits are penetrating and meticulous. Commercial morality is enormously improved. The bezzle shrinks."

Sanwichman, February 24, 2017 at 05:24 AM

For nearly a half a century, from 1947 to 1996, real GDP and real Net Worth of Households and Non-profit Organizations (in 2009 dollars) both increased at a compound annual rate of a bit over 3.5%. GDP growth, in fact, was just a smidgen faster -- 0.016% -- than growth of Net Household Worth.

From 1996 to 2015, GDP grew at a compound annual rate of 2.3% while Net Worth increased at the rate of 3.6%....

-- Sanwichman

anne -> anne... February 24, 2017 at 05:25 AM

https://fred.stlouisfed.org/graph/?g=cOU6

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1952-2016

(Indexed to 1952)

https://fred.stlouisfed.org/graph/?g=cPq1

January 15, 2017

Gross Domestic Product and Net Worth for Households & Nonprofit Organizations, 1992-2016

(Indexed to 1992)

anne -> Sandwichman ... February 24, 2017 at 03:35 PM

The real home price index extends from 1890. From 1890 to 1996, the index increased slightly faster than inflation so that the index was 100 in 1890 and 113 in 1996. However from 1996 the index advanced to levels far beyond any previously experienced, reaching a high above 194 in 2006. Previously the index high had been just above 130.

Though the index fell from 2006, the level in 2016 is above 161, a level only reached when the housing bubble had formed in late 2003-early 2004.

Real home prices are again strikingly high:

http://www.econ.yale.edu/~shiller/data.htm Reply Friday, February 24, 2017 at 03:34 PM anne -> Sandwichman ... February 24, 2017

Valuation

The Shiller 10-year price-earnings ratio is currently 29.34, so the inverse or the earnings rate is 3.41%. The dividend yield is 1.93. So an expected yearly return over the coming 10 years would be 3.41 + 1.93 or 5.34% provided the price-earnings ratio stays the same and before investment costs.

Against the 5.34% yearly expected return on stock over the coming 10 years, the current 10-year Treasury bond yield is 2.32%.

The risk premium for stocks is 5.34 - 2.32 or 3.02%:

http://www.econ.yale.edu/~shiller/data.htm

anne -> anne..., February 24, 2017 at 05:36 AM

What the robot-productivity paradox is puzzles me, other than since 2005 for all the focus on the productivity of robots and on robots replacing labor there has been a dramatic, broad-spread slowing in productivity growth.

However what the changing relationship between the growth of GDP and net worth since 1996 show, is that asset valuations have been increasing relative to GDP. Valuations of stocks and homes are at sustained levels that are higher than at any time in the last 120 years. Bear markets in stocks and home prices have still left asset valuations at historically high levels. I have no idea why this should be.

Sandwichman -> anne... February 24, 2017 at 08:34 AM

The paradox is that productivity statistics can't tell us anything about the effects of robots on employment because both the numerator and the denominator are distorted by the effects of colossal Ponzi bubbles.

John Kenneth Galbraith used to call it "the bezzle." It is "that increment to wealth that occurs during the magic interval when a confidence trickster knows he has the money he has appropriated but the victim does not yet understand that he has lost it." The current size of the gross national bezzle (GNB) is approximately $24 trillion.

Ponzilocks and the Twenty-Four Trillion Dollar Question

http://econospeak.blogspot.ca/2017/02/ponzilocks-and-twenty-four-trillion.html

Twenty-three and a half trillion, actually. But what's a few hundred billion? Here today, gone tomorrow, as they say.

At the beginning of 2007, net worth of households and non-profit organizations exceeded its 1947-1996 historical average, relative to GDP, by some $16 trillion. It took 24 months to wipe out eighty percent, or $13 trillion, of that colossal but ephemeral slush fund. In mid-2016, net worth stood at a multiple of 4.83 times GDP, compared with the multiple of 4.72 on the eve of the Great Unworthing.

When I look at the ragged end of the chart I posted yesterday, it screams "Ponzi!" "Ponzi!" "Ponz..."

To make a long story short, let's think of wealth as capital. The value of capital is determined by the present value of an expected future income stream. The value of capital fluctuates with changing expectations but when the nominal value of capital diverges persistently and significantly from net revenues, something's got to give. Either economic growth is going to suddenly gush forth "like nobody has ever seen before" or net worth is going to have to come back down to earth.

Somewhere between 20 and 30 TRILLION dollars of net worth will evaporate within the span of perhaps two years.

When will that happen? Who knows? There is one notable regularity in the data, though -- the one that screams "Ponzi!"

When the net worth bubble stops going up...
...it goes down.

[Nov 20, 2017] Sudoers - Community Help Wiki

Notable quotes:
"... The special command '"sudoedit"' allows users to run sudo with the -e flag or as the command sudoedit . If you include command line arguments in a command in an alias these must exactly match what the user enters on the command line. If you include any of the following they will need to be escaped with a backslash (\): ",", "\", ":", "=". ..."
Nov 09, 2017 | help.ubuntu.com

... ... ...

Aliases

There are four kinds of aliases: User_Alias, Runas_Alias, Host_Alias and Cmnd_Alias. Each alias definition is of the form:

Where Alias_Type is one of User_Alias, Runas_Alias, Host_Alias or Cmnd_Alias. A name is a string of uppercase letters, numbers and underscores starting with an uppercase letter. You can put several aliases of the same type on one line by separating them with colons (:) as so:

You can include other aliases in an alias specification provided they would normally fit there. For example you can use a user alias wherever you would normally expect to see a list of users (for example in a user or runas alias).

There are also built in aliases called ALL which match everything where they are used. If you used ALL in place of a user list it matches all users for example. If you try and set an alias of ALL it will be overridden by this built in alias so don't even try.

User Aliases

User aliases are used to specify groups of users. You can specify usernames, system groups (prefixed by a %) and netgroups (prefixed by a +) as follows:

 # Everybody in the system group "admin" is covered by the alias ADMINS
 User_Alias ADMINS = %admin
 # The users "tom", "dick", and "harry" are covered by the USERS alias
 User_Alias USERS = tom, dick, harry
 # The users "tom" and "mary" are in the WEBMASTERS alias
 User_Alias WEBMASTERS = tom, mary
 # You can also use ! to exclude users from an alias
 # This matches anybody in the USERS alias who isn't in WEBMASTERS or ADMINS aliases
 User_Alias LIMITED_USERS = USERS, !WEBMASTERS, !ADMINS
Runas Aliases

Runas Aliases are almost the same as user aliases but you are allowed to specify users by uid's. This is helpful as usernames and groups are matched as strings so two users with the same uid but different usernames will not be matched by entering a single username but can be matched with a uid. For example:

 # UID 0 is normally used for root
 # Note the hash (#) on the following line indicates a uid, not a comment.
 Runas_Alias ROOT = #0
 # This is for all the admin users similar to the User_Alias of ADMINS set earlier 
 # with the addition of "root"
 Runas_Alias ADMINS = %admin, root
Host Aliases

A host alias is a list of hostname, ip addresses, networks and netgroups (prefixed with a +). If you do not specify a netmask with a network the netmask of the hosts ethernet interface(s) will be used when matching.

 # This is all the servers
 Host_Alias SERVERS = 192.168.0.1, 192.168.0.2, server1
 # This is the whole network
 Host_Alias NETWORK = 192.168.0.0/255.255.255.0
 # And this is every machine in the network that is not a server
 Host_Alias WORKSTATIONS = NETWORK, !SERVER
 # This could have been done in one step with 
 # Host_Alias WORKSTATIONS = 192.168.0.0/255.255.255.0, !SERVERS
 # but I think this method is clearer.
Command Aliases

Command aliases are lists of commands and directories. You can use this to specify a group of commands. If you specify a directory it will include any file within that directory but not in any subdirectories.

The special command '"sudoedit"' allows users to run sudo with the -e flag or as the command sudoedit . If you include command line arguments in a command in an alias these must exactly match what the user enters on the command line. If you include any of the following they will need to be escaped with a backslash (\): ",", "\", ":", "=".

Examples:

 # All the shutdown commands
 Cmnd_Alias SHUTDOWN_CMDS = /sbin/poweroff, /sbin/reboot, /sbin/halt
 # Printing commands
 Cmnd_Alias PRINTING_CMDS = /usr/sbin/lpc, /usr/sbin/lprm
 # Admin commands
 Cmnd_Alias ADMIN_CMDS = /usr/sbin/passwd, /usr/sbin/useradd, /usr/sbin/userdel, /usr/sbin/usermod, /usr/sbin/visudo
 # Web commands
 Cmnd_Alias WEB_CMDS = /etc/init.d/apache2
User Specifications

User Specifications are where the sudoers file sets who can run what as who. It is the key part of the file and all the aliases have just been set up for this very point. If this was a film this part is where all the key threads of the story come together in the glorious unveiling before the final climatic ending. Basically it is important and without this you ain't going anywhere.

A user specification is in the format

<user list> <host list> = <operator list> <tag list> <command list>

The user list is a list of users or a user alias that has already been set, the host list is a list of hosts or a host alias, the operator list is a list of users they must be running as or a runas alias and the command list is a list of commands or a cmnd alias.

The tag list has not been covered yet and allows you set special things for each command. You can use PASSWD and NOPASSWD to specify whether the user has to enter a password or not and you can also use NOEXEC to prevent any programs launching shells themselves (as once a program is running with sudo it has full root privileges so could launch a root shell to circumvent any restrictions in the sudoers file.

For example (using the aliases and users from earlier)

 # This lets the webmasters run all the web commands on the machine 
 # "webserver" provided they give a password
 WEBMASTERS webserver= WEB_CMDS
 # This lets the admins run all the admin commands on the servers
 ADMINS SERVERS= ADMIN_CMDS
 # This lets all the USERS run admin commands on the workstations provided 
 # they give the root password or and admin password (using "sudo -u <username>")
 USERS WORKSTATIONS=(ADMINS) ADMIN_CMDS
 # This lets "harry" shutdown his own machine without a password
 harry harrys-machine= NOPASSWD: SHUTDOWN_CMDS
 # And this lets everybody print without requiring a password
 ALL ALL=(ALL) NOPASSWD: PRINTING_CMDS
The Default Ubuntu Sudoers File

The sudoers file that ships with Ubuntu 8.04 by default is included here so if you break everything you can restore it if needed and also to highlight some key things.

# /etc/sudoers
#
# This file MUST be edited with the 'visudo' command as root.
#
# See the man page for details on how to write a sudoers file.
#

Defaults    env_reset

# Uncomment to allow members of group sudo to not need a password
# %sudo ALL=NOPASSWD: ALL

# Host alias specification

# User alias specification

# Cmnd alias specification

# User privilege specification
root    ALL=(ALL) ALL

# Members of the admin group may gain root privileges
%admin ALL=(ALL) ALL

This is pretty much empty and only has three rules in it. The first ( Defaults env_reset ) resets the terminal environment after switching to root. So, ie: all user set variables are removed. The second ( root ALL=(ALL) ALL ) just lets root do everything on any machine as any user. And the third ( %admin ALL=(ALL) ALL ) lets anybody in the admin group run anything as any user. Note that they will still require a password (thus giving you the normal behaviour you are so used to).

If you want to add your own specifications and you are a member of the admin group then you will need to add them after this line. Otherwise all your changes will be overridden by this line saying you (as part of the admin group) can do anything on any machine as any user provided you give a password.

Common Tasks

This section includes some common tasks and how to accomplish them using the sudoers file.

Shutting Down From The Console Without A Password

Often people want to be able to shut their computers down without requiring a password to do so. This is particularly useful in media PCs where you want to be able to use the shutdown command in the media centre to shutdown the whole computer.

To do this you need to add some cmnd aliases as follows:

Cmnd_Alias SHUTDOWN_CMDS = /sbin/poweroff, /sbin/halt, /sbin/reboot

You also need to add a user specification (at the end of the file after the " %admin ALL = (ALL) ALL " line so it takes effect - see above for details):

<your username> ALL=(ALL) NOPASSWD: SHUTDOWN_CMDS

Obviously you need to replace "<your username>" with the username of the user who needs to be able to shutdown the pc without a password. You can use a user alias here as normal.

Multiple tags on a line

There are times where you need to have both NOPASSWD and NOEXEC or other tags on the same configuration line. The man page for sudoers is less than clear, so here is an example of how this is done:

myuser ALL = (root) NOPASSWD:NOEXEC: /usr/bin/vim

This example lets the user "myuser" run as root the "vim" binary without a password, and without letting vim shell out (the :shell command).

Enabling Visual Feedback when Typing Passwords

As of Ubuntu 10.04 (Lucid), you can enable visual feedback when you are typing a password at a sudo prompt.

Simply edit /etc/sudoers and change the Defaults line to read:

Defaults        env_reset,pwfeedback
Troubleshooting

If your changes don't seem to have had any effect, check that they are not trying to use aliases that are not defined yet and that no other user specifications later in the file are overriding what you are trying to accomplish.

[Nov 19, 2017] Understanding sudoers syntax

Notable quotes:
"... A command may also be the full path to a directory (including a trailing /). This permits execution of all the files in that directory, but not in any subdirectories. ..."
"... The keyword sudoedit is also recognised as a command name, and arguments can be specified as with other commands. Use this instead of allowing a particular editor to be run with sudo, because it runs the editor as the user and only installs the editor's output file into place as root (or other target user). ..."
Nov 09, 2017 | toroid.org

User specifications

The /etc/sudoers file contains "user specifications" that define the commands that users may execute. When sudo is invoked, these specifications are checked in order, and the last match is used. A user specification looks like this at its most basic:

User Host = (Runas) Command

Read this as "User may run Command as the Runas user on Host".

Any or all of the above may be the special keyword ALL, which always matches.

User and Runas may be usernames, group names prefixed with %, numeric UIDs prefixed with #, or numeric GIDs prefixed with %#. Host may be a hostname, IP address, or a whole network (e.g., 192.0.2.0/24), but not 127.0.0.1.

Runas

This optional clause controls the target user (and group) sudo will run the Command as, or in other words, which combinations of the -u and -g arguments it will accept.

If the clause is omitted, the user will be permitted to run commands only as root. If you specify a username, e.g., (postgres), sudo will accept "-u postgres" and run commands as that user. In both cases, sudo will not accept -g.

If you also specify a target group, e.g., (postgres:postgres), sudo will accept any combination of the listed users and groups (see the section on aliases below). If you specify only a target group, e.g., (:postgres), sudo will accept and act on "-g postgres" but run commands only as the invoking user.

This is why you sometimes see (ALL:ALL) in the 90% of examples.

Commands

In the simplest case, a command is the full path to an executable, which permits it to be executed with any arguments. You may specify a list of arguments after the path to permit the command only with those exact arguments, or write "" to permit execution only without any arguments.

A command may also be the full path to a directory (including a trailing /). This permits execution of all the files in that directory, but not in any subdirectories.

ams ALL=/bin/ls, /bin/df -h /, /bin/date "", \
        /usr/bin/, sudoedit /etc/hosts, \
        OTHER_COMMANDS

The keyword sudoedit is also recognised as a command name, and arguments can be specified as with other commands. Use this instead of allowing a particular editor to be run with sudo, because it runs the editor as the user and only installs the editor's output file into place as root (or other target user).

As shown above, comma-separated lists of commands and aliases may be specified. Commands may also use shell wildcards either in the path or in the argument list (but see the warning below about the latter).

Sudo is very flexible, and it's tempting to set up very fine-grained access, but it can be difficult to understand the consequences of a complex setup, and you can end up with unexpected problems . Try to keep things simple.

Options

Before the command, you can specify zero or more options to control how it will be executed. The most important options are NOPASSWD (to not require a password) and SETENV (to allow the user to set environment variables for the command).

ams ALL=(ALL) NOPASSWD: SETENV: /bin/ls

Other available options include NOEXEC, LOG_INPUT and LOG_OUTPUT, and SELinux role and type specifications. These are all documented in the manpage.

Digests

The path to a binary (i.e., not a directory or alias) may also be prefixed with a digest:

ams ALL=(ALL) sha224:IkotndXGTmZtH5ZNFtRfIwkG0WuiuOs7GoZ+6g== /bin/ls

The specified binary will then be executed only if it matches the digest. SHA-2 digests of 224, 256, 384, and 512-bits are accepted in hex or Base64 format. The values can be generated using, e.g., sha512sum or openssl.

Aliases

In addition to the things listed above, a User, Host, Runas, or Command may be an alias, which is a named list of comma-separated values of the corresponding type. An alias may be used wherever a User, Host, Runas, or Command may occur. They are always named in uppercase, and can be defined as shown in these examples:

# Type_Alias NAME = a, b : NAME_2 = c, d,  

User_Alias TRUSTED = %admin, !ams
Runas_Alias LEGACYUSERS = oldapp1, oldapp2
Runas_Alias APPUSERS = app1, app2, LEGACYUSERS
Host_Alias PRODUCTION = www1, www2, \
    192.0.2.1/24, !192.0.2.222
Cmnd_Alias DBA = /usr/pgsql-9.4/bin, \
    /usr/local/bin/pgadmin

An alias definition can also include another alias of the same type (e.g., LEGACYUSERS above). You cannot include options like NOPASSWD: in command aliases.

Any term in a list may be prefixed with ! to negate it. This can be used to include a group but exclude a certain user, or to exclude certain addresses in a network, and so on. Negation can also be used in command lists, but note the manpage's warning that trying to "subtract" commands from ALL using ! is generally not effective .

Use aliases whenever you need rules involving multiple users, hosts, or commands.

Default options

Sudo has a number of options whose values may be set in the configuration file, overriding the defaults either unconditionally, or only for a given user, host, or command. The defaults are sensible, so you do not need to care about options unless you're doing something special.

Option values are specified in one or more "Defaults" lines. The example below switches on env_reset, turns off insults (read !insults as "not insults"), sets password_tries to 4, and so on. All the values are set unconditionally, i.e. they apply to every user specification.

Defaults env_reset, !insults, password_tries=4, \
    lecture=always
Defaults passprompt="Password for %p:"

Options may also be set only for specific hosts, users, or commands, as shown below. Defaults@host sets options for a host, Defaults:user for a (requesting) user, Defaults!command for a command, and Defaults>user for a target user. You can also use aliases in these definitions.

Defaults@localhost insults
Defaults:ams insults, !lecture
Defaults>root mail_always, mailto="foo@example.org"

Cmnd_Alias FOO = /usr/bin/foo, /usr/bin/bar, \
    /usr/local/bin/baz
Defaults!FOO always_set_home

Unconditional defaults are parsed first, followed by host and user defaults, then runas defaults, then command defaults.

The many available options are explained well in the manpage.

Complications

In addition to the alias mechanism, a User, Host, Runas, or Command may each be a comma-separated list of things of the corresponding type. Also, a user specification may contain multiple host and command sets for a single User. Please be sparing in your use of this syntax, in case you ever have to make sense of it again.

Users and hosts can also be a +netgroup or other more esoteric things, depending on plugins. Host names may also use shell wildcards (see the fqdn option).

If Runas is omitted but the () are not, sudo will reject -u and -g and run commands only as the invoking user.

You can use wildcards in command paths and in arguments, but their meaning is different. In a path, a * will not match a /, so /usr/bin/* will match /usr/bin/who but not /usr/bin/X11/xterm. In arguments, a * does match /; also, arguments are matched as a single string (not a list of separate words), so * can match across words. The manpage includes the following problematic example, which permits additional arguments to be passed to /bin/cat without restriction:

%operator ALL = /bin/cat /var/log/messages*

Warning : Sudo will not work if /etc/sudoers contains syntax errors, so you should only ever edit it using visudo, which performs basic sanity checks, and installs the new file only if it parses correctly.

Another warning: if you take the EBNF in the manpage seriously enough, you will discover that the implementation doesn't follow it. You can avoid this sad fate by linking to this article instead of trying to write your own. Happy sudoing!

[Nov 13, 2017] 20 Sed (Stream Editor) Command Examples for Linux Users

Nov 13, 2017 | www.linuxtechi.com

20 Sed (Stream Editor) Command Examples for Linux Users

by Pradeep Kumar · Published November 9, 2017 · Updated November 9, 2017

Sed command or Stream Editor is very powerful utility offered by Linux/Unix systems. It is mainly used for text substitution , find & replace but it can also perform other text manipulations like insertion deletion search etc. With SED, we can edit complete files without actually having to open it. Sed also supports the use of regular expressions, which makes sed an even more powerful test manipulation tool

In this article, we will learn to use SED command with the help some examples. Basic syntax for using sed command is,

sed OPTIONS [SCRIPT] [INPUTFILE ]

Now let's see some examples.

Example :1) Displaying partial text of a file

With sed, we can view only some part of a file rather than seeing whole file. To see some lines of the file, use the following command,

[linuxtechi@localhost ~]$ sed -n 22,29p testfile.txt

here, option 'n' suppresses printing of whole file & option 'p' will print only line lines from 22 to 29.

Example :2) Display all except some lines

To display all content of a file except for some portion, use the following command,

[linuxtechi@localhost ~]$ sed 22,29d testfile.txt

Option 'd' will remove the mentioned lines from output.

Example :3) Display every 3rd line starting with Nth line

Do display content of every 3rd line starting with line number 2 or any other line, use the following command

[linuxtechi@localhost ~]$ sed -n '2-3p' file.txt
Example :4 ) Deleting a line using sed command

To delete a line with sed from a file, use the following command,

[linuxtechi@localhost ~]$ sed Nd testfile.txt

where 'N' is the line number & option 'd' will delete the mentioned line number. To delete the last line of the file, use

[linuxtechi@localhost ~]$ sed $d testfile.txt
Example :5) Deleting a range of lines

To delete a range of lines from the file, run

[linuxtechi@localhost ~]$ sed '29-34d' testfile.txt

This will delete lines 29 to 34 from testfile.txt file.

Example :6) Deleting lines other than the mentioned

To delete lines other than the mentioned lines from a file, we will use '!'

[linuxtechi@localhost ~]$ sed '29-34!d' testfile.txt

here '!' option is used as not, so it will reverse the condition i.e. will not delete the lines mentioned. All the lines other 29-34 will be deleted from the files testfile.txt.

Example :7) Adding Blank lines/spaces

To add a blank line after every non-blank line, we will use option 'G',

[linuxtechi@localhost ~]$ sed G testfile.txt
Example :8) Search and Replacing a string using sed

To search & replace a string from the file, we will use the following example,

[linuxtechi@localhost ~]$ sed 's/danger/safety/' testfile.txt

here option 's' will search for word 'danger' & replace it with 'safety' on every line for the first occurrence only.

Example :9) Search and replace a string from whole file using sed

To replace the word completely from the file, we will use option 'g' with 's',

[linuxtechi@localhost ~]$ sed 's/danger/safety/g' testfile.txt
Example :10) Replace the nth occurrence of string pattern

We can also substitute a string on nth occurrence from a file. Like replace 'danger' with 'safety' only on second occurrence,

[linuxtechi@localhost ~]$ sed 's/danger/safety/2' testfile.txt

To replace 'danger' on 2nd occurrence of every line from whole file, use

[linuxtechi@localhost ~]$ sed 's/danger/safety/2g' testfile.txt
Example :11) Replace a string on a particular line

To replace a string only from a particular line, use

[linuxtechi@localhost ~]$ sed '4 s/danger/safety/' testfile.txt

This will only substitute the string from 4th line of the file. We can also mention a range of lines instead of a single line,

[linuxtechi@localhost ~]$  sed '4-9 s/danger/safety/' testfile.txt
Example :12) Add a line after/before the matched search

To add a new line with some content after every pattern match, use option 'a' ,

[linuxtechi@localhost ~]$ sed '/danger/a "This is new line with text after match"' testfile.txt

To add a new line with some content a before every pattern match, use option 'i',

[linuxtechi@localhost ~]$ sed '/danger/i "This is new line with text before match" ' testfile.txt
Example :13) Change a whole line with matched pattern

To change a whole line to a new line when a search pattern matches we need to use option 'c' with sed,

[linuxtechi@localhost ~]$ sed '/danger/c "This will be the new line" ' testfile.txt

So when the pattern matches 'danger', whole line will be changed to the mentioned line.

Advanced options with sed

Up until now we were only using simple expressions with sed, now we will discuss some advanced uses of sed with regex,

Example :14) Running multiple sed commands

If we need to perform multiple sed expressions, we can use option 'e' to chain the sed commands,

[linuxtechi@localhost ~]$  sed -e 's/danger/safety/g' -e 's/hate/love/' testfile.txt
Example :15) Making a backup copy before editing a file

To create a backup copy of a file before we edit it, use option '-i.bak',

[linuxtechi@localhost ~]$ sed -i.bak -e 's/danger/safety/g'  testfile.txt

This will create a backup copy of the file with extension .bak. You can also use other extension if you like.

Example :16) Delete a file line starting with & ending with a pattern

To delete a file line starting with a particular string & ending with another string, use

[linuxtechi@localhost ~]$ sed -e 's/danger.*stops//g' testfile.txt

This will delete the line with 'danger' on start & 'stops' in the end & it can have any number of words in between , '.*' defines that part.

Example :17) Appending lines

To add some content before every line with sed & regex, use

[linuxtechi@localhost ~]$ sed -e 's/.*/testing sed &/' testfile.txt

So now every line will have 'testing sed' before it.

Example :18) Removing all commented lines & empty lines

To remove all commented lines i.e. lines with # & all the empty lines, use

[linuxtechi@localhost ~]$ sed -e 's/#.*//;/^$/d' testfile.txt

To only remove commented lines, use

[linuxtechi@localhost ~]$ sed -e 's/#.*//' testfile.txt
Example :19) Get list of all usernames from /etc/passwd

To get the list of all usernames from /etc/passwd file, use

[linuxtechi@localhost ~]$  sed 's/\([^:]*\).*/\1/' /etc/passwd

a complete list all usernames will be generated on screen as output.

Example :20) Prevent overwriting of system links with sed command

'sed -i' command has been know to remove system links & create only regular files in place of the link file. So to avoid such a situation & prevent ' sed -i ' from destroying the links, use ' –follow-symklinks ' options with the command being executed.

Let's assume i want to disable SELinux on CentOS or RHEL Severs

[linuxtechi@localhost ~]# sed -i --follow-symlinks 's/SELINUX=enforcing/SELINUX=disabled/g' /etc/sysconfig/selinux

These were some examples to show sed, we can use these reference to employ them as & when needed. If you guys have any queries related to this or any article, do share with us.

[Nov 12, 2017] Installing Nagios 3.4.4 On CentOS 6.3

Nov 12, 2017 | www.howtoforge.com

Installing Nagios 3.4.4 On CentOS 6.3 Introduction

Nagios is a monitoring tool under GPL licence. This tool lets you monitor servers, network hardware (switches, routers, ...) and applications. A lot of plugins are available and its big community makes Nagios the biggest open source monitoring tool. This tutorial shows how to install Nagios 3.4.4 on CentOS 6.3.

Prerequisites

After installing your CentOS server, you have to disable selinux & install some packages to make nagios work.

To disable selinux, open the file: /etc/selinux/config

# vi /etc/selinux/config

# This file controls the state of SELinux on the system.
# SELINUX= can take one of these three values:
#     enforcing - SELinux security policy is enforced.
#     permissive - SELinux prints warnings instead of enforcing.
#     disabled - No SELinux policy is loaded.
SELINUX=permissive // change this value to disabled
# SELINUXTYPE= can take one of these two values:
#     targeted - Targeted processes are protected,
#     mls - Multi Level Security protection.
SELINUXTYPE=targeted

Now, download all packages you need:

# yum install gd gd-devel httpd php gcc glibc glibc-common

Nagios Installation

Create a directory:

# mkdir /root/nagios

Navigate to this directory:

# cd /root/nagios

Download nagios-core & plugin:

# wget http://prdownloads.sourceforge.net/sourceforge/nagios/nagios-3.4.4.tar.gz
# wget http://prdownloads.sourceforge.net/sourceforge/nagiosplug/nagios-plugins-1.4.16.tar.gz

Untar nagios core:

# tar xvzf nagios-3.4.4.tar.gz

Go to the nagios dir:

# cd nagios

Configure before make:

# ./configure

Make all necessary files for Nagios:

# make all

Installation:

# make install

# make install-init

# make install-commandmode

# make install-config

# make install-webconf

Create a password to log into the web interface:

# htpasswd -c /usr/local/nagios/etc/htpasswd.users nagiosadmin

Start the service and start it on boot:

# chkconfig nagios on
# service nagios start

Now, you have to install the plugins:

# cd ..
# tar xvzf nagios-plugins-1.4.15.tar.gz
# cd nagios-plugins-1.4.15
# ./configure
# make
# make install

Start the apache service and enable it on boot:

# service httpd start
# chkconfig httpd on

Now, connect to your nagios system:

http://Your-Nagios-IP/nagios and enter login : nagiosadmin & password you have chosen above.

And after the installation ?

After the installation you have to configure all your host & services in nagios configuration files.This step is performed in command line and is complicated, so I recommand to install tool like Centreon, that is a beautiful front-end to add you host & services.

To go further, I recommend you to read my article on Nagios & Centreon monitoring .

[Nov 12, 2017] How to Install Nagios 4 in Ubuntu and Debian

Nov 12, 2017 | www.tecmint.com

Requirements

  1. Debian 9 Minimal Installation
  2. Ubuntu 16.04 Minimal Installation
Step 1: Install Pre-requirements for Nagios

1. Before installing Nagios Core from sources in Ubuntu or Debian , first install the following LAMP stack components in your system, without MySQL RDBMS database component, by issuing the below command.

# apt install apache2 libapache2-mod-php7.0 php7.0

2. On the next step, install the following system dependencies and utilities required to compile and install Nagios Core from sources, by issuing the follwoing command.

# apt install wget unzip zip  autoconf gcc libc6 make apache2-utils libgd-dev
Step 2: Install Nagios 4 Core in Ubuntu and Debian

3. On the first step, create nagios system user and group and add nagios account to the Apache www-data user, by issuing the below commands.

# useradd nagios
# usermod -a -G nagios www-data

4. After all dependencies, packages and system requirements for compiling Nagios from sources are present in your system, go to Nagios webpage and grab the latest version of Nagios Core stable source archive by issuing the following command.

# wget https://assets.nagios.com/downloads/nagioscore/releases/nagios-4.3.4.tar.gz

5. Next, extract Nagios tarball and enter the extracted nagios directory, with the following commands. Issue ls command to list nagios directory content.

# tar xzf nagios-4.3.4.tar.gz 
# cd nagios-4.3.4/
# ls
List Nagios Content

List Nagios Content

6. Now, start to compile Nagios from sources by issuing the below commands. Make sure you configure Nagios with Apache sites-enabled directory configuration by issuing the below command.

# ./configure --with-httpd-conf=/etc/apache2/sites-enabled

7. In the next step, build Nagios files by issuing the following command.

# make all

8. Now, install Nagios binary files, CGI scripts and HTML files by issuing the following command.

# make install

9. Next, install Nagios daemon init and external command mode configuration files and make sure you enable nagios daemon system-wide by issuing the following commands.

# make install-init
# make install-commandmode
# systemctl enable nagios.service

10. Next, run the following command in order to install some Nagios sample configuration files needed by Nagios to run properly by issuing the below command.

# make install-config

11. Also, install Nagios configuration file for Apacahe web server, which can be fount in /etc/apacahe2/sites-enabled/ directory, by executing the below command.

# make install-webconf

12. Next, create nagiosadmin account and a password for this account necessary by Apache server to log in to Nagios web panel by issuing the following command.

# htpasswd -c /usr/local/nagios/etc/htpasswd.users nagiosadmin

13. To allow Apache HTTP server to execute Nagios cgi scripts and to access Nagios admin panel via HTTP, first enable cgi module in Apache and then restart Apache service and start and enable Nagios daemon system-wide by issuing the following commands.

# a2enmod cgi
# systemctl restart apache2
# systemctl start nagios
# systemctl enable nagios

14. Finally, log in to Nagios Web Interface by pointing a browser to your server's IP address or domain name at the following URL address via HTTP protocol. Log in to Nagios with nagiosadmin user the password setup with htpasswd script.

http://IP-Address/nagios
OR
http://DOMAIN/nagios

[Nov 11, 2017] Example of the sudoers file

Nov 09, 2017 | support.symantec.com

Example of the sudoers file

This is an example of the contents of the sudoers file is located in the /etc directory of the UNIX target computer. This example contains sample configurations required to use the sudo functionality as mentioned in the section Using sudo functionality for querying Oracle UNIX targets .

# User alias specification
##
User_Alias UNIX_USERS = unix1, unix2, unix3
User_Alias BV_CONTROL_USERS = bvunix1, bvunix2, bvunix3
##
# Runas alias specification
Defaults:UNIX_USERS !authenticate
Defaults:BV_CONTROL_USERS !authenticate
##
Runas_Alias SUPER_USERS = root
Defaults logfile=/var/log/sudolog
##
# Cmnd alias specification
##
Cmnd_Alias APPLICATIONS = /usr/sbin/named
Cmnd_Alias AIX_ADMINCMDS = /usr/sbin/lsps, /usr/sbin/lsattr
Cmnd_Alias ADMINCMDS = /usr/sbin/prtconf, /sbin/runlevel, ulimit, AIX_ADMINCMDS,
Cmnd_Alias NETWORKCMDS = /sbin/ifconfig, /usr/local/bin/nslookup, inetadm -p
Cmnd_Alias FILECMDS = /bin/cat, /bin/date '+%Z', /usr/bin/strings -n, \
   /usr/bin/diff, /usr/bin/cmp, /usr/bin/find, \
   /bin/echo, /usr/bin/file, /bin/df -P, \
   /usr/bin/cksum, /bin/ls -la, /bin/ls -lad, \
   /bin/ls -lac, /bin/ls -lau
#Cmnd_Alias COMMONCMDS = /usr/bin, /bin, /usr/local/bin
Cmnd_Alias SU = /usr/bin/su
Cmnd_Alias SYSADMCMD = /usr/lib/sendmail
Cmnd_Alias ACTIVEADMCMDS = /usr/sbin/adduser
UNIX_USERS ALL = (SUPER_USERS) APPLICATIONS, NETWORKCMDS, ADMINCMDS, FILECMDS, !SU, !ACTIVEADMCMDS, !SYSADMCMD, NOPASSWD: ALL
BV_CONTROL_USERS ALL = NOPASSWD: ALL

See Using sudo functionality for querying Oracle UNIX targets .

See Disabling password prompt in the sudoers file .

See Minimum required privileges to query an Oracle database .

[Nov 10, 2017] Make sudo work harder

Notable quotes:
"... timestamp_timeout ..."
www.linux.com
Also at www.ibm.com/developerworks

Managing sudoers

Over time, your sudoers file will grow with more and more entries, which is to be expected. This could be because more application environments are being placed on the server, or because of splitting the delegation of currents tasks down further to segregate responsibility. With many entries, typos can occur, which is common. Making the sudoers file more manageable by the root user makes good administrative sense. Let's look at two ways this can be achieved, or at least a good standard to build on. If you have many static entries (meaning the same command is run on every machine where sudo is), put these into a separate sudoers file, which can be achieved using the include directive.

Having many entries for individual users can also be time consuming when adding or amending entries. With many user entries, it is good practice to put these into groups. Using groups, you can literally group users together, and the groups are valid AIX groups.

Now look at these two methods more closely.

Include file

Within large-enterprise environments, keeping the sudoers file maintained is an important and regularly required task. A solution to make this chore easier is to reorganize the sudoers file. One way to do this is to extract entries that are static or reusable, where the same commands are run on every box. Like audit/security or storix backups or general performance reports, with sudo you can now use the include directive. The main sudoers file can then contain the local entries, and the include file would barely need editing as those entries are static. When visudo is invoked, it will scan sudoers when it sees the include entry. It will scan that file, then come back to the main sudoers and carry on scanning. In reality, it works like this. When you exit out of visudo from the main sudoers file, it will take you to the include file for editing. Once you quit the include, you are back to the AIX prompt. You can have more than one include file, but I cannot think of a reason why you would want more than one.

Let's call our secondary sudoers file sudo_static.<hostname>. In the examples in this demonstration the hostname I am using is rs6000. In the main sudoers file, make the entry as follows:

1 #include /etc/sudo_static.rs6000

Next, add some entries to the /etc/sudo_static.rs6000 file. You do not have to put in all the sudoers directives or stanzas. If this file contains entries where they are not required, don't include them. For example, my include file contains only the following text, and nothing more.

You can use the %h, instead of typing the actual hostname:

I personally do not use this method because I have experienced returning extra characters on the hostname. This issue is fixed in sudo 1.7.2 p1.

1 2 3 4 bravo rs6000 = (root) NOPASSWD: /usr/opt/db2_08_01/adm/db2licd -end bravo rs6000 = (root) NOPASSWD: /usr/opt/db2_08_01/adm/db2licd bravo rs6000 = (db2inst) NOPASSWD: /home/db2inst/sqllib/adm/db2start bravo rs6000 = (db2inst) NOPASSWD: /home/db2inst/sqllib/adm/db2stop force

When you run visudo, and you save and quit the file, visudo will inform you to click Enter to edit the include sudoers file. Once you have edited the file, sudo will pick up on syntax errors if any, as with the main file. Alternatively, to edit the include file directly, use:

1 visudo -f /etc/sudo_static.rs6000
Using groups

Users belonging to a valid AIX group can be included in sudoers, making the sudoers file more manageable with fewer entries per user. When reorganizing the sudoers entries to include groups, you may have to create a new groups under AIX to include users that are only allowed to use sudo for certain commands. To use groups, simply prefix the entries with a '%'. Assume you have groups called devops and devuat , and with those groups you have the following users:

1 2 3 4 5 6 7 8 # lsgroup -f -a users devops devops: users=joex,delta,charlie,tstgn # lsgroup -f -a users devuat devuat: users=zebra,spsys,charlie

For the group devops to be allowed to run the /usr/local/bin/data_ext.sh command as dbdftst.

For the group devuat to be allowed to run the commands :/usr/local/bin/data_mvup.sh, /usr/local/bin/data_rep.sh as dbukuat.

We could have the following sudoers entries:

1 2 3 %devops rs6000 = (dbdftst) NOPASSWD: /usr/local/bin/data_ext.sh %devuat rs6000 = (dbukuat) /usr/local/bin/data_mvup.sh %devuat rs6000 = (dbukuat) /usr/local/bin/data_rep.sh

Notice in the previous entries, the group devops users will not be prompted for their password when executing /usr/local/bin/data_ext.sh; however, the group devuat users will be prompted for their password. User "charlie" is a member of both groups ( devops and devuat ), so he can execute all the above commands.

Timeout with sudo

Sudo has a feature that uses time tickets to determine how long since the last sudo command was run. During this time period, the user can re-run the command without being prompted for the password (that's the user's own password). Once this time allotment has ended, the user is prompted for the password again to re-run the command. If the user gives the correct password, the command is executed, the ticket is then re-set, and the time clock starts all over again. The ticket feature will not work if you have NOPASSWD in the user's entry in sudoers. The default timeout is five minutes. If you wish to change the default value, simply put an entry in sudoers. For example, to set the timeout value for user "bravo" on any commands he runs to 20 minutes, you could use:

1 Defaults:bravo timestamp_timeout=20

To destroy the ticket, as the user, use:

1 $ sudo -k

When the ticket is destroyed, the user will be prompted for his password again, when running a sudo command.

Please do not set the timeout value for all users, as this will cause problems, especially when running jobs in batch and the batch takes longer to run than normal. To disable this feature, use the value -1 in the timestamp_timeout variable. The time tickets are directory entries with the name of the user located in /var/run/sudo.

Those variables

As discussed earlier, sudo will strip out potentially dangerous system variables. To check out what variables are kept and which ones are stripped, use sudo -V . The output will give you a listing of preserved and stripped variables. Stripping out the LIBPATH is clearly an inconvenience. There are a couple of ways around this--either write a wrapper script or specify the environments on the command line. Looking at the wrapper script solution first, suppose you have an application that stops or starts a DB2® instance. You could create a bare-bones script that would keep the variables intact. In Listing 1. rc.db2 , notice that you source the instance profile, which in turn exports various LIBPATH and DB2 environment variables, keeping the environment variable intact, by using:

1 . /home/$inst/sqllib/db2profile

For completeness, the entries in sudoers to execute this is and not strip out any system environment variables are:

1 2 3 4 bravo rs6000 = (dbinst4) NOPASSWD: /home/dbinst4/sqllib/adm/db2start bravo rs6000 = (dbinst4) NOPASSWD: /home/dbinst4/sqllib/adm/db2stop force bravo rs6000 = (dbinst4) NOPASSWD: /usr/local/bin/rc.db2 stop db2inst4 bravo rs6000 = (dbinst4) NOPASSWD: /usr/local/bin/rc.db2 start db2inst4

Note in this example, user "bravo" can execute the above commands as user "dbinst4." Typically, the user would run:

1 2 sudo -u dbinst4 /usr/local/bin/rc.db2 stop db2inst4 sudo -u dbinst4 /usr/local/bin/rc.db2 start db2inst4
Listing 1. rc.db2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 #!/bin/sh # rc.db2 # stop/start db2 instances # check to see if db2 inst is runningdb2_running(){state=`ps -ef |grep db2sysc |grep -v grep| awk '$1=="'${inst}'" { print $1 }'` if [ "$state" = "" ] then return 1 else return 0 fi} usage () { echo "`basename $0` start | stop <instance>" } # stop db2 stop_db2 () { echo "stopping db2 instance as user $inst" if [ -f /home/$inst/sqllib/db2profile ]; then . /home/$inst/sqllib/db2profile else echo "Cannot source DB2..exiting" exit 1 fi /home/$inst/sqllib/adm/db2stop force } # start db2 start_db2 () { echo "starting db2 instance as user $inst" if [ -f /home/$inst/sqllib/db2profile ]; then . /home/$inst/sqllib/db2profile else echo "Cannot source DB2..exiting" exit 1 fi /home/$inst/sqllib/adm/db2start } # check we get 2 params if [ $# != 2 ] then usage exit 1 fi inst=$2 case "$1" in Start|start) if db2_running then echo "db2 instance $inst appears to be already running" exit 0 else echo " instance not running as user $inst..attempting to start it" start_db2 $inst fi ;; Stop|stop) if db2_running then echo "instance running as $inst..attempting to stop it" stop_db2 $inst else echo "db2 instance $inst appears to be not running anyway" exit 0 fi ;; *) usage ;; esac

The other way to preserve system environment variables is to use the Defaults !env_reset directive, like in sudoers:

1 Defaults !env_reset

Then from the command line, specify the environment variable name with its value:

1 $ sudo LIBPATH=″/usr/lib:/opt/db2_09_05/lib64″ -u delta /usr/local/bin/datapmp

If you do not put the !env_reset entry in, you will get the following error from sudo when you try to run the command:

1 sudo: sorry, you are not allowed to set the following environment variables: LIBPATH

If you find that sudo is also stripping out other environment variables, you can specify the variable name in sudoers so that sudo keeps those variables intact (with the Defaults env_keep += directive). For instance, suppose sudo was stripping out the application variables DSTAGE_SUP and DSTAGE_META from one of my suodo-ised scripts. To preserve these variables, I could put the following entries in sudoers:

1 2 Defaults env_keep += "DSTAGE_SUP" Defaults env_keep += "DSTAGE_META"

Notice that I give the variable name and not the variable value. The values are already contained in my script like this:

1 export DSTAGE_SUP=/opt/dstage/dsengine; export DSTAGE_META=/opt/dstage/db2

Now when the sudo script is executed, the above environment variables are preserved.

Securing the sudo path

A default PATH within sudoers can be imposed using the secure_path directive. This directive specifies where to look for binaries and commands when a user executes a sudo command. This option clearly tries to lock down specific areas where a user runs a sudo command, which is good practice. Use the following directive in sudoers, specifying the secure PATH with its search directories:

1 Defaults secure_path="/usr/local/sbin:/usr/local/bin:/opt/freeware/bin:/usr/sbin"
Getting restrictive

Restrictions can be put in place to restrict certain commands to users. Assume you have a group called dataex , whose members are "alpha," "bravo," and "charlie." Now, that group has been allowed to run the sudo command /usr/local/bin/mis_ext * , where the asterisk represents the many parameters passed to the script. However, user "charlie" is not allowed to execute that script if the parameter is import . This type of condition can be met by using the logical NOT '!' operator. Here is how that is achieved in sudoers:

1 2 %dataex rs6000 = (dbmis) NOPASSWD: /usr/local/bin/mis_ext * charlie rs6000 = (dbmis) NOPASSWD: !/usr/local/bin/mis_ext import

Note that the logical NOT operator entries go after the non-restrictive entry. Many conditional NOT entries can be applied on the same line; just make sure that they are comma separated, like so:

1 2 3 4 charlie rs6000 = (dbmis) NOPASSWD: /usr/local/bin/aut_pmp * charlie rs6000 = (dbmis) NOPASSWD: !/usr/local/bin/aut_pmp create, !/usr/local/bin/aut_pmp delete, !/usr/local/bin/aut_pmp amend
When in visudo, do not think just saving the sudo entry and staying in visudo will make the changes effective; it won't. You must exit visudo for the changes to take effect. Rolling out sudo commands

Rolling out sudo commands to remote hosts in an enterprise environment is best done using a ssh script as root, and the keys should have been exchanged between the hosts, for password-less logins. Let's look at one example of how to do this. With geographically remote machines, if you get a hardware issue of some sort (disk or memory), the IBM® engineer will be on-site to replace the failing hardware. There will be occasions when they require the root password to carry out their task. One procedure you might want to put in place is for the engineer to gain access to root they must use sudo. Informing the engineer prior to the visit of the password would be advantageous. Listing 2 demonstrates one way you could roll out this configuration. Looking more closely at Listing 2 , use a for loop containing a list of hosts you are pushing out to. (Generally, though, you would have these hosts in a text file and read them in using a while loop.) Using the 'here' document method, make a backup copy of sudoers, and an entry is then appended to sudoers, like so:

1 2 # -- ibmeng sudo root ibmeng host1 = (root) NOPASSWD:ALL

Next, the user "ibmeng" is created, and the password is set for the user using chpasswd . In this demonstration, it is ibmpw . A message is then appended to their profile, informing the user how to sudo to root. So when the engineer logs in, he is presented with the message:

1 IBM Engineer, to access root account type: sudo -u root su -

Of course the account for ibmeng would be locked after the visit.

Listing 2. dis_ibm
Nov 09, 2017 | www.ibm.com
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 #!/bin/sh # dis_ibm dest_hosts='host1 host2 host3 host4' for host in $dest_hosts do echo "doing [$host]" $ssh -T -t -l root $host<<'mayday' host=`hostname` cp /etc/sudoers /etc/sudoers.bak if [ $? != 0 ] then echo "error: unable to cp sudoers file" exit 1 fi echo "# -- ibmeng sudo root\nibmeng $host = (root) NOPASSWD:ALL">>/etc/sudoers mkuser su=false ibmeng if [ $? = 0 ] then echo "ibmeng:ibmpw" | chpasswd -c else echo "error: unable to create user ibmeng and or passwd" exit 1 fi chuser gecos='IBM engineer acc' ibmeng if [ -f /home/ibmeng/.profile ] then echo "echo \"IBM Engineer, to access root account type: sudo -u root su -"\" >>/home/ibmeng/.profile fi mayday done
Conclusion

Sudo allows you to control who can run what commands as whom. But you must be able to understand the features of sudoers fully to gain maximum understanding of its implications and responsibility.


Downloadable resources
Related topics

[Nov 09, 2017] Add an netgroup in sudoers instead a group

Nov 09, 2017 | hd.free.fr

5 thoughts on "sudo command & sudoers file : Concepts and Practical examples"

  1. Pingback: sudo | Site Title
  2. Andres Ferreo July 16, 2014 at 21:18

    I'll like to add an netgroup in sudoers instead a group. That is possible? How should I do this setup

    Thanks.

    1. Pier Post author July 17, 2014 at 22:50

      In order to use a netgroup in the sudoers file, you just need to explicitly define it as a netgroup by using the " + " sign (instead of a " % " sign that would be used for a system group).

      You will need to include this netgroup inside a User_Alias (you may want to create a new User_Alias for this purpose)

      Please check the " 3.1.2 User_Alias " section for more infos, and feel free to ask for more detailed explanation.

      Hope this helps.

      Pier.

  3. Matthew February 14, 2014 at 15:43

    Great info, just diving into the world of this, and was trying to figure out how to limit a login to run a cache clearing command

    user ALL=NOPASSWD: rm -rf /usr/nginx/cache/*

    but i got a syntax error

    1. Pier Post author February 17, 2014 at 07:22

      Hi,

      Looks like you forgot the following part of the command specs :
      3. (ALL) : This is the part that specify which user(s) you may act as.

      Check the 2.1 Section of the current page, you may want to have something like :
      user ALL=(ALL) NOPASSWD: /sbin/rm -rf /usr/nginx/cache/*

      Always use the full path for any given command : This will prevent you from using a bad aliased command.

[Nov 09, 2017] TERM strings by Tom Ryder

Jan 26, 2013 | sanctum.geek.nz

A certain piece of very misleading advice is often given online to users having problems with the way certain command-line applications are displaying in their terminals. This is to suggest that the user change the value of their TERM environment variable from within the shell, doing something like this:

$ TERM=xterm-256color

This misinformation sometimes extends to suggesting that users put the forced TERM change into their shell startup scripts. The reason this is such a bad idea is that it forces your shell to assume what your terminal is, and thereby disregards the initial terminal identity string sent by the emulator. This leads to a lot of confusion when one day you need to connect with a very different terminal emulator.

Accounting for differences

All terminal emulators are not created equal. Certainly, not all of them are xterm(1) , although many other terminal emulators do a decent but not comprehensive job of copying it. The value of the TERM environment variable is used by the system running the shell to determine what the terminal connecting to it can and cannot do, what control codes to send to the program to use those features, and how the shell should understand the input of certain key codes, such as the Home and End keys. These things in particular are common causes of frustration for new users who turn out to be using a forced TERM string.

Instead, focus on these two guidelines for setting TERM :

  1. Avoid setting TERM from within the shell, especially in your startup scripts like .bashrc or .bash_profile . If that ever seems like the answer, then you are probably asking the wrong question! The terminal identification string should always be sent by the terminal emulator you are using; if you do need to change it, then change it in the settings for the emulator.
  2. Always use an appropriate TERM string that accurately describes what your choice of terminal emulator can and cannot display. Don't make an rxvt(1) terminal identify itself as xterm ; don't make a linux console identify itself as vt100 ; and don't make an xterm(1) compiled without 256 color support refer to itself as xterm-256color .

In particular, note that sometimes for compatibility reasons, the default terminal identification used by an emulator is given as something generic like xterm , when in fact a more accurate or comprehensive terminal identity file is more than likely available for your particular choice of terminal emulator with a little searching.

An example that surprises a lot of people is the availability of the putty terminal identity file, when the application defaults to presenting itself as an imperfect xterm(1) emulator.

Configuring your emulator's string

Before you change your terminal string in its settings, check whether the default it uses is already the correct one, with one of these:

$ echo $TERM
$ tset -q

Most builds of rxvt(1) , for example, should already use the correct TERM string by default, such as rxvt-unicode-256color for builds with 256 colors and Unicode support.

Where to configure which TERM string your terminal uses will vary depending on the application. For xterm(1) , your .Xresources file should contain a definition like the below:

XTerm*termName: xterm-256color

For rxvt(1) , the syntax is similar:

URxvt*termName: rxvt-unicode-256color

Other GTK and Qt emulators sometimes include the setting somewhere in their preferences. Look for mentions of xterm , a common fallback default.

For Windows PuTTY, it's configurable under the "'Connections > Data"' section:

Setting the terminal string in PuTTY

More detail about configuring PuTTY for connecting to modern systems can be found in my article on configuring PuTTY .

Testing your TERM string

On GNU/Linux systems, an easy way to test the terminal capabilities (particularly effects like colors and reverse video) is using the msgcat(1) utility:

$ msgcat --color=test

This will output a large number of tests of various features to the terminal, so that you can check their appearance is what you expect.

Finding appropriate terminfo(5) definitions

On GNU/Linux systems, the capabilities and behavior of various terminal types is described using terminfo(5) files, usually installed as part of the ncurses package. These files are often installed in /lib/terminfo or /usr/share/terminfo , in subdirectories by first letter.

In order to use a particular TERM string, an appropriate file must exist in one of these directories. On Debian-derived systems, a large collection of terminal types can be installed to the system with the ncurses-term package.

For example, the following variants of the rxvt terminal emulator are all available:

$ cd /usr/share/terminfo/r
$ ls rxvt*
rxvt-16color  rxvt-256color  rxvt-88color  rxvt-color  rxvt-cygwin
rxvt-cygwin-native  rxvt+pcfkeys  rxvt-unicode-256color  rxvt-xpm
Private and custom terminfo(5) files

If you connect to a system that doesn't have a terminfo(5) definition to match the TERM definition for your particular terminal, you might get a message similar to this on login:

setterm: rxvt-unicode-256color: unknown terminal type
tput: unknown terminal "rxvt-unicode-256color"
$

If you're not able to install the appropriate terminal definition system-wide, one technique is to use a private .terminfo directory in your home directory containing the definitions you need:

$ cd ~/.terminfo
$ find
.
./x
./x/xterm-256color
./x/xterm
./r
./r/rxvt-256color
./r/rxvt-unicode-256color
./r/rxvt
./s
./s/screen
./s/screen-256color
./p
./p/putty-256color
./p/putty

You can copy this to your home directory on the servers you manage with a tool like scp :

$ scp -r .terminfo server:
TERM and multiplexers

Terminal multiplexers like screen(1) and tmux(1) are special cases, and they cause perhaps the most confusion to people when inaccurate TERM strings are used. The tmux FAQ even opens by saying that most of the display problems reported by people are due to incorrect TERM settings, and a good portion of the codebase in both multiplexers is dedicated to negotiating the differences between terminal capacities.

This is because they are "terminals within terminals", and provide their own functionality only within the bounds of what the outer terminal can do. In addition to this, they have their own type for terminals within them; both of them use screen and its variants, such as screen-256color .

It's therefore very important to check that both the outer and inner definitions for TERM are correct. In .screenrc it usually suffices to use a line like the following:

term screen

Or in .tmux.conf :

set-option -g default-terminal screen

If the outer terminals you use consistently have 256 color capabilities, you may choose to use the screen-256color variant instead.

If you follow all of these guidelines, your terminal experience will be much smoother, as your terminal and your system will understand each other that much better. You may find that this fixes a lot of struggles with interactive tools like vim(1) , for one thing, because if the application is able to divine things like the available color space directly from terminal information files, it saves you from having to include nasty hacks on the t_Co variable in your .vimrc . Posted in Terminal Tagged term strings , terminal types , terminfo

[Nov 09, 2017] PuTTY configuration by Tom Ryder

Dec 22, 2012 | sanctum.geek.nz

Posted on PuTTY is a terminal emulator with a free software license, including an SSH client. While it has cross-platform ports, it's used most frequently on Windows systems, because they otherwise lack a built-in terminal emulator that interoperates well with Unix-style TTY systems.

While it's very popular and useful, PuTTY's defaults are quite old, and are chosen for compatibility reasons rather than to take advantage of all the features of a more complete terminal emulator. For new users, this is likely an advantage as it can avoid confusion, but more advanced users who need to use a Windows client to connect to a modern GNU/Linux system may find the defaults frustrating, particularly when connecting to a more capable and custom-configured server.

Here are a few of the problems with the default configuration:

All of these things are fixable.

Terminal type

Usually the most important thing in getting a terminal working smoothly is to make sure it identifies itself correctly to the machine to which it's connecting, using an appropriate $TERM string. By default, PuTTY identifies itself as an xterm(1) terminal emulator, which most systems will support.

However, there's a terminfo(5) definition for putty and putty-256color available as part of ncurses , and if you have it available on your system then you should use it, as it slightly more precisely describes the features available to PuTTY as a terminal emulator.

You can check that you have the appropriate terminfo(5) definition installed by looking in /usr/share/terminfo/p :

$ ls -1 /usr/share/terminfo/p/putty*
/usr/share/terminfo/p/putty  
/usr/share/terminfo/p/putty-256color  
/usr/share/terminfo/p/putty-sco  
/usr/share/terminfo/p/putty-vt100

On Debian and Ubuntu systems, these files can be installed with:

# apt-get install ncurses-term

If you can't install the files via your system's package manager, you can also keep a private repository of terminfo(5) files in your home directory, in a directory called .terminfo :

$ ls -1 $HOME/.terminfo/p
putty
putty-256color

Once you have this definition installed, you can instruct PuTTY to identify with that $TERM string in the Connection > Data section:

Correct terminal definition in PuTTY

Here, I've used putty-256color ; if you don't need or want a 256 color terminal you could just use putty .

Once connected, make sure that your $TERM string matches what you specified, and hasn't been mangled by any of your shell or terminal configurations:

$ echo $TERM
putty-256color
Color space

Certain command line applications like Vim and Tmux can take advantage of a full 256 colors in the terminal. If you'd like to use this, set PuTTY's $TERM string to putty-256color as outlined above, and select Allow terminal to use xterm 256-colour mode in Window > Colours

256 colours in PuTTY

You can test this is working by using a 256 color application, or by trying out the terminal colours directly in your shell using tput :

$ for ((color = 0; color <= 255; color++)); do
> tput setaf "$color"
> printf "test"
> done

If you see the word test in many different colors, then things are probably working. Type reset to fix your terminal after this:

$ reset
Using UTF-8

If you're connecting to a modern GNU/Linux system, it's likely that you're using a UTF-8 locale. You can check which one by typing locale . In my case, I'm using the en_NZ locale with UTF-8 character encoding:

$ locale
LANG=en_NZ.UTF-8
LANGUAGE=en_NZ:en
LC_CTYPE="en_NZ.UTF-8"
LC_NUMERIC="en_NZ.UTF-8"
LC_TIME="en_NZ.UTF-8"
LC_COLLATE="en_NZ.UTF-8"
LC_MONETARY="en_NZ.UTF-8"
LC_MESSAGES="en_NZ.UTF-8"
LC_PAPER="en_NZ.UTF-8"
LC_NAME="en_NZ.UTF-8"
LC_ADDRESS="en_NZ.UTF-8"
LC_TELEPHONE="en_NZ.UTF-8"
LC_MEASUREMENT="en_NZ.UTF-8"
LC_IDENTIFICATION="en_NZ.UTF-8"
LC_ALL=

If the output of locale does show you're using a UTF-8 character encoding, then you should configure PuTTY to interpret terminal output using that character set; it can't detect it automatically (which isn't PuTTY's fault; it's a known hard problem). You do this in the Window > Translation section:

Using UTF-8 encoding in PuTTY

While you're in this section, it's best to choose the Use Unicode line drawing code points option as well. Line-drawing characters are most likely to work properly with this setting for UTF-8 locales and modern fonts:

Using Unicode line-drawing points in PuTTY

If Unicode and its various encodings is new to you, I highly recommend Joel Spolsky's classic article about what programmers should know about both.

Fonts

Courier New is a workable monospace font, but modern Windows systems include Consolas , a much nicer terminal font. You can change this in the Window > Appearance section:

Using Consolas font in PuTTY

There's no reason you can't use another favourite Bitmap or TrueType font instead once it's installed on your system; DejaVu Sans Mono , Inconsolata , and Terminus are popular alternatives. I personally favor Ubuntu Mono .

Bells

Terminal bells by default in PuTTY emit the system alert sound. Most people find this annoying; some sort of visual bell tends to be much better if you want to use the bell at all. Configure this in Terminal > Bell

Given the purpose of the alert is to draw attention to the window, I find that using a flashing taskbar icon works well; I use this to draw my attention to my prompt being displayed after a long task completes, or if someone mentions my name or directly messages me in irssi(1) .

Another option is using the Visual bell (flash window) option, but I personally find this even worse than the audible bell.

Default palette

The default colours for PuTTY are rather like those used in xterm(1) , and hence rather harsh, particularly if you're used to the slightly more subdued colorscheme of terminal emulators like gnome-terminal(1) , or have customized your palette to something like Solarized .

If you have decimal RGB values for the colours you'd prefer to use, you can enter those in the Window > Colours section, making sure that Use system colours and Attempt to use logical palettes are unchecked:

There are a few other default annoyances in PuTTY, but the above are the ones that seem to annoy advanced users most frequently. Dag Wieers has a similar post with a few more defaults to fix.

[Nov 04, 2017] Time to move away from HPE Software by Lindsay Hill

Nov 04, 2017 | lkhill.com

Time to move away from HPE Software 15 September 2016 · Filed in Opinion

If you are still using HPE Software, you should actively plan to migrate away. The recent divestiture does not look good to me - I think existing customers are going to get soaked. Plan your migration now.

I've said it before, that I retain a soft spot for Hewlett-Packard. They gave me my first professional job out of university. I served my sentence doing HP OpenView consulting, and HP-UX Administration, but still: it got me started. Once you have some professional experience, it's much easier to move to the next role.

It saddens me to watch HP's ongoing struggles. It's sad to watch a big ship get broken up for parts. But things had to change. They need to do something to adapt to the realities of modern IT demands.

There was one line in the recent announcement about divesting HPE's software assets that stood out to me:

Micro Focus expects to improve the margin on HPE's software assets by approximately 20 percentage points by the end of the third full financial year following the closing of the transaction

Press Release

(Emphasis added).

It has been clear for a while that HP Software was no longer a core asset for HPE. It was clear that it was not adapting, and was being starved for investment. Revenues have seen decline. Smart customers have seen this coming, and have been actively migrating away from HPE Software.

But if you're still using it, you should pay attention to that press release. How do you think Micro Focus plans to improve margins by 20 percentage points? That's a lot of margin. You've got three options:

  1. Increase sales. Software development has high fixed costs, so margin improves with additional sales.
  2. Increase prices, collecting more money from existing customers.
  3. Reduce investment, spending less to improve margins, and hope customers don't notice.

This is a mature business. They will have a low percentage of new customers. Most revenue will be coming from existing customers. It is not a growth market. So what's left? Raise prices, and reduce investment.

If you're an existing customer, expect to see more license audits, and higher renewal quotes. Expect to see feature stagnation.

It won't happen straight away, but it will happen. If you're still delaying that migration, time to get a move on.

[Nov 04, 2017] Has HP Abandoned Operations Manager Lindsay Hill

Nov 04, 2017 | lkhill.com

HP OM has not adapted well to modern demands. It does not deal well with VMs being deployed at a high rate. It does not offer service monitoring capabilities. It does not offer any way to connect to cloud provider APIs. The agents have continued to be unstable. The administrative interface for OML/OMU looks like something I wrote over a weekend, based on a dodgy PHP shopping cart. It does not look like a piece of software that costs tens of thousands of dollars. Or actually maybe it does - Enterprise software in general tends to be ugly. HP didn't even develop it themselves - they licensed the admin interface from Blue Elephant Systems . The Java GUI for OML/OMU was a disgrace in 2002 - and it hasn't changed since.

William , April 13, 2016 10:30 AM

Again, at another site where they are attempting to move to OMi ( BSM ) Just a note here. BSM is the top tier interface through which other products flow. A crude anology would be MicroSoft Office is the suite in which many other products like Access, Outlook, One Note...etc...etc are pieces or parts. OMi is a piece of the comprehensive suite of tools offered by HP Software. Just like "OpenView" was the umbrella word used for all HP tools like OM-W, OM-U, NNM, OV...PI, TA, SI, PM and a host of other products. The jury is still out on whether the products are viable as a management suite. One major consideration is ROI. Problems still exist in ALL the tools, SiS does not provide capabilities or granularity agents have. I could write or borrow scripts ( Perl, Shell, VB, Powershell ) to effectivly do everything it does. OMi loses CI's, does not get critical mesaages forwarded, loses communication with the agents it is supposedly managing for starters, NNMi has issues not finding nodes that it should discover when discovery filters are configured. And I could add a dozen other "dirty diapers" in the suite. Yet, one can see where HP is trying to go here. IF a few of those 400 million in development dollars are thrown at the suite it could prove a valuable suite in any IT departments arsenal.

Hank Williams , October 9, 2013 10:08 AM

Theoretically BSM/OMi looks like an HPOM alternative, but looking at the scalability, the TCO, the complexity (and, and, and ...) it isn't. If you are wary about migrating to BSM, be provocative and ask HP for a reference implementation and analyse the length and cost of the implementation.

William Linn Hank Williams , July 28, 2017 3:45 PM

OMi is a little cleaner, on my last customer site it at least functioned in the 10.12 version. HP couldn't sell BSM with all the integrations like they thought. I personally know of several large enterprises that unceremoniously dumped ALL HP products, like Data Protector, HP-UX when the monitoring tool became an albatross around their necks as far as implementation and complexity of BSM/OMi. So, HP has done what HP always does when they have a major malfunction in marketing. They "REBRANDED"!!!! Seems that is what you see in companies that go out of business in one location, then move to a secondary spot OR, better yet they have a huge "going out of business" sale and the products never get lowered in price, they actually mark them up, if they sell great, if not then they close for a couple of weeks and company A opens as company B with all the same inventory at a marked up price. Maybe not really the scenario HP is using, but close. OMi by itself without the uCMDB ( which causes other issues when reconciliation occurs between agent based CI' and CIT and what is found via the scripts uCMDB uses to collect data, mismatches as one sees it differently and then the CI or CIT is removed, IF a critical system...boom...no monitoring as the policies are gone and there is only a reference to the CI in OMi. ) but...as noted, OMi by itself seems stable, though they are not in version 10.61...and by the way...the patch from 10.60 to 61 is flawed. But aside from the complications of TQL's, RTSM, etc...etc...it looks a whole lot more stable.

Lindsay Hill Hank Williams , October 9, 2013 11:57 PM

You're right - I've seen those implementation plans, and it gets very expensive, very quickly. You have to put in a lot of effort just into getting software installed and integrated - none of which is of any direct value to the customer. Maybe justifiable in huge environments, but for the rest of us? Not really.
Customers shouldn't have to pay for fixing broken integrations, they should be able to just start using the software to solve their business problems. We're years away from reaching that point though.

[Nov 01, 2017] Cron best practices by Tom Ryder

May 08, 2016 | sanctum.geek.nz

The time-based job scheduler cron(8) has been around since Version 7 Unix, and its crontab(5) syntax is familiar even for people who don't do much Unix system administration. It's standardised , reasonably flexible, simple to configure, and works reliably, and so it's trusted by both system packages and users to manage many important tasks.

However, like many older Unix tools, cron(8) 's simplicity has a drawback: it relies upon the user to know some detail of how it works, and to correctly implement any other safety checking behaviour around it. Specifically, all it does is try and run the job at an appropriate time, and email the output. For simple and unimportant per-user jobs, that may be just fine, but for more crucial system tasks it's worthwhile to wrap a little extra infrastructure around it and the tasks it calls.

There are a few ways to make the way you use cron(8) more robust if you're in a situation where keeping track of the running job is desirable.

Apply the principle of least privilege

The sixth column of a system crontab(5) file is the username of the user as which the task should run:

0 * * * *  root  cron-task

To the extent that is practical, you should run the task as a user with only the privileges it needs to run, and nothing else. This can sometimes make it worthwhile to create a dedicated system user purely for running scheduled tasks relevant to your application.

0 * * * *  myappcron  cron-task

This is not just for security reasons, although those are good ones; it helps protect you against nasties like scripting errors attempting to remove entire system directories .

Similarly, for tasks with database systems such as MySQL, don't use the administrative root user if you can avoid it; instead, use or even create a dedicated user with a unique random password stored in a locked-down ~/.my.cnf file, with only the needed permissions. For a MySQL backup task, for example, only a few permissions should be required, including SELECT , SHOW VIEW , and LOCK TABLES .

In some cases, of course, you really will need to be root . In particularly sensitive contexts you might even consider using sudo(8) with appropriate NOPASSWD options, to allow the dedicated user to run only the appropriate tasks as root , and nothing else.

Test the tasks

Before placing a task in a crontab(5) file, you should test it on the command line, as the user configured to run the task and with the appropriate environment set. If you're going to run the task as root , use something like su or sudo -i to get a root shell with the user's expected environment first:

$ sudo -i -u cronuser
$ cron-task

Once the task works on the command line, place it in the crontab(5) file with the timing settings modified to run the task a few minutes later, and then watch /var/log/syslog with tail -f to check that the task actually runs without errors, and that the task itself completes properly:

May  7 13:30:01 yourhost CRON[20249]: (you) CMD (cron-task)

This may seem pedantic at first, but it becomes routine very quickly, and it saves a lot of hassles down the line as it's very easy to make an assumption about something in your environment that doesn't actually hold in the one that cron(8) will use. It's also a necessary acid test to make sure that your crontab(5) file is well-formed, as some implementations of cron(8) will refuse to load the entire file if one of the lines is malformed.

If necessary, you can set arbitrary environment variables for the tasks at the top of the file:

MYVAR=myvalue

0 * * * *  you  cron-task
Don't throw away errors or useful output

You've probably seen tutorials on the web where in order to keep the crontab(5) job from sending standard output and/or standard error emails every five minutes, shell redirection operators are included at the end of the job specification to discard both the standard output and standard error. This kluge is particularly common for running web development tasks by automating a request to a URL with curl(1) or wget(1) :

*/5 * * *  root  curl https://example.com/cron.php >/dev/null 2>&1

Ignoring the output completely is generally not a good idea, because unless you have other tasks or monitoring ensuring the job does its work, you won't notice problems (or know what they are), when the job emits output or errors that you actually care about.

In the case of curl(1) , there are just way too many things that could go wrong, that you might notice far too late:

The author has seen all of the above happen, in some cases very frequently.

As a general policy, it's worth taking the time to read the manual page of the task you're calling, and to look for ways to correctly control its output so that it emits only the output you actually want. In the case of curl(1) , for example, I've found the following formula works well:

curl -fLsS -o /dev/null http://example.com/

This way, the curl(1) request should stay silent if everything is well, per the old Unix philosophy Rule of Silence .

You may not agree with some of the choices above; you might think it important to e.g. log the complete output of the returned page, or to fail rather than silently accept a 301 redirect, or you might prefer to use wget(1) . The point is that you take the time to understand in more depth what the called program will actually emit under what circumstances, and make it match your requirements as closely as possible, rather than blindly discarding all the output and (worse) the errors. Work with Murphy's law ; assume that anything that can go wrong eventually will.

Send the output somewhere useful

Another common mistake is failing to set a useful MAILTO at the top of the crontab(5) file, as the specified destination for any output and errors from the tasks. cron(8) uses the system mail implementation to send its messages, and typically, default configurations for mail agents will simply send the message to an mbox file in /var/mail/$USER , that they may not ever read. This defeats much of the point of mailing output and errors.

This is easily dealt with, though; ensure that you can send a message to an address you actually do check from the server, perhaps using mail(1) :

$ printf '%s\n' 'Test message' | mail -s 'Test subject' you@example.com

Once you've verified that your mail agent is correctly configured and that the mail arrives in your inbox, set the address in a MAILTO variable at the top of your file:

MAILTO=you@example.com

0 * * * *    you  cron-task-1
*/5 * * * *  you  cron-task-2

If you don't want to use email for routine output, another method that works is sending the output to syslog with a tool like logger(1) :

0 * * * *   you  cron-task | logger -it cron-task

Alternatively, you can configure aliases on your system to forward system mail destined for you on to an address you check. For Postfix, you'd use an aliases(5) file.

I sometimes use this setup in cases where the task is expected to emit a few lines of output which might be useful for later review, but send stderr output via MAILTO as normal. If you'd rather not use syslog , perhaps because the output is high in volume and/or frequency, you can always set up a log file /var/log/cron-task.log but don't forget to add a logrotate(8) rule for it!

Put the tasks in their own shell script file

Ideally, the commands in your crontab(5) definitions should only be a few words, in one or two commands. If the command is running off the screen, it's likely too long to be in the crontab(5) file, and you should instead put it into its own script. This is a particularly good idea if you want to reliably use features of bash or some other shell besides POSIX/Bourne /bin/sh for your commands, or even a scripting language like Awk or Perl; by default, cron(8) uses the system's /bin/sh implementation for parsing the commands.

Because crontab(5) files don't allow multi-line commands, and have other gotchas like the need to escape percent signs % with backslashes, keeping as much configuration out of the actual crontab(5) file as you can is generally a good idea.

If you're running cron(8) tasks as a non-system user, and can't add scripts into a system bindir like /usr/local/bin , a tidy method is to start your own, and include a reference to it as part of your PATH . I favour ~/.local/bin , and have seen references to ~/bin as well. Save the script in ~/.local/bin/cron-task , make it executable with chmod +x , and include the directory in the PATH environment definition at the top of the file:

PATH=/home/you/.local/bin:/usr/local/bin:/usr/bin:/bin
MAILTO=you@example.com

0 * * * *  you  cron-task

Having your own directory with custom scripts for your own purposes has a host of other benefits, but that's another article

Avoid /etc/crontab

If your implementation of cron(8) supports it, rather than having an /etc/crontab file a mile long, you can put tasks into separate files in /etc/cron.d :

$ ls /etc/cron.d
system-a
system-b
raid-maint

This approach allows you to group the configuration files meaningfully, so that you and other administrators can find the appropriate tasks more easily; it also allows you to make some files editable by some users and not others, and reduces the chance of edit conflicts. Using sudoedit(8) helps here too. Another advantage is that it works better with version control; if I start collecting more than a few of these task files or to update them more often than every few months, I start a Git repository to track them:

$ cd /etc/cron.d
$ sudo git init
$ sudo git add --all
$ sudo git commit -m "First commit"

If you're editing a crontab(5) file for tasks related only to the individual user, use the crontab(1) tool; you can edit your own crontab(5) by typing crontab -e , which will open your $EDITOR to edit a temporary file that will be installed on exit. This will save the files into a dedicated directory, which on my system is /var/spool/cron/crontabs .

On the systems maintained by the author, it's quite normal for /etc/crontab never to change from its packaged template.

Include a timeout

cron(8) will normally allow a task to run indefinitely, so if this is not desirable, you should consider either using options of the program you're calling to implement a timeout, or including one in the script. If there's no option for the command itself, the timeout(1) command wrapper in coreutils is one possible way of implementing this:

0 * * * *  you  timeout 10s cron-task

Greg's wiki has some further suggestions on ways to implement timeouts .

Include file locking to prevent overruns

cron(8) will start a new process regardless of whether its previous runs have completed, so if you wish to avoid locking for long-running task, on GNU/Linux you could use the flock(1) wrapper for the flock(2) system call to set an exclusive lockfile, in order to prevent the task from running more than one instance in parallel.

0 * * * *  you  flock -nx /var/lock/cron-task cron-task

Greg's wiki has some more in-depth discussion of the file locking problem for scripts in a general sense, including important information about the caveats of "rolling your own" when flock(1) is not available.

If it's important that your tasks run in a certain order, consider whether it's necessary to have them in separate tasks at all; it may be easier to guarantee they're run sequentially by collecting them in a single shell script.

Do something useful with exit statuses

If your cron(8) task or commands within its script exit non-zero, it can be useful to run commands that handle the failure appropriately, including cleanup of appropriate resources, and sending information to monitoring tools about the current status of the job. If you're using Nagios Core or one of its derivatives, you could consider using send_nsca to send passive checks reporting the status of jobs to your monitoring server. I've written a simple script called nscaw to do this for me:

0 * * * *  you  nscaw CRON_TASK -- cron-task
Consider alternatives to cron(8)

If your machine isn't always on and your task doesn't need to run at a specific time, but rather needs to run once daily or weekly, you can install anacron and drop scripts into the cron.hourly , cron.daily , cron.monthly , and cron.weekly directories in /etc , as appropriate. Note that on Debian and Ubuntu GNU/Linux systems, the default /etc/crontab contains hooks that run these, but they run only if anacron(8) is not installed.

If you're using cron(8) to poll a directory for changes and run a script if there are such changes, on GNU/Linux you could consider using a daemon based on inotifywait(1) instead.

Finally, if you require more advanced control over when and how your task runs than cron(8) can provide, you could perhaps consider writing a daemon to run on the server consistently and fork processes for its task. This would allow running a task more often than once a minute, as an example. Don't get too bogged down into thinking that cron(8) is your only option for any kind of asynchronous task management!

[Nov 01, 2017] Listing files

www.tecmint.com

Using ls is probably one of the first commands an administrator will learn for getting a simple list of the contents of the directory. Most administrators will also know about the -a and -l switches, to show all files including dot files and to show more detailed data about files in columns, respectively.

There are other switches to GNU ls which are less frequently used, some of which turn out to be very useful for programming:

Since the listing is text like anything else, you could, for example, pipe the output of this command into a vim process, so you could add explanations of what each file is for and save it as an inventory file or add it to a README:

$ ls -XR | vim -

This kind of stuff can even be automated by make with a little work, which I'll cover in another article later in the series.

[Nov 01, 2017] Functions by Tom Ryder

Nov 01, 2017 | sanctum.geek.nz

A more flexible method for defining custom commands for an interactive shell (or within a script) is to use a shell function. We could declare our ll function in a Bash startup file as a function instead of an alias like so:

# Shortcut to call ls(1) with the -l flag
ll() {
    command ls -l "$@"
}

Note the use of the command builtin here to specify that the ll function should invoke the program named ls , and not any function named ls . This is particularly important when writing a function wrapper around a command, to stop an infinite loop where the function calls itself indefinitely:

# Always add -q to invocations of gdb(1)
gdb() {
    command gdb -q "$@"
}

In both examples, note also the use of the "$@" expansion, to add to the final command line any arguments given to the function. We wrap it in double quotes to stop spaces and other shell metacharacters in the arguments causing problems. This means that the ll command will work correctly if you were to pass it further options and/or one or more directories as arguments:

$ ll -a
$ ll ~/.config

Shell functions declared in this way are specified by POSIX for Bourne-style shells, so they should work in your shell of choice, including Bash, dash , Korn shell, and Zsh. They can also be used within scripts, allowing you to abstract away multiple instances of similar commands to improve the clarity of your script, in much the same way the basics of functions work in general-purpose programming languages.

Functions are a good and portable way to approach adding features to your interactive shell; written carefully, they even allow you to port features you might like from other shells into your shell of choice. I'm fond of taking commands I like from Korn shell or Zsh and implementing them in Bash or POSIX shell functions, such as Zsh's vared or its two-argument cd features.

If you end up writing a lot of shell functions, you should consider putting them into separate configuration subfiles to keep your shell's primary startup file from becoming unmanageably large.

Examples from the author

You can take a look at some of the shell functions I have defined here that are useful to me in general shell usage; a lot of these amount to implementing convenience features that I wish my shell had, especially for quick directory navigation, or adding options to commands:

Other examples Variables in shell functions

You can manipulate variables within shell functions, too:

# Print the filename of a path, stripping off its leading path and
# extension
fn() {
    name=$1
    name=${name##*/}
    name=${name%.*}
    printf '%s\n' "$name"
}

This works fine, but the catch is that after the function is done, the value for name will still be defined in the shell, and will overwrite whatever was in there previously:

$ printf '%s\n' "$name"
foobar
$ fn /home/you/Task_List.doc
Task_List
$ printf '%s\n' "$name"
Task_List

This may be desirable if you actually want the function to change some aspect of your current shell session, such as managing variables or changing the working directory. If you don't want that, you will probably want to find some means of avoiding name collisions in your variables.

If your function is only for use with a shell that provides the local (Bash) or typeset (Ksh) features, you can declare the variable as local to the function to remove its global scope, to prevent this happening:

# Bash-like
fn() {
    local name
    name=$1
    name=${name##*/}
    name=${name%.*}
    printf '%s\n' "$name"
}

# Ksh-like
# Note different syntax for first line
function fn {
    typeset name
    name=$1
    name=${name##*/}
    name=${name%.*}
    printf '%s\n' "$name"
}

If you're using a shell that lacks these features, or you want to aim for POSIX compatibility, things are a little trickier, since local function variables aren't specified by the standard. One option is to use a subshell , so that the variables are only defined for the duration of the function:

# POSIX; note we're using plain parentheses rather than curly brackets, for
# a subshell
fn() (
    name=$1
    name=${name##*/}
    name=${name%.*}
    printf '%s\n' "$name"
)

# POSIX; alternative approach using command substitution:
fn() {
    printf '%s\n' "$(
        name=$1
        name=${name##*/}
        name=${name%.*}
        printf %s "$name"
    )"
}

This subshell method also allows you to change directory with cd within a function without changing the working directory of the user's interactive shell, or to change shell options with set or Bash options with shopt only temporarily for the purposes of the function.

Another method to deal with variables is to manipulate the positional parameters directly ( $1 , $2 ) with set , since they are local to the function call too:

# POSIX; using positional parameters
fn() {
    set -- "${1##*/}"
    set -- "${1%.*}"
    printf '%s\n' "$1"
}

These methods work well, and can sometimes even be combined, but they're awkward to write, and harder to read than the modern shell versions. If you only need your functions to work with your modern shell, I recommend just using local or typeset . The Bash Guide on Greg's Wiki has a very thorough breakdown of functions in Bash, if you want to read about this and other aspects of functions in more detail.

Keeping functions for later

As you get comfortable with defining and using functions during an interactive session, you might define them in ad-hoc ways on the command line for calling in a loop or some other similar circumstance, just to solve a task in that moment.

As an example, I recently made an ad-hoc function called monit to run a set of commands for its hostname argument that together established different types of monitoring system checks, using an existing script called nmfs :

$ monit() { nmfs "$1" Ping Y ; nmfs "$1" HTTP Y ; nmfs "$1" SNMP Y ; }
$ for host in webhost{1..10} ; do
> monit "$host"
> done

After that task was done, I realized I was likely to use the monit command interactively again, so I decided to keep it. Shell functions only last as long as the current shell, so if you want to make them permanent, you need to store their definitions somewhere in your startup files. If you're using Bash, and you're content to just add things to the end of your ~/.bashrc file, you could just do something like this:

$ declare -f monit >> ~/.bashrc

That would append the existing definition of monit in parseable form to your ~/.bashrc file, and the monit function would then be loaded and available to you for future interactive sessions. Later on, I ended up converting monit into a shell script, as its use wasn't limited to just an interactive shell.

If you want a more robust approach to keeping functions like this for Bash permanently, I wrote a tool called Bashkeep , which allows you to quickly store functions and variables defined in your current shell into separate and appropriately-named files, including viewing and managing the list of names conveniently:

$ keep monit
$ keep
monit
$ ls ~/.bashkeep.d
monit.bash
$ keep -d monit

[Nov 01, 2017] 4 Ways to Watch or Monitor Log Files in Real Time by Matei Cezar

Oct 31, 2017 | www.tecmint.com
How can I see the content of a log file in real time in Linux? Well there are a lot of utilities out there that can help a user to output the content of a file while the file is changing or continuously updating. Some of the most known and heavily used utility to display a file content in real time in Linux is the tail command (manage files effectively).

Read Also : 4 Good Open Source Log Monitoring and Management Tools for Linux

1. tail Command – Monitor Logs in Real Time

As said, tail command is the most common solution to display a log file in real time. However, the command to display the file has two versions, as illustrated in the below examples.

In the first example the command tail needs the -f argument to follow the content of a file.

$ sudo tail -f /var/log/apache2/access.log
Monitor Apache Logs in Real Time

Monitor Apache Logs in Real Time

The second version of the command is actually a command itself: tailf . You won't need to use the -f switch because the command is built-in with the -f argument.

$ sudo tailf /var/log/apache2/access.log
Real Time Apache Logs Monitoring

Real Time Apache Logs Monitoring

Usually, the log files are rotated frequently on a Linux server by the logrotate utility. To watch log files that get rotated on a daily base you can use the -F flag to tail command

Read Also : How to Manage System Logs (Configure, Rotate and Import Into Database) in Linux

The tail -F will keep track if new log file being created and will start following the new file instead of the old file.

$ sudo tail -F /var/log/apache2/access.log

However, by default, tail command will display the last 10 lines of a file. For instance, if you want to watch in real time only the last two lines of the log file, use the -n file combined with the -f flag, as shown in the below example.

$ sudo tail -n2 -f /var/log/apache2/access.log
Watch Last Two Lines of Logs

Watch Last Two Lines of Logs 2. Multitail Command – Monitor Multiple Log Files in Real Time

Another interesting command to display log files in real time is multitail command . The name of the command implies that multitail utility can monitor and keep track of multiple files in real time. Multitail also lets you navigate back and forth in the monitored file.

To install mulitail utility in Debian and RedHat based systems issue the below command.

$ sudo apt install multitail   [On Debian & Ubuntu]
$ sudo yum install multitail   [On RedHat & CentOS]
$ sudo dnf install multitail   [On Fedora 22+ version]

To display the output of two log file simultaneous, execute the command as shown in the below example.

$ sudo multitail /var/log/apache2/access.log /var/log/apache2/error.log
Multitail Monitor Logs

Multitail Monitor Logs 3. lnav Command – Monitor Multiple Log Files in Real Time

Another interesting command, similar to multitail command is the lnav command . Lnav utility can also watch and follow multiple files and display their content in real time.

To install lnav utility in Debian and RedHat based Linux distributions by issuing the below command.

$ sudo apt install lnav   [On Debian & Ubuntu]
$ sudo yum install lnav   [On RedHat & CentOS]
$ sudo dnf install lnav   [On Fedora 22+ version]

Watch the content of two log files simultaneously by issuing the command as shown in the below example.

$ sudo lnav /var/log/apache2/access.log /var/log/apache2/error.log
lnav - Real Time Logs Monitoring

lnav – Real Time Logs Monitoring 4. less Command – Display Real Time Output of Log Files

Finally, you can display the live output of a file with less command if you type Shift+F .

As with tail utility , pressing Shift+F in a opened file in less will start following the end of the file. Alternatively, you can also start less with less +F flag to enter to live watching of the file.

$ sudo less +F  /var/log/apache2/access.log
Watch Logs Using Less Command

Watch Logs Using Less Command

That's It! You may read these following articles on Log monitoring and management.

[Nov 01, 2017] File metadata

sanctum.geek.nz

The file tool gives you a one-line summary of what kind of file you're looking at, based on its extension, headers and other cues. This is very handy used with find when examining a set of unfamiliar files:

$ find . -exec file {} +
.:            directory
./hanoi:      Perl script, ASCII text executable
./.hanoi.swp: Vim swap file, version 7.3
./factorial:  Perl script, ASCII text executable
./bits.c:     C source, ASCII text
./bits:       ELF 32-bit LSB executable, Intel 80386, version ...

[Oct 31, 2017] Bash job control by Tom Ryder

Jan 31, 2012 | sanctum.geek.nz

Oftentimes you may wish to start a process on the Bash shell without having to wait for it to actually complete, but still be notified when it does. Similarly, it may be helpful to temporarily stop a task while it's running without actually quitting it, so that you can do other things with the terminal. For these kinds of tasks, Bash's built-in job control is very useful. Backgrounding processes

If you have a process that you expect to take a long time, such as a long cp or scp operation, you can start it in the background of your current shell by adding an ampersand to it as a suffix:

$ cp -r /mnt/bigdir /home &
[1] 2305

This will start the copy operation as a child process of your bash instance, but will return you to the prompt to enter any other commands you might want to run while that's going.

The output from this command shown above gives both the job number of 1, and the process ID of the new task, 2305. You can view the list of jobs for the current shell with the builtin jobs :

$ jobs
[1]+  Running  cp -r /mnt/bigdir /home &

If the job finishes or otherwise terminates while it's backgrounded, you should see a message in the terminal the next time you update it with a newline:

[1]+  Done  cp -r /mnt/bigdir /home &
Foregrounding processes

If you want to return a job in the background to the foreground, you can type fg :

$ fg
cp -r /mnt/bigdir /home &

If you have more than one job backgrounded, you should specify the particular job to bring to the foreground with a parameter to fg :

$ fg %1

In this case, for shorthand, you can optionally omit fg and it will work just the same:

$ %1
Suspending processes

To temporarily suspend a process, you can press Ctrl+Z:

$ cp -r /mnt/bigdir /home
^Z
[1]+  Stopped  cp -r /mnt/bigdir /home

You can then continue it in the foreground or background with fg %1 or bg %1 respectively, as above.

This is particularly useful while in a text editor; instead of quitting the editor to get back to a shell, or dropping into a subshell from it, you can suspend it temporarily and return to it with fg once you're ready.

Dealing with output

While a job is running in the background, it may still print its standard output and standard error streams to your terminal. You can head this off by redirecting both streams to /dev/null for verbose commands:

$ cp -rv /mnt/bigdir /home &>/dev/null

However, if the output of the task is actually of interest to you, this may be a case where you should fire up another terminal emulator, perhaps in GNU Screen or tmux , rather than using simple job control.

Suspending SSH sessions

As a special case, you can suspend an SSH session using an SSH escape sequence . Type a newline followed by a ~ character, and finally press Ctrl+Z to background your SSH session and return to the terminal from which you invoked it.

tom@conan:~$ ssh crom
tom@crom:~$ ~^Z [suspend ssh]
[1]+  Stopped  ssh crom
tom@conan:~$

You can then resume it as you would any job by typing fg :

tom@conan:~$ fg %1
ssh crom
tom@crom:~$

[Oct 31, 2017] Elegant Awk usage by Tom Ryder

It's better to use Perl for this pupose...
Feb 06, 2012 | sanctum.geek.nz

For many system administrators, Awk is used only as a way to print specific columns of data from programs that generate columnar output, such as netstat or ps .

For example, to get a list of all the IP addresses and ports with open TCP connections on a machine, one might run the following:

# netstat -ant | awk '{print $5}'

This works pretty well, but among the data you actually wanted it also includes the fifth word of the opening explanatory note, and the heading of the fifth column:

and
Address
0.0.0.0:*
205.188.17.70:443
172.20.0.236:5222
72.14.203.125:5222

There are varying ways to deal with this.

Matching patterns

One common way is to pipe the output further through a call to grep , perhaps to only include results with at least one number:

# netstat -ant | awk '{print $5}' | grep '[0-9]'

In this case, it's instructive to use the awk call a bit more intelligently by setting a regular expression which the applicable line must match in order for that field to be printed, with the standard / characters as delimiters. This eliminates the need for the call to grep :

# netstat -ant | awk '/[0-9]/ {print $5}'

We can further refine this by ensuring that the regular expression should only match data in the fifth column of the output, using the ~ operator:

# netstat -ant | awk '$5 ~ /[0-9]/ {print $5}'
Skipping lines

Another approach you could take to strip the headers out might be to use sed to skip the first two lines of the output:

# netstat -ant | awk '{print $5}' | sed 1,2d

However, this can also be incorporated into the awk call, using the NR variable and making it part of a conditional checking the line number is greater than two:

# netstat -ant | awk 'NR>2 {print $5}'
Combining and excluding patterns

Another common idiom on systems that don't have the special pgrep command is to filter ps output for a string, but exclude the grep process itself from the output with grep -v grep :

# ps -ef | grep apache | grep -v grep | awk '{print $2}'

If you're using Awk to get columnar data from the output, in this case the second column containing the process ID, both calls to grep can instead be incorporated into the awk call:

# ps -ef | awk '/apache/ && !/awk/ {print $2}'

Again, this can be further refined if necessary to ensure you're only matching the expressions against the command name by specifying the field number for each comparison:

# ps -ef | awk '$8 ~ /apache/ && $8 !~ /awk/ {print $2}'

If you're used to using Awk purely as a column filter, the above might help to increase its utility for you and allow you to write shorter and more efficient command lines. The Awk Primer on Wikibooks is a really good reference for using Awk to its fullest for the sorts of tasks for which it's especially well-suited.

[Oct 31, 2017] Nagios on Debian primer by Tom Ryder

Jan 26, 2012 | sanctum.geek.nz

Nagios is useful for monitoring pretty much any kind of network service, with a wide variety of community-made plugins to test pretty much anything you might need. However, its configuration and interface can be a little bit cryptic to initiates. Fortunately, Nagios is well-packaged in Debian and Ubuntu and provides a basic default configuration that is instructive to read and extend.

There's a reason that a lot of system administrators turn into monitoring fanatics when tools like Nagios are available. The rapid feedback of things going wrong and being fixed and the pleasant sea of green when all your services are up can get addictive for any halfway dedicated administrator.

In this article I'll walk you through installing a very simple monitoring setup on a Debian or Ubuntu server. We'll assume you have two computers in your home network, a workstation on 192.168.1.1 and a server on 192.168.1.2 , and that you maintain a web service of some sort on a remote server, for which I'll use www.example.com .

We'll install a Nagios instance on the server that monitors both local services and the remote webserver, and emails you if it detects any problems.

For those not running a Debian-based GNU/Linux distribution or perhaps BSD, much of the configuration here will still apply, but the initial setup will probably be peculiar to your ports or packaging system unless you're compiling from source.

Installing the packages

We'll work on a freshly installed Debian Stable box as the server, which at the time of writing is version 6.0.3 "Squeeze". If you don't have it working already, you should start by installing Apache HTTPD:

# apt-get install apache2

Visit the server on http://192.168.1.1/ and check that you get the "It works!", and that should be all you need. Note that by default this installation of Apache is not terribly secure, so you shouldn't allow access to it from outside your private network until you've locked it down a bit, which is outside the scope of this article.

Next we'll install the nagios3 package, which will include a default set of useful plugins, and a simple configuration. The list of packages it needs to support these is quite long so you may need to install a lot of dependencies, which apt-get will manage for you.

# apt-get install nagios3

The installation procedure will include requesting a password for the administration area; provide it with a suitable one. You may also get prompted to configure a workgroup for the samba-common package; don't worry, you aren't installing a samba service by doing this, it's just information for the smbclient program in case you want to monitor any SMB/CIFS services.

That should provide you with a basic self-monitoring Nagios setup. Visit http://192.168.1.1/nagios3/ in your browser to verify this; use the username nagiosadmin and the password you gave during the install process. If you see something like the below, you're in business; this is the Nagios web reporting and administration panel.

The Nagios administration area's front page

The Nagios administration area's front page Default setup

To start with, click the Services link in the left menu. You should see something like the below, which is the monitoring for localhost and the service monitoring that the packager set up for you by default:

Default Nagios monitoring hosts and services

Default Nagios monitoring hosts and services

Note that on my system, monitoring for the already-existing HTTP and SSH daemons was automatically set up for me, along with the default checks for load average, user count, and process count. If any of these pass a threshold, they'll turn yellow for WARNING, and red for CRITICAL states.

This is already somewhat useful, though a server monitoring itself is a bit problematic because of course it won't be able to tell you if it goes completely down. So for the next step, we're going to set up monitoring for the remote host www.example.com , which means firing up your favourite text editor to edit a few configuration files.

Default configuration

Nagios configuration is at first blush a bit complex, because monitoring setups need to be quite finely-tuned in order to be useful long term, particularly if you're managing a large number of hosts. Take a look at the files in /etc/nagios3/conf.d .

# ls /etc/nagios3/conf.d
contacts_nagios2.cfg
extinfo_nagios2.cfg
generic-host_nagios2.cfg
generic-service_nagios2.cfg
hostgroups_nagios2.cfg
localhost_nagios2.cfg
services_nagios2.cfg
timeperiods_nagios2.cfg

You can actually arrange a Nagios configuration any way you like, including one big well-ordered file, but it makes some sense to break it up into sections if you can. In this case, the default setup includes the following files:

This isn't my favourite method of organising Nagios configuration, but it'll work fine for us. We'll start by defining a remote host, and add services to it.

Testing services

First of all, let's check we actually have connectivity to the host we're monitoring from this server for both of the services we intend to check; ICMP ECHO (PING) and HTTP.

$ ping -n -c 1 www.example.com
PING www.example.com (192.0.43.10) 56(84) bytes of data.
64 bytes from 192.0.43.10: icmp_req=1 ttl=243 time=168 ms
--- www.example.com ping statistics --- 1 packets transmitted, 1 received,
0% packet loss, time 0ms rtt min/avg/max/mdev = 168.700/168.700/168.700/0.000 ms

$ wget www.example.com -O - | grep -i found
tom@novus:~$ wget www.example.com -O -
--2012-01-26 21:12:00--  http://www.example.com/
Resolving www.example.com... 192.0.43.10, 2001:500:88:200::10
Connecting to www.example.com|192.0.43.10|:80... connected.
HTTP request sent, awaiting response... 302 Found
...

All looks well, so we'll go ahead and add the host and its services.

Defining the remote host

Write a new file in the /etc/nagios3/conf.d directory called www.example.com_nagios2.cfg , with the following contents:

define host {
    use        generic-host
    host_name  www.example.com
    address    www.example.com
}

The first stanza of localhost_nagios2.conf looks very similar to this, indeed, it uses the same host template, generic-host . All we need to do is define what to call the host, and where to find it.

However, in order to get it monitoring appropriate services, we might need to add it to one of the already existing groups. Open up hostgroups_nagios2.cfg , and look for the stanza that includes hostgroup_name http-servers . Add www.example.com to the group's members, so that that stanza looks like this:

# A list of your web servers
define hostgroup {
    hostgroup_name  http-servers
    alias           HTTP servers
    members         localhost,www.example.com
}

With this done, you need to restart the Nagios process:

# service nagios3 restart

If that succeeds, you should notice under your Hosts and Services section is a new host called "www.example.com", and it's being monitored for HTTP. At first, it'll be PENDING, but when the scheduled check runs, it should come back (hopefully!) as OK.

[Oct 31, 2017] Bash process substitution by Tom Ryder

Notable quotes:
"... Thanks to Reddit user Rhomboid for pointing out an incorrect assertion about this syntax necessarily abstracting ..."
"... calls, which I've since removed. ..."
February 27, 2012 sanctum.geek.nz

For tools like diff that work with multiple files as parameters, it can be useful to work with not just files on the filesystem, but also potentially with the output of arbitrary commands. Say, for example, you wanted to compare the output of ps and ps -e with diff -u . An obvious way to do this is to write files to compare the output:

$ ps > ps.out
$ ps -e > pse.out
$ diff -u ps.out pse.out

This works just fine, but Bash provides a shortcut in the form of process substitution , allowing you to treat the standard output of commands as files. This is done with the <() and >() operators. In our case, we want to direct the standard output of two commands into place as files:

$ diff -u <(ps) <(ps -e)

This is functionally equivalent, except it's a little tidier because it doesn't leave files lying around. This is also very handy for elegantly comparing files across servers, using ssh :

$ diff -u .bashrc <(ssh remote cat .bashrc)

Conversely, you can also use the >() operator to direct from a filename context to the standard input of a command. This is handy for setting up in-place filters for things like logs. In the following example, I'm making a call to rsync , specifying that it should make a log of its actions in log.txt , but filter it through grep -vF .tmp first to remove anything matching the fixed string .tmp :

$ rsync -arv --log-file=>(grep -vF .tmp >log.txt) src/ host::dst/

Combined with tee this syntax is a way of simulating multiple filters for a stdout stream, transforming output from a command in as many ways as you see fit:

$ ps -ef | tee >(awk '$1=="tom"' >toms-procs.txt) \
               >(awk '$1=="root"' >roots-procs.txt) \
               >(awk '$1!="httpd"' >not-apache-procs.txt) \
               >(awk 'NR>1{print $1}' >pids-only.txt)

In general, the idea is that wherever on the command line you could specify a file to be read from or written to, you can instead use this syntax to make an implicit named pipe for the text stream.

Thanks to Reddit user Rhomboid for pointing out an incorrect assertion about this syntax necessarily abstracting mkfifo calls, which I've since removed.

[Oct 31, 2017] Temporary files by Tom Ryder

Mar 05, 2012 | sanctum.geek.nz

With judicious use of tricks like pipes, redirects, and process substitution in modern shells, it's very often possible to avoid using temporary files, doing everything inline and keeping them quite neat. However when manipulating a lot of data into various formats you do find yourself occasionally needing a temporary file, just to hold data temporarily.

A common way to deal with this is to create a temporary file in your home directory, with some arbitrary name, something like test or working :

$ ps -ef >~/test

If you want to save the information indefinitely for later use, this makes sense, although it would be better to give it a slightly more instructive name than just test .

If you really only needed the data temporarily, however, you're much better to use the temporary files directory. This is usually /tmp , but for good practice's sake it's better to check the value of TMPDIR first, and only use /tmp as a default:

$ ps -ef >"${TMPDIR:-/tmp}"/test

This is getting better, but there is still a significant problem: there's no built-in check that the test file doesn't already exist, perhaps being used by some other user or program, particularly another running instance of the same script.

To that end, we have the mktemp program, which creates an empty temporary file in the appropriate directory for you without overwriting anything, and prints the filename it created. This allows you to use the file inline in both shell scripts and one-liners, and is much safer than specifying hardcoded paths:

$ mktemp
/tmp/tmp.yezXn0evDf
$ procsfile=$(mktemp)
$ printf '%s\n' "$procsfile"
/tmp/tmp.9rBjzWYaSU
$ ps -ef >"$procsfile"

If you're going to create several such files for related purposes, you could also create a directory in which to put them using the -d option:

$ procsdir=$(mktemp -d)
$ printf '%s\n' "$procsdir"
/tmp/tmp.HMAhM2RBSO

On GNU/Linux systems, files of a sufficient age in TMPDIR are cleared on boot (controlled in /etc/default/rcS on Debian-derived systems, /etc/cron.daily/tmpwatch on Red Hat ones), making /tmp useful as a general scratchpad as well as for a kind of relatively reliable inter-process communication without cluttering up users' home directories.

In some cases, there may be additional advantages in using /tmp for its designed purpose as some administrators choose to mount it as a tmpfs filesystem, so it operates in RAM and works very quickly. It's also common practice to set the noexec flag on the mount to prevent malicious users from executing any code they manage to find or save in the directory.

[Oct 31, 2017] High-speed Bash by Tom Ryder

Notable quotes:
"... One of my favourite technical presentations I've read online has been Hal Pomeranz's Unix Command-Line Kung Fu , a catalogue of shortcuts and efficient methods of doing very clever things with the Bash shell. None of these are grand arcane secrets, but they're things that are often forgotten in the course of daily admin work, when you find yourself typing something you needn't, or pressing up repeatedly to find something you wrote for which you could simply search your command history. ..."
Jan 24, 2012 | sanctum.geek.nz

One of my favourite technical presentations I've read online has been Hal Pomeranz's Unix Command-Line Kung Fu , a catalogue of shortcuts and efficient methods of doing very clever things with the Bash shell. None of these are grand arcane secrets, but they're things that are often forgotten in the course of daily admin work, when you find yourself typing something you needn't, or pressing up repeatedly to find something you wrote for which you could simply search your command history.

I highly recommend reading the whole thing, as I think even the most experienced shell users will find there are useful tidbits in there that would make their lives easier and their time with the shell more productive, beyond simpler things like tab completion.

Here, I'll recap two of the things I thought were the most simple and useful items in the presentation for general shell usage, and see if I can add a little value to them with reference to the Bash manual.

History with Ctrl+R

For many shell users, finding a command in history means either pressing the up arrow key repeatedly, or perhaps piping a history call through grep . It turns out there's a much nicer way to do this, using Bash's built-in history searching functionality; if you press Ctrl+R and start typing a search pattern, the most recent command matching that pattern will automatically be inserted on your current line, at which point you can adapt it as you need, or simply press Enter to run it again. You can keep pressing Ctrl+R to move further back in your history to the next-most recent match. On my shell, if I search through my history for git , I can pull up what I typed for a previous commit:

(reverse-i-search)`git': git commit -am "Pulled up-to-date colors."

This functionality isn't actually exclusive to Bash; you can establish a history search function in quite a few tools that use GNU Readline, including the MySQL client command line.

You can search forward through history in the same way with Ctrl+S, but it's likely you'll have to fix up a couple of terminal annoyances first.

Additionally, if like me you're a Vim user and you don't really like having to reach for the arrow keys, or if you're on a terminal where those keys are broken for whatever reason, you can browse back and forth within your command history with Ctrl+P (previous) and Ctrl+N (next). These are just a few of the Emacs-style shortcuts that GNU Readline provides; check here for a more complete list .

Repeating commands with !!

The last command you ran in Bash can be abbreviated on the next line with two exclamation marks:

$ echo "Testing."
Testing.
$ !!
Testing.

You can use this to simply repeat a command over and over again, although for that you really should be using watch , but more interestingly it turns out this is very handy for building complex pipes in stages. Suppose you were building a pipeline to digest some data generated from a program like netstat , perhaps to determine the top 10 IP addresses that are holding open the most connections to a server. You might be able to build a pipeline like this:

# netstat -ant
# !! | awk '{print $5}'
# !! | sort
# !! | uniq -c
# !! | sort -rn
# !! | sed 10q

Similarly, you can repeat the last argument from the previous command line using !$ , which is useful if you're doing a set of operations on one file, such as checking it out via RCS, editing it, and checking it back in:

$ co -l file.txt
$ vim !$
$ ci -u !$

Or if you happen to want to work on a set of arguments, you can repeat all of the arguments from the previous command using !* :

$ touch a.txt b.txt c.txt
$ rm !*

When you remember to user these three together, they can save you a lot of typing, and will really increase your accuracy because you won't be at risk of mistyping any of the commands or arguments. Naturally, however, it pays to be careful what you're running through rm !

[Oct 31, 2017] Learning the content of /bin and /usr/bin by Tom Ryder

Mar 16, 2012 | sanctum.geek.nz

When you have some spare time, something instructive to do that can help fill gaps in your Unix knowledge and to get a better idea of the programs installed on your system and what they can do is a simple whatis call, run over all the executable files in your /bin and /usr/bin directories.

This will give you a one-line summary of the file's function if available from man pages.

tom@conan:/bin$ whatis *
bash (1) - GNU Bourne-Again SHell
bunzip2 (1) - a block-sorting file compressor, v1.0.4
busybox (1) - The Swiss Army Knife of Embedded Linux
bzcat (1) - decompresses files to stdout
...

tom@conan:/usr/bin$ whatis *
[ (1)                - check file types and compare values
2to3 (1)             - Python2 to Python3 converter
2to3-2.7 (1)         - Python2 to Python3 converter
411toppm (1)         - convert Sony Mavica .411 image to ppm
...

It also works on many of the files in other directories, such as /etc :

tom@conan:/etc$ whatis *
acpi (1)             - Shows battery status and other ACPI information
adduser.conf (5)     - configuration file for adduser(8) and addgroup(8)
adjtime (3)          - correct the time to synchronize the system clock
aliases (5)          - Postfix local alias database format
...

Because packages often install more than one binary and you're only in the habit of using one or two of them, this process can tell you about programs on your system that you may have missed, particularly standard tools that solve common problems. As an example, I first learned about watch this way, having used a clunky solution with for loops with sleep calls to do the same thing many times before.

[Oct 31, 2017] Testing exit values in Bash by Tom Ryder

Oct 28, 2013 | sanctum.geek.nz

In Bash scripting (and shell scripting in general), we often want to check the exit value of a command to decide an action to take after it completes, likely for the purpose of error handling. For example, to determine whether a particular regular expression regex was present somewhere in a file options , we might apply grep(1) with its POSIX -q option to suppress output and just use the exit value:

grep -q regex options

An approach sometimes taken is then to test the exit value with the $? parameter, using if to check if it's non-zero, which is not very elegant and a bit hard to read:

# Bad practice
grep -q regex options
if (($? > 0)); then
    printf '%s\n' 'myscript: Pattern not found!' >&2
    exit 1
fi

Because the if construct by design tests the exit value of commands , it's better to test the command directly , making the expansion of $? unnecessary:

# Better
if grep -q regex options; then
    # Do nothing
    :
else
    printf '%s\n' 'myscript: Pattern not found!\n' >&2
    exit 1
fi

We can precede the command to be tested with ! to negate the test as well, to prevent us having to use else as well:

# Best
if ! grep -q regex options; then
    printf '%s\n' 'myscript: Pattern not found!' >&2
    exit 1
fi

An alternative syntax is to use && and || to perform if and else tests with grouped commands between braces, but these tend to be harder to read:

# Alternative
grep -q regex options || {
    printf '%s\n' 'myscript: Pattern not found!' >&2
    exit 1
}

With this syntax, the two commands in the block are only executed if the grep(1) call exits with a non-zero status. We can apply && instead to execute commands if it does exit with zero.

That syntax can be convenient for quickly short-circuiting failures in scripts, for example due to nonexistent commands, particularly if the command being tested already outputs its own error message. This therefore cuts the script off if the given command fails, likely due to ffmpeg(1) being unavailable on the system:

hash ffmpeg || exit 1

Note that the braces for a grouped command are not needed here, as there's only one command to be run in case of failure, the exit call.

Calls to cd are another good use case here, as running a script in the wrong directory if a call to cd fails could have really nasty effects:

cd wherever || exit 1

In general, you'll probably only want to test $? when you have specific non-zero error conditions to catch. For example, if we were using the --max-delete option for rsync(1) , we could check a call's return value to see whether rsync(1) hit the threshold for deleted file count and write a message to a logfile appropriately:

rsync --archive --delete --max-delete=5 source destination
if (($? == 25)); then
    printf '%s\n' 'Deletion limit was reached' >"$logfile"
fi

It may be tempting to use the errexit feature in the hopes of stopping a script as soon as it encounters any error, but there are some problems with its usage that make it a bit error-prone. It's generally more straightforward to simply write your own error handling using the methods above.

For a really thorough breakdown of dealing with conditionals in Bash, take a look at the relevant chapter of the Bash Guide .

[Oct 31, 2017] Shell config subfiles by Tom Ryder

Notable quotes:
"... Note that we unset the config variable after we're done, otherwise it'll be in the namespace of our shell where we don't need it. You may also wish to check for the existence of the ~/.bashrc.d directory, check there's at least one matching file inside it, or check that the file is readable before attempting to source it, depending on your preference. ..."
"... Thanks to commenter oylenshpeegul for correcting the syntax of the loops. ..."
Jan 30, 2015 | sanctum.geek.nz

Large shell startup scripts ( .bashrc , .profile ) over about fifty lines or so with a lot of options, aliases, custom functions, and similar tweaks can get cumbersome to manage over time, and if you keep your dotfiles under version control it's not terribly helpful to see large sets of commits just editing the one file when it could be more instructive if broken up into files by section.

Given that shell configuration is just shell code, we can apply the source builtin (or the . builtin for POSIX sh ) to load several files at the end of a .bashrc , for example:

source ~/.bashrc.options
source ~/.bashrc.aliases
source ~/.bashrc.functions

This is a better approach, but it still binds us into using those filenames; we still have to edit the ~/.bashrc file if we want to rename them, or remove them, or add new ones.

Fortunately, UNIX-like systems have a common convention for this, the .d directory suffix, in which sections of configuration can be stored to be read by a main configuration file dynamically. In our case, we can create a new directory ~/.bashrc.d :

$ ls ~/.bashrc.d
options.bash
aliases.bash
functions.bash

With a slightly more advanced snippet at the end of ~/.bashrc , we can then load every file with the suffix .bash in this directory:

# Load any supplementary scripts
for config in "$HOME"/.bashrc.d/*.bash ; do
    source "$config"
done
unset -v config

Note that we unset the config variable after we're done, otherwise it'll be in the namespace of our shell where we don't need it. You may also wish to check for the existence of the ~/.bashrc.d directory, check there's at least one matching file inside it, or check that the file is readable before attempting to source it, depending on your preference.

The same method can be applied with .profile to load all scripts with the suffix .sh in ~/.profile.d , if we want to write in POSIX sh , with some slightly different syntax:

# Load any supplementary scripts
for config in "$HOME"/.profile.d/*.sh ; do
    . "$config"
done
unset -v config

Another advantage of this method is that if you have your dotfiles under version control, you can arrange to add extra snippets on a per-machine basis unversioned, without having to update your .bashrc file.

Here's my implementation of the above method, for both .bashrc and .profile :

Thanks to commenter oylenshpeegul for correcting the syntax of the loops.

[Oct 31, 2017] Searching compressed files by Tom Ryder

Mar 14, 2012 | sanctum.geek.nz

If you need to search a set of log files in /var/log , some of which have been compressed with gzip as part of the logrotate procedure, it can be a pain to deflate them to check them for a specific string, particularly where you want to include the current log which isn't compressed:

$ gzip -d log.1.gz log.2.gz log.3.gz
$ grep pattern log log.1 log.2 log.3

It turns out to be a little more elegant to use the -c switch for gzip to deflate the files in-place and write the content of the files to standard output, concatenating any uncompressed files you may also want to search in with cat :

$ gzip -dc log.*.gz | cat - log | grep pattern

This and similar operations with compressed files are common enough problems that short scripts in /bin on GNU/Linux systems exist, providing analogues to existing tools that can work with files in both a compressed and uncompressed state. In this case, the zgrep tool is of the most use to us:

$ zgrep pattern log*

Note that this search will also include the uncompressed log file and search it normally. The tools are for possibly compressed files, which makes them particularly well-suited to searching and manipulating logs in mixed compression states. It's worth noting that most of these are actually reasonably simple shell scripts.

The complete list of tools, most of which do the same thing as their z-less equivalents, can be gleaned with a quick whatis call:

$ pwd
/bin
$ whatis z*
zcat (1)   - compress or expand files
zcmp (1)   - compare compressed files
zdiff (1)  - compare compressed files
zegrep (1) - search possibly compressed files for a regular expression
zfgrep (1) - search possibly compressed files for a regular expression
zforce (1) - force a '.gz' extension on all gzip files
zgrep (1)  - search possibly compressed files for a regular expression
zless (1)  - file perusal filter for crt viewing of compressed text
zmore (1)  - file perusal filter for crt viewing of compressed text
znew (1)   - recompress .Z files to .gz files

[Oct 31, 2017] In many companies its questionable whether the process can even be ever understood well, unless you have significant in-company knowledge, which makes outsourcing a key risk

Notable quotes:
"... Personnel turnover in Indian firms is sky high. As soon as software engineers finish taking part in a project, they jot the reference on their CV, and rush to find another project, in a different area, to extend their skill set, beef up their CV and improve their chances of a higher salary in the IT market. ..."
"... The consequence is that Indian IT firms in charge of the outsourced projects/products just cannot rely upon the implicit knowledge within the heads of their employees. In a sense, they cannot afford to have "key personnel", experienced people who know important, undocumented aspects of a piece of software and can be queried to clear up things -- all employees must be interchangeable. Hence the strict reliance on well-documented processes. ..."
"... Outsourcing your core competencies or your competitive advantage -- that's the real beauty of outsourcing! What could go wrong? ..."

Oct 30, 2017 | www.nakedcapitalism.com ,

Jesper , , October 30, 2017 at 7:40 am

I've seen a couple of BPOs, Business Process Outsourcing deals.

The key for success of BPO in the short term is to define the process -- document every step of the process of having something done and then introduce control-functions to ensure that the process is being followed. Possibly also develop some tools in supporting the process.

If the process is understood and documented well -- so well that rare/expensive skill is no longer needed to follow the process -- then it is possible to look for the lowest possible cost employee to follow the process.

As far as I can tell the most common mistake in BPO deals is that the process being outsourced isn't understood well. The documentation tends to be extensive but if the understanding is lacking then the process might be providing different results than wished for. Key Performance Indicators (KPIs) are introduced and then the gaming of the KPIs is begun .
Even if the initial process was understood well and documented well then the next problem is that due to distance (provider to client) there might be difficulties in adapting the process to changing circumstances.

And yes, there are similarities in BPOs and automation. Understanding of the process is key, without understanding of the process then the end result is usually bad. The key to learning and understanding is often humility and humility is often (in my experience) lacking in executives, senior management and project managers involved in BPO deals and/or efficiency projects.

Automation in the Too Big bank Nordea:
https://www.bloomberg.com/news/articles/2017-10-26/nordea-to-cut-at-least-6-000-jobs-in-fight-to-stay-competitive
Time will tell if it is a success for Nordea and if other big banks will follow suit and cut 13% of their workforce.

vlade , , October 30, 2017 at 8:35 am

See, you put it right on "the process is not understood well". My point is, in many companies it's questionable whether the process can even be ever understood well, unless you have significant in-company knowledge, which makes outsourcing a key risk, even in absence of anything else.

ejf , , October 30, 2017 at 9:54 am

Yup,you got it -- Business Process Outsourcing. I've seen the ill-understood processes ruined when, e.g., software development was transferred to India. I saw this starting in 2000 up til the present day. Yankee management LOVED the idea of cheap labor, but never got back the software it originally intended and designed.

It was the culture: Yankees are software cowboys -- able to change project as needed; Indians loved the process of development. The Indians sounded good but never go the job done.

visitor , , October 30, 2017 at 11:02 am

In the 1990s, I was quite impressed that the first company to reach a CMM level 5 was from India (a subsidiary from IBM, if I remember correctly) -- and thereafter seeing Indian software firms achieving ISO 9000/CMM compliance before large Western corporations.

Later, I worked in several projects that were partly outsourced/externalized to India (the usual suspects like HCL or Wipro), and I understood. Personnel turnover in Indian firms is sky high. As soon as software engineers finish taking part in a project, they jot the reference on their CV, and rush to find another project, in a different area, to extend their skill set, beef up their CV and improve their chances of a higher salary in the IT market.

Remaining in one domain area, with one set of technologies, is not considered a good thing for advancement in the Indian IT market, or when trying to get directly hired by a Western firm. They often have to support an extended family that paid for their computer science studies, so fast career moves are really important for them.

The consequence is that Indian IT firms in charge of the outsourced projects/products just cannot rely upon the implicit knowledge within the heads of their employees. In a sense, they cannot afford to have "key personnel", experienced people who know important, undocumented aspects of a piece of software and can be queried to clear up things -- all employees must be interchangeable. Hence the strict reliance on well-documented processes.

Jesper , , October 30, 2017 at 1:46 pm

all employees must be interchangeable

To expand on that I'd say that interchangeable employees have limited or no bargaining power leading to it being easier to keep salaries low. What is left for the interchangeable employee to do to increase earnings? Yep, change jobs leading to more focus on making employees interchangeable .
The game (war) between the company and its employees escalates. Power is everything and all CEOs know that you don't get paid what you're worth -- you're paid what you negotiate. Maintaining power is worth the cost of churn.

d , , October 30, 2017 at 3:12 pm

dont we have some of the all employees must be interchangeable in the US too?

Thuto , , October 30, 2017 at 8:18 am

Pity the "build or buy" decision calculus has been perverted beyond what the firm needs as inputs into its final market ready products, but is increasingly being used as a defensive move by big companies to kill off competition from smaller firms via knock off products or "acqui-hiring" of talent.

Thuto , , October 30, 2017 at 8:25 am

Aqui-hiring aka acquiring the smaller firm, pretending to integrate its product into the big company's product line, starving the product of resources to slowly kill it off, then pulling the plug citing "dissapointing sales and take up in the market" to protect big company's market share

Thuto , , October 30, 2017 at 8:41 am

Then redeploying the acqui-hired "talent" (I.e. founders of the acquired firm) to work on the next generation of big company's products (except now they do so in a bureaucratic, red tape laden maze of "corporate innovation management" processes).

Dan , , October 30, 2017 at 10:13 am

Outsourcing your core competencies or your competitive advantage -- that's the real beauty of outsourcing! What could go wrong?

WobblyTelomeres , , October 30, 2017 at 11:25 am

I thought one would outsource the core competitive disadvantages. That is, a smaller firm would outsource (buy) when they could not competitively create a subassembly/subcomponent because the sourcing firm had successfully achieved superior economies of scale (EoS) . This is why multiple automobile manufacturers purchase their subcomponents (say, coils or sparkplugs or bearings) from a supplier instead of manufacturing them in-house as the supplier achieves superior EoS by supplying the entire industry.

Even commenter Larry's above example ("offload liability risk with our larger insurance policy") is an EoS advantage/disadvantage, no?

Problems occur when one side of the dance is dominated by one or two very large players (think WalMart or Takata) or political will (defined here as $) is involved.

OTOH, I'm prolly just extremely naive.

sgt_doom , , October 30, 2017 at 2:59 pm

Naïve? No, you sound unholy ignorant, chum!

When Corporate America started offshoring R&D, scientist jobs, engineering jobs, programming jobs, medical jobs, legal jobs, etc., etc., etc. beginning in the late 1970s, but exploding under Jack Welch at GE in 1984-1985 [and I was offered a position helping in the process -- so nobody dare contradict me] it simply exacerbates those offshored manufacturing jobs, for without them in the past, too many American inventors would never have come to fruition -- this of course requires some knowledge of the history of technology.

The one absolute in human nature and human commerce: the greater the inequality, the lower the innovation -- IN EVERYTHING, IN EVERY AREA!

In other words, the greatest innovation in America (and everywhere else throughout history) took place when this nation was at its lowest in inequality indices and closest to socialism: the 1950s to 1960s and early 1970s -- and almost everything has simply been incremental since then.

As Leonardo da Vinci once remarked:

" Realize that everything connects to everything else. "

WobblyTelomeres , , October 30, 2017 at 7:44 pm

In other words, the greatest innovation in America (and everywhere else throughout history) took place when this nation was at its lowest in inequality indices and closest to socialism: the 1950s to 1960s and early 1970s

I disagree with this statement and would ask you to provide specific references for such a sweeping claim.

and almost everything has simply been incremental since then.

And would argue, with diagrams on a chalkboard if necessary, that all human knowledge is incremental. At least, that which requires more than simple immediate sensory perception.

sgt_doom , , October 30, 2017 at 2:54 pm

Thanks, Dan! Most of the comments here today are simply beneath commenting on, therefore your most sarcastic and cogent comments sums it up!

I should be flabbergasted by them, but I have frankly given up!

Recommended Reading (to the clueless, not Dan!):

Sold Out by Michelle Malkin Outsourcing America by Ron Hira Take This Job and Ship It by Byron Dorgan

[Oct 31, 2017] 256 colour terminals by Tom Ryder

Notable quotes:
"... An earlier version of this post suggested changing the TERM definition in .bashrc , which is generally not a good idea, even if bounded with conditionals as my example was. You should always set the terminal string in the emulator itself if possible, if you do it at all. ..."
"... Similarly, to use 256 colours in GNU Screen, add the following to your .screenrc : ..."
February 23, 2012 | sanctum.geek.nz

Using 256 colours in terminals is well-supported in GNU/Linux distributions these days, and also in Windows terminal emulators like PuTTY. Using 256 colours is great for Vim colorschemes in particular, but also very useful for Tmux colouring or any other terminal application where a slightly wider colour space might be valuable. Be warned that once you get this going reliably, there's no going back if you spend a lot of time in the terminal. Xterm

To set this up for xterm or emulators that use xterm as the default value for $TERM , such as xfce4-terminal or gnome-terminal , it generally suffices to check the options for your terminal emulator to ensure that it will allow 256 colors, and then use the TERM string xterm-256color for it.

An earlier version of this post suggested changing the TERM definition in .bashrc , which is generally not a good idea, even if bounded with conditionals as my example was. You should always set the terminal string in the emulator itself if possible, if you do it at all.

Be aware that older systems may not have terminfo definitions for this terminal, but you can always copy them in using a private .terminfo directory if need be.

Tmux

To use 256 colours in Tmux, you should set the default terminal in .tmux.conf to be screen-256color :

set -g default-terminal "screen-256color"

This will allow you to use color definitions like colour231 in your status lines and other configurations. Again, this particular terminfo definition may not be present on older systems, so you should copy it into ~/.terminfo/s/screen-256color on those systems if you want to use it everywhere.

GNU Screen

Similarly, to use 256 colours in GNU Screen, add the following to your .screenrc :

term screen-256color
Vim

With the applicable options from the above set, you should not need to change anything in Vim to be able to use 256-color colorschemes. If you're wanting to write or update your own 256-colour compatible scheme, it should either begin with set t_Co=256 , or more elegantly, check the value of the corresponding option value is &t_Co is 256 before trying to use any of the extra colour set.

The Vim Tips Wiki contains a detailed reference of the colour codes for schemes in 256-color terminals.

[Oct 31, 2017] Better Bash history by Tom Ryder

Feb 21, 2012 | sanctum.geek.nz

By default, the Bash shell keeps the history of your most recent session in the .bash_history file, and the commands you've issued in your current session are also available with a history call. These defaults are useful for keeping track of what you've been up to in the shell on any given machine, but with disks much larger and faster than they were when Bash was designed, a little tweaking in your .bashrc file can record history more permanently, consistently, and usefully. Append history instead of rewriting it

You should start by setting the histappend option, which will mean that when you close a session, your history will be appended to the .bash_history file rather than overwriting what's in there.

shopt -s histappend
Allow a larger history file

The default maximum number of commands saved into the .bash_history file is a rather meager 500. If you want to keep history further back than a few weeks or so, you may as well bump this up by explicitly setting $HISTSIZE to a much larger number in your .bashrc . We can do the same thing with the $HISTFILESIZE variable.

HISTFILESIZE=1000000
HISTSIZE=1000000

The man page for Bash says that HISTFILESIZE can be unset to stop truncation entirely, but unfortunately this doesn't work in .bashrc files due to the order in which variables are set; it's therefore more straightforward to simply set it to a very large number.

If you're on a machine with resource constraints, it might be a good idea to occasionally archive old .bash_history files to speed up login and reduce memory footprint.

Don't store specific lines

You can prevent commands that start with a space from going into history by setting $HISTCONTROL to ignorespace . You can also ignore duplicate commands, for example repeated du calls to watch a file grow, by adding ignoredups . There's a shorthand to set both in ignoreboth .

HISTCONTROL=ignoreboth

You might also want to remove the use of certain commands from your history, whether for privacy or readability reasons. This can be done with the $HISTIGNORE variable. It's common to use this to exclude ls calls, job control builtins like bg and fg , and calls to history itself:

HISTIGNORE='ls:bg:fg:history'
Record timestamps

If you set $HISTTIMEFORMAT to something useful, Bash will record the timestamp of each command in its history. In this variable you can specify the format in which you want this timestamp displayed when viewed with history . I find the full date and time to be useful, because it can be sorted easily and works well with tools like cut and awk .

HISTTIMEFORMAT='%F %T '
Use one command per line

To make your .bash_history file a little easier to parse, you can force commands that you entered on more than one line to be adjusted to fit on only one with the cmdhist option:

shopt -s cmdhist
Store history immediately

By default, Bash only records a session to the .bash_history file on disk when the session terminates. This means that if you crash or your session terminates improperly, you lose the history up to that point. You can fix this by recording each line of history as you issue it, through the $PROMPT_COMMAND variable:

PROMPT_COMMAND='history -a'

[Oct 31, 2017] Bash history expansion by Tom Ryder

Notable quotes:
"... Thanks to commenter Mihai Maruseac for pointing out a bug in the examples. ..."
Aug 16, 2012 | sanctum.geek.nz

Setting the Bash option histexpand allows some convenient typing shortcuts using Bash history expansion . The option can be set with either of these:

$ set -H
$ set -o histexpand

It's likely that this option is already set for all interactive shells, as it's on by default. The manual, man bash , describes these features as follows:

-H  Enable ! style history substitution. This option is on
    by default when the shell is interactive.

You may have come across this before, perhaps to your annoyance, in the following error message that comes up whenever ! is used in a double-quoted string, or without being escaped with a backslash:

$ echo "Hi, this is Tom!"
bash: !": event not found

If you don't want the feature and thereby make ! into a normal character, it can be disabled with either of these:

$ set +H
$ set +o histexpand

History expansion is actually a very old feature of shells, having been available in csh before Bash usage became common.

This article is a good followup to Better Bash history , which among other things explains how to include dates and times in history output, as these examples do.

Basic history expansion

Perhaps the best known and most useful of these expansions is using !! to refer to the previous command. This allows repeating commands quickly, perhaps to monitor the progress of a long process, such as disk space being freed while deleting a large file:

$ rm big_file &
[1] 23608
$ du -sh .
3.9G    .
$ !!
du -sh .
3.3G    .

It can also be useful to specify the full filesystem path to programs that aren't in your $PATH :

$ hdparm
-bash: hdparm: command not found
$ /sbin/!!
/sbin/hdparm

In each case, note that the command itself is printed as expanded, and then run to print the output on the following line.

History by absolute index

However, !! is actually a specific example of a more general form of history expansion. For example, you can supply the history item number of a specific command to repeat it, after looking it up with history :

$ history | grep expand
 3951  2012-08-16 15:58:53  set -o histexpand
$ !3951
set -o histexpand

You needn't enter the !3951 on a line by itself; it can be included as any part of the command, for example to add a prefix like sudo :

$ sudo !3850

If you include the escape string \! as part of your Bash prompt , you can include the current command number in the prompt before the command, making repeating commands by index a lot easier as long as they're still visible on the screen.

History by relative index

It's also possible to refer to commands relative to the current command. To subtitute the second-to-last command, we can type !-2 . For example, to check whether truncating a file with sed worked correctly:

$ wc -l bigfile.txt
267 bigfile.txt
$ printf '%s\n' '11,$d' w | ed -s bigfile.txt
$ !-2
wc -l bigfile.txt
10 bigfile.txt

This works further back into history, with !-3 , !-4 , and so on.

Expanding for historical arguments

In each of the above cases, we're substituting for the whole command line. There are also ways to get specific tokens, or words , from the command if we want that. To get the first argument of a particular command in the history, use the !^ token:

$ touch a.txt b.txt c.txt
$ ls !^
ls a.txt
a.txt

To get the last argument, add !$ :

$ touch a.txt b.txt c.txt
$ ls !$
ls c.txt
c.txt

To get all arguments (but not the command itself), use !* :

$ touch a.txt b.txt c.txt
$ ls !*
ls a.txt b.txt c.txt
a.txt  b.txt  c.txt

This last one is particularly handy when performing several operations on a group of files; we could run du and wc over them to get their size and character count, and then perhaps decide to delete them based on the output:

$ du a.txt b.txt c.txt
4164    a.txt
5184    b.txt
8356    c.txt
$ wc !*
wc a.txt b.txt c.txt
16689    94038  4250112 a.txt
20749   117100  5294592 b.txt
33190   188557  8539136 c.txt
70628   399695 18083840 total
$ rm !*
rm a.txt b.txt c.txt

These work not just for the preceding command in history, but also absolute and relative command numbers:

$ history 3
 3989  2012-08-16 16:30:59  wc -l b.txt
 3990  2012-08-16 16:31:05  du -sh c.txt
 3991  2012-08-16 16:31:12  history 3
$ echo !3989^
echo -l
-l
$ echo !3990$
echo c.txt
c.txt
$ echo !-1*
echo c.txt
c.txt

More generally, you can use the syntax !n:w to refer to any specific argument in a history item by number. In this case, the first word, usually a command or builtin, is word 0 :

$ history | grep bash
 4073  2012-08-16 20:24:53  man bash
$ !4073:0
man
What manual page do you want?
$ !4073:1
bash

You can even select ranges of words by separating their indices with a hyphen:

$ history | grep apt-get
 3663  2012-08-15 17:01:30  sudo apt-get install gnome
$ !3663:0-1 purge !3663:3
sudo apt-get purge gnome

You can include ^ and $ as start and endpoints for these ranges, too. 3* is a shorthand for 3-$ , meaning "all arguments from the third to the last."

Expanding history by string

You can also refer to a previous command in the history that starts with a specific string with the syntax !string :

$ !echo
echo c.txt
c.txt
$ !history
history 3
 4011  2012-08-16 16:38:28  rm a.txt b.txt c.txt
 4012  2012-08-16 16:42:48  echo c.txt
 4013  2012-08-16 16:42:51  history 3

If you want to match any part of the command line, not just the start, you can use !?string? :

$ !?bash?
man bash

Be careful when using these, if you use them at all. By default it will run the most recent command matching the string immediately , with no prompting, so it might be a problem if it doesn't match the command you expect.

Checking history expansions before running

If you're paranoid about this, Bash allows you to audit the command as expanded before you enter it, with the histverify option:

$ shopt -s histverify
$ !rm
$ rm a.txt b.txt c.txt

This option works for any history expansion, and may be a good choice for more cautious administrators. It's a good thing to add to one's .bashrc if so.

If you don't need this set all the time, but you do have reservations at some point about running a history command, you can arrange to print the command without running it by adding a :p suffix:

$ !rm:p
rm important-file

In this instance, the command was expanded, but thankfully not actually run.

Substituting strings in history expansions

To get really in-depth, you can also perform substitutions on arbitrary commands from the history with !!:gs/pattern/replacement/ . This is getting pretty baroque even for Bash, but it's possible you may find it useful at some point:

$ !!:gs/txt/mp3/
rm a.mp3 b.mp3 c.mp3

If you only want to replace the first occurrence, you can omit the g :

$ !!:s/txt/mp3/
rm a.mp3 b.txt c.txt
Stripping leading directories or trailing files

If you want to chop a filename off a long argument to work with the directory, you can do this by adding an :h suffix, kind of like a dirname call in Perl:

$ du -sh /home/tom/work/doc.txt
$ cd !$:h
cd /home/tom/work

To do the opposite, like a basename call in Perl, use :t :

$ ls /home/tom/work/doc.txt
$ document=!$:t
document=doc.txt
Stripping extensions or base names

A bit more esoteric, but still possibly useful; to strip a file's extension, use :r :

$ vi /home/tom/work/doc.txt
$ stripext=!$:r
stripext=/home/tom/work/doc

To do the opposite, to get only the extension, use :e :

$ vi /home/tom/work/doc.txt
$ extonly=!$:e
extonly=.txt
Quoting history

If you're performing substitution not to execute a command or fragment but to use it as a string, it's likely you'll want to quote it. For example, if you've just found through experiment and trial and error an ideal ffmpeg command line to accomplish some task, you might want to save it for later use by writing it to a script:

$ ffmpeg -f alsa -ac 2 -i hw:0,0 -f x11grab -r 30 -s 1600x900 \
> -i :0.0+1600,0 -acodec pcm_s16le -vcodec libx264 -preset ultrafast \
> -crf 0 -threads 0 "$(date +%Y%m%d%H%M%S)".mkv

To make sure all the escaping is done correctly, you can write the command into the file with the :q modifier:

$ echo '#!/usr/bin/env bash' >ffmpeg.sh
$ echo !ffmpeg:q >>ffmpeg.sh

In this case, this will prevent Bash from executing the command expansion "$(date ... )" , instead writing it literally to the file as desired. If you build a lot of complex commands interactively that you later write to scripts once completed, this feature is really helpful and saves a lot of cutting and pasting.

Thanks to commenter Mihai Maruseac for pointing out a bug in the examples.

[Oct 31, 2017] Prompt directory shortening by Tom Ryder

Notable quotes:
"... If you're using Bash version 4.0 or above ( bash --version ), you can save a bit of terminal space by setting the PROMPT_DIRTRIM variable for the shell. This limits the length of the tail end of the \w and \W expansions to that number of path elements: ..."
Nov 07, 2014 | sanctum.geek.nz

The common default of some variant of \h:\w\$ for a Bash prompt PS1 string includes the \w escape character, so that the user's current working directory appears in the prompt, but with $HOME shortened to a tilde:

tom@sanctum:~$
tom@sanctum:~/Documents$
tom@sanctum:/usr/local/nagios$

This is normally very helpful, particularly if you leave your shell for a time and forget where you are, though of course you can always call the pwd shell builtin. However it can get annoying for very deep directory hierarchies, particularly if you're using a smaller terminal window:

tom@sanctum:/chroot/apache/usr/local/perl/app-library/lib/App/Library/Class:~$

If you're using Bash version 4.0 or above ( bash --version ), you can save a bit of terminal space by setting the PROMPT_DIRTRIM variable for the shell. This limits the length of the tail end of the \w and \W expansions to that number of path elements:

tom@sanctum:/chroot/apache/usr/local/app-library/lib/App/Library/Class$ PROMPT_DIRTRIM=3
tom@sanctum:.../App/Library/Class$

This is a good thing to include in your ~/.bashrc file if you often find yourself deep in directory trees where the upper end of the hierarchy isn't of immediate interest to you. You can remove the effect again by unsetting the variable:

tom@sanctum:.../App/Library/Class$ unset PROMPT_DIRTRIM
tom@sanctum:/chroot/apache/usr/local/app-library/lib/App/Library/Class$

[Oct 27, 2017] Neat trick of using su command for killing all processes for a particular user

Oct 27, 2017 | unix.stackexchange.com

If you pass -1 as the process ID argument to either the kill shell command or the kill C function , then the signal is sent to all the processes it can reach, which in practice means all the processes of the user running the kill command or syscall.

su -c 'kill -TERM -1' bob

In C (error checking omitted):

if (fork() == 0) {
    setuid(uid);
    signal(SIGTERM, SIG_DFL);
    kill(-1, SIGTERM);
}

[Oct 27, 2017] c - How do I kill all a user's processes using their UID - Unix Linux Stack Exchange

Oct 27, 2017 | unix.stackexchange.com

osgx ,Aug 4, 2011 at 10:07

Use pkill -U UID or pkill -u UID or username instead of UID. Sometimes skill -u USERNAME may work, another tool is killall -u USERNAME .

Skill was a linux-specific and is now outdated, and pkill is more portable (Linux, Solaris, BSD).

pkill allow both numberic and symbolic UIDs, effective and real http://man7.org/linux/man-pages/man1/pkill.1.html

pkill - ... signal processes based on name and other attributes

    -u, --euid euid,...
         Only match processes whose effective user ID is listed.
         Either the numerical or symbolical value may be used.
    -U, --uid uid,...
         Only match processes whose real user ID is listed.  Either the
         numerical or symbolical value may be used.

Man page of skill says is it allowed only to use username, not user id: http://man7.org/linux/man-pages/man1/skill.1.html

skill, snice ... These tools are obsolete and unportable. The command syntax is poorly defined. Consider using the killall, pkill

  -u, --user user
         The next expression is a username.

killall is not marked as outdated in Linux, but it also will not work with numberic UID; only username: http://man7.org/linux/man-pages/man1/killall.1.html

killall - kill processes by name

   -u, --user
         Kill only processes the specified user owns.  Command names
         are optional.

I think, any utility used to find process in Linux/Solaris style /proc (procfs) will use full list of processes (doing some readdir of /proc ). I think, they will iterate over /proc digital subfolders and check every found process for match.

To get list of users, use getpwent (it will get one user per call).

skill (procps & procps-ng) and killall (psmisc) tools both uses getpwnam library call to parse argument of -u option, and only username will be parsed. pkill (procps & procps-ng) uses both atol and getpwnam to parse -u / -U argument and allow both numeric and textual user specifier.

; ,Aug 4, 2011 at 10:11

pkill is not obsolete. It may be unportable outside Linux, but the question was about Linux specifically. – Lars Wirzenius Aug 4 '11 at 10:11

Petesh ,Aug 4, 2011 at 10:58

to get the list of users use the one liner: getent passwd | awk -F: '{print $1}' – Petesh Aug 4 '11 at 10:58

; ,Aug 4, 2011 at 12:07

what about I give a command like: "kill -ju UID" from C system() call? – user489152 Aug 4 '11 at 12:07

osgx ,Aug 4, 2011 at 15:01

is it an embedded linux? you have no skill, pkill and killall? Even busybox embedded shell has pkill and killall. – osgx Aug 4 '11 at 15:01

michalzuber ,Apr 23, 2015 at 7:47

killall -u USERNAME worked like charm – michalzuber Apr 23 '15 at 7:47

[Oct 25, 2017] How to modify scripts behavior on signals using bash traps - LinuxConfig.org

Oct 25, 2017 | linuxconfig.org

Trap syntax is very simple and easy to understand: first we must call the trap builtin, followed by the action(s) to be executed, then we must specify the signal(s) we want to react to:

trap [-lp] [[arg] sigspec]
Let's see what the possible trap options are for.

When used with the -l flag, the trap command will just display a list of signals associated with their numbers. It's the same output you can obtain running the kill -l command:

$ trap -l
1) SIGHUP        2) SIGINT       3) SIGQUIT      4) SIGILL       5) SIGTRAP
6) SIGABRT       7) SIGBUS       8) SIGFPE       9) SIGKILL     10) SIGUSR1
11) SIGSEGV     12) SIGUSR2     13) SIGPIPE     14) SIGALRM     15) SIGTERM
16) SIGSTKFLT   17) SIGCHLD     18) SIGCONT     19) SIGSTOP     20) SIGTSTP
21) SIGTTIN     22) SIGTTOU     23) SIGURG      24) SIGXCPU     25) SIGXFSZ
26) SIGVTALRM   27) SIGPROF     28) SIGWINCH    29) SIGIO       30) SIGPWR
31) SIGSYS      34) SIGRTMIN    35) SIGRTMIN+1  36) SIGRTMIN+2  37) SIGRTMIN+3
38) SIGRTMIN+4  39) SIGRTMIN+5  40) SIGRTMIN+6  41) SIGRTMIN+7  42) SIGRTMIN+8
43) SIGRTMIN+9  44) SIGRTMIN+10 45) SIGRTMIN+11 46) SIGRTMIN+12 47) SIGRTMIN+13
48) SIGRTMIN+14 49) SIGRTMIN+15 50) SIGRTMAX-14 51) SIGRTMAX-13 52) SIGRTMAX-12
53) SIGRTMAX-11 54) SIGRTMAX-10 55) SIGRTMAX-9  56) SIGRTMAX-8  57) SIGRTMAX-7
58) SIGRTMAX-6  59) SIGRTMAX-5  60) SIGRTMAX-4  61) SIGRTMAX-3  62) SIGRTMAX-2
63) SIGRTMAX-1  64) SIGRTMAX
It's really important to specify that it's possible to react only to signals which allows the script to respond: the SIGKILL and SIGSTOP signals cannot be caught, blocked or ignored.

Apart from signals, traps can also react to some pseudo-signal such as EXIT, ERR or DEBUG, but we will see them in detail later. For now just remember that a signal can be specified either by its number or by its name, even without the SIG prefix.

About the -p option now. This option has sense only when a command is not provided (otherwise it will produce an error). When trap is used with it, a list of the previously set traps will be displayed. If the signal name or number is specified, only the trap set for that specific signal will be displayed, otherwise no distinctions will be made, and all the traps will be displayed:

$ trap 'echo "SIGINT caught!"' SIGINT
We set a trap to catch the SIGINT signal: it will just display the "SIGINT caught" message onscreen when given signal will be received by the shell. If we now use trap with the -p option, it will display the trap we just defined:
$ trap -p
trap -- 'echo "SIGINT caught!"' SIGINT
By the way, the trap is now "active", so if we send a SIGINT signal, either using the kill command, or with the CTRL-c shortcut, the associated command in the trap will be executed (^C is just printed because of the key combination):
^CSIGINT caught!
Trap in action We now will write a simple script to show trap in action, here it is:
#!/usr/bin/env bash
#
# A simple script to demonstrate how trap works
#
set -e
set -u
set -o pipefail

trap 'echo "signal caught, cleaning..."; rm -i linux_tarball.tar.xz' SIGINT SIGTERM

echo "Downloading tarball..."
wget -O linux_tarball.tar.xz https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.13.5.tar.xz &> /dev/null
The above script just tries to download the latest linux kernel tarball into the directory from what it is launched using wget . During the task, if the SIGINT or SIGTERM signals are received (notice how you can specify more than one signal on the same line), the partially downloaded file will be deleted.

In this case the command are actually two: the first is the echo which prints the message onscreen, and the second is the actual rm command (we provided the -i option to it, so it will ask user confirmation before removing), and they are separated by a semicolon. Instead of specifying commands this way, you can also call functions: this would give you more re-usability. Notice that if you don't provide any command the signal(s) will just be ignored!

This is the output of the script above when it receives a SIGINT signal:

$ ./fetchlinux.sh
Downloading tarball...
^Csignal caught, cleaning...
rm: remove regular file 'linux_tarball.tar.xz'?
A very important thing to remember is that when a script is terminated by a signal, like above, its exist status will be the result of 128 + the signal number . As you can see, the script above, being terminated by a SIGINT, has an exit status of 130 :
$ echo $?
130
Lastly, you can disable a trap just by calling trap followed by the - sign, followed by the signal(s) name or number:
trap - SIGINT SIGTERM
The signals will take back the value they had upon the entrance to shell. Pseudo-signals As already mentioned above, trap can be set not only for signals which allows the script to respond but also to what we can call "pseudo-signals". They are not technically signals, but correspond to certain situations that can be specified: EXIT When EXIT is specified in a trap, the command of the trap will be execute on exit from the shell. ERR This will cause the argument of the trap to be executed when a command returns a non-zero exit status, with some exceptions (the same of the shell errexit option): the command must not be part of a while or until loop; it must not be part of an if construct, nor part of a && or || list, and its value must not be inverted by using the ! operator. DEBUG This will cause the argument of the trap to be executed before every simple command, for , case or select commands, and before the first command in shell functions RETURN The argument of the trap is executed after a function or a script sourced by using source or the . command.

[Oct 20, 2017] Simple logical operators in Bash - Stack Overflow

Notable quotes:
"... Backquotes ( ` ` ) are old-style form of command substitution, with some differences: in this form, backslash retains its literal meaning except when followed by $ , ` , or \ , and the first backquote not preceded by a backslash terminates the command substitution; whereas in the $( ) form, all characters between the parentheses make up the command, none are treated specially. ..."
"... Double square brackets delimit a Conditional Expression. And, I find the following to be a good reading on the subject: "(IBM) Demystify test, [, [[, ((, and if-then-else" ..."
Oct 20, 2017 | stackoverflow.com

Amit , Jun 7, 2011 at 19:18

I have a couple of variables and I want to check the following condition (written out in words, then my failed attempt at bash scripting):
if varA EQUALS 1 AND ( varB EQUALS "t1" OR varB EQUALS "t2" ) then 

do something

done.

And in my failed attempt, I came up with:

if (($varA == 1)) && ( (($varB == "t1")) || (($varC == "t2")) ); 
  then
    scale=0.05
  fi

Best answer Gilles

What you've written actually almost works (it would work if all the variables were numbers), but it's not an idiomatic way at all.

This is the idiomatic way to write your test in bash:

if [[ $varA = 1 && ($varB = "t1" || $varC = "t2") ]]; then

If you need portability to other shells, this would be the way (note the additional quoting and the separate sets of brackets around each individual test):

if [ "$varA" = 1 ] && { [ "$varB" = "t1" ] || [ "$varC" = "t2" ]; }; then

Will Sheppard , Jun 19, 2014 at 11:07

It's better to use == to differentiate the comparison from assigning a variable (which is also = ) – Will Sheppard Jun 19 '14 at 11:07

Cbhihe , Apr 3, 2016 at 8:05

+1 @WillSheppard for yr reminder of proper style. Gilles, don't you need a semicolon after yr closing curly bracket and before "then" ? I always thought if , then , else and fi could not be on the same line... As in:

if [ "$varA" = 1 ] && { [ "$varB" = "t1" ] || [ "$varC" = "t2" ]; }; then

Cbhihe Apr 3 '16 at 8:05

Rockallite , Jan 19 at 2:41

Backquotes ( ` ` ) are old-style form of command substitution, with some differences: in this form, backslash retains its literal meaning except when followed by $ , ` , or \ , and the first backquote not preceded by a backslash terminates the command substitution; whereas in the $( ) form, all characters between the parentheses make up the command, none are treated specially.

Rockallite Jan 19 at 2:41

Peter A. Schneider , Aug 28 at 13:16

You could emphasize that single brackets have completely different semantics inside and outside of double brackets. (Because you start with explicitly pointing out the subshell semantics but then only as an aside mention the grouping semantics as part of conditional expressions. Was confusing to me for a second when I looked at your idiomatic example.) – Peter A. Schneider Aug 28 at 13:16

matchew , Jun 7, 2011 at 19:29

very close
if (( $varA == 1 )) && [[ $varB == 't1' || $varC == 't2' ]]; 
  then 
    scale=0.05
  fi

should work.

breaking it down

(( $varA == 1 ))

is an integer comparison where as

$varB == 't1'

is a string comparison. otherwise, I am just grouping the comparisons correctly.

Double square brackets delimit a Conditional Expression. And, I find the following to be a good reading on the subject: "(IBM) Demystify test, [, [[, ((, and if-then-else"

Peter A. Schneider , Aug 28 at 13:21

Just to be sure: The quoting in 't1' is unnecessary, right? Because as opposed to arithmetic instructions in double parentheses, where t1 would be a variable, t1 in a conditional expression in double brackets is just a literal string.

I.e., [[ $varB == 't1' ]] is exactly the same as [[ $varB == t1 ]] , right? – Peter A. Schneider Aug 28 at 13:21

[Oct 20, 2017] shell script - OR in `expr match`

Notable quotes:
"... ...and if you weren't targeting a known/fixed operating system, using case rather than a regex match is very much the better practice, since the accepted answer depends on behavior POSIX doesn't define. ..."
"... Regular expression syntax, including the use of backquoting, is different for different tools. Always look it up. ..."
Oct 20, 2017 | unix.stackexchange.com

OR in `expr match` up vote down vote favorite

stracktracer , Dec 14, 2015 at 13:54

I'm confused as to why this does not match:

expr match Unauthenticated123 '^(Unauthenticated|Authenticated).*'

it outputs 0.

Charles Duffy , Dec 14, 2015 at 18:22

As an aside, if you were using bash for this, the preferred alternative would be the =~ operator in [[ ]] , ie. [[ Unauthenticated123 =~ ^(Unauthenticated|Authenticated) ]]Charles Duffy Dec 14 '15 at 18:22

Charles Duffy , Dec 14, 2015 at 18:25

...and if you weren't targeting a known/fixed operating system, using case rather than a regex match is very much the better practice, since the accepted answer depends on behavior POSIX doesn't define. Charles Duffy Dec 14 '15 at 18:25

Gilles , Dec 14, 2015 at 23:43

See Why does my regular expression work in X but not in Y?Gilles Dec 14 '15 at 23:43

Lambert , Dec 14, 2015 at 14:04

Your command should be:
expr match Unauthenticated123 'Unauthenticated\|Authenticated'

If you want the number of characters matched.

To have the part of the string (Unauthenticated) returned use:

expr match Unauthenticated123 '\(Unauthenticated\|Authenticated\)'

From info coreutils 'expr invocation' :

STRING : REGEX' Perform pattern matching. The arguments are converted to strings and the second is considered to be a (basic, a la GNU grep') regular expression, with a `^' implicitly prepended. The first argument is then matched against this regular expression.

 If the match succeeds and REGEX uses `\(' and `\)', the `:'
 expression returns the part of STRING that matched the
 subexpression; otherwise, it returns the number of characters
 matched.

 If the match fails, the `:' operator returns the null string if
 `\(' and `\)' are used in REGEX, otherwise 0.

 Only the first `\( ... \)' pair is relevant to the return value;
 additional pairs are meaningful only for grouping the regular
 expression operators.

 In the regular expression, `\+', `\?', and `\|' are operators
 which respectively match one or more, zero or one, or separate
 alternatives.  SunOS and other `expr''s treat these as regular
 characters.  (POSIX allows either behavior.)  *Note Regular
 Expression Library: (regex)Top, for details of regular expression
 syntax.  Some examples are in *note Examples of expr::.

stracktracer , Dec 14, 2015 at 14:18

Thanks escaping the | worked. Weird, normally I'd expect it if I wanted to match the literal |... – stracktracer Dec 14 '15 at 14:18

reinierpost , Dec 14, 2015 at 15:34

Regular expression syntax, including the use of backquoting, is different for different tools. Always look it up.reinierpost Dec 14 '15 at 15:34

Stéphane Chazelas , Dec 14, 2015 at 14:49

Note that both match and \| are GNU extensions (and the behaviour for : (the match standard equivalent) when the pattern starts with ^ varies with implementations). Standardly, you'd do:
expr " $string" : " Authenticated" '|' " $string" : " Unauthenticated"

The leading space is to avoid problems with values of $string that start with - or are expr operators, but that means it adds one to the number of characters being matched.

With GNU expr , you'd write it:

expr + "$string" : 'Authenticated\|Unauthenticated'

The + forces $string to be taken as a string even if it happens to be a expr operator. expr regular expressions are basic regular expressions which don't have an alternation operator (and where | is not special). The GNU implementation has it as \| though as an extension.

If all you want is to check whether $string starts with Authenticated or Unauthenticated , you'd better use:

case $string in
  (Authenticated* | Unauthenticated*) do-something
esac

netmonk , Dec 14, 2015 at 14:06

$ expr match "Unauthenticated123" '^\(Unauthenticated\|Authenticated\).*' you have to escape with \ the parenthesis and the pipe.

mikeserv , Dec 14, 2015 at 14:18

and the ^ may not mean what some would think depending on the expr . it is implied anyway. – mikeserv Dec 14 '15 at 14:18

Stéphane Chazelas , Dec 14, 2015 at 14:34

@mikeserv, match and \| are GNU extensions anyway. This Q&A seems to be about GNU expr anyway (where ^ is guaranteed to mean match at the beginning of the string ). – Stéphane Chazelas Dec 14 '15 at 14:34

mikeserv , Dec 14, 2015 at 14:49

@StéphaneChazelas - i didn't know they were strictly GNU. i think i remember them being explicitly officially unspecified - but i don't use expr too often anyway and didn't know that. thank you. – mikeserv Dec 14 '15 at 14:49

Random832 , Dec 14, 2015 at 16:13

It's not "strictly GNU" - it's present in a number of historical implementations (even System V had it, undocumented, though it didn't have the others like substr/length/index), which is why it's explicitly unspecified. I can't find anything about \| being an extension. – Random832 Dec 14 '15 at 16:13

[Oct 19, 2017] Bash One-Liners bashoneliners.com

Oct 19, 2017 | www.bashoneliners.com
Kill a process running on port 8080
 $ lsof -i :8080 | awk 'NR > 1 {print $2}' | xargs --no-run-if-empty kill

-- by Janos on Sept. 1, 2017, 8:31 p.m.

Make a new folder and cd into it.
 $ mkcd(){ NAME=$1; mkdir -p "$NAME"; cd "$NAME"; }

-- by PrasannaNatarajan on Aug. 3, 2017, 6:49 a.m.

Go up to a particular folder
 $ alias ph='cd ${PWD%/public_html*}/public_html'

-- by Jab2870 on July 18, 2017, 6:07 p.m.

Explanation

I work on a lot of websites and often need to go up to the public_html folder.

This command creates an alias so that however many folders deep I am, I will be taken up to the correct folder.

alias ph='....' : This creates a shortcut so that when command ph is typed, the part between the quotes is executed

cd ... : This changes directory to the directory specified

PWD : This is a global bash variable that contains the current directory

${...%/public_html*} : This removes /public_html and anything after it from the specified string

Finally, /public_html at the end is appended onto the string.

So, to sum up, when ph is run, we ask bash to change the directory to the current working directory with anything after public_html removed.

Open another terminal at current location
 $ $TERMINAL & disown

-- by Jab2870 on July 18, 2017, 3:04 p.m.

Explanation

Opens another terminal window at the current location.

Use Case

I often cd into a directory and decide it would be useful to open another terminal in the same folder, maybe for an editor or something. Previously, I would open the terminal and repeat the CD command.

I have aliased this command to open so I just type open and I get a new terminal already in my desired folder.

The & disown part of the command stops the new terminal from being dependant on the first meaning that you can still use the first and if you close the first, the second will remain open. Limitations

It relied on you having the $TERMINAL global variable set. If you don't have this set you could easily change it to something like the following:

gnome-terminal & disown or konsole & disown

Preserve your fingers from cd ..; cd ..; cd..; cd..;
 $ up(){ DEEP=$1; for i in $(seq 1 ${DEEP:-"1"}); do cd ../; done; }

-- by alireza6677 on June 28, 2017, 5:40 p.m.

Generate a sequence of numbers
 $ echo {01..10}

-- by Elkku on March 1, 2015, 12:04 a.m.

Explanation

This example will print:

01 02 03 04 05 06 07 08 09 10

While the original one-liner is indeed IMHO the canonical way to loop over numbers, the brace expansion syntax of Bash 4.x has some kick-ass features such as correct padding of the number with leading zeros. Limitations

The zero-padding feature works only in Bash >=4.

Tweet

Related one-liners
Generate a sequence of numbers
 $ for ((i=1; i<=10; ++i)); do echo $i; done

-- by Janos on Nov. 4, 2014, 12:29 p.m.

Explanation

This is similar to seq , but portable. seq does not exist in all systems and is not recommended today anymore. Other variations to emulate various uses with seq :

# seq 1 2 10
for ((i=1; i<=10; i+=2)); do echo $i; done

# seq -w 5 10
for ((i=5; i<=10; ++i)); do printf '%02d\n' $i; done
Find recent logs that contain the string "Exception"
 $ find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$

-- by Janos on July 19, 2014, 7:53 a.m.

Explanation

The find :

  • -name '*.log' -- match files ending with .log
  • -mtime -2 -- match files modified within the last 2 days
  • -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path

The grep :

  • -c is to print the count of the matches instead of the matches themselves
  • -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
  • The output lines will be in the format path:count . Files that didn't match "Exception" will still be printed, with 0 as count
  • The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)

Extra tips:

  • Change "Exception" to the typical relevant failure indicator of your application
  • Add -i for grep to make the search case insensitive
  • To make the find match strictly only files, add -type f
  • Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' yourmail@example.com
Remove offending key from known_hosts file with one swift move
 $ sed -i 18d .ssh/known_hosts

-- by EvaggelosBalaskas on Jan. 16, 2013, 2:29 p.m.

Explanation

Using sed to remove a specific line.

The -i parameter is to edit the file in-place. Limitations

This works as posted in GNU sed . In BSD sed , the -i flag requires a parameter to use as the suffix of a backup file. You can set it to empty to not use a backup file:

[Oct 18, 2017] The frenzy is deliberately and I would say almost scientifically engineered by very bright marketing people in software vendors. Savvy IT organizations maintain their focus

Chasing recent fad and risking the organization assets (systems, processes, people, reputations) for the sake of advancing your goals is a clear-cut characteristic of a broken ecosystem.
Feb 01, 2028 | www.itskeptic.org
The change madness is getting worse with every passing year .

The demands for change being placed on corporate IT are plain ridiculous. As a consequence we are breaking IT. In pursuit of absurd project commitments we are eating ourselves .

And the hysteria reaches fever pitch as people extrapolate trends into the future linearly or worse still exponentially. This is such bad scientific thinking that it shouldn't be worthy of debate, but the power of critical thought is a scarce resource

Rob England (The IT Skeptic) -> Dierdre Popov , March 7, 2013 3:43 AM

A broken management and governance system, a broken value system, and a broken culture.

But even in the best and healthiest organisations, there are plenty of rogues; psychopaths (and milder sociopaths) who are never going to care about anyone but themselves. They soar in management (and they're drawn to the power); they look good to all measures and controls except a robust risk management system - it is the last line of defense.

Rob England (The IT Skeptic) -> Simon Kent , February 28, 2013 5:06 AM

...I'm saying there is a real limit to how fast humans can change: how fast we can change our behaviours, our attitudes, our processes, our systems. We need to accept that the technology is changing faster than society, our IT sector, our organisations, our teams, ourselves can change.

I'm saying there is a social and business backlash already to the pace of change. We're standing in the ruins of an economy that embraced fast change.

I'm saying there are real risks to the pace of change, and we currently live in a culture that thinks writing risks down means you can then ignore them, or that if you can't ignore them you can always hedge them somehow.

We have to slow down a bit. perhaps "Slow IT" is the wrong name but it was catchy. I'm not saying go slooooow. We've somehow sustained a pretty impressive pace for decades. But clearly it can't go much faster, if at all, and all these demands that it must go faster are plain silly. It just can't. There's bits falling off, people burning out, smoking shells of projects everywhere.

I'm not saying stop, but I am saying ease off a little, calm down, stop panicking, stop this desperate headlong rush. You are right Simon that mindfulness is a key element: we all need time to think. Let the world keep up.

Fustbariclation , February 27, 2013 10:03 PM

Yes, Rob, short-termism is certainly bad news, and rushing to achieve short-term goals without thinking about them in the larger context is a good indication of disaster ahead.

It's easy to mistake activity for progress.

Wdpowel , March 14, 2013 10:06 AM

Much of the zeitgeist that drives the frenzy you describe is generated by vendors especially those with software in their portfolio. Software has more margin that hardware or service. As a result they have more marketing budget. With that budget they invest and spend a lot of time and effort to figure out exactly how to generate the frenzy with a new thing that you must have. They have to do this to keep market interest in the products. That is actually what their job is.

The frenzy is deliberately and I would say almost scientifically engineered by very very bright marketing people in software vendors. Savvy IT organizations are aware of that distinction and maintain their focus on enabling their business to be successful. IT as Utility, On Demand, SOA, Cloud, ..... Software vendors will not and should not stop doing that - that is what keeps them in business and generates profits that enable new innovation. The onus is on the buyer to understand that whatever the latest technology is, does not provide the answer for how they will improve business performance. Improving business performance is the burden that only the organization can bear.

[Oct 17, 2017] Converting string to lower case in Bash - Stack Overflow

Feb 15, 2010 | stackoverflow.com

assassin , Feb 15, 2010 at 7:02

Is there a way in bash to convert a string into a lower case string?

For example, if I have:

a="Hi all"

I want to convert it to:

"hi all"

ghostdog74 , Feb 15, 2010 at 7:43

The are various ways: tr
$ echo "$a" | tr '[:upper:]' '[:lower:]'
hi all
AWK
$ echo "$a" | awk '{print tolower($0)}'
hi all
Bash 4.0
$ echo "${a,,}"
hi all
Perl
$ echo "$a" | perl -ne 'print lc'
hi all
Bash
lc(){
    case "$1" in
        [A-Z])
        n=$(printf "%d" "'$1")
        n=$((n+32))
        printf \\$(printf "%o" "$n")
        ;;
        *)
        printf "%s" "$1"
        ;;
    esac
}
word="I Love Bash"
for((i=0;i<${#word};i++))
do
    ch="${word:$i:1}"
    lc "$ch"
done

jangosteve , Jan 14, 2012 at 21:58

Am I missing something, or does your last example (in Bash) actually do something completely different? It works for "ABX", but if you instead make word="Hi All" like the other examples, it returns ha , not hi all . It only works for the capitalized letters and skips the already-lowercased letters. – jangosteve Jan 14 '12 at 21:58

Richard Hansen , Feb 3, 2012 at 18:55

Note that only the tr and awk examples are specified in the POSIX standard. – Richard Hansen Feb 3 '12 at 18:55

Richard Hansen , Feb 3, 2012 at 18:58

tr '[:upper:]' '[:lower:]' will use the current locale to determine uppercase/lowercase equivalents, so it'll work with locales that use letters with diacritical marks. – Richard Hansen Feb 3 '12 at 18:58

Adam Parkin , Sep 25, 2012 at 18:01

How does one get the output into a new variable? Ie say I want the lowercased string into a new variable? – Adam Parkin Sep 25 '12 at 18:01

Tino , Nov 14, 2012 at 15:39

@Adam: b="$(echo $a | tr '[A-Z]' '[a-z]')"Tino Nov 14 '12 at 15:39

Dennis Williamson , Feb 15, 2010 at 10:31

In Bash 4:

To lowercase

$ string="A FEW WORDS"
$ echo "${string,}"
a FEW WORDS
$ echo "${string,,}"
a few words
$ echo "${string,,[AEIUO]}"
a FeW WoRDS

$ string="A Few Words"
$ declare -l string
$ string=$string; echo "$string"
a few words

To uppercase

$ string="a few words"
$ echo "${string^}"
A few words
$ echo "${string^^}"
A FEW WORDS
$ echo "${string^^[aeiou]}"
A fEw wOrds

$ string="A Few Words"
$ declare -u string
$ string=$string; echo "$string"
A FEW WORDS

Toggle (undocumented, but optionally configurable at compile time)

$ string="A Few Words"
$ echo "${string~~}"
a fEW wORDS
$ string="A FEW WORDS"
$ echo "${string~}"
a FEW WORDS
$ string="a few words"
$ echo "${string~}"
A few words

Capitalize (undocumented, but optionally configurable at compile time)

$ string="a few words"
$ declare -c string
$ string=$string
$ echo "$string"
A few words

Title case:

$ string="a few words"
$ string=($string)
$ string="${string[@]^}"
$ echo "$string"
A Few Words

$ declare -c string
$ string=(a few words)
$ echo "${string[@]}"
A Few Words

$ string="a FeW WOrdS"
$ string=${string,,}
$ string=${string~}
$ echo "$string"

To turn off a declare attribute, use + . For example, declare +c string . This affects subsequent assignments and not the current value.

The declare options change the attribute of the variable, but not the contents. The reassignments in my examples update the contents to show the changes.

Edit:

Added "toggle first character by word" ( ${var~} ) as suggested by ghostdog74

Edit: Corrected tilde behavior to match Bash 4.3.

ghostdog74 , Feb 15, 2010 at 10:52

there's also ${string~}ghostdog74 Feb 15 '10 at 10:52

Hubert Kario , Jul 12, 2012 at 16:48

Quite bizzare, "^^" and ",," operators don't work on non-ASCII characters but "~~" does... So string="łódź"; echo ${string~~} will return "ŁÓDŹ", but echo ${string^^} returns "łóDź". Even in LC_ALL=pl_PL.utf-8 . That's using bash 4.2.24. – Hubert Kario Jul 12 '12 at 16:48

Dennis Williamson , Jul 12, 2012 at 18:20

@HubertKario: That's weird. It's the same for me in Bash 4.0.33 with the same string in en_US.UTF-8 . It's a bug and I've reported it. – Dennis Williamson Jul 12 '12 at 18:20

Dennis Williamson , Jul 13, 2012 at 0:44

@HubertKario: Try echo "$string" | tr '[:lower:]' '[:upper:]' . It will probably exhibit the same failure. So the problem is at least partly not Bash's. – Dennis Williamson Jul 13 '12 at 0:44

Dennis Williamson , Jul 14, 2012 at 14:27

@HubertKario: The Bash maintainer has acknowledged the bug and stated that it will be fixed in the next release. – Dennis Williamson Jul 14 '12 at 14:27

shuvalov , Feb 15, 2010 at 7:13

echo "Hi All" | tr "[:upper:]" "[:lower:]"

Richard Hansen , Feb 3, 2012 at 19:00

+1 for not assuming english – Richard Hansen Feb 3 '12 at 19:00

Hubert Kario , Jul 12, 2012 at 16:56

@RichardHansen: tr doesn't work for me for non-ACII characters. I do have correct locale set and locale files generated. Have any idea what could I be doing wrong? – Hubert Kario Jul 12 '12 at 16:56

wasatchwizard , Oct 23, 2014 at 16:42

FYI: This worked on Windows/Msys. Some of the other suggestions did not. – wasatchwizard Oct 23 '14 at 16:42

Ignacio Vazquez-Abrams , Feb 15, 2010 at 7:03

tr :
a="$(tr [A-Z] [a-z] <<< "$a")"
AWK :
{ print tolower($0) }
sed :
y/ABCDEFGHIJKLMNOPQRSTUVWXYZ/abcdefghijklmnopqrstuvwxyz/

Sandeepan Nath , Feb 2, 2011 at 11:12

+1 a="$(tr [A-Z] [a-z] <<< "$a")" looks easiest to me. I am still a beginner... – Sandeepan Nath Feb 2 '11 at 11:12

Haravikk , Oct 19, 2013 at 12:54

I strongly recommend the sed solution; I've been working in an environment that for some reason doesn't have tr but I've yet to find a system without sed , plus a lot of the time I want to do this I've just done something else in sed anyway so can chain the commands together into a single (long) statement. – Haravikk Oct 19 '13 at 12:54

Dennis , Nov 6, 2013 at 19:49

The bracket expressions should be quoted. In tr [A-Z] [a-z] A , the shell may perform filename expansion if there are filenames consisting of a single letter or nullgob is set. tr "[A-Z]" "[a-z]" A will behave properly. – Dennis Nov 6 '13 at 19:49

Haravikk , Jun 15, 2014 at 10:51

@CamiloMartin it's a BusyBox system where I'm having that problem, specifically Synology NASes, but I've encountered it on a few other systems too. I've been doing a lot of cross-platform shell scripting lately, and with the requirement that nothing extra be installed it makes things very tricky! However I've yet to encounter a system without sedHaravikk Jun 15 '14 at 10:51

fuz , Jan 31, 2016 at 14:54

Note that tr [A-Z] [a-z] is incorrect in almost all locales. for example, in the en-US locale, A-Z is actually the interval AaBbCcDdEeFfGgHh...XxYyZ . – fuz Jan 31 '16 at 14:54

nettux443 , May 14, 2014 at 9:36

I know this is an oldish post but I made this answer for another site so I thought I'd post it up here:

UPPER -> lower : use python:

b=`echo "print '$a'.lower()" | python`

Or Ruby:

b=`echo "print '$a'.downcase" | ruby`

Or Perl (probably my favorite):

b=`perl -e "print lc('$a');"`

Or PHP:

b=`php -r "print strtolower('$a');"`

Or Awk:

b=`echo "$a" | awk '{ print tolower($1) }'`

Or Sed:

b=`echo "$a" | sed 's/./\L&/g'`

Or Bash 4:

b=${a,,}

Or NodeJS if you have it (and are a bit nuts...):

b=`echo "console.log('$a'.toLowerCase());" | node`

You could also use dd (but I wouldn't!):

b=`echo "$a" | dd  conv=lcase 2> /dev/null`

lower -> UPPER

use python:

b=`echo "print '$a'.upper()" | python`

Or Ruby:

b=`echo "print '$a'.upcase" | ruby`

Or Perl (probably my favorite):

b=`perl -e "print uc('$a');"`

Or PHP:

b=`php -r "print strtoupper('$a');"`

Or Awk:

b=`echo "$a" | awk '{ print toupper($1) }'`

Or Sed:

b=`echo "$a" | sed 's/./\U&/g'`

Or Bash 4:

b=${a^^}

Or NodeJS if you have it (and are a bit nuts...):

b=`echo "console.log('$a'.toUpperCase());" | node`

You could also use dd (but I wouldn't!):

b=`echo "$a" | dd  conv=ucase 2> /dev/null`

Also when you say 'shell' I'm assuming you mean bash but if you can use zsh it's as easy as

b=$a:l

for lower case and

b=$a:u

for upper case.

JESii , May 28, 2015 at 21:42

Neither the sed command nor the bash command worked for me. – JESii May 28 '15 at 21:42

nettux443 , Nov 20, 2015 at 14:33

@JESii both work for me upper -> lower and lower-> upper. I'm using sed 4.2.2 and Bash 4.3.42(1) on 64bit Debian Stretch. – nettux443 Nov 20 '15 at 14:33

JESii , Nov 21, 2015 at 17:34

Hi, @nettux443... I just tried the bash operation again and it still fails for me with the error message "bad substitution". I'm on OSX using homebrew's bash: GNU bash, version 4.3.42(1)-release (x86_64-apple-darwin14.5.0) – JESii Nov 21 '15 at 17:34

tripleee , Jan 16, 2016 at 11:45

Do not use! All of the examples which generate a script are extremely brittle; if the value of a contains a single quote, you have not only broken behavior, but a serious security problem. – tripleee Jan 16 '16 at 11:45

Scott Smedley , Jan 27, 2011 at 5:37

In zsh:
echo $a:u

Gotta love zsh!

Scott Smedley , Jan 27, 2011 at 5:39

or $a:l for lower case conversion – Scott Smedley Jan 27 '11 at 5:39

biocyberman , Jul 24, 2015 at 23:26

Add one more case: echo ${(C)a} #Upcase the first char onlybiocyberman Jul 24 '15 at 23:26

devnull , Sep 26, 2013 at 15:45

Using GNU sed :
sed 's/.*/\L&/'

Example:

$ foo="Some STRIng";
$ foo=$(echo "$foo" | sed 's/.*/\L&/')
$ echo "$foo"
some string

technosaurus , Jan 21, 2012 at 10:27

For a standard shell (without bashisms) using only builtins:
uppers=ABCDEFGHIJKLMNOPQRSTUVWXYZ
lowers=abcdefghijklmnopqrstuvwxyz

lc(){ #usage: lc "SOME STRING" -> "some string"
    i=0
    while ([ $i -lt ${#1} ]) do
        CUR=${1:$i:1}
        case $uppers in
            *$CUR*)CUR=${uppers%$CUR*};OUTPUT="${OUTPUT}${lowers:${#CUR}:1}";;
            *)OUTPUT="${OUTPUT}$CUR";;
        esac
        i=$((i+1))
    done
    echo "${OUTPUT}"
}

And for upper case:

uc(){ #usage: uc "some string" -> "SOME STRING"
    i=0
    while ([ $i -lt ${#1} ]) do
        CUR=${1:$i:1}
        case $lowers in
            *$CUR*)CUR=${lowers%$CUR*};OUTPUT="${OUTPUT}${uppers:${#CUR}:1}";;
            *)OUTPUT="${OUTPUT}$CUR";;
        esac
        i=$((i+1))
    done
    echo "${OUTPUT}"
}

Dereckson , Nov 23, 2014 at 19:52

I wonder if you didn't let some bashism in this script, as it's not portable on FreeBSD sh: ${1:$...}: Bad substitution – Dereckson Nov 23 '14 at 19:52

tripleee , Apr 14, 2015 at 7:09

Indeed; substrings with ${var:1:1} are a Bashism. – tripleee Apr 14 '15 at 7:09

Derek Shaw , Jan 24, 2011 at 13:53

Regular expression

I would like to take credit for the command I wish to share but the truth is I obtained it for my own use from http://commandlinefu.com . It has the advantage that if you cd to any directory within your own home folder that is it will change all files and folders to lower case recursively please use with caution. It is a brilliant command line fix and especially useful for those multitudes of albums you have stored on your drive.

find . -depth -exec rename 's/(.*)\/([^\/]*)/$1\/\L$2/' {} \;

You can specify a directory in place of the dot(.) after the find which denotes current directory or full path.

I hope this solution proves useful the one thing this command does not do is replace spaces with underscores - oh well another time perhaps.

Wadih M. , Nov 29, 2011 at 1:31

thanks for commandlinefu.comWadih M. Nov 29 '11 at 1:31

John Rix , Jun 26, 2013 at 15:58

This didn't work for me for whatever reason, though it looks fine. I did get this to work as an alternative though: find . -exec /bin/bash -c 'mv {} `tr [A-Z] [a-z] <<< {}`' \; – John Rix Jun 26 '13 at 15:58

Tino , Dec 11, 2015 at 16:27

This needs prename from perl : dpkg -S "$(readlink -e /usr/bin/rename)" gives perl: /usr/bin/prenameTino Dec 11 '15 at 16:27

c4f4t0r , Aug 21, 2013 at 10:21

In bash 4 you can use typeset

Example:

A="HELLO WORLD"
typeset -l A=$A

community wiki, Jan 16, 2016 at 12:26

Pre Bash 4.0

Bash Lower the Case of a string and assign to variable

VARIABLE=$(echo "$VARIABLE" | tr '[:upper:]' '[:lower:]') 

echo "$VARIABLE"

Tino , Dec 11, 2015 at 16:23

No need for echo and pipes: use $(tr '[:upper:]' '[:lower:]' <<<"$VARIABLE")Tino Dec 11 '15 at 16:23

tripleee , Jan 16, 2016 at 12:28

@Tino The here string is also not portable back to really old versions of Bash; I believe it was introduced in v3. – tripleee Jan 16 '16 at 12:28

Tino , Jan 17, 2016 at 14:28

@tripleee You are right, it was introduced in bash-2.05b - however that's the oldest bash I was able to find on my systems – Tino Jan 17 '16 at 14:28

Bikesh M Annur , Mar 23 at 6:48

You can try this
s="Hello World!" 

echo $s  # Hello World!

a=${s,,}
echo $a  # hello world!

b=${s^^}
echo $b  # HELLO WORLD!

ref : http://wiki.workassis.com/shell-script-convert-text-to-lowercase-and-uppercase/

Orwellophile , Mar 24, 2013 at 13:43

For Bash versions earlier than 4.0, this version should be fastest (as it doesn't fork/exec any commands):
function string.monolithic.tolower
{
   local __word=$1
   local __len=${#__word}
   local __char
   local __octal
   local __decimal
   local __result

   for (( i=0; i<__len; i++ ))
   do
      __char=${__word:$i:1}
      case "$__char" in
         [A-Z] )
            printf -v __decimal '%d' "'$__char"
            printf -v __octal '%03o' $(( $__decimal ^ 0x20 ))
            printf -v __char \\$__octal
            ;;
      esac
      __result+="$__char"
   done
   REPLY="$__result"
}

technosaurus's answer had potential too, although it did run properly for mee.

Stephen M. Harris , Mar 22, 2013 at 22:42

If using v4, this is baked-in . If not, here is a simple, widely applicable solution. Other answers (and comments) on this thread were quite helpful in creating the code below.
# Like echo, but converts to lowercase
echolcase () {
    tr [:upper:] [:lower:] <<< "${*}"
}

# Takes one arg by reference (var name) and makes it lowercase
lcase () { 
    eval "${1}"=\'$(echo ${!1//\'/"'\''"} | tr [:upper:] [:lower:] )\'
}

Notes:

JaredTS486 , Dec 23, 2015 at 17:37

In spite of how old this question is and similar to this answer by technosaurus . I had a hard time finding a solution that was portable across most platforms (That I Use) as well as older versions of bash. I have also been frustrated with arrays, functions and use of prints, echos and temporary files to retrieve trivial variables. This works very well for me so far I thought I would share. My main testing environments are:
  1. GNU bash, version 4.1.2(1)-release (x86_64-redhat-linux-gnu)
  2. GNU bash, version 3.2.57(1)-release (sparc-sun-solaris2.10)
lcs="abcdefghijklmnopqrstuvwxyz"
ucs="ABCDEFGHIJKLMNOPQRSTUVWXYZ"
input="Change Me To All Capitals"
for (( i=0; i<"${#input}"; i++ )) ; do :
    for (( j=0; j<"${#lcs}"; j++ )) ; do :
        if [[ "${input:$i:1}" == "${lcs:$j:1}" ]] ; then
            input="${input/${input:$i:1}/${ucs:$j:1}}" 
        fi
    done
done

Simple C-style for loop to iterate through the strings. For the line below if you have not seen anything like this before this is where I learned this . In this case the line checks if the char ${input:$i:1} (lower case) exists in input and if so replaces it with the given char ${ucs:$j:1} (upper case) and stores it back into input.

input="${input/${input:$i:1}/${ucs:$j:1}}"

Gus Neves , May 16 at 10:04

Many answers using external programs, which is not really using Bash .

If you know you will have Bash4 available you should really just use the ${VAR,,} notation (it is easy and cool). For Bash before 4 (My Mac still uses Bash 3.2 for example). I used the corrected version of @ghostdog74 's answer to create a more portable version.

One you can call lowercase 'my STRING' and get a lowercase version. I read comments about setting the result to a var, but that is not really portable in Bash , since we can't return strings. Printing it is the best solution. Easy to capture with something like var="$(lowercase $str)" .

How this works

The way this works is by getting the ASCII integer representation of each char with printf and then adding 32 if upper-to->lower , or subtracting 32 if lower-to->upper . Then use printf again to convert the number back to a char. From 'A' -to-> 'a' we have a difference of 32 chars.

Using printf to explain:

$ printf "%d\n" "'a"
97
$ printf "%d\n" "'A"
65

97 - 65 = 32

And this is the working version with examples.
Please note the comments in the code, as they explain a lot of stuff:

#!/bin/bash

# lowerupper.sh

# Prints the lowercase version of a char
lowercaseChar(){
    case "$1" in
        [A-Z])
            n=$(printf "%d" "'$1")
            n=$((n+32))
            printf \\$(printf "%o" "$n")
            ;;
        *)
            printf "%s" "$1"
            ;;
    esac
}

# Prints the lowercase version of a sequence of strings
lowercase() {
    word="$@"
    for((i=0;i<${#word};i++)); do
        ch="${word:$i:1}"
        lowercaseChar "$ch"
    done
}

# Prints the uppercase version of a char
uppercaseChar(){
    case "$1" in
        [a-z])
            n=$(printf "%d" "'$1")
            n=$((n-32))
            printf \\$(printf "%o" "$n")
            ;;
        *)
            printf "%s" "$1"
            ;;
    esac
}

# Prints the uppercase version of a sequence of strings
uppercase() {
    word="$@"
    for((i=0;i<${#word};i++)); do
        ch="${word:$i:1}"
        uppercaseChar "$ch"
    done
}

# The functions will not add a new line, so use echo or
# append it if you want a new line after printing

# Printing stuff directly
lowercase "I AM the Walrus!"$'\n'
uppercase "I AM the Walrus!"$'\n'

echo "----------"

# Printing a var
str="A StRing WITH mixed sTUFF!"
lowercase "$str"$'\n'
uppercase "$str"$'\n'

echo "----------"

# Not quoting the var should also work, 
# since we use "$@" inside the functions
lowercase $str$'\n'
uppercase $str$'\n'

echo "----------"

# Assigning to a var
myLowerVar="$(lowercase $str)"
myUpperVar="$(uppercase $str)"
echo "myLowerVar: $myLowerVar"
echo "myUpperVar: $myUpperVar"

echo "----------"

# You can even do stuff like
if [[ 'option 2' = "$(lowercase 'OPTION 2')" ]]; then
    echo "Fine! All the same!"
else
    echo "Ops! Not the same!"
fi

exit 0

And the results after running this:

$ ./lowerupper.sh 
i am the walrus!
I AM THE WALRUS!
----------
a string with mixed stuff!
A STRING WITH MIXED STUFF!
----------
a string with mixed stuff!
A STRING WITH MIXED STUFF!
----------
myLowerVar: a string with mixed stuff!
myUpperVar: A STRING WITH MIXED STUFF!
----------
Fine! All the same!

This should only work for ASCII characters though .

For me it is fine, since I know I will only pass ASCII chars to it.
I am using this for some case-insensitive CLI options, for example.

nitinr708 , Jul 8, 2016 at 9:20

To store the transformed string into a variable. Following worked for me - $SOURCE_NAME to $TARGET_NAME
TARGET_NAME="`echo $SOURCE_NAME | tr '[:upper:]' '[:lower:]'`"

[Oct 17, 2017] Use Spare Older Workers to Overcome 'Labour Shortages' naked capitalism

Notable quotes:
"... By Leith van Onselen. Originally published at MacroBusiness ..."
"... "remains low by historical standards" ..."
"... There's a myth that innovation comes from the 20 something in their basement, but that's just not the case. ..."
Oct 17, 2017 | www.nakedcapitalism.com

Yves here. On the one hand, as someone who is getting to be pretty long in tooth, I'm not sure about calling un and under-employed older workers "spare". But when the alternative is being thrown on the trash heap, maybe that isn't so unflattering.

Even though this analysis is from Australia, most of if not all of its finding would almost certainly prove out in the US. However, there is a whole 'nother set of issues here. Australia is 85% urban, with most of the population living in or near four large cities. So its labor mobility issues are less pronounced than here. Moreover, a lot of the whinging in the US about worker shortages, as even readers of the Wall Street Journal regularly point out in its comment section is:

1. Not being willing to pay enough to skilled workers, which includes not being willing to pay them to relocate

2. Not being willing to train less skilled workers, as companies once did as a matter of course

By Leith van Onselen. Originally published at MacroBusiness

A few weeks back, the Benevolent Society released a report which found that age-related discrimination is particularly rife in the workplace, with over a quarter (29%) of survey respondents stating they had been turned down for a job because of their old age, whereas 14% claimed they had been denied a promotion because of their old age.

Today, the Regional Australia Institute (RAI) has warned that Australia is facing a pension crisis unless employers stop their "discrimination" against older workers. From The ABC :

[RAI] has warned the Federal Government's pension bill would rise from $45 billion to $51 billion within three years, unless efforts were made to help more mature workers gain employment, particularly in regional communities.

Chief executive Jack Archer said continued unemployment of people older than 55 would cut economic growth and put a greater strain on public resources.

"We hear that there is a lot of people who would like to work, who would love to stay in the workforce either part-time or full-time even though they're in their late 50s, 60s and even into their 70s," he said.

"But we're not doing a very good job of giving them the training, giving them the incentives around the pension, and working with employers to stop the discrimination around employing older workers"

"It basically means you've got a lot of talent on the bench, a lot of people who could be involved and contributing who are sitting around homes and wishing they were doing something else," he said

Mr Archer said as the population aged the workforce shrank, and that risked future economic growth.

But he said that could be reversed provided employers embraced an older workforce

"[When] those people are earning [an income], their pension bills will either disappear or be much lower and the government will get a benefit from that."

For years the growth lobby and the government has told us that Australia needs to run high levels of immigration in order to alleviate so-called 'skills shortages' and to mitigate an ageing population. This has come despite the Department of Employment showing that Australia's skills shortage "remains low by historical standards" and Australia's labour underutilisation rate tracking at high levels:

Economic models are often cited as proof that a strong immigration program is 'good' for the economy because they show that real GDP per capita is moderately increased via immigration, based on several dubious assumptions.

The most dubious of these assumptions is that population ageing will necessarily result in fewer people working, which will subtract from per capita GDP (due to the ratio of workers to dependents falling).

Leaving aside the fact that the assumed benefit to GDP per capita from immigration is only transitory, since migrants also age (thereby requiring an ever-bigger immigration intake to keep the population age profile from rising), it is just as likely that age-specific workforce participation will respond to labour demand, resulting in fewer people being unemployed. This is exactly what has transpired in Japan where an ageing population has driven the unemployment rate down to only 2.8% – the lowest level since the early-1990s:

The ABS last month revealed that more Australians are working past traditional retirement age, thereby mitigating concerns that population ageing will necessarily reduce the employment-to-population ratio:

Clearly, however, there is much further scope to boost workforce participation among older workers.

Rather than relying on mass immigration to fill phantom 'labour shortages' – in turn displacing both young and older workers alike – the more sensible policy option is to moderate immigration and instead better utilise the existing workforce as well as use automation to overcome any loss of workers as the population ages – as has been utilised in Japan.

It's worth once again highlighting that economists at MIT recently found that there is absolutely no relationship between population ageing and economic decline. To the contrary, population ageing seems to have been associated with improvements in GDP per capita, thanks to increased automation:

If anything, countries experiencing more rapid aging have grown more in recent decades we show that since the early 1990s or 2000s, the periods commonly viewed as the beginning of the adverse effects of aging in much of the advanced world, there is no negative association between aging and lower GDP per capita on the contrary, the relationship is significantly positive in many specifications.

The last thing that Australia should be doing is running a mass immigration program which, as noted many times by the Productivity Commission cannot provide a long-term solution to ageing, and places increasing strains on infrastructure, housing and the natural environment.

The sustainable 'solution' to population ageing is to better utilise the existing workforce, where significant spare capacity exists.

Enquiring Mind , October 17, 2017 at 10:26 am

At what point might an impatient constituency demand greater accountability by its elected representatives? In the business world, the post-2000 accounting scandals like Enron resulted in legislation to make company execs sign off on financial statements under threat of harsh personal penalties for misrepresentation. If legislators were forced by constituents to enact similar legislation about their own actions, the transparency could be very enlightening and a type of risk reduction due to acknowledgement of material factors. Imagine seeing in print the real reasons for votes, the funding sources behind those votes and prospect of jail time for misrepresentation about what is just their damn job. Call it Truth-In-Legislating, similar to the prior Truth-In-Lending act.

Vatch , October 17, 2017 at 11:29 am

It's a nice idea, but I don't think that very many executives have been penalized under the Sarbanes Oxley Act. Jamie Dimon certainly wasn't penalized for the actions of the London Whale. I guess we'll see what happens in the near future to the executives of Wells Fargo. I suspect that a Truth-In-Legislating law would be filled with loopholes or would be hampered by enforcement failures, like current Congressional ethics rules and the Sarbanes Oxley Act.

sgt_doom , October 17, 2017 at 2:11 pm

At what point might an impatient constituency demand greater accountability by its elected representatives?

At that point when they start shooting them (as they did in Russian in the very early 1900s, or lop their heads off, as they once did in France).

Personally, I'll never work for any Ameritard corporation ever again, as real innovation is not allowed, and the vast majority are all about financialization in some form or other!

My work life the past thirty years became worse and worse and worse, in direct relation to the majority of others, and my last jobs were beyond commenting up.

My very last position, which was in no manner related to my experience, education, skill set and talents -- like too many other American workers -- ended with a most tortuous layoff: the private equity firm which was owner in a failed "pump and dump" brought a "toxic work environment specialist" whose job was to advise the sleazoid senior executives (and by that time I was probably one of only four actual employee workers there, they had hired a whole bunch of executives, though) on how to create a negative work environment to convince us to leave instead of merely laying us off (worked for two, but not the last lady there I myself).

The American workplace sucks big time as evidenced by their refusal to raise wages while forever complaining about their inability to find skilled employees -- they are all criminals today!

RUKidding , October 17, 2017 at 10:55 am

Interesting article and thanks.

I lived and worked in Australia in the late '70s and early '80s. Times were different. Back then, the government jobs came with mandatory retirement. I believe (but could be wrong) that it was at 63, but you could request staying until 65 (required approval). After that, one could continue working in the private sector, if you could find a job.

The population was much less than it is now. I believe the idea was to make room for the younger generation coming up. Back then, government workers, as well as many private sector workers, had defined benefit pension plans. So retiring younger typically worked out ok.

I had one friend who continued working until about 70 because she wanted to; liked her job; and wasn't interested in retiring. However, I knew far more people who were eager to stop at 63. But back then, it appeared to me that they had the financial means to do so without much worry.

Things have changed since then. More of my friends are putting off retirement bc they need the money now. Plus defined benefit pension plans have mostly been dispensed with and replaced by, I believe (I'm not totally clear on this), the Aussie version of a 401 (k) (someone can correct me if I'm wrong).

What the article proposes makes sense. Of course here in the USA, older workers/job seekers face a host of discriminatory practices, especially for the better paying jobs. Nowadays, though, US citizens in their golden years can sell their house, buy an RV, and become itinerant workers – sometimes at back breaking labor, such as harvesting crops or working at an Amazon gulag – for $10 an hour. Yippee kay-o kay-aaay!

So let us also talk about cutting Medicare for all of those lazy slacker Seniors out there. Woo hoo!

jrs , October 17, 2017 at 1:34 pm

There is really two issues:
1) for those whom age discrimination in employment is hitting in their 50s or even younger, before anyone much is retiring, it needs to be combatted
2) eventually (sometimes in their 60's and really should be at least by 65) people ought to be allowed to retire and with enough money to not be in poverty. This work full time until you drop garbage is just that (it's not as if 70 year olds can even say work 20 hours instead, no it's the same 50+ hours or whatever as everyone else is doing). And most people won't live that much longer, really they won't, U.S. average lifespans aren't that long and falling fast. So it really is work until you die that is being pushed if people aren't allowed to retire sometime in their 60s. Some people have good genes and good luck and so on (they may also have a healthy lifestyle but sheer luck plays a large role), and will live far beyond that, but averages

RUKIdding , October 17, 2017 at 4:44 pm

Agree with you about the 2 issues.

Working past 65 is one of those things where it just depends. I know people who are happily (and don't "really" need the money) working past 65 bc they love their jobs and they're not taking a toll on their health. They enjoy the socialization at work; are intellectually stimulated; and are quite happy. That's one issue.

But when people HAVE TO work past 65 – and I know quite a few in this category – when it starts taking a toll on their health, that is truly bad. And I can reel off several cases that I know of personally. It's just wrong.

Whether you live much longer or not is sort of up to fate, no matter what. But yes, if work is taking a toll on your heath, then you most likely won't live as long.

cocomaan , October 17, 2017 at 11:06 am

In January, economists from MIT published a paper, entitled Secular Stagnation? The Effect of Aging on Economic Growth in the Age of Automation, which showed that there is absolutely no relationship between population aging and economic decline. To the contrary, population aging seems to have been associated with improvements in GDP per capita, thanks to increased automation:

From the cited article.

I don't know why it never occurred to me before, but there's no reason to ditch your most knowledgeable, most skilled workers toward the eve of their careers except if you don't want to pay labor costs. Which we know that most firms do not, in their mission for profit for shareholders or the flashy new building or trying to Innuhvate .

There's a myth that innovation comes from the 20 something in their basement, but that's just not the case. Someone who has, for instance, overseen 100 construction projects building bridges needs to be retained, not let go. Maybe they can't lift the sledge anymore, but I'd keep them on as long as possible.

Good food for thought! I enjoyed this piece.

HotFlash , October 17, 2017 at 4:19 pm

There's a myth that innovation comes from the 20 something in their basement, but that's just not the case.

Widely held by 20 somethings. Maybe it's just one of those Oedipus things.

fresno dan , October 17, 2017 at 11:08 am

1. Not being willing to pay enough to skilled workers, which includes not being willing to pay them to relocate

2. Not being willing to train less skilled workers, as companies once did as a matter of course

3. older workers have seen all the crap and evil management has done, and is usually in a much better position than young less established employees to take effective action against it

Disturbed Voter , October 17, 2017 at 1:26 pm

This. Don't expect rational actors, in management or labor. If everyone was paid the same, regardless of age or training or education or experience etc then the financial incentives for variant outcomes would decrease. Except for higher health costs for older workers. For them, we could simply ban employer provide health insurance then that takes that variable out of the equation too. So yes, the ideal is a rational Marxism or the uniformity of the hive-mind-feminism. While we would have "from each according to their ability, to each according to their need" we will have added it as an axiom that all have the same need. And a whip can encourage the hoi polloi to do their very best.

Jeremy Grimm , October 17, 2017 at 2:25 pm

Fully agee! To your list I would add a corollory to your item #3 -- older workers having seen all the crap and evil management has done are more likely to inspire other employees to feel and act with them. -- This corollory is obvious but I think it bears stating for emphasis of the point.

I believe your whole list might be viewed as symtoms resulting from the concept of workers as commodity -- fungible as cogs on a wheel. Young and old alike are dehumanized.

The boss of the branch office of the firm I last worked for before I retired constantly emphasized how each of us must remain "fungible" [he's who introduced me to this word] if we wanted to remain employed. The firm would win contracts using one set of workers in its bids and slowly replace them with new workers providing the firm a higher return per hour billed to the client. I feel very lucky I managed to remain employed -- to within a couple of years of the age when I could apply for Medicare. [Maybe it's because I was too cowed to make waves and avoided raises as best I could.]

[I started my comment considering the idea of "human capital" but ran into trouble with that concept. Shouldn't capital be assessed in terms of its replacement costs and its capacity for generating product or other gain? I had trouble working that calculus into the way firms treat their employees and decided "commodity" rather than "capital" better fit how workers were regarded and treated.]

BoycottAmazon , October 17, 2017 at 11:16 am

"skills vs. demand imbalance" not labor shortage. Capital wants to tip the scale the other way, but isn't willing to invest the money to train the people, per a comment I made last week. Plenty of unemployed or under-employed even in Japan, much less Oz.

Keeping the elderly, who already have the skills, in the work place longer is a way to put off making the investments. Getting government to tax the poor for their own training is another method. Exploiting poor nations education systems by importing skills yet another.

Some business hope to develop skills that only costs motive power (electric), minimal maintenance, and are far less capital intensive and quicker to the market than the current primary source's 18 years. Capitalism on an finite resource will eat itself, but even capitalism with finite resources will self-destruct in the end.

Jim Haygood , October 17, 2017 at 1:35 pm

Importantly, the chart labeled as Figure 2 uses GDP per capita on the y-axis.

Bearing in mind that GDP growth is composed of labor force growth times productivity, emerging economies that are growing faster than the rich world in both population and GDP look more anemic on a per capita basis, allowing us rich country denizens to feel better about our good selves. :-)

But in terms of absolute GDP growth, things ain't so bright here in the Homeland. Both population and productivity growth are slowing. Over the past two-thirds century, the trend in GDP groaf is relentlessly down, even as debt rises in an apparent attempt to maintain unsustainable living standards. Chart (viewer discretion advised):

https://gailtheactuary.files.wordpress.com/2016/02/us-annual-gdp-growth-rate-2015.png

Van Onselen doesn't address the rich world's busted pension systems. To the extent that they contain a Ponzi element premised on endless growth, immigration would modestly benefit them by adding new victims workers to support the greying masses of doddering Boomers.

Will you still need me
Will you still feed me
When I'm sixty-four?

-- The Beatles

Arthur Wilke , October 17, 2017 at 1:44 pm

There's been an increase in the employment of older people in the U.S. in the U.S. population. To provide a snapshot, below are three tables referring to the U.S. by age cohorts of 1) the total population, 2) Employment and 3) employment-population ratios (percent).based on Bureau of Labor Statistics weightings for population estimates and compiled in the Merge Outgoing Rotation Groups (MORG) dataset by the National Bureau of Economic Research (NBER) from the monthly Current Population Survey (CPS).

The portion of the population 16 to 54 has declined while those over 54 has increased.
1. Percent Population in Age Cohorts: 1986 & 2016

1986 2016 AGE
18.9 15.2 16-24
53.7 49.6 25-54
12.2 16.3 55-64
9.4 11.2 65-74
5.8 7.7 75 & OVER
100.0 100.0 ALL

The portion of the population 16 to 54 employed has declined while the portion over 54 has increased..

2 Percent Employed in Age Cohorts: 1986 & 2016

1986 2016 AGE
18.5 12.5 16-24
68.4 64.7 25-54
10.4 16.9 55-64
2.3 4.8 65-74
0.4 1.0 75 & OVER
100.0 100.0 ALL

The employment-population ratios (percents) show significant declines for those under 25 while increases for those 55 and above.

3. Age-Specific Employment Population Ratios (Percents)

1986 2016 AGE
59.5 49.4 16-24
77.3 77.9 25-54
51.8 61.8 55-64
14.8 25.9 65-74
3.8 7.9 75 & OVER
60.7 59.7 ALL

None of the above data refute claims about age and experience inequities. Rather these provide a base from which to explore such concerns. Because MORG data are representative samples with population weightings, systematic contingency analyses are challenging.

In the 30 year interval of these data there have been changes in population and employment by education status, gender, race, citizenship status along with industry and occupation, all items of which are found in the publicly available MORG dataset.

AW

Yves Smith Post author , October 17, 2017 at 4:54 pm

I think you are missing the point. Life expectancy at birth has increased by nearly five years since 1986. That renders simple comparisons of labor force participation less meaningful. The implication is that many people are not just living longer but are in better shape in their later middle age. Look at the dramatic drop in labor force participation from the 25-54 age cohort v. 55 to 64. How can so few people in that age group be working given that even retiring at 65 is something most people cannot afford? And the increase over time in the current 55=64 age cohort is significantly due to the entry of women into the workplace. Mine was the first generation where that became widespread.

The increase in the over 65 cohort reflects desperation. Anyone who can work stays working.

Arthur Wilke , October 17, 2017 at 6:27 pm

Even if life-expectancy is increasing due to improved health, the percentage of those in older cohorts who are working is increasing at an even faster rate. If a ratio is 6/8 for a category and goes up to 10/12 the category has increased (8 to 12 or 50%) and the subcategory has increased (6 to 10 i or 67% and the ratios go from 6/8 or 75/100 to 10/12 or 83.3/100)

I assume you are referencing the employment-population (E/P) ratio when noting "the dramatic drop in labor force participation from the 25-54 age cohort v. 55 to 64." However the change in the E/P ratio for 25-54 year olds was virtually unchanged (77.3/100 in 1986 to 77.9/100 in 2016) and for the 55-64 year olds the E/P ratio INCREASED significantly, from 51.8/100 in 1986 to 61.8/100 in 2016.

You query: "How can so few people in that age group be working given that even retiring at 65 is something most people cannot afford?" That's a set of concerns the data I've compiled cannot and thus cannot address. It would take more time to see if an empirical answer could be constructed, something that doesn't lend itself to making a timely, empirically based comment. The data I compiled was done after reading the original post.

You note: ". . . ;;[T] the increase over time in the current 55-64 age cohort is significantly due to the entry of women into the workplace." Again, I didn't compute the age-gender specific E/P ratios. I can do that if there's interest. The OVERALL female E/P ratio (from FRED) did not significantly increase from December 1986 ( 51.7/100) to December 2016 (53.8/100).

Your write: "The increase in the over 65 cohort reflects desperation. Anyone who can work stays working." Again, the data I was using provided me no basis for this interpretation. I suspect that the MORG data can provide some support for that interpretation. However, based on your comments about longer life expectancy, it's likely that a higher proportion of those in professional-middle class or in the upper-middle class category Richard Reeves writes about (Dream Hoarders) were able and willing to continue working. For a time in higher education some institutions offered incentives for older faculty to continue working thereby they could continue to receive a salary and upon becoming eligible for Social Security draw on that benefit. No doubt many, many vulnerable older people, including workers laid off in the wake of the Great Recession and otherwise burdened lengthened their or sought employment.

Again the MORG data can get somewhat closer to your concerns and interests, but whether this is the forum is a challenge given the reporting-comment cycle which guides this excellent site.

paul , October 17, 2017 at 1:54 pm

Institutional memory (perhaps, wisdom) is a positive threat to institutional change (for the pillage).

In my experience,those in possession of it are encouraged/discouraged/finally made to go.

The break up of British Rail is a salient,suppurating example.

The break up of National Health Service is another.

It would be easy to go on, I just see it as the long year zero the more clinical sociopaths desire.

Livius Drusus , October 17, 2017 at 3:20 pm

I don't understand how the media promotes the "society is aging, we need more immigrants to avoid a labor shortage" argument and the "there will be no jobs in the near future due to automation, there will be a jobs shortage" argument at the same time. Dean Baker has discussed this issue:

http://cepr.net/publications/op-eds-columns/badly-confused-economics-the-debate-on-automation

In any event, helping to keep older workers in the workforce can be a good thing. Some people become physically inactive after retirement and their social networks decline which can cause depression and loneliness. Work might benefit some people who would otherwise sink into inactivity and loneliness.

Of course, results might vary based on individual differences and those who engaged in hard physical labor will likely have to retire earlier due to wear and tear on their bodies.

flora , October 17, 2017 at 7:42 pm

Increase in life expectancy is greatly influenced by a decrease in childhood mortality. People are living longer because they aren't dying in large numbers in childhood anymore in the US. So many arguments that start out "we're living longer, so something" confuse a reduction in childhood mortality with how long one can expect to live to in old age, based on the actuarial charts. Pols who want to cut SS or increase the retirement age find this confusion very useful.

" Life expectancy at birth is very sensitive to reductions in the death rates of children, because each child that survives adds many years to the amount of life in the population. Thus, the dramatic declines in infant and child mortality in the twentieth century were accompanied by equally stunning increases in life expectancy. "

http://www.pbs.org/fmc/timeline/dmortality.htm

TarheelDem , October 17, 2017 at 5:24 pm

I've noticed ever since the 1990s that "labor shortage" is a signal for cost-cutting measures that trigger a recession. Which then becomes the excuse for shedding workers and really getting the recession on.

It is not just older workers who are spare. There are other forms of discrimination that could fall by the wayside if solving the "labor shortage" was the sincere objective.

JBird , October 17, 2017 at 5:57 pm

Often productively, sales, and profits decrease with those cost cuttings, which justified further cuts which decreases productivity, sales, and profits which justifies

It's a pattern I first noticed in the 1990s and looking back in the 80s too. It's like some malevolent MBAs went out and convinced the whole of American middle and senior business management that this was the Way to do it. It's like something out of the most hidebound, nonsensical ideas of Maoism and Stalinism as something that could not fail but only be failed. It is right out of the Chicago Boys' economics playbook. Thirty-five years later and the Way still hasn't succeeded, but they're still trying not to fail it.

SpringTexan , October 17, 2017 at 6:59 pm

Love your reflections. Yeah, it's like a religion that they can't pay more, can't train, must cut people till they are working to their max at ordinary times (so have no slack for crises), etc. etc., and that it doesn't work doesn't change the faith in it AT ALL.

JBird , October 17, 2017 at 5:42 pm

This is ranting, but most jobs can be done at most ages. If want someone to be a SEAL or do 12 hours at farm labor no of course not, but just about everything else so what's the problem?

All this "we have a skilled labor shortage" or "we have a labor surplus" or "the workers are all lazy/stupid" narratives" and "it's the unions' fault" and "the market solves everything" and the implicit "we are a true meritocracy and the losers are waste who deserve their pain" and my favorite of the "Job creators do make jobs" being said, and/or believed all at the same time is insanity made mainstream.

Sometimes I think whoever is running things are told they have to drink the Draught of UnWisdom before becoming the elites.

Dan , October 17, 2017 at 7:12 pm

So I'm a middle aged fella – early thirties – and have to admit that in my industry I find that most older workers are a disaster. I'm in tech and frankly find that most older workers are a detriment simply from being out of date. While I sympathize, in some cases experience can be a minus rather than a plus. The willingness to try new things and stay current with modern technologies/techniques just isn't there for the majority of tech workers that are over the hill.

flora , October 17, 2017 at 7:46 pm

Well, if you're lucky, your company won't replace you with a cheaper HiB visa holder or outsource your job to the sub-continent before you're 40.

[Oct 16, 2017] Indenting Here-Documents - bash Cookbook

Oct 16, 2017 | www.safaribooksonline.com

Indenting Here-Documents Problem

The here-document is great, but it's messing up your shell script's formatting. You want to be able to indent for readability. Solution

Use <<- and then you can use tab characters (only!) at the beginning of lines to indent this portion of your shell script.

   $ cat myscript.sh
        ...
             grep $1 <<-'EOF'
                lots of data
                can go here
                it's indented with tabs
                to match the script's indenting
                but the leading tabs are
                discarded when read
                EOF
            ls
        ...
        $
Discussion

The hyphen just after the << is enough to tell bash to ignore the leading tab characters. This is for tab characters only and not arbitrary white space. This is especially important with the EOF or any other marker designation. If you have spaces there, it will not recognize the EOF as your ending marker, and the "here" data will continue through to the end of the file (swallowing the rest of your script). Therefore, you may want to always left-justify the EOF (or other marker) just to be safe, and let the formatting go on this one line.

[Oct 16, 2017] Indenting bourne shell here documents

Oct 16, 2017 | prefetch.net

The Bourne shell provides here documents to allow block of data to be passed to a process through STDIN. The typical format for a here document is something similar to this:

command <<ARBITRARY_TAG
data to pass 1
data to pass 2
ARBITRARY_TAG

This will send the data between the ARBITRARY_TAG statements to the standard input of the process. In order for this to work, you need to make sure that the data is not indented. If you indent it for readability, you will get a syntax error similar to the following:

./test: line 12: syntax error: unexpected end of file

To allow your here documents to be indented, you can append a "-" to the end of the redirection strings like so:

if [ "${STRING}" = "SOMETHING" ]
then
        somecommand <<-EOF
        this is a string1
        this is a string2
        this is a string3
        EOF
fi

You will need to use tabs to indent the data, but that is a small price to pay for added readability. Nice!

[Oct 15, 2017] Two Cheers For Trump's Immigration Proposal Especially "Interior Enforcement" - The Unz Review

Notable quotes:
"... In the 1970s a programming shop was legacy American, with only a thin scattering of foreigners like myself. Twenty years later programming had been considerably foreignized , thanks to the H-1B visa program. Now, twenty years further on, I believe legacy-American programmers are an endangered species. ..."
"... So a well-paid and mentally rewarding corner of the middle-class job market has been handed over to foreigners -- for the sole reason, of course, that they are cheaper than Americans. The desire for cheap labor explains 95 percent of U.S. immigration policy. The other five percent is sentimentality. ..."
"... Now they are brazen in their crime: you have heard, I'm sure, those stories about American workers being laid off, with severance packages conditional on their helping train their cheaper foreign replacements. That's our legal ..."
"... A "merit-based" points system won't fix that. It will quickly and easily be gamed by employers to lay waste yet more middle-class occupational zones for Americans. If it was restricted to the higher levels of "merit," we would just be importing a professional overclass of foreigners, most East and South Asians, to direct the labors of less-meritorious legacy Americans. How would that ..."
"... Measured by the number of workers per year, the largest guestworker program in the entire immigration system is now student visas through the Optional Practical Training program (OPT). Last year over 154,000 aliens were approved to work on student visas. By comparison, 114,000 aliens entered the workforce on H-1B guestworker visas. ..."
"... A History of the 'Optional Practical Training' Guestworker Program , ..."
"... incredible amount ..."
"... on all sorts of subjects ..."
"... for all kinds of outlets. (This ..."
"... no longer includes ..."
"... National Review, whose editors had some kind of tantrum and ..."
"... and several other ..."
"... . He has had two books published by VDARE.com com: ..."
"... ( also available in Kindle ) and ..."
"... Has it ever occurred to anyone other than me that the cost associated with foreign workers using our schools and hospitals and pubic services for free, is more than off-set by the cheap price being paid for grocery store items like boneless chicken breast, grapes, apples, peaches, lettuce etc, which would otherwise be prohibitively expensive even for the wealthy? ..."
Oct 15, 2017 | www.unz.com

Headliner of the week for immigration patriots was President Trump's immigration reform proposal , which he sent to Congress for their perusal last Sunday. The proposal is a very detailed 70-point list under three main headings:

Border Security (27 items) Interior Enforcement (39 items) Merit-Based Immigration System (four items)

Item-wise, the biggest heading there is the second one, "Interior Enforcement." That's very welcome.

Of course we need improved border security so that people don't enter our country without permission. That comes under the first heading. An equally pressing problem, though, is the millions of foreigners who are living and working here, and using our schools and hospitals and public services, who should not be here.

The President's proposals on interior enforcement cover all bases: Sanctuary cities , visa overstays , law-enforcement resources , compulsory E-Verify , more deportations , improved visa security.

This is a major, wonderful improvement in national policy, when you consider that less than a year ago the White House and Justice Department were run by committed open-borders fanatics. I thank the President and his staff for having put so much work into such a detailed proposal for restoring American sovereignty and the rights of American workers and taxpayers.

That said, here come the quibbles.

That third heading, "Merit-Based Immigration System," with just four items, needs work. Setting aside improvements on visa controls under the other headings, this is really the only part of the proposal that covers legal immigration. In my opinion, it does so imperfectly.

There's some good meat in there, mind. Three of the four items -- numbers one, three, and four -- got a fist-pump from me:

cutting down chain migration by limiting it to spouse and dependent children; eliminating the Diversity Visa Lottery ; and limiting the number of refugees admitted, assuming this means severely cutting back on the numbers, preferably all the way to zero.

Good stuff. Item two, however, is a problem. Quote:

Establish a new, points-based system for the awarding of Green Cards (lawful permanent residents) based on factors that allow individuals to successfully assimilate and support themselves financially.

sounds OK, bringing in talented, well-educated, well-socialized people, rather than what the late Lee Kuan Yew referred to as " fruit-pickers ." Forgive me if I have a rather jaundiced view of this merit-based approach.

For most of my adult life I made a living as a computer programmer. I spent four years doing this in the U.S.A. through the mid-1970s. Then I came back in the late 1980s and worked at the same trade here through the 1990s. (Pictured right–my actual H-1B visa ) That gave me two clear snapshots twenty years apart, of this particular corner of skilled middle-class employment in America.

In the 1970s a programming shop was legacy American, with only a thin scattering of foreigners like myself. Twenty years later programming had been considerably foreignized , thanks to the H-1B visa program. Now, twenty years further on, I believe legacy-American programmers are an endangered species.

So a well-paid and mentally rewarding corner of the middle-class job market has been handed over to foreigners -- for the sole reason, of course, that they are cheaper than Americans. The desire for cheap labor explains 95 percent of U.S. immigration policy. The other five percent is sentimentality.

On so-called "merit-based immigration," therefore, you can count me a cynic. I have no doubt that American firms could recruit all the computer programmers they need from among our legacy population. They used to do so, forty years ago. Then they discovered how to game the immigration system for cheaper labor.

Now they are brazen in their crime: you have heard, I'm sure, those stories about American workers being laid off, with severance packages conditional on their helping train their cheaper foreign replacements. That's our legal immigration system in a nutshell. It's a cheap-labor racket.

A "merit-based" points system won't fix that. It will quickly and easily be gamed by employers to lay waste yet more middle-class occupational zones for Americans. If it was restricted to the higher levels of "merit," we would just be importing a professional overclass of foreigners, most East and South Asians, to direct the labors of less-meritorious legacy Americans. How would that contribute to social harmony?

With coming up to a third of a billion people, the U.S.A. has all the talent, all the merit , it needs. You might make a case for a handful of certified geniuses like Einstein or worthy dissidents like Solzhenitsyn, but those cases aside, there is no reason at all to have guest-worker programs. They should all be shut down.

Some of these cheap-labor rackets don't even need congressional action to shut them down; it can be done by regulatory change via executive order. The scandalous OPT-visa scam, for example, which brings in cheap workers under the guise of student visas.

Here is John Miano writing about the OPT program last month, quote:

Measured by the number of workers per year, the largest guestworker program in the entire immigration system is now student visas through the Optional Practical Training program (OPT). Last year over 154,000 aliens were approved to work on student visas. By comparison, 114,000 aliens entered the workforce on H-1B guestworker visas.

Because there is no reporting on how long guestworkers stay in the country, we do not know the total number of workers in each category. Nonetheless, the number of approvals for work on student visas has grown by 62 percent over the past four years so their numbers will soon dwarf those on H-1B visas.

The troubling fact is that the OPT program was created entirely through regulation with no authorization from Congress whatsoever. [ A History of the 'Optional Practical Training' Guestworker Program , CIS, September 18, 2017]

End quote. (And a cheery wave of acknowledgement to John Miano here from one of the other seventeen people in the U.S.A. that knows the correct placement of the hyphen in "H-1B.")

Our legal immigration system is addled with these scams. Don't even get me started on the EB-5 investor's visa . It all needs sweeping away.

So for preference I would rewrite that third heading to include, yes, items one, three, and four -- cutting down chain migration, ending the Diversity Visa Lottery, and ending refugee settlement for anyone of less stature than Solzhenitsyn; but then, I'd replace item two with the following:

End all guest-worker programs, with exceptions only for the highest levels of talent and accomplishment, limit one hundred visas per annum .

So much for my amendments to the President's October 8th proposals. There is, though, one glaring omission from that 70-item list. The proposal has no mention at all of birthright citizenship.

have abandoned it . It leads to obstetric tourism : women well-advanced in pregnancy come to the U.S.A. to give birth, knowing that the child will be a U.S. citizen. It is deeply unpopular with Americans , once it's explained to them.

Yes, yes, I know: some constitutional authorities argue that birthright citizenship is implied in the Fourteenth Amendment , although it is certain that the framers of that Amendment did not have foreign tourists or illegal entrants in mind. Other scholars think Congress could legislate against it.

The only way to find out is to have Congress legislate. If the courts strike down the legislation as unconstitutional, let's then frame a constitutional amendment and put it to the people.

Getting rid of birthright citizenship might end up a long and difficult process. We might ultimately fail. The only way to find out is to get the process started . Failure to mention this in the President's proposal is a very glaring omission.

Setting aside that, and the aforementioned reservations about working visas, I give two cheers to the proposal. email him ] writes an incredible amount on all sorts of subjects for all kinds of outlets. (This no longer includes National Review, whose editors had some kind of tantrum and fired him. ) He is the author of We Are Doomed: Reclaiming Conservative Pessimism and several other books . He has had two books published by VDARE.com com: FROM THE DISSIDENT RIGHT ( also available in Kindle ) and FROM THE DISSIDENT RIGHT II: ESSAYS 2013 . (Republished from VDare.com by permission of author or representative)

SimpleHandle > > , October 14, 2017 at 2:56 am GMT

I agree with ending birthright citizenship. But Trump should wait until he can put at least one more strict constitutionalist in the supreme court. There will be a court challenge, and we need judges who can understand that if the 14th Amendment didn't give automatic citizenship to American Indians it doesn't give automatic citizenship to children of Mexican citizens who jumped our border.

Diversity Heretic > > , October 14, 2017 at 5:04 am GMT

@Carroll Price

Insofar as your personal situation is concerned, perhaps you would find yourself less "relatively poor" if you had a job with higher wages.

Diversity Heretic > > , October 14, 2017 at 5:16 am GMT

John's article, it seems to me, ignores the elephant in the room: the DACA colonists. Trump is offering this proposal, more or less, in return for some sort of semi-permanent regularization of their status. Bad trade, in my opinion. Ending DACA and sending those illegals back where they belong will have more real effect on illegal and legal immigration/colonization than all sorts of proposals to be implemented in the future, which can and will be changed by subsequent Administrations and Congresses.

Trump would also be able to drive a much harder bargain with Congress (like maybe a moratorium on any immigration) if he had kept his campaign promise, ended DACA the afternoon of January 20, 2017, and busloads of DACA colonists were being sent south of the Rio Grande.

The best hope for immigration patriots is that the Democrats are so wedded to Open Borders that the entire proposal dies and Trump, in disgust, reenacts Ike's Operation Wetback.

bartok > > , October 14, 2017 at 6:32 am GMT

@Carroll Price

Once all the undocumented workers who are doing all the dirty, nasty jobs Americans refuse to do are run out the country, then what?

White people couldn't possibly thrive without non-Whites! Why, without all of that ballast we'd ascend too near the sun.

Negrolphin Pool > > , October 14, 2017 at 7:53 am GMT

Well, in the real world, things just don't work that way. It's pay me now or pay me later. Once all the undocumented workers who are doing all the dirty, nasty jobs Americans refuse to do are run out the country, then what?

Right, prior to 1965, Americans didn't exist. They had all starved to death because, as everyone knows, no Americans will work to produce food and, even if they did, once Tyson chicken plants stop making 50 percent on capital they just shut down.

If there were no Somalis in Minnesota, even Warren Buffett couldn't afford grapes.

Joe Franklin > > , October 14, 2017 at 12:24 pm GMT

Illegal immigrants picking American produce is a false economy.

Illegal immigrants are subsidized by the taxpayer in terms of public health, education, housing, and welfare.

If businesses didn't have access to cheap and subsidized illegal alien labor, they would be compelled to resort to more farm automation to reduce cost.

Cheap illegal alien labor delays the inevitable use of newer farm automation technologies.

Many Americans would likely prefer a machine touch their food rather than a illegal alien with strange hygiene practices.

In addition, anti-American Democrats and neocons prefer certain kinds of illegal aliens because they bolster their diversity scheme.

Carroll Price > > , October 14, 2017 at 12:27 pm GMT

@Realist "Once all the undocumented workers who are doing all the dirty, nasty jobs Americans refuse to do are run out the country, then what?"

Eliminate welfare...then you'll have plenty of workers. Unfortunately, that train left the station long ago. With or without welfare, there's simply no way soft, spoiled, lazy, over-indulged Americans who have never hit a lick at anything their life, will ever perform manual labor for anyone, including themselves.

Jonathan Mason > > , October 14, 2017 at 2:57 pm GMT

@Randal Probably people other than you have worked out that once their wages are not being continually undercut by cheap and easy immigrant competition, the American working classes will actually be able to earn enough to pay the increased prices for grocery store items, especially as the Americans who, along with machines, will replace those immigrants doing the "jobs Americans won't do" will also be earning more and actually paying taxes on it.

The "jobs Americans/Brits/etc won't do" myth is a deliberate distortion of reality that ignores the laws of supply and demand. There are no jobs Americans etc won't do, only jobs for which the employers are not prepared to pay wages high enough to make them worthwhile for Americans etc to do.

Now of course it is more complicated than that. There are jobs that would not be economically viable if the required wages were to be paid, and there are marginal contributions to job creation by immigrant populations, but those aspects are in reality far less significant than the bosses seeking cheap labour want people to think they are.

As a broad summary, a situation in which labour is tight, jobs are easy to come by and staff hard to hold on to is infinitely better for the ordinary working people of any nation than one in which there is a huge pool of excess labour, and therefore wages are low and employees disposable.

You'd think anyone purporting to be on the "left", in the sense of supporting working class people would understand that basic reality, but far too many on the left have been indoctrinated in radical leftist anti-racist and internationalist dogmas that make them functional stooges for big business and its mass immigration program.

Probably people other than you have worked out that once their wages are not being continually undercut by cheap and easy immigrant competition, the American working classes will actually be able to earn enough to pay the increased prices for grocery store items, especially as the Americans who, along with machines, will replace those immigrants doing the "jobs Americans won't do" will also be earning more and actually paying taxes on it.

There might be some truth in this. When I was a student in England in the 60′s I spent every summer working on farms, picking hops, apples, pears, potatoes and made some money and had a lot of fun too and became an expert farm tractor operator.

No reason why US students and high school seniors should not pick up a lot of the slack. Young people like camping in the countryside and sleeping rough, plus lots of opportunity to meet others, have sex, smoke weed, drink beer, or whatever. If you get a free vacation plus a nice check at the end, that makes the relatively low wages worthwhile. It is not always a question of how much you are paid, but how much you can save.

George Weinbaum > > , October 14, 2017 at 3:35 pm GMT

We can fix the EB-5 visa scam. My suggestion: charge would-be "investors" $1 million to enter the US. This $1 is not refundable under any circumstance. It is paid when the "investor's" visa is approved. If the "investor" is convicted of a felony, he is deported. He may bring no one with him. No wife, no child, no aunt, no uncle. Unless he pays $1 million for that person.

We will get a few thousand Russian oligarchs and Saudi princes a year under this program

As to fixing the H-1B visa program, we charge employer users of the program say $25,000 per year per employee. We require the employers to inform all employees that if any is asked to train a replacement, he should inform the DOJ immediately. The DOJ investigates and if true, charges managerial employees who asked that a replacement be trained with fraud.

As to birthright citizenship: I say make it a five-year felony to have a child while in the US illegally. Make it a condition of getting a tourist visa that one not be pregnant. If the tourist visa lasts say 60 days and the woman has a child while in the US, she gets charged with fraud.

None of these suggestions requires a constitutional amendment.

Auntie Analogue > > , October 14, 2017 at 7:10 pm GMT

In the United States middle class prosperity reached its apogee in 1965 – before the disastrous (and eminently foreseeable) wage-lowering consequence of the Hart-Celler Open Immigration Act's massive admission of foreigners increased the supply of labor which began to lower middle class prosperity and to shrink and eradicate the middle class.

It was in 1965 that ordinary Americans, enjoying maximum employment because employers were forced to compete for Americans' talents and labor, wielded their peak purchasing power . Since 1970 wages have remained stagnant, and since 1965 the purchasing power of ordinary Americans has gone into steep decline.

It is long past time to halt Perpetual Mass Immigration into the United States, to end birthright citizenship, and to deport all illegal aliens – if, that is, our leaders genuinely care about and represent us ordinary Americans instead of continuing their legislative, policy, and judicial enrichment of the 1-percenter campaign donor/rentier class of transnational Globali$t Open Border$ E$tabli$hment $ellout$.

Jim Sweeney > > , October 14, 2017 at 8:26 pm GMT

Re the birthright citizenship argument, that is not settled law in that SCOTUS has never ruled on the question of whether a child born in the US is thereby a citizen if the parents are illegally present. Way back in 1897, SCOTUS did resolve the issue of whether a child born to alien parents who were legally present was thereby a citizen. That case is U.S. vs Wong Kim Ark 169 US 649. SCOTUS ruled in favor of citizenship. If that was a justiciable issue how much more so is it when the parents are illegally present?

My thinking is that the result would be the same but, at least, the question would be settled. I cannot see justices returning a toddler to Beijing or worse. They would never have invitations to cocktail parties again for the shame heaped upon them for such uncaring conduct. Today, the title of citizen is conferred simply by bureaucratic rule, not by judicial order.

JP Straley > > , October 14, 2017 at 9:42 pm GMT

Arguments Against Fourteenth Amendment Anchor Baby Interpretation
J. Paige Straley

Part One. Anchor Baby Argument, Mexican Case.
The ruling part of the US Constitution is Amendment Fourteen: "All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside."

Here is the ruling part of the Mexican Constitution, Section II, Article Thirty:
Article 30
Mexican nationality is acquired by birth or by naturalization:
A. Mexicans by birth are:
I. Those born in the territory of the Republic, regardless of the nationality of
their parents:
II. Those born in a foreign country of Mexican parents; of a Mexican father and
a foreign mother; or of a Mexican mother and an unknown father;
III. Those born on Mexican vessels or airships, either war or merchant vessels. "

A baby born to Mexican nationals within the United States is automatically a Mexican citizen. Under the anchor baby reasoning, this baby acquires US citizenship at the same time and so is a dual citizen. Mexican citizenship is primary because it stems from a primary source, the parents' citizenship and the law of Mexico. The Mexican Constitution states the child of Mexican parents is automatically a Mexican citizen at birth no matter where the birth occurs. Since the child would be a Mexican citizen in any country, and becomes an American citizen only if born in America, it is clear that Mexico has the primary claim of citizenry on the child. This alone should be enough to satisfy the Fourteenth Amendment jurisdiction thereof argument. Since Mexican citizenship is primary, it has primary jurisdiction; thus by the plain words of the Fourteenth such child is not an American citizen at birth.

[MORE]
There is a second argument for primary Mexican citizenship in the case of anchor babies. Citizenship, whether Mexican or American, establishes rights and duties. Citizenship is a reciprocal relationship, thus establishing jurisdiction. This case for primary Mexican citizenship is supported by the fact that Mexico allows and encourages Mexicans resident in the US, either illegal aliens or legal residents, to vote in Mexican elections. They are counted as Mexican citizens abroad, even if dual citizens, and their government provides widespread consular services as well as voting access to Mexicans residing in the US. As far as Mexico is concerned, these persons are not Mexican in name only, but have a civil relationship strong enough to allow a political voice; in essence, full citizenship. Clearly, all this is the expression of typical reciprocal civic relationships expressed in legal citizenship, further supporting the establishment of jurisdiction.

Part Two: Wong Kim Ark (1898) case. (Birthright Citizenship)

The Wong Kim Ark (WKA) case is often cited as the essential legal reasoning and precedent for application of the fourteenth amendment as applied to aliens. There has been plenty of commentary on WKA, but the truly narrow application of the case is emphasized reviewing a concise statement of the question the case was meant to decide, written by Hon. Horace Gray, Justice for the majority in this decision.

"[W]hether a child born in the United States, of parents of Chinese descent, who, at the time of his birth, are subjects of the Emperor of China, but have a permanent domicile and residence in the United States, and are there carrying on business, and are not employed in any diplomatic or official capacity under the Emperor of China, becomes at the time of his birth a citizen of the United States by virtue of the first clause of the Fourteenth Amendment of the Constitution." (Italics added.)

For WKA to justify birthright citizenship, the parents must have " permanent domicile and residence " But how can an illegal alien have permanent residence when the threat of deportation is constantly present? There is no statute of limitation for illegal presence in the US and the passage of time does not eliminate the legal remedy of deportation. This alone would seem to invalidate WKA as a support and precedent for illegal alien birthright citizenship.

If illegal (or legal) alien parents are unemployed, unemployable, illegally employed, or if they get their living by illegal means, then they are not ". . .carrying on business. . .", and so the children of indigent or criminal aliens may not be eligible for birthright citizenship

If legal aliens meet the two tests provided in WKA, birthright citizenship applies. Clearly the WKA case addresses the specific situation of the children of legal aliens, and so is not an applicable precedent to justify birthright citizenship for the children of illegal aliens.

Part three. Birth Tourism

Occasionally foreign couples take a trip to the US during the last phase of the wife's pregnancy so she can give birth in the US, thus conferring birthright citizenship on the child. This practice is called "birth tourism." WKA provides two tests for birthright citizenship: permanent domicile and residence and doing business, and a temporary visit answers neither condition. WKA is therefore disqualified as justification for a "birth tourism" child to be granted birthright citizenship.

Realist > > , October 14, 2017 at 10:05 pm GMT

@Carroll Price Unfortunately, that train left the station long ago. With or without welfare, there's simply no way soft, spoiled, lazy, over-indulged Americans who have never hit a lick at anything their life, will ever perform manual labor for anyone, including themselves. Then let them starve to death. The Pilgrims nipped that dumb ass idea (welfare) in the bud

Alfa158 > > , October 15, 2017 at 2:10 am GMT

@Carroll Price

An equally pressing problem, though, is the millions of foreigners who are living and working here, and using our schools and hospitals and public services, who should not be here.
Has it ever occurred to anyone other than me that the cost associated with foreign workers using our schools and hospitals and pubic services for free, is more than off-set by the cheap price being paid for grocery store items like boneless chicken breast, grapes, apples, peaches, lettuce etc, which would otherwise be prohibitively expensive even for the wealthy?

Let alone relatively poor people (like myself) and those on fixed incomes? What un-thinking Americans want, is having their cake and eating it too. Well, in the real world, things just don't work that way. It's pay me now or pay me later. Once all the undocumented workers who are doing all the dirty, nasty jobs Americans refuse to do are run out the country, then what? Please look up;History; United States; pre mid-twentieth century. I'm pretty sure Americans were eating chicken, grapes, apples, peaches, lettuce, etc. prior to that period. I don't think their diet consisted of venison and tree bark.
But since I wasn't there, maybe I'm wrong and that is actually what they were eating.
I know some people born in the 1920′s; I'll check with them and let you know what they say.

[Oct 09, 2017] TMOUT - Auto Logout Linux Shell When There Isn't Any Activity by Aaron Kili

Oct 07, 2017 | www.tecmint.com
... ... ..

To enable automatic user logout, we will be using the TMOUT shell variable, which terminates a user's login shell in case there is no activity for a given number of seconds that you can specify.

To enable this globally (system-wide for all users), set the above variable in the /etc/profile shell initialization file.

[Oct 03, 2017] Timeshift A System Restore Utility Tool Review - LinuxAndUbuntu - Linux News Apps Reviews Linux Tutorials HowTo

Look like technologically this is a questionable approach although technical details are unclear. Rsync is better done by other tools and BTRFS is a niche filesystem.
www.unz.com

TimeShift is a system restore tool for Linux. It provides functionality that is quite similar to the System Restore feature in Windows or the Time Machine tool in MacOS. TimeShift protects your system by making incremental snapshots of the file system manually or at regular automated intervals.

These snapshots can then be restored at a later point to undo all changes to the system and restore it to the previous state. Snapshots are made using rsync and hard-links and the tool shares common files amongst snapshots in order to save disk space. Now that we have an idea about what Timeshift is, let us take take a detail look at setting up and using this tool. ​​

... ... ...

Timeshift supports 2 snapshot formats. The first is by using Rsync and the second is by using the in-built features of BTRFS file system that allows snapshots to be created. So you can select the BTRFS format if you are using that particular filesystem. Other than that, you have to choose the Rsync format.

[Oct 03, 2017] Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

Notable quotes:
"... That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country! ..."
"... I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore... ..."
"... Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model. ..."
"... There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor ..."
"... If you want a high tech executive to suffer a stroke, mention the words "labor unions". ..."
"... India isn't being hired for the quality, they're being hired for cheap labor. ..."
"... Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again... ..."
"... Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology. ..."
"... I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children. ..."
"... Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact. ..."
Oct 02, 2017 | profile.theguardian.com
Terryl Dorian , 21 Sep 2017 13:26
That's Silicon Valley's dirty secret. Most tech workers in Palo Alto make about as much as the high school teachers who teach their kids. And these are the top coders in the country!
Ray D Wright -> RogTheDodge , , 21 Sep 2017 14:52
I don't see why more Americans would want to be coders. These companies want to drive down wages for workers here and then also ship jobs offshore...
Richard Livingstone -> KatieL , , 21 Sep 2017 14:50
+++1 to all of that.

Automated coding just pushes the level of coding further up the development food chain, rather than gets rid of it. It is the wrong approach for current tech. AI that is smart enough to model new problems and create their own descriptive and runnable language - hopefully after my lifetime but coming sometime.

Arne Babenhauserheide -> Evelita , , 21 Sep 2017 14:48
What coding does not teach is how to improve our non-code infrastructure and how to keep it running (that's the stuff which actually moves things). Code can optimize stuff, but it needs actual actuators to affect reality.

Sometimes these actuators are actual people walking on top of a roof while fixing it.

WyntonK , 21 Sep 2017 14:47
Silicon Valley companies have placed lowering wages and flooding the labor market with cheaper labor near the top of their goals and as a business model.

There are quite a few highly qualified American software engineers who lose their jobs to foreign engineers who will work for much lower salaries and benefits. This is a major ingredient of the libertarian virus that has engulfed and contaminating the Valley, going hand to hand with assembling products in China by slave labor .

If you want a high tech executive to suffer a stroke, mention the words "labor unions".

TheEgg -> UncommonTruthiness , , 21 Sep 2017 14:43

The ship has sailed on this activity as a career.

Nope. Married to a highly-technical skillset, you can still make big bucks. I say this as someone involved in this kind of thing academically and our Masters grads have to beat the banks and fintech companies away with dog shits on sticks. You're right that you can teach anyone to potter around and throw up a webpage but at the prohibitively difficult maths-y end of the scale, someone suitably qualified will never want for a job.

Mike_Dexter -> Evelita , , 21 Sep 2017 14:43
In a similar vein, if you accept the argument that it does drive down wages, wouldn't the culprit actually be the multitudes of online and offline courses and tutorials available to an existing workforce?
Terryl Dorian -> CountDooku , , 21 Sep 2017 14:42
Funny you should pick medicine, law, engineering... 3 fields that are *not* taught in high school. The writer is simply adding "coding" to your list. So it seems you agree with his "garbage" argument after all.
anticapitalist -> RogTheDodge , , 21 Sep 2017 14:42
Key word is "good". Teaching everyone is just going to increase the pool of programmers code I need to fix. India isn't being hired for the quality, they're being hired for cheap labor. As for women sure I wouldn't mind more women around but why does no one say their needs to be more equality in garbage collection or plumbing? (And yes plumbers are a high paid professional).

In the end I don't care what the person is, I just want to hire and work with the best and not someone I have to correct their work because they were hired by quota. If women only graduate at 15% why should IT contain more than that? And let's be a bit honest with the facts, of those 15% how many spend their high school years staying up all night hacking? Very few. Now the few that did are some of the better developers I work with but that pool isn't going to increase by forcing every child to program... just like sports aren't better by making everyone take gym class.

WithoutPurpose , 21 Sep 2017 14:42
I ran a development team for 10 years and I never had any trouble hiring programmers - we just had to pay them enough. Every job would have at least 10 good applicants.

Two years ago I decided to scale back a bit and go into programming (I can code real-time low latency financial apps in 4 languages) and I had four interviews in six months with stupidly low salaries. I'm lucky in that I can bounce between tech and the business side so I got a decent job out of tech.

My entirely anecdotal conclusion is that there is no shortage of good programmers just a shortage of companies willing to pay them.

oddbubble -> Tori Turner , , 21 Sep 2017 14:41
I've worn many hats so far, I started out as a started out as a sysadmin, then I moved on to web development, then back end and now I'm doing test automation because I am on almost the same money for half the effort.
peter nelson -> raffine , , 21 Sep 2017 14:38
But the concepts won't. Good programming requires the ability to break down a task, organise the steps in performing it, identify parts of the process that are common or repetitive so they can be bundled together, handed-off or delegated, etc.

These concepts can be applied to any programming language, and indeed to many non-software activities.

Oliver Jones -> Trumbledon , , 21 Sep 2017 14:37
In the city maybe with a financial background, the exception.
anticapitalist -> Ethan Hawkins , 21 Sep 2017 14:32
Well to his point sort of... either everything will go php or all those entry level php developers will be on the street. A good Java or C developer is hard to come by. And to the others, being a being a developer, especially a good one, is nothing like reading and writing. The industry is already saturated with poor coders just doing it for a paycheck.
peter nelson -> Tori Turner , 21 Sep 2017 14:31
I'm just going to say this once: not everyone with a computer science degree is a coder.

And vice versa. I'm retiring from a 40-year career as a software engineer. Some of the best software engineers I ever met did not have CS degrees.

KatieL -> Mishal Almohaimeed , 21 Sep 2017 14:30
"already developing automated coding scripts. "

Pretty much the entire history of the software industry since FORAST was developed for the ORDVAC has been about desperately trying to make software development in some way possible without driving everyone bonkers.

The gulf between FORAST and today's IDE-written, type-inferring high level languages, compilers, abstracted run-time environments, hypervisors, multi-computer architectures and general tech-world flavour-of-2017-ness is truly immense[1].

And yet software is still fucking hard to write. There's no sign it's getting easier despite all that work.

Automated coding was promised as the solution in the 1980s as well. In fact, somewhere in my archives, I've got paper journals which include adverts for automated systems that would programmers completely redundant by writing all your database code for you. These days, we'd think of those tools as automated ORM generators and they don't fix the problem; they just make a new one -- ORM impedance mismatch -- which needs more engineering on top to fix...

The tools don't change the need for the humans, they just change what's possible for the humans to do.

[1] FORAST executed in about 20,000 bytes of memory without even an OS. The compile artifacts for the map-reduce system I built today are an astonishing hundred million bytes... and don't include the necessary mapreduce environment, management interface, node operating system and distributed filesystem...

raffine , 21 Sep 2017 14:29
Whatever they are taught today will be obsolete tomorrow.
yannick95 -> savingUK , , 21 Sep 2017 14:27
"There are already top quality coders in China and India"

AHAHAHAHAHAHAHAHAHAHAHA *rolls on the floor laughting* Yes........ 1%... and 99% of incredibly bad, incompetent, untalented one that produce cost 50% of a good developer but produce only 5% in comparison. And I'm talking with a LOT of practical experience through more than a dozen corporations all over the world which have been outsourcing to India... all have been disasters for the companies (but good for the execs who pocketed big bonuses and left the company before the disaster blows up in their face)

Wiretrip -> mcharts , , 21 Sep 2017 14:25
Enough people have had their hands burnt by now with shit companies like TCS (Tata) that they are starting to look closer to home again...
TomRoche , 21 Sep 2017 14:11

Tech executives have pursued [the goal of suppressing workers' compensation] in a variety of ways. One is collusion – companies conspiring to prevent their employees from earning more by switching jobs. The prevalence of this practice in Silicon Valley triggered a justice department antitrust complaint in 2010, along with a class action suit that culminated in a $415m settlement.

Folks interested in the story of the Techtopus (less drily presented than in the links in this article) should check out Mark Ames' reporting, esp this overview article and this focus on the egregious Steve Jobs (whose canonization by the US corporate-funded media is just one more impeachment of their moral bankruptcy).

Another, more sophisticated method is importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status.

Folks interested in H-1B and US technical visas more generally should head to Norm Matloff 's summary page , and then to his blog on the subject .

Olympus68 , 21 Sep 2017 13:49

I have watched as schools run by trade unions have done the opposite for the 5 decades. By limiting the number of graduates, they were able to help maintain living wages and benefits. This has been stopped in my area due to the pressure of owners run "trade associations".

During that same time period I have witnessed trade associations controlled by company owners, while publicising their support of the average employee, invest enormous amounts of membership fees in creating alliances with public institutions. Their goal has been that of flooding the labor market and thus keeping wages low. A double hit for the average worker because membership fees were paid by employees as well as those in control.

And so it goes....

savingUK , 21 Sep 2017 13:38
Coding jobs are just as susceptible to being moved to lower cost areas of the world as hardware jobs already have. It's already happening. There are already top quality coders in China and India. There is a much larger pool to chose from and they are just as good as their western counterparts and work harder for much less money.

Globalisation is the reason, and trying to force wages up in one country simply moves the jobs elsewhere. The only way I can think of to limit this happening is to keep the company and coders working at the cutting edge of technology.

whitehawk66 , 21 Sep 2017 15:18

I'd be much more impressed if I saw that the hordes of young male engineers here in SF expressing a semblance of basic common sense, basic self awareness and basic life skills. I'd say 91.3% are oblivious, idiotic children.

They would definitely not survive the zombie apocalypse.

P.S. not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread.

UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

William Fitch III , 21 Sep 2017 13:52
Hi: As I have said many times before, there is no shortage of people who fully understand the problem and can see all the connections.

However, they all fall on their faces when it comes to the solution. To cut to the chase, Concentrated Wealth needs to go, permanently. Of course the challenge is how to best accomplish this.....

.....Bill

MostlyHarmlessD , , 21 Sep 2017 13:16

Damn engineers and their black and white world view, if they weren't so inept they would've unionized instead of being trampled again and again in the name of capitalism.
mcharts -> Aldous0rwell , , 21 Sep 2017 13:07
Not maybe. Too late. American corporations objective is to low ball wages here in US. In India they spoon feed these pupils with affordable cutting edge IT training for next to nothing ruppees. These pupils then exaggerate their CVs and ship them out en mass to the western world to dominate the IT industry. I've seen it with my own eyes in action. Those in charge will anything/everything to maintain their grip on power. No brag. Just fact.

Woe to our children and grandchildren.

Where's Bernie Sanders when we need him.

[Oct 03, 2017] The dream of coding automation remain illusive... Very illusive...

Oct 03, 2017 | discussion.theguardian.com

Richard Livingstone -> Mishal Almohaimeed , 21 Sep 2017 14:46

Wrong again, that approach has been tried since the 80s and will keep failing only because software development is still more akin to a technical craft than an engineering discipline. The number of elements required to assemble a working non trivial system is way beyond scriptable.
freeandfair -> Taylor Dotson , 21 Sep 2017 14:26
> That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?

You don't believe there will be robots to do plumbing and cleaning? The cleaner's job will be to program robots to do what they need.
CEOs? Absolutely.

English teachers? Both of my kids have school laptops and everything is being done on the computers. The teachers use software and create websites and what not. Yes, even English teachers.

Not knowing / understanding how to code will be the same as not knowing how to use Word/ Excel. I am assuming there are people who don't, but I don't know any above the age of 6.

Wiretrip -> Mishal Almohaimeed , 21 Sep 2017 14:20
We've had 'automated coding scripts' for years for small tasks. However, anyone who says they're going to obviate programmers, analysts and designers doesn't understand the software development process.
Ethan Hawkins -> David McCaul , 21 Sep 2017 13:22
Even if expert systems (an 80's concept, BTW) could code, we'd still have a huge need for managers. The hard part of software isn't even the coding. It's determining the requirements and working with clients. It will require general intelligence to do 90% of what we do right now. The 10% we could automate right now, mostly gets in the way. I agree it will change, but it's going to take another 20-30 years to really happen.
Mishal Almohaimeed -> PolydentateBrigand , , 21 Sep 2017 13:17
wrong, software companies are already developing automated coding scripts. You'll get a bunch of door to door knives salespeople once the dust settles that's what you'll get.
freeandfair -> rgilyead , , 21 Sep 2017 14:22
> In 20 years time AI will be doing the coding

Possible, but your still have to understand how AI operates and what it can and cannot do.

[Oct 03, 2017] Coding and carpentry are not so distant, are they ?

Thw user "imipak" views are pretty common misconceptions. They are all wrong.
Notable quotes:
"... I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers. ..."
"... Many people can write, but few become journalists, and fewer still become real authors. ..."
Oct 03, 2017 | discussion.theguardian.com

imipak, 21 Sep 2017 15:13

Coding has little or nothing to do with Silicon Valley. They may or may not have ulterior motives, but ultimately they are nothing in the scheme of things.

I disagree with teaching coding as a discrete subject. I think it should be combined with home economics and woodworking because 90% of these subjects consist of transferable skills that exist in all of them. Only a tiny residual is actually topic-specific.

In the case of coding, the residual consists of drawing skills and typing skills. Programming language skills? Irrelevant. You should choose the tools to fit the problem. Neither of these needs a computer. You should only ever approach the computer at the very end, after you've designed and written the program.

Is cooking so very different? Do you decide on the ingredients before or after you start? Do you go shopping half-way through cooking an omelette?

With woodwork, do you measure first or cut first? Do you have a plan or do you randomly assemble bits until it does something useful?

Real coding, taught correctly, is barely taught at all. You teach the transferable skills. ONCE. You then apply those skills in each area in which they apply.

What other transferable skills apply? Top-down design, bottom-up implementation. The correct methodology in all forms of engineering. Proper testing strategies, also common across all forms of engineering. However, since these tests are against logic, they're a test of reasoning. A good thing to have in the sciences and philosophy.

Technical writing is the art of explaining things to idiots. Whether you're designing a board game, explaining what you like about a house, writing a travelogue or just seeing if your wild ideas hold water, you need to be able to put those ideas down on paper in a way that exposes all the inconsistencies and errors. It doesn't take much to clean it up to be readable by humans. But once it is cleaned up, it'll remain free of errors.

So I would teach a foundation course that teaches top-down reasoning, bottom-up design, flowcharts, critical path analysis and symbolic logic. Probably aimed at age 7. But I'd not do so wholly in the abstract. I'd have it thoroughly mixed in with one field, probably cooking as most kids do that and it lacks stigma at that age.

I'd then build courses on various crafts and engineering subjects on top of that, building further hierarchies where possible. Eliminate duplication and severely reduce the fictions we call disciplines.

oldzealand, 21 Sep 2017 14:58
I used to employ 200 computer scientists in my business and now teach children so I'm apparently as guilty as hell. To be compared with a carpenter is, however, a true compliment, if you mean those that create elegant, aesthetically-pleasing, functional, adaptable and long-lasting bespoke furniture, because our crafts of problem-solving using limited resources in confined environments to create working, life-improving artifacts both exemplify great human ingenuity in action. Capitalism or no.
peter nelson, 21 Sep 2017 14:29
"But coding is not magic. It is a technical skill, akin to carpentry."

But some people do it much better than others. Just like journalism. This article is complete nonsense, as I discuss in another comment. The author might want to consider a career in carpentry.

Fanastril, 21 Sep 2017 14:13
"But coding is not magic. It is a technical skill, akin to carpentry."

It is a way of thinking. Perhaps carpentry is too, but the arrogance of the above statement shows a soul who is done thinking.

NDReader, 21 Sep 2017 14:12
"But coding is not magic. It is a technical skill, akin to carpentry."

I was about to take offence on behalf of programmers, but then I realized that would be snobbish and insulting to carpenters too. Many people can code, but only a few can code well, and fewer still become the masters of the profession. Many people can learn carpentry, but few become joiners, and fewer still become cabinetmakers.

Many people can write, but few become journalists, and fewer still become real authors.

MostlyHarmlessD, 21 Sep 2017 13:08
A carpenter!? Good to know that engineers are still thought of as jumped up tradesmen.

[Oct 02, 2017] Programming vs coding

This idiotic US term "coder" is complete baloney.
Notable quotes:
"... You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field. ..."
"... Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to. ..."
"... I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it. ..."
"... "...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck. ..."
"... Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed. ..."
Oct 02, 2017 | profile.theguardian.com
Wiretrip -> Mark Mauvais , 21 Sep 2017 14:23
Yes, 'engineers' (and particularly mathematicians) write appalling code.
Trumbledon , 21 Sep 2017 14:23
A good developer can easily earn £600-800 per day, which suggests to me that they are in high demand, and society needs more of them.
Wiretrip -> KatieL , 21 Sep 2017 14:22
Agreed, to many people 'coding' consists of copying other people's JavaScript snippets from StackOverflow... I tire of the many frauds in the business...
stratplaya , 21 Sep 2017 14:21
You can learn to code, but that doesn't mean you'll be good at it. There will be a few who excel but most will not. This isn't a reflection on them but rather the reality of the situation. In any given area some will do poorly, more will do fairly, and a few will excel. The same applies in any field.
peter nelson -> UncommonTruthiness , 21 Sep 2017 14:21

The ship has sailed on this activity as a career.

Oh, rubbish. I'm in the process of retiring from my job as an Android software designer so I'm tasked with hiring a replacement for my organisation. It pays extremely well, the work is interesting, and the company is successful and serves an important worldwide industry.

Still, finding highly-qualified people is hard and they get snatched up in mid-interview because the demand is high. Not only that but at these pay scales, we can pretty much expect the Guardian will do yet another article about the unconscionable gap between what rich, privileged techies like software engineers make and everyone else.

Really, we're damned if we do and damned if we don't. If tech workers are well-paid we're castigated for gentrifying neighbourhoods and living large, and yet anything that threatens to lower what we're paid produces conspiracy-theory articles like this one.

Fanastril -> Taylor Dotson , 21 Sep 2017 14:17
I learned to cook in school. Was there a shortage of cooks? No. Did I become a professional cook? No. but I sure as hell would not have missed the skills I learned for the world, and I use them every day.
KatieL -> Taylor Dotson , 21 Sep 2017 14:13
Oh no, there's loads of people who say they're coders, who have on their CV that they're coders, that have been paid to be coders. Loads of them. Amazingly, about 9 out of 10 of them, experienced coders all, spent ages doing it, not a problem to do it, definitely a coder, not a problem being "hands on"... can't actually write working code when we actually ask them to.
youngsteveo -> Taylor Dotson , 21 Sep 2017 14:12
I feel for your brother, and I've experienced the exact same BS "test" that you're describing. However, when I said "rudimentary coding exam", I wasn't talking about classic fiz-buz questions, Fibonacci problems, whiteboard tests, or anything of the sort. We simply ask people to write a small amount of code that will solve a simple real world problem. Something that they would be asked to do if they got hired. We let them take a long time to do it. We let them use Google to look things up if they need. You would be shocked how many "qualified applicants" can't do it.
Fanastril -> Taylor Dotson , 21 Sep 2017 14:11
It is not zero-sum: If you teach something empowering, like programming, motivating is a lot easier, and they will learn more.
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented.

Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

KatieL -> Taylor Dotson , 21 Sep 2017 14:10
"intelligence, creativity, diligence, communication ability, or anything else that a job"

None of those are any use if, when asked to turn your intelligent, creative, diligent, communicated idea into some software, you perform as well as most candidates do at simple coding assessments... and write stuff that doesn't work.

peter nelson , 21 Sep 2017 14:09

At its root, the campaign for code education isn't about giving the next generation a shot at earning the salary of a Facebook engineer. It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

Of course the writer does not offer the slightest shred of evidence to support the idea that this is the actual goal of these programs. So it appears that the tinfoil-hat conspiracy brigade on the Guardian is operating not only below the line, but above it, too.

The fact is that few of these students will ever become software engineers (which, incidentally, is my profession) but programming skills are essential in many professions for writing little scripts to automate various tasks, or to just understand 21st century technology.

kcrane , 21 Sep 2017 14:07
Sadly this is another article by a partial journalist who knows nothing about the software industry, but hopes to subvert what he had read somewhere to support a position he had already assumed. As others had said, understanding coding had already become akin to being able to use a pencil. It is a basic requirement of many higher level roles.

But knowing which end of a pencil to put on the paper (the equivalent of the level of coding taught in schools) isn't the same as being an artist. Moreover anyone who knows the field recognises that top coders are gifted, they embody genius. There are coding Caravaggio's out there, but few have the experience to know that. No amount of teaching will produce high level coders from average humans, there is an intangible something needed, as there is in music and art, to elevate the merely good to genius.

All to say, however many are taught the basics, it won't push down the value of the most talented coders, and so won't reduce the costs of the technology industry in any meaningful way as it is an industry, like art, that relies on the few not the many.

DebuggingLife , 21 Sep 2017 14:06
Not all of those children will want to become programmers but at least the barrier to entry, - for more to at least experience it - will be lower.

Teaching music to only the children whose parents can afford music tuition means than society misses out on a greater potential for some incredible gifted musicians to shine through.

Moreover, learning to code really means learning how to wrangle with the practical application of abstract concepts, algorithms, numerical skills, logic, reasoning, etc. which are all transferrable skills some of which are not in the scope of other classes, certainly practically.
Like music, sport, literature etc. programming a computer, a website, a device, a smartphone is an endeavour that can be truly rewarding as merely a pastime, and similarly is limited only by ones imagination.

rgilyead , 21 Sep 2017 14:01
"...coding is not magic. It is a technical skill, akin to carpentry. " I think that is a severe underestimation of the level of expertise required to conceptualise and deliver robust and maintainable code. The complexity of integrating software is more equivalent to constructing an entire building with components of different materials. If you think teaching coding is enough to enable software design and delivery then good luck.
Taylor Dotson -> cwblackwell , 21 Sep 2017 14:00
Yeah, but mania over coding skills inevitably pushes over skills out of the curriculum (or deemphasizes it). Education is zero-sum in that there's only so much time and energy to devote to it. Hence, you need more than vague appeals to "enhancement," especially given the risks pointed out by the author.
Taylor Dotson -> PolydentateBrigand , 21 Sep 2017 13:57
"Talented coders will start new tech businesses and create more jobs."

That could be argued for any skill set, including those found in the humanities and social sciences likely to pushed out by the mania over coding ability. Education is zero-sum: Time spent on one subject is time that invariably can't be spent learning something else.

Taylor Dotson -> WumpieJr , 21 Sep 2017 13:49
"If they can't literally fix everything let's just get rid of them, right?"

That's a strawman. His point is rooted in the recognition that we only have so much time, energy, and money to invest in solutions. One's that feel good but may not do anything distract us for the deeper structural issues in our economy. The probably with thinking "education" will fix everything is that it leaves the status quo unquestioned.

martinusher , 21 Sep 2017 13:31
Being able to write code and being able to program are two very different skills. In language terms its the difference between being able to read and write (say) English and being able to write literature; obviously you need a grasp of the language to write literature but just knowing the language is not the same as being able to assemble and marshal thought into a coherent pattern prior to setting it down.

To confuse things further there's various levels of skill that all look the same to the untutored eye. Suppose you wished to bridge a waterway. If that waterway was a narrow ditch then you could just throw a plank across. As the distance to be spanned got larger and larger eventually you'd have to abandon intuition for engineering and experience. Exactly the same issues happen with software but they're less tangible; anyone can build a small program but a complex system requires a lot of other knowledge (in my field, that's engineering knowledge -- coding is almost an afterthought).

Its a good idea to teach young people to code but I wouldn't raise their expectations of huge salaries too much. For children educating them in wider, more general, fields and abstract activities such as music will pay off huge dividends, far more than just teaching them whatever the fashionable language du jour is. (...which should be Logo but its too subtle and abstract, it doesn't look "real world" enough!).

freeandfair , 21 Sep 2017 13:30
I don't see this is an issue. Sure, there could be ulterior motives there, but anyone who wants to still be employed in 20 years has to know how to code . It is not that everyone will be a coder, but their jobs will either include part-time coding or will require understanding of software and what it can and cannot do. AI is going to be everywhere.
WumpieJr , 21 Sep 2017 13:23
What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

youngsteveo , 21 Sep 2017 13:16
I'm not going to argue that the goal of mass education isn't to drive down wages, but the idea that the skills gap is a myth doesn't hold water in my experience. I'm a software engineer and manager at a company that pays well over the national average, with great benefits, and it is downright difficult to find a qualified applicant who can pass a rudimentary coding exam.

A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job. Secondarily, while I agree that one day our field might be replaced by automation, there's a level of creativity involved with good software engineering that makes your carpenter comparison a bit flawed.

[Oct 02, 2017] Does programming provides a new path to the middle class? Probably no longer, unless you are really talanted. In the latter case it is not that different from any other fields, but the pressure from H1B makes is harder for programmers. The neoliberal USA have a real problem with the social mobility

Notable quotes:
"... I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse ..."
"... This is interesting. Indeed, I do think there is excess supply of software programmers. ..."
"... Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US. ..."
"... Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history. ..."
"... I was laid off at your age in the depths of the recent recession and I got a job. ..."
"... The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done. ..."
"... Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible. ..."
Oct 02, 2017 | discussion.theguardian.com
swelle , 21 Sep 2017 17:36
I do think it's peculiar that Silicon Valley requires so many H1B visas... 'we can't find the talent here' is the main excuse, though many 'older' (read: over 40) native-born tech workers will tell your that's plenty of talent here already, but even with the immigration hassles, H1B workers will be cheaper overall...

Julian Williams , 21 Sep 2017 18:06

This is interesting. Indeed, I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security. However, these jobs are usually occupied and the incumbents are not likely to move on quickly. Road blocks are also put up by creating sub networks of engineers who ensure that some knowledge is not ubiquitous.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

Still, the ability to write a computer program in an enabler, knowing how it works means you have an ability to imagine something and make it real. To me it is a bit like language, some people can use language to make more money than others, but it is still important to be able to have a basic level of understanding.

FabBlondie -> peter nelson , 21 Sep 2017 17:42
And yet I know a lot of people that has happened to. Better to replace a $125K a year programmer with one who will do the same, or even less, job for $50K.

JMColwill , 21 Sep 2017 18:17

This could backfire if the programmers don't find the work or pay to match their expectations... Programmers, after all tend to make very good hackers if their minds are turned to it.

freeandfair -> FabBlondie , 21 Sep 2017 18:23

> While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

Well, I am a software architect and what he says sounds correct for a certain type of applications. Maybe you do a different type of programming.

peter nelson -> FabBlondie , 21 Sep 2017 18:23

While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done.

How else can you do it?

Java is popular because it's a very versatile language - On this list it's the most popular general-purpose programming language. (Above it javascript is just a scripting language and HTML/CSS aren't even programming languages) https://fossbytes.com/most-used-popular-programming-languages/ ... and below it you have to go down to C# at 20% to come to another general-purpose language, and even that's a Microsoft house language.

Also the "correct" choice of programming languages is also based on how many people in the shop know it so they maintain code that's written in it by someone else.

freeandfair -> FabBlondie , 21 Sep 2017 18:22
> job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training.

Well, it is either that or the kids themselves who have to pay for it and they are even less prepared to do so. Ideally, college education should be tax payer paid but this is not the case in the US. And the employer ideally should pay for the job related training, but again, it is not the case in the US.

freeandfair -> mlzarathustra , 21 Sep 2017 18:20
> The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the buck

Plenty of people care about the arts but people can't survive on what the arts pay. That was pretty much the case all through human history.

theindyisbetter -> Game Cabbage , 21 Sep 2017 18:18
No. The amount of work is not a fixed sum. That's the lump of labour fallacy. We are not tied to the land.
ConBrio , 21 Sep 2017 18:10
Since newspaper are consolidating and cutting jobs gotta clamp down on colleges offering BA degrees, particularly in English Literature and journalism.

And then... and...then...and...

LMichelle -> chillisauce , 21 Sep 2017 18:03
This article focuses on the US schools, but I can imagine it's the same in the UK. I don't think these courses are going to be about creating great programmers capable of new innovations as much as having a work force that can be their own IT Help Desk.

They'll learn just enough in these classes to do that.

Then most companies will be hiring for other jobs, but want to make sure you have the IT skills to serve as your own "help desk" (although they will get no salary for their IT work).

edmundberk -> FabBlondie , 21 Sep 2017 17:57
I find that quite remarkable - 40 years ago you must have been using assembler and with hardly any memory to work with. If you blitzed through that without applying the thought processes described, well...I'm surprised.
James Dey , 21 Sep 2017 17:55
Funny. Every day in the Brexit articles, I read that increasing the supply of workers has negligible effect on wages.
peter nelson -> peterainbow , 21 Sep 2017 17:54
I was laid off at your age in the depths of the recent recession and I got a job. As I said in another posting, it usually comes down to fresh skills and good personal references who will vouch for your work-habits and how well you get on with other members of your team.

The great thing about software , as opposed to many other jobs, is that it can be done at home which you're laid off. Write mobile (IOS or Android) apps or work on open source projects and get stuff up on github. I've been to many job interviews with my apps loaded on mobile devices so I could show them what I've done.

Game Cabbage -> theindyisbetter , 21 Sep 2017 17:52
The situation has a direct comparison to today. It has nothing to do with land. There was a certain amount of profit making work and not enough labour to satisfy demand. There is currently a certain amount of profit making work and in many situations (especially unskilled low paid work) too much labour.
edmundberk , 21 Sep 2017 17:52
So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Or is this the most obtuse argument yet for avoiding what everyone in tech knows - even more blatantly than in many other industries, wages are curtailed by offshoring; and in the US, by having offshoring centres on US soil.

chillisauce , 21 Sep 2017 17:48
Well, speaking as someone who spends a lot of time trying to find really good programmers... frankly there aren't that many about. We take most of ours from Eastern Europe and SE Asia, which is quite expensive, given the relocation costs to the UK. But worth it.

So, yes, if more British kids learnt about coding, it might help a bit. But not much; the real problem is that few kids want to study IT in the first place, and that the tuition standards in most UK universities are quite low, even if they get there.

Baobab73 , 21 Sep 2017 17:48
True......
peter nelson -> rebel7 , 21 Sep 2017 17:47
There was recently an programme/podcast on ABC/RN about the HUGE shortage in Australia of techies with specialized security skills.
peter nelson -> jigen , 21 Sep 2017 17:46
Robots, or AI, are already making us more productive. I can write programs today in an afternoon that would have taken me a week a decade or two ago.

I can create a class and the IDE will take care of all the accessors, dependencies, enforce our style-guide compliance, stub-in the documentation ,even most test cases, etc, and all I have to write is very-specific stuff required by my application - the other 90% is generated for me. Same with UI/UX - stubs in relevant event handlers, bindings, dependencies, etc.

Programmers are a zillion times more productive than in the past, yet the demand keeps growing because so much more stuff in our lives has processors and code. Your car has dozens of processors running lots of software; your TV, your home appliances, your watch, etc.

Quaestor , 21 Sep 2017 17:43

Schools really can't win. Don't teach coding, and you're raising a generation of button-pushers. Teach it, and you're pandering to employers looking for cheap labour. Unions in London objected to children being taught carpentry in the twenties and thirties, so it had to be renamed "manual instruction" to get round it. Denying children useful skills is indefensible.

jamesupton , 21 Sep 2017 17:42
Getting children to learn how to write code, as part of core education, will be the first step to the long overdue revolution. The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.
cjenk415 -> LMichelle , 21 Sep 2017 17:40
did you misread? it seemed like he was emphasizing that learning to code, like learning art (and sports and languages), will help them develop skills that benefit them in whatever profession they choose.
FabBlondie -> peter nelson , 21 Sep 2017 17:40
While I like your idea of what designing a computer program involves, in my nearly 40 years experience as a programmer I have rarely seen this done. And, FWIW, IMHO choosing the tool (programming language) might reasonably be expected to follow designing a solution, in practice this rarely happens. No, these days it's Java all the way, from day one.
theindyisbetter -> Game Cabbage , 21 Sep 2017 17:40
There was a fixed supply of land and a reduced supply of labour to work the land.

Nothing like then situation in a modern economy.

LMichelle , 21 Sep 2017 17:39
I'd advise parents that the classes they need to make sure their kids excel in are acting/drama. There is no better way to getting that promotion or increasing your pay like being a skilled actor in the job market. It's a fake it till you make it deal.
theindyisbetter , 21 Sep 2017 17:36
What a ludicrous argument.

Let's not teach maths or science or literacy either - then anyone with those skills will earn more.

SheriffFatman -> Game Cabbage , 21 Sep 2017 17:36

After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions

It also produced wage-control legislation (which admittedly failed to work).

peter nelson -> peterainbow , 21 Sep 2017 17:32
if there were truly a shortage i wouldn't be unemployed

I've heard that before but when I've dug deeper I've usually found someone who either let their skills go stale, or who had some work issues.

LMichelle -> loveyy , 21 Sep 2017 17:26
Really? You think they are going to emphasize things like the importance of privacy and consumer rights?
loveyy , 21 Sep 2017 17:25
This really has to be one of the silliest articles I read here in a very long time.
People, let your children learn to code. Even more, educate yourselves and start to code just for the fun of it - look at it like a game.
The more people know how to code the less likely they are to understand how stuff works. If you were ever frustrated by how impossible it seems to shop on certain websites, learn to code and you will be frustrated no more. You will understand the intent behind the process.
Even more, you will understand the inherent limitations and what is the meaning of safety. You will be able to better protect yourself in a real time connected world.

Learning to code won't turn your kid into a programmer, just like ballet or piano classes won't mean they'll ever choose art as their livelihood. So let the children learn to code and learn along with them

Game Cabbage , 21 Sep 2017 17:24
Tipping power to employers in any profession by oversupply of labour is not a good thing. Bit of a macabre example here but...After the Black Death in the middle ages there was a huge under supply of labour. It produced a consistent rise in wages and conditions and economic development for hundreds of years after this. Not suggesting a massive depopulation. But you can achieve the same effects by altering the power balance. With decades of Neoliberalism, the employers side of the power see-saw is sitting firmly in the mud and is producing very undesired results for the vast majority of people.
Zuffle -> peterainbow , 21 Sep 2017 17:23
Perhaps you're just not very good. I've been a developer for 20 years and I've never had more than 1 week of unemployment.
Kevin P Brown -> peterainbow , 21 Sep 2017 17:20
" at 55 finding it impossible to get a job"

I am 59, and it is not just the age aspect it is the money aspect. They know you have experience and expectations, and yet they believe hiring someone half the age and half the price, times 2 will replace your knowledge. I have been contracting in IT for 30 years, and now it is obvious it is over. Experience at some point no longer mitigates age. I think I am at that point now.

TheLane82 , 21 Sep 2017 17:20
Completely true! What needs to happen instead is to teach the real valuable subjects.

Gender studies. Islamic studies. Black studies. All important issues that need to be addressed.

peter nelson -> mlzarathustra , 21 Sep 2017 17:06
Dear, dear, I know, I know, young people today . . . just not as good as we were. Everything is just going down the loo . . . Just have a nice cuppa camomile (or chamomile if you're a Yank) and try to relax ... " hey you kids, get offa my lawn !"
FabBlondie , 21 Sep 2017 17:06
There are good reasons to teach coding. Too many of today's computer users are amazingly unaware of the technology that allows them to send and receive emails, use their smart phones, and use websites. Few understand the basic issues involved in computer security, especially as it relates to their personal privacy. Hopefully some introductory computer classes could begin to remedy this, and the younger the students the better.

Security problems are not strictly a matter of coding.

Security issues persist in tech. Clearly that is not a function of the size of the workforce. I propose that it is a function of poor management and design skills. These are not taught in any programming class I ever took. I learned these on the job and in an MBA program, and because I was determined.

Don't confuse basic workforce training with an effective application of tech to authentic needs.

How can the "disruption" so prized in today's Big Tech do anything but aggravate our social problems? Tech's disruption begins with a blatant ignorance of and disregard for causes, and believes to its bones that a high tech app will truly solve a problem it cannot even describe.

Kool Aid anyone?

peterainbow -> brady , 21 Sep 2017 17:05
indeed that idea has been around as long as cobol and in practice has just made things worse, the fact that many people outside of software engineering don;t seem to realise is that the coding itself is a relatively small part of the job
FabBlondie -> imipak , 21 Sep 2017 17:04
Hurrah.
peterainbow -> rebel7 , 21 Sep 2017 17:04
so how many female and old software engineers are there who are unable to get a job, i'm one of them at 55 finding it impossible to get a job and unlike many 'developers' i know what i'm doing
peterainbow , 21 Sep 2017 17:02
meanwhile the age and sex discrimination in IT goes on, if there were truly a shortage i wouldn't be unemployed
Jared Hall -> peter nelson , 21 Sep 2017 17:01
Training more people for an occupation will result in more people becoming qualified to perform that occupation, irregardless of the fact that many will perform poorly at it. A CS degree is no guarantee of competency, but it is one of the best indicators of general qualification we have at the moment. If you can provide a better metric for analyzing the underlying qualifications of the labor force, I'd love to hear it.

Regarding your anecdote, while interesting, it poor evidence when compared to the aggregate statistical data analyzed in the EPI study.

peter nelson -> FabBlondie , 21 Sep 2017 17:00

Job-specific training is completely different.

Good grief. It's not job-specific training. You sound like someone who knows nothing about computer programming.

Designing a computer program requires analysing the task; breaking it down into its components, prioritising them and identifying interdependencies, and figuring out which parts of it can be broken out and done separately. Expressing all this in some programming language like Java, C, or C++ is quite secondary.

So once you learn to organise a task properly you can apply it to anything - remodeling a house, planning a vacation, repairing a car, starting a business, or administering a (non-software) project at work.

[Oct 02, 2017] Evaluation of potential job candidates for programming job should include evaluation of thier previous projects and code written

Notable quotes:
"... Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst. ..."
"... how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts ..."
"... And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that. ..."
"... most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe. ..."
Oct 02, 2017 | discussion.theguardian.com

peter nelson -> c mm , 21 Sep 2017 19:49

Instant feedback is one of the things I really like about programming, but it's also the thing that some people can't handle. As I'm developing a program all day long the compiler is telling me about build errors or warnings or when I go to execute it it crashes or produces unexpected output, etc. Software engineers are bombarded all day with negative feedback and little failures. You have to be thick-skinned for this work.
peter nelson -> peterainbow , 21 Sep 2017 19:42
How is it shallow and lazy? I'm hiring for the real world so I want to see some real world accomplishments. If the candidate is fresh out of university they can't point to work projects in industry because they don't have any. But they CAN point to stuff they've done on their own. That shows both motivation and the ability to finish something. Why do you object to it?
anticapitalist -> peter nelson , 21 Sep 2017 14:47
Thank you. The kids that spend high school researching independently and spend their nights hacking just for the love of it and getting a job without college are some of the most competent I've ever worked with. Passionless college grads that just want a paycheck are some of the worst.
John Kendall , 21 Sep 2017 19:42
There is a big difference between "coding" and programming. Coding for a smart phone app is a matter of calling functions that are built into the device. For example, there are functions for the GPS or for creating buttons or for simulating motion in a game. These are what we used to call subroutines. The difference is that whereas we had to write our own subroutines, now they are just preprogrammed functions. How those functions are written is of little or no importance to today's coders.

Nor are they able to program on that level. Real programming requires not only a knowledge of programming languages, but also a knowledge of the underlying algorithms that make up actual programs. I suspect that "coding" classes operate on a quite superficial level.

Game Cabbage -> theindyisbetter , 21 Sep 2017 19:40
Its not about the amount of work or the amount of labor. Its about the comparative availability of both and how that affects the balance of power, and that in turn affects the overall quality of life for the 'majority' of people.
c mm -> Ed209 , 21 Sep 2017 19:39
Most of this is not true. Peter Nelson gets it right by talking about breaking steps down and thinking rationally. The reason you can't just teach the theory, however, is that humans learn much better with feedback. Think about trying to learn how to build a fast car, but you never get in and test its speed. That would be silly. Programming languages take the system of logic that has been developed for centuries and gives instant feedback on the results. It's a language of rationality.
peter nelson -> peterainbow , 21 Sep 2017 19:37
This article is about the US. The tech industry in the EU is entirely different, and basically moribund. Where is the EU's Microsoft, Apple, Google, Amazon, Oracle, Intel, Facebook, etc, etc? The opportunities for exciting interesting work, plus the time and schedule pressures that force companies to overlook stuff like age because they need a particular skill Right Now, don't exist in the EU. I've done very well as a software engineer in my 60's in the US; I cannot imagine that would be the case in the EU.
peterainbow -> peter nelson , 21 Sep 2017 19:37
sorry but that's just not true, i doubt you are really programming still, or quasi programmer but really a manager who like to keep their hand in, you certainly aren't busy as you've been posting all over this cif. also why would you try and hire someone with such disparate skillsets, makes no sense at all

oh and you'd be correct that i do have workplace issues, ie i have a disability and i also suffer from depression, but that shouldn't bar me from employment and again regarding my skills going stale, that again contradicts your statement that it's about planning/analysis/algorithms etc that you said above ( which to some extent i agree with )

c mm -> peterainbow , 21 Sep 2017 19:36
Not at all, it's really egalitarian. If I want to hire someone to paint my portrait, the best way to know if they're any good is to see their previous work. If they've never painted a portrait before then I may want to go with the girl who has
c mm -> ragingbull , 21 Sep 2017 19:34
There is definitely not an excess. Just look at projected jobs for computer science on the Bureau of Labor statistics.
c mm -> perble conk , 21 Sep 2017 19:32
Right? It's ridiculous. "Hey, there's this industry you can train for that is super valuable to society and pays really well!"
Then Ben Tarnoff, "Don't do it! If you do you'll drive down wages for everyone else in the industry. Build your fire starting and rock breaking skills instead."
peterainbow -> peter nelson , 21 Sep 2017 19:29
how about how new labor tried to sign away IT access in England to India in exchange for banking access there, how about the huge loopholes in bringing in cheap IT workers from elsewhere in the world, not conspiracies, but facts
peter nelson -> eirsatz , 21 Sep 2017 19:25
I think the difference between gifted and not is motivation. But I agree it's not innate. The kid who stayed up all night in high school hacking into the school server to fake his coding class grade is probably more gifted than the one who spent 4 years in college getting a BS in CS because someone told him he could get a job when he got out.

I've done some hiring in my life and I always ask them to tell me about stuff they did on their own.

peter nelson -> TheBananaBender , 21 Sep 2017 19:20

Most coding jobs are bug fixing.

The only bugs I have to fix are the ones I make.

peter nelson -> Ed209 , 21 Sep 2017 19:19
As several people have pointed out, writing a computer program requires analyzing and breaking down a task into steps, identifying interdependencies, prioritizing the order, figuring out what parts can be organized into separate tasks that be done separately, etc.

These are completely independent of the language - I've been programming for 40 years in everything from FORTRAN to APL to C to C# to Java and it's all the same. Not only that but they transcend programming - they apply to planning a vacation, remodeling a house, or fixing a car.

peter nelson -> ragingbull , 21 Sep 2017 19:14
Neither coding nor having a bachelor's degree in computer science makes you a suitable job candidate. I've done a lot of recruiting and interviews in my life, and right now I'm trying to hire someone. And I've never recommended hiring anyone right out of school who could not point me to a project they did on their own, i.e., not just grades and test scores. I'd like to see an IOS or Android app, or a open-source component, or utility or program of theirs on GitHub, or something like that.

That's the thing that distinguishes software from many other fields - you can do something real and significant on your own. If you haven't managed to do so in 4 years of college you're not a good candidate.

peter nelson -> nickGregor , 21 Sep 2017 19:07
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it.

In a sense that's already true, as i noted elsewhere. 90% of the code in my projects (Java and C# in their respective IDEs) is machine generated. I do relatively little "coding". But the flaw in your idea is this: most of what software designers do is not coding. It requires domain knowledge and that's where the "smart" IDEs and AI coding wizards fall down. It will be a long time before we get where you describe.

Ricardo111 -> martinusher , 21 Sep 2017 19:03
Completely agree. At the highest levels there is more work that goes into managing complexity and making sure nothing is missed than in making the wheels turn and the beepers beep.
ragingbull , 21 Sep 2017 19:02
Hang on... if the current excess of computer science grads is not driving down wages, why would training more kids to code make any difference?
Ricardo111 -> youngsteveo , 21 Sep 2017 18:59
I've actually interviewed people for very senior technical positions in Investment Banks who had all the fancy talk in the world and yet failed at some very basic "write me a piece of code that does X" tests.

Next hurdle on is people who have learned how to deal with certain situations and yet don't really understand how it works so are unable to figure it out if you change the problem parameters.

That said, the average coder is only slightly beyond this point. The ones who can take in account maintenability and flexibility for future enhancements when developing are already a minority, and those who can understand the why of software development process steps, design software system architectures or do a proper Technical Analysis are very rare.

eirsatz -> Ricardo111 , 21 Sep 2017 18:57
Hubris. It's easy to mistake efficiency born of experience as innate talent. The difference between a 'gifted coder' and a 'non gifted junior coder' is much more likely to be 10 or 15 years sitting at a computer, less if there are good managers and mentors involved.
Ed209 , 21 Sep 2017 18:57
Politicians love the idea of teaching children to 'code', because it sounds so modern, and nobody could possible object... could they? Unfortunately it simply shows up their utter ignorance of technical matters because there isn't a language called 'coding'. Computer programming languages have changed enormously over the years, and continue to evolve. If you learn the wrong language you'll be about as welcome in the IT industry as a lamp-lighter or a comptometer operator.

The pace of change in technology can render skills and qualifications obsolete in a matter of a few years, and only the very best IT employers will bother to retrain their staff - it's much cheaper to dump them. (Most IT posts are outsourced through agencies anyway - those that haven't been off-shored. )

peter nelson -> YEverKnot , 21 Sep 2017 18:54
And this isn't even a good conspiracy theory; it's a bad one. He offers no evidence that there's an actual plan or conspiracy to do this. I'm looking for an account of where the advocates of coding education met to plot this in some castle in Europe or maybe a secret document like "The Protocols of the Elders of Google", or some such.
TheBananaBender , 21 Sep 2017 18:52
Most jobs in IT are shit - desktop support, operations droids. Most coding jobs are bug fixing.
Ricardo111 -> Wiretrip , 21 Sep 2017 18:49
Tool Users Vs Tool Makers. The really good coders actually get why certain things work as they do and can adjust them for different conditions. The mass produced coders are basically code copiers and code gluing specialists.
peter nelson -> AmyInNH , 21 Sep 2017 18:49
People who get Masters and PhD's in computer science are not usually "coders" or software engineers - they're usually involved in obscure, esoteric research for which there really is very little demand. So it doesn't surprise me that they're unemployed. But if someone has a Bachelor's in CS and they're unemployed I would have to wonder what they spent their time at university doing.

The thing about software that distinguishes it from lots of other fields is that you can make something real and significant on your own . I would expect any recent CS major I hire to be able to show me an app or an open-source component or something similar that they made themselves, and not just test scores and grades. If they could not then I wouldn't even think about hiring them.

Ricardo111 , 21 Sep 2017 18:44
Fortunately for those of us who are actually good at coding, the difference in productivity between a gifted coder and a non-gifted junior developer is something like 100-fold. Knowing how to code and actually being efficient at creating software programs and systems are about as far apart as knowing how to write and actually being able to write a bestselling exciting Crime trilogy.
peter nelson -> jamesupton , 21 Sep 2017 18:36

The rest of us will still have to stick to burning buildings down and stringing up the aristocracy.

If you know how to write software you can get a robot to do those things.

peter nelson -> Julian Williams , 21 Sep 2017 18:34
I do think there is excess supply of software programmers. There is only a modest number of decent jobs, say as an algorithms developer in finance, general architecture of complex systems or to some extent in systems security.

This article is about coding; most of those jobs require very little of that.

Most very high paying jobs in the technology sector are in the same standard upper management roles as in every other industry.

How do you define "high paying". Everyone I know (and I know a lot because I've been a sw engineer for 40 years) who is working fulltime as a software engineer is making a high-middle-class salary, and can easily afford a home, travel on holiday, investments, etc.

YEverKnot , 21 Sep 2017 18:32

Tech's push to teach coding isn't about kids' success – it's about cutting wages

Nowt like a good conspiracy theory.
freeandfair -> WithoutPurpose , 21 Sep 2017 18:31
What is a stupidly low salary? 100K?
freeandfair -> AmyInNH , 21 Sep 2017 18:30
> Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.

That just means 50% of them are no good and need to develop their skills further or try something else.
Not every with a STEM degree from some 3rd rate college is capable of doing complex IT or STEM work.

peter nelson -> edmundberk , 21 Sep 2017 18:30

So, is teaching people English or arithmetic all about reducing wages for the literate and numerate?

Yes. Haven't you noticed how wage growth has flattened? That's because some do-gooders" thought it would be a fine idea to educate the peasants. There was a time when only the well-to do knew how to read and write, and that's why they well-to-do were well-to-do. Education is evil. Stop educating people and then those of us who know how to read and write can charge them for reading and writing letters and email. Better yet, we can have Chinese and Indians do it for us and we just charge a transaction fee.

AmyInNH -> peter nelson , 21 Sep 2017 18:27
Massive amounts of public use cars, it doesn't mean millions need schooling in auto mechanics. Same for software coding. We aren't even using those who have Bachelors, Masters and PhDs in CS.
carlospapafritas , 21 Sep 2017 18:27
"..importing large numbers of skilled guest workers from other countries through the H1-B visa program..."

"skilled" is good. H1B has long ( appx 17 years) been abused and turned into trafficking scheme. One can buy H1B in India. Powerful ethnic networks wheeling & dealing in US & EU selling IT jobs to essentially migrants.

The real IT wages haven't been stagnant but steadily falling from the 90s. It's easy to see why. $82K/year IT wage was about average in the 90s. Comparing the prices of housing (& pretty much everything else) between now gives you the idea.

freeandfair -> whitehawk66 , 21 Sep 2017 18:27
> not every kid wants or needs to have their soul sucked out of them sitting in front of a screen full of code for some idiotic service that some other douchbro thinks is the next iteration of sliced bread

Taking a couple of years of programming are not enough to do this as a job, don't worry.
But learning to code is like learning maths, - it helps to develop logical thinking, which will benefit you in every area of your life.

James Dey , 21 Sep 2017 18:25
We should stop teaching our kids to be journalists, then your wage might go up.
peter nelson -> AmyInNH , 21 Sep 2017 18:23
What does this even mean?

[Oct 02, 2017] Programming is a culturally important skill

Notable quotes:
"... A lot of basic entry level jobs require a good level of Excel skills. ..."
"... Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. ..."
"... What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra. ..."
"... Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it. ..."
"... We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money. ..."
"... Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled. ..."
"... Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely. ..."
"... We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? ..."
www.moonofalabama.org
David McCaul -> IanMcLzzz , 21 Sep 2017 13:03
There are very few professional Scribes nowadays, a good level of reading & writing is simplely a default even for the lowest paid jobs. A lot of basic entry level jobs require a good level of Excel skills. Several years from now basic coding will be necessary to manipulate basic tools for entry level jobs, especially as increasingly a lot of real code will be generated by expert systems supervised by a tiny number of supervisors. Coding jobs will go the same way that trucking jobs will go when driverless vehicles are perfected.

anticapitalist, 21 Sep 2017 14:25

Offer the class but not mandatory. Just like I could never succeed playing football others will not succeed at coding. The last thing the industry needs is more bad developers showing up for a paycheck.

Fanastril , 21 Sep 2017 14:08

Programming is a cultural skill; master it, or even understand it on a simple level, and you understand how the 21st century works, on the machinery level. To bereave the children of this crucial insight is to close off a door to their future. What's next, keep them off Math, because, you know . .
Taylor Dotson -> freeandfair , 21 Sep 2017 13:59
That's some crystal ball you have there. English teachers will need to know how to code? Same with plumbers? Same with janitors, CEOs, and anyone working in the service industry?
PolydentateBrigand , 21 Sep 2017 12:59
The economy isn't a zero-sum game. Developing a more skilled workforce that can create more value will lead to economic growth and improvement in the general standard of living. Talented coders will start new tech businesses and create more jobs.

WumpieJr , 21 Sep 2017 13:23

What a dumpster argument. I am not a programmer or even close, but a basic understanding of coding has been important to my professional life. Coding isn't just about writing software. Understanding how algorithms work, even simple ones, is a general skill on par with algebra.

But is isn't just about coding for Tarnoff. He seems to hold education in contempt generally. "The far-fetched premise of neoliberal school reform is that education can mend our disintegrating social fabric." If they can't literally fix everything let's just get rid of them, right?

Never mind that a good education is clearly one of the most important things you can do for a person to improve their quality of life wherever they live in the world. It's "neoliberal," so we better hate it.

mlzarathustra , 21 Sep 2017 16:52
I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.

The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.

I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.

Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.

AmyInNH -> peter nelson , 21 Sep 2017 16:51
Geeze it'd be nice if you'd make an effort.
rucore.libraries.rutgers.edu/rutgers-lib/45960/PDF/1/
https://rucore.libraries.rutgers.edu/rutgers-lib/46156 /
https://rucore.libraries.rutgers.edu/rutgers-lib/46207 /
peter nelson -> WyntonK , 21 Sep 2017 16:45
Libertarianism posits that everyone should be free to sell their labour or negotiate their own arrangements without the state interfering. So if cheaper foreign labour really was undercutting American labout the Libertarians would be thrilled.

But it's not. I'm in my 60's and retiring but I've been a software engineer all my life. I've worked for many different companies, and in different industries and I've never had any trouble competing with cheap imported workers. The people I've seen fall behind were ones who did not keep their skills fresh. When I was laid off in 2009 in my mid-50's I made sure my mobile-app skills were bleeding edge (in those days ANYTHING having to do with mobile was bleeding edge) and I used to go to job interviews with mobile devices to showcase what I could do. That way they could see for themselves and not have to rely on just a CV.

They older guys who fell behind did so because their skills and toolsets had become obsolete.

Now I'm trying to hire a replacement to write Android code for use in industrial production and struggling to find someone with enough experience. So where is this oversupply I keep hearing about?

Jared Hall -> RogTheDodge , 21 Sep 2017 16:42
Not producing enough to fill vacancies or not producing enough to keep wages at Google's preferred rate? Seeing as research shows there is no lack of qualified developers, the latter option seems more likely.
JayThomas , 21 Sep 2017 16:39

It's about ensuring those salaries no longer exist, by creating a source of cheap labor for the tech industry.

We're already using Asia as a source of cheap labor for the tech industry. Why do we need to create cheap labor in the US? That just seems inefficient.

FabBlondie -> RogTheDodge , 21 Sep 2017 16:39
There was never any need to give our jobs to foreigners. That is, if you are comparing the production of domestic vs. foreign workers. The sole need was, and is, to increase profits.
peter nelson -> AmyInNH , 21 Sep 2017 16:34
Link?
FabBlondie , 21 Sep 2017 16:34
Schools MAY be able to fix big social problems, but only if they teach a well-rounded curriculum that includes classical history and the humanities. Job-specific training is completely different. What a joke to persuade public school districts to pick up the tab on job training. The existing social problems were not caused by a lack of programmers, and cannot be solved by Big Tech.

I agree with the author that computer programming skills are not that limited in availability. Big Tech solved the problem of the well-paid professional some years ago by letting them go, these were mostly workers in their 50s, and replacing them with H1-B visa-holders from India -- who work for a fraction of their experienced American counterparts.

It is all about profits. Big Tech is no different than any other "industry."

peter nelson -> Jared Hall , 21 Sep 2017 16:31
Supply of apples does not affect the demand for oranges. Teaching coding in high school does not necessarily alter the supply of software engineers. I studied Chinese History and geology at University but my doing so has had no effect on the job prospects of people doing those things for a living.
johnontheleft -> Taylor Dotson , 21 Sep 2017 16:30
You would be surprised just how much a little coding knowledge has transformed my ability to do my job (a job that is not directly related to IT at all).
peter nelson -> Jared Hall , 21 Sep 2017 16:29
Because teaching coding does not affect the supply of actual engineers. I've been a professional software engineer for 40 years and coding is only a small fraction of what I do.
peter nelson -> Jared Hall , 21 Sep 2017 16:28
You and the linked article don't know what you're talking about. A CS degree does not equate to a productive engineer.

A few years ago I was on the recruiting and interviewing committee to try to hire some software engineers for a scientific instrument my company was making. The entire team had about 60 people (hw, sw, mech engineers) but we needed 2 or 3 sw engineers with math and signal-processing expertise. The project was held up for SIX months because we could not find the people we needed. It would have taken a lot longer than that to train someone up to our needs. Eventually we brought in some Chinese engineers which cost us MORE than what we would have paid for an American engineer when you factor in the agency and visa paperwork.

Modern software engineers are not just generic interchangable parts - 21st century technology often requires specialised scientific, mathematical, production or business domain-specific knowledge and those people are hard to find.

freeluna -> freeluna , 21 Sep 2017 16:18
...also, this article is alarmist and I disagree with it. Dear Author, Phphphphtttt! Sincerely, freeluna
AmyInNH , 21 Sep 2017 16:16
Regimentation of the many, for benefit of the few.
AmyInNH -> Whatitsaysonthetin , 21 Sep 2017 16:15
Visa jobs are part of trade agreements. To be very specific, US gov (and EU) trade Western jobs for market access in the East.
http://www.marketwatch.com/story/in-india-british-leader-theresa-may-preaches-free-trade-2016-11-07
There is no shortage. This is selling off the West's middle class.
Take a look at remittances in wikipedia and you'll get a good idea just how much it costs the US and EU economies, for sake of record profits to Western industry.
jigen , 21 Sep 2017 16:13
And thanks to the author for not using the adjective "elegant" in describing coding.
freeluna , 21 Sep 2017 16:13
I see advantages in teaching kids to code, and for kids to make arduino and other CPU powered things. I don't see a lot of interest in science and tech coming from kids in school. There are too many distractions from social media and game platforms, and not much interest in developing tools for future tech and science.
jigen , 21 Sep 2017 16:13
Let the robots do the coding. Sorted.
FluffyDog -> rgilyead , 21 Sep 2017 16:13
Although coding per se is a technical skill it isn't designing or integrating systems. It is only a small, although essential, part of the whole software engineering process. Learning to code just gets you up the first steps of a high ladder that you need to climb a fair way if you intend to use your skills to earn a decent living.
rebel7 , 21 Sep 2017 16:11
BS.

Friend of mine in the SV tech industry reports that they are about 100,000 programmers short in just the internet security field.

Y'all are trying to create a problem where there isn't one. Maybe we shouldn't teach them how to read either. They might want to work somewhere besides the grill at McDonalds.

AmyInNH -> WyntonK , 21 Sep 2017 16:11
To which they will respond, offshore.
AmyInNH -> MrFumoFumo , 21 Sep 2017 16:10
They're not looking for good, they're looking for cheap + visa indentured. Non-citizens.
nickGregor , 21 Sep 2017 16:09
Within the next year coding will be old news and you will simply be able to describe things in ur native language in such a way that the machine will be able to execute any set of instructions you give it. Coding is going to change from its purely abstract form that is not utilized at peak- but if you can describe what you envision in an effective concise manner u could become a very good coder very quickly -- and competence will be determined entirely by imagination and the barriers of entry will all but be extinct
AmyInNH -> unclestinky , 21 Sep 2017 16:09
Already there. I take it you skipped right past the employment prospects for US STEM grads - 50% chance of finding STEM work.
AmyInNH -> User10006 , 21 Sep 2017 16:06
Apparently a whole lot of people are just making it up, eh?
http://www.motherjones.com/politics/2017/09/inside-the-growing-guest-worker-program-trapping-indian-students-in-virtual-servitude /
From today,
http://www.computerworld.com/article/2915904/it-outsourcing/fury-rises-at-disney-over-use-of-foreign-workers.html
All the way back to 1995,
https://www.youtube.com/watch?v=vW8r3LoI8M4&feature=youtu.be
JCA1507 -> whitehawk66 , 21 Sep 2017 16:04
Bravo
JCA1507 -> DirDigIns , 21 Sep 2017 16:01
Total... utter... no other way... huge... will only get worse... everyone... (not a very nuanced commentary is it).

I'm glad pieces like this are mounting, it is relevant that we counter the mix of messianism and opportunism of Silicon Valley propaganda with convincing arguments.

RogTheDodge -> WithoutPurpose , 21 Sep 2017 16:01
That's not my experience.
AmyInNH -> TTauriStellarbody , 21 Sep 2017 16:01
It's a stall tactic by Silicon Valley, "See, we're trying to resolve the [non-existant] shortage."
AmyInNH -> WyntonK , 21 Sep 2017 16:00
They aren't immigrants. They're visa indentured foreign workers. Why does that matter? It's part of the cheap+indentured hiring criteria. If it were only cheap, they'd be lowballing offers to citizen and US new grads.
RogTheDodge -> Jared Hall , 21 Sep 2017 15:59
No. Because they're the ones wanting them and realizing the US education system is not producing enough
RogTheDodge -> Jared Hall , 21 Sep 2017 15:58
Except the demand is increasing massively.
RogTheDodge -> WyntonK , 21 Sep 2017 15:57
That's why we are trying to educate American coders - so we don't need to give our jobs to foreigners.
AmyInNH , 21 Sep 2017 15:56
Correct premises,
- proletarianize programmers
- many qualified graduates simply can't find jobs.
Invalid conclusion:
- The problem is there aren't enough good jobs to be trained for.

That conclusion only makes sense if you skip right past ...
" importing large numbers of skilled guest workers from other countries through the H1-B visa program. These workers earn less than their American counterparts, and possess little bargaining power because they must remain employed to keep their status"

Hiring Americans doesn't "hurt" their record profits. It's incessant greed and collusion with our corrupt congress.

Oldvinyl , 21 Sep 2017 15:51
This column was really annoying. I taught my students how to program when I was given a free hand to create the computer studies curriculum for a new school I joined. (Not in the UK thank Dog). 7th graders began with studying the history and uses of computers and communications tech. My 8th grade learned about computer logic (AND, OR, NOT, etc) and moved on with QuickBASIC in the second part of the year. My 9th graders learned about databases and SQL and how to use HTML to make their own Web sites. Last year I received a phone call from the father of one student thanking me for creating the course, his son had just received a job offer and now works in San Francisco for Google.
I am so glad I taught them "coding" (UGH) as the writer puts it, rather than arty-farty subjects not worth a damn in the jobs market.
WyntonK -> DirDigIns , 21 Sep 2017 15:47
I live and work in Silicon Valley and you have no idea what you are talking about. There's no shortage of coders at all. Terrific coders are let go because of their age and the availability of much cheaper foreign coders(no, I am not opposed to immigration).
Sean May , 21 Sep 2017 15:43
Looks like you pissed off a ton of people who can't write code and are none to happy with you pointing out the reason they're slinging insurance for geico.

I think you're quite right that coding skills will eventually enter the mainstream and slowly bring down the cost of hiring programmers.

The fact is that even if you don't get paid to be a programmer you can absolutely benefit from having some coding skills.

There may however be some kind of major coding revolution with the advent of quantum computing. The way code is written now could become obsolete.

Jared Hall -> User10006 , 21 Sep 2017 15:43
Why is it a fantasy? Does supply and demand not apply to IT labor pools?
Jared Hall -> ninianpark , 21 Sep 2017 15:42
Why is it a load of crap? If you increase the supply of something with no corresponding increase in demand, the price will decrease.
pictonic , 21 Sep 2017 15:40
A well-argued article that hits the nail on the head. Amongst any group of coders, very few are truly productive, and they are self starters; training is really needed to do the admin.
Jared Hall -> DirDigIns , 21 Sep 2017 15:39
There is not a huge skills shortage. That is why the author linked this EPI report analyzing the data to prove exactly that. This may not be what people want to believe, but it is certainly what the numbers indicate. There is no skills gap.

http://www.epi.org/files/2013/bp359-guestworkers-high-skill-labor-market-analysis.pdf

Axel Seaton -> Jaberwocky , 21 Sep 2017 15:34
Yeah, but the money is crap
DirDigIns -> IanMcLzzz , 21 Sep 2017 15:32
Perfect response for the absolute crap that the article is pushing.
DirDigIns , 21 Sep 2017 15:30
Total and utter crap, no other way to put it.

There is a huge skills shortage in key tech areas that will only get worse if we don't educate and train the young effectively.

Everyone wants youth to have good skills for the knowledge economy and the ability to earn a good salary and build up life chances for UK youth.

So we get this verbal diarrhoea of an article. Defies belief.

Whatitsaysonthetin -> Evelita , 21 Sep 2017 15:27
Yes. China and India are indeed training youth in coding skills. In order that they take jobs in the USA and UK! It's been going on for 20 years and has resulted in many experienced IT staff struggling to get work at all and, even if they can, to suffer stagnating wages.
WmBoot , 21 Sep 2017 15:23
Wow. Congratulations to the author for provoking such a torrent of vitriol! Job well done.
TTauriStellarbody , 21 Sep 2017 15:22
Has anyones job is at risk from a 16 year old who can cobble together a couple of lines of javascript since the dot com bubble?

Good luck trying to teach a big enough pool of US school kids regular expressions let alone the kind of test driven continuous delivery that is the norm in the industry now.

freeandfair -> youngsteveo , 21 Sep 2017 13:27
> A lot of resumes come across my desk that look qualified on paper, but that's not the same thing as being able to do the job

I have exactly the same experience. There is undeniable a skill gap. It takes about a year for a skilled professional to adjust and learn enough to become productive, it takes about 3-5 years for a college grad.

It is nothing new. But the issue is, as the college grad gets trained, another company steal him/ her. And also keep in mind, all this time you are doing job and training the new employee as time permits. Many companies in the US cut the non-profit department (such as IT) to the bone, we cannot afford to lose a person and then train another replacement for 3-5 years.

The solution? Hire a skilled person. But that means nobody is training college grads and in 10-20 years we are looking at the skill shortage to the point where the only option is brining foreign labor.

American cut-throat companies that care only about the bottom line cannibalized themselves.

farabundovive -> Ethan Hawkins , 21 Sep 2017 15:10

Heh. You are not a coder, I take it. :) Going to be a few decades before even the easiest coding jobs vanish.

Given how shit most coders of my acquaintance have been - especially in matters of work ethic, logic, matching s/w to user requirements and willingness to test and correct their gormless output - most future coding work will probably be in the area of disaster recovery. Sorry, since the poor snowflakes can't face the sad facts, we have to call it "business continuation" these days, don't we?
UncommonTruthiness , 21 Sep 2017 14:10
The demonization of Silicon Valley is clearly the next place to put all blame. Look what "they" did to us: computers, smart phones, HD television, world-wide internet, on and on. Get a rope!

I moved there in 1978 and watched the orchards and trailer parks on North 1st St. of San Jose transform into a concrete jungle. There used to be quite a bit of semiconductor equipment and device manufacturing in SV during the 80s and 90s. Now quite a few buildings have the same name : AVAILABLE. Most equipment and device manufacturing has moved to Asia.

Programming started with binary, then machine code (hexadecimal or octal) and moved to assembler as a compiled and linked structure. More compiled languages like FORTRAN, BASIC, PL-1, COBOL, PASCAL, C (and all its "+'s") followed making programming easier for the less talented. Now the script based languages (HTML, JAVA, etc.) are even higher level and accessible to nearly all. Programming has become a commodity and will be priced like milk, wheat, corn, non-unionized workers and the like. The ship has sailed on this activity as a career.

[Oct 01, 2017] How to Use Script Command To Record Linux Terminal Session

Oct 01, 2017 | linoxide.com

How to Use "Script" Command To Record Linux Terminal Session May 30, 2014 By Pungki Arianto Updated June 14, 2017 Facebook Google+ Twitter Pinterest LinkedIn StumbleUpon Reddit Email This script command is very helpful for system admin. If any problem occurs to the system, it is very difficult to find what command was executed previously. Hence, system admin knows the importance of this script command. Sometimes you are on the server and you think to yourself that your team or somebody you know is actually missing a documentation on how to do a specific configuration. It is possible for you to do the configuration, record all actions of your shell session and show the record to the person who will see exactly what you had (the same output) on your shell at the moment of the configuration. How does script command work?

script command records a shell session for you so that you can look at the output that you saw at the time and you can even record with timing so that you can have a real-time playback. It is really useful and comes in handy in the strangest kind of times and places.

The script command keeps action log for various tasks. The script records everything in a session such as things you type, things you see. To do this you just type script command on the terminal and type exit when finished. Everything between the script and the exit command is logged to the file. This includes the confirmation messages from script itself.

1. Record your terminal session

script makes a typescript of everything printed on your terminal. If the argument file is given, script saves all dialogue in the indicated file in the current directory. If no file name is given, the typescript is saved in default file typescript. To record your shell session so what you are doing in the current shell, just use the command below

# script shell_record1
Script started, file is shell_record1

It indicates that a file shell_record1 is created. Let's check the file

# ls -l shell_*
-rw-r--r-- 1 root root 0 Jun 9 17:50 shell_record1

After completion of your task, you can enter exit or Ctrl-d to close down the script session and save the file.

# exit
exit
Script done, file is shell_record1

You can see that script indicates the filename.

2. Check the content of a recorded terminal session

When you use script command, it records everything in a session such as things you type so all your output. As the output is saved into a file, it is possible after to check its content after existing a recorded session. You can simply use a text editor command or a text file command viewer.

# cat shell_record1 
Script started on Fri 09 Jun 2017 06:23:41 PM UTC
[root@centos-01 ~]# date
Fri Jun 9 18:23:46 UTC 2017
[root@centos-01 ~]# uname -a
Linux centos-01 3.10.0-514.16.1.el7.x86_64 #1 SMP Wed Apr 12 15:04:24 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[root@centos-01 ~]# whoami
root
[root@centos-01 ~]# pwd
/root
[root@centos-01 ~]# exit
exit

Script done on Fri 09 Jun 2017 06:25:11 PM UTC

While you view the file you realize that the script also stores line feeds and backspaces. It also indicates the time of the recording to the top and the end of the file.

3. Record several terminal session

You can record several terminal session as you want. When you finish a record, just begin another new session record. It can be helpful if you want to record several configurations that you are doing to show it to your team or students for example. You just need to name each recording file.

For example, let us assume that you have to do OpenLDAP , DNS , Machma configurations. You will need to record each configuration. To do this, just create recording file corresponding to each configuration when finished.

# script openldap_record
   ...............
    configuration step
   ..............
# exit

When you have finished with the first configuration, begin to record the next configuration

# script machma_record
    ............
     configuration steps
    .............
# exit

And so on for the other. Note that if you script command followed by existing filename, the file will be replaced. So you will lost everything.

Now, let us imagine that you have begun Machma configuration but you have to abort its configuration in order to finish DNS configuration because of some emergency case. Now you want to continue the machma configuration where you left. It means you want to record the next steps into the existing file machma_record without deleting its previous content; to do this you will use script -a command to append the new output to the file.

This is the content of our recorded file

Now if we want to continue our recording in this file without deleting the content already present, we will do

# script -a machma_record
Script started, file is machma_record

Now continue the configuration, then exit when finished and let's check the content of the recorded file.

Note the new time of the new record which appears. You can see that the file has the previous and actual records.

4. Replay a linux terminal session

We have seen that it is possible to see the content of the recorded file with commands to display a text file content. The script command also gives the possibility to see the recorded session as a video. It means that you will review exactly what you have done step by step at the moment you were entering the commands as if you were looking a video. So you will playback/replay the recorded terminal session.

To do it, you have to use --timing option of script command when you will start the record.

# script --timing=file_time shell_record1
Script started, file is shell_record1

See that the file into which to record is shell_record1. When the record is finished, exit normally

# exit
exit
Script done, file is shell_record1

Let's see check the content of file_time

# cat file_time 
0.807440 49
0.030061 1
116.131648 1
0.226914 1
0.033997 1
0.116936 1
0.104201 1
0.392766 1
0.301079 1
0.112105 2
0.363375 152

The --timing option outputs timing data to the file indicated. This data contains two fields, separated by a space which indicates how much time elapsed since the previous output how many characters were output this time. This information can be used to replay typescripts with realistic typing and output delays.

Now to replay the terminal session, we use scriptreplay command instead of script command with the same syntax when recording the session. Look below

# scriptreplay --timing=file_time shell_record1

You will that the recorded session with be played as if you were looking a video which was recording all that you were doing. You can just insert the timing file without indicating all the --timing=file_time. Look below

# scriptreplay file_time shell_record1

So you understand that the first parameter is the timing file and the second is the recorded file.

Conclusion

The script command can be your to-go tool for documenting your work and showing others what you did in a session. It can be used as a way to log what you are doing in a shell session. When you run script, a new shell is forked. It reads standard input and output for your terminal tty and stores the data in a file.

[Sep 27, 2017] Chkservice - An Easy Way to Manage Systemd Units in Terminal

Sep 27, 2017 | linoxide.com

Systemd is a system and service manager for Linux operating systems which introduces the concept of systemd units and provides a number of features such as parallel startup of system services at boot time, on-demand activation of daemons, etc. It helps to manage services on your Linux OS such as starting/stopping/reloading. But to operate on services with systemd, you need to know the different services launched and the name which exactly matches the service. There is a tool provided which can help Linux users to navigate through the different services available on your Linux as you do for the different process in progress on your system with top command.

What is chkservice?

chkservice is a new and handy tool for systemd units management in a terminal. It is a GitHub project developed by Svetlana Linuxenko. It has the particularity to list the differents services presents on your system. You have a view of each service available and you are able to manage it as you want.

Debian:

sudo add-apt-repository ppa:linuxenko/chkservice
sudo apt-get update
sudo apt-get install chkservice

Arch

git clone https://aur.archlinux.org/chkservice.git
cd chkservice
makepkg -si

Fedora

dnf copr enable srakitnican/default
dnf install chkservice

chkservice require super user privileges to make changes into unit states or sysv scripts. For user it works read-only.

Package dependencies:

Build dependencies:

Build and install debian package.

git clone https://github.com/linuxenko/chkservice.git
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/usr ../
cpack

dpkg -i chkservice-x.x.x.deb

Build release version.

git clone https://github.com/linuxenko/chkservice.git
mkdir build
cd build
cmake ../
make

[Sep 27, 2017] Arithmetic Evaluation

Sep 27, 2017 | mywiki.wooledge.org

Bash has several different ways to say we want to do arithmetic instead of string operations. Let's look at them one by one.

The first way is the let command:

$ unset a; a=4+5
$ echo $a
4+5
$ let a=4+5
$ echo $a
9

You may use spaces, parentheses and so forth, if you quote the expression:

$ let a='(5+2)*3'

For a full list of operators availabile, see help let or the manual.

Next, the actual arithmetic evaluation compound command syntax:

$ ((a=(5+2)*3))

This is equivalent to let , but we can also use it as a command , for example in an if statement:

$ if (($a == 21)); then echo 'Blackjack!'; fi

Operators such as == , < , > and so on cause a comparison to be performed, inside an arithmetic evaluation. If the comparison is "true" (for example, 10 > 2 is true in arithmetic -- but not in strings!) then the compound command exits with status 0. If the comparison is false, it exits with status 1. This makes it suitable for testing things in a script.

Although not a compound command, an arithmetic substitution (or arithmetic expression ) syntax is also available:

$ echo "There are $(($rows * $columns)) cells"

Inside $((...)) is an arithmetic context , just like with ((...)) , meaning we do arithmetic (multiplying things) instead of string manipulations (concatenating $rows , space, asterisk, space, $columns ). $((...)) is also portable to the POSIX shell, while ((...)) is not.

Readers who are familiar with the C programming language might wish to know that ((...)) has many C-like features. Among them are the ternary operator:

$ ((abs = (a >= 0) ? a : -a))

and the use of an integer value as a truth value:

$ if ((flag)); then echo "uh oh, our flag is up"; fi

Note that we used variables inside ((...)) without prefixing them with $ -signs. This is a special syntactic shortcut that Bash allows inside arithmetic evaluations and arithmetic expressions.

There is one final thing we must mention about ((flag)) . Because the inside of ((...)) is C-like, a variable (or expression) that evaluates to zero will be considered false for the purposes of the arithmetic evaluation. Then, because the evaluation is false, it will exit with a status of 1. Likewise, if the expression inside ((...)) is non-zero , it will be considered true ; and since the evaluation is true, it will exit with status 0. This is potentially very confusing, even to experts, so you should take some time to think about this. Nevertheless, when things are used the way they're intended, it makes sense in the end:

$ flag=0      # no error
$ while read line; do
>   if [[ $line = *err* ]]; then flag=1; fi
> done < inputfile
$ if ((flag)); then echo "oh no"; fi

[Sep 27, 2017]