People are losing their jobs as machines make fewer workers able to produce what formerly required many workers. But there’s nothing new in that. It’s happened over and over.
In the past, as the need for people dropped in one industry, new industries appeared and needed people to do new things. After some uncomfortable dislocations, there have always been new jobs. Employment optimists say that this pattern will repeat.
But nearly invisibly, every time this has happened, some people have been left behind. They’ve dropped below the “employment waterine” a term that I found in this essay by Scott Alexander:
In prehistoric days, everyone could
…just hang out and live in a cave and gather roots and berries and maybe hunt buffalo and participate in the appropriate tribal bonding rituals like everyone else.
But society came and paved over the place where all the roots and berry plants grew and killed the buffalo and dynamited the caves and declared the tribal bonding rituals Problematic. This increased productivity by about a zillion times, so most people ended up better off. The only ones who didn’t were the ones who for some reason couldn’t participate in it.
The number who could not participate was not large, but–due to rising machine intelligence–it is starting to grow. And it will grow exponentially.
Imagine an employment waterline, gradually rising through higher and higher levels of competence. In the distant past, maybe you could be pretty dumb, have no emotional continence at all, and still live a pretty happy life. As the waterline rises, the skills necessary to support yourself comfortably become higher and higher.
The employment waterline is rising, is rising exponentially, and we are at the knee of the curve.
Some skills can be acquired, of course. People who had sufficient skills to do old-style farm work (planting and hoeing) could be retrained to do old-style factory work (lifting, fitting, and bolting) and make an adequate living. There are still some farm jobs (picking in the fields) that don’t require literacy, but there are few factory jobs in the developed economies that an illiterate can hold. And there are fewer and fewer jobs of all kinds for those who can’t acquire the skills of reading and writing.
They are below the waterline.
To get a good job increasingly requires the ability to learn and a store of things already learned. So where’s the waterline today? To simplify things let’s consider a number that measures a person’s general ability to learn. Let’s call is the LQ. And let’s ignore the amount already learned, and concentrate on the limits imposed by your LQ.
So: f your LQ is below 10, you can’t learn to read and write and aren’t qualified for any jobs that require even basic literacy.
If your LQ is below 30 you can learn rote procedures–although slowly–but you can’t learn problem solving techniques, and the higher-order techniques for choosing the right technique to apply to a particular problem. If the job requires that, you’re below the water line.
If your LQ is above 30 you can learn how to teach an intelligent humanoid machine–one with and the ability to manipulate tools, even if it does not look like a person–how to carry out a rote procedure. You’re above the water line, but everyone with an LQ below 30 is now below the waterline.
IBM’s Watson has become better at diagnosing cancer than doctors, but Watson has not been programmed to diagnose cancer. Instead it’s been programmed to read, to analyze, and to organize a corpus of data about cancer, has had a chance to test its diagnostic skills using whatever algorithm it started with, then refine its algorithm based on its results.
Watson, itself has a high LQ maybe 50 –or 80. Whatever it is, it’s enough to learn do the job of a doctor better than most doctors. And that means it can put everyone with a lower LQ out of work.
If your LQ is above some number–let’s say 100, you might have a job– increasing the ability of Watson-class machines to learn faster and better, raising their effective LQ and making more and more people unemployable.
The flaw in this argument is that LQ is unidimensional and learning is not. For example, some people can learn math and logic, but they can’t learn social skills. So no matter how good computers get at mathematical/logical/structured learning, they won’t master social skills.
But computers are learning social skills as well. The job of being a personal assistant to a busy professional involves some logical, structured activity (checking a calendar for conflicts) and some social skills: carrying on an email conversation to negotiate a time and place for a meeting.
And here’s an intelligent machine that does that job, reviewed here. It’s a fairly skilled job, as the reviewer points out:
The average personal assistant is paid about $31,000 a year, while the average wage for the U.S. job market is roughly $5,000 less than that.
So he “hired” Clara, an AI from claralabs.com (the link is my referal link, so if you ask to get putting on the waiting list, I get credit, and get moved closer to the front of the waiting list for Clara. A pretty brilliant viral market scheme, probably created by a person, not an AI.)
The reviewer continued:
Over the course of the month, Clara did the unthinkable for a robot: she always passed for human. I’d CC Clara to set up an appointment, only to ask my contact later if they’d realized she was an AI. Not one person did from her speech
Warehouse employees are being replaced by intelligent machines that do the job, better and cheaper than humans. Truck and cab drivers will soon be replaced by intelligent devices that do the job better and cheaper than humans. Some doctors will be replace by intelligent devices that do the job better and cheaper than humans. Journalists are being replaced by intelligent devices that write run-of-the mill news stories better and cheaper than humans. Administrators are being replaced by intelligent devices that are better and cheaper than humans.
This is just the beginning.
This time it’s going to be different.