mmm....shiney! said:
Big A.D. said:
Well, that's where we disagree: the are plenty of things you can do to make a profit and not all of them are socially acceptable.
We define those things by expressing our values because what might seem okay now might well have very serious repercussions down the track, particularly since "currently profitable" and "not a nice situation to find ourselves in" are two very different things.
What is socially acceptable is largely determined by the objective nature of morality ie a set of core common principles or truths that are held universally. These principles are protected and upheld by law (or at least should be). What you are alluding to is that in the drive to gain reward by meeting the needs of individuals (profit), entrepreneurs and business people may engage in activities that some would consider unacceptable. Well so what? As long as those entrepreneurs or business people don't cross the line and transgress the universal truths then as value is subjective, it's an entirely personal decision whether someone engages in what others would consider socially unacceptable practices or not. Therefore it's a non-issue.
It's not a non-issue because, like I said above, profit
now isn't the ultimate predictor of whether we get the best outcome.
For example, is it moral to develop a machine that will put a hundred million people out of work?
Who cares. It'll happen anyway. And value is subjective. Fine.
What do you do with a hundred million people who suddenly find themselves out of a job though? Some can re-train for something else (but some can't) and then someone else invests another machine that makes the job they're re-training for redundant and they're back at square one again. What are they supposed to live on in the mean time? How to they obtain food and shelter?
The change is inevitable, but what happens if it occurs too fast for society to keep up with? If every entrepreneur out there is working as hard as they can to automate one little particular thing that will make them themselves very rich there's going to be a compounding effect of great new technology being used to make even greater new technology and so on. We know how this goes: everything ticks along slowly for a while and then there's a point of critical mass and everything just rockets along.
So how to we, as a society,
manage the change so there aren't as many unwanted side effects (like millions of hungry surplus workers living on the streets)? A robot tax? Progressively fewer "standard" working hours? Universal incomes?
Sipping the margaritas while robots do the work sounds fine to me. Not having riots and wars first would be nice.
Big A.D. said:
the whole concept of profit is based on a capitalist system being the best way of managing scarce resources (and let's assume it is), what happens if resources are no longer scarce?
More specifically, what if everybody could sit around drinking those margaritas because money has become an anachronism but instead we have a handful of uber-rich entrepreneur elites and millions of people living in poverty because work is still the way people are expected to earn money but all the work is being done by robots and AIs? I'm not saying there would need to be a neo-Luddite revolution to smash all the robots, just that we'll need to think about what our values are because a lot of what we do every day is going to become redundant.
Under such a system, resources that are no longer scarce would command little value. If they were a key ingredient in whatever process is involved in meeting the needs of individuals, their cost would reflect their abundance. In your scenario, if labour were truly abundant and no longer scarce, then the cost of labour would be negligible and people would be working for a pittance.
Exactly. Or, alternatively, money stops being a good reflection of perceived value since not enough people have enough of it for it to mean much.
Maybe.
Before accepting your scenario as plausible though, you'd firstly have to ask how we arrived at having a handful of elite mega-rich in the first place with the will and capacity to control the lives of others
Because we thought - at the time - giving entrepreneurs free reign to make profits was the only way technology would ever advance, that technological advances are always a good thing and that thinking about the future was a waste of time because, hey, who knows what will happen?
secondly, why would these elite mega-rich bother to continue using robots if the cost of labour was so cheap? Or forced? :/
Because once the robots are there, they're there. They do the job they were designed to do and dealing with human workers is a pain in the arse (they complain, they want things, they need lunch and toilet breaks, they have
ideas that mess with the smooth running of the business, etc.)
Seriously, anyone can invent any possible dystopian future, after all it requires very little imagination to imagine bogey men and bunyips lurking around every corner or waterhole. Entrepreneurial imagination on the other hand is not shared so broadly in the community. It needs to dreamed and practised unhindered and it should be profited from.
Entrepreneurial imagination may not be such a common trait now, but what happens when an entrepreneur develops an entrepreneurial AI? Then it will be common. Or rather, there will be one instance of it, but it will have access to so much knowledge that it will be able to replace every human entrepreneur. Seriously, it's not a huge stretch. First you develop an "intelligence" that can analyze weather data and make accurate predictions, then you plug it into the stock market and get it to trade wheat futures and then you just keep going to the next logical step in development again and again and again.
Dystopian futures are what science fiction authors write about to remind us that we need to think about our choices
before things go horribly wrong.