The Ethics of Automation - Page 4
Close
Login to Your Account
Page 4 of 4 FirstFirst ... 234
Results 61 to 80 of 80
  1. #61
    Join Date
    Jan 2002
    Location
    West Coast, USA
    Posts
    9,127
    Post Thanks / Like
    Likes (Given)
    601
    Likes (Received)
    6638

    Default

    Quote Originally Posted by CutEdge View Post
    Are you saying, then, that one can justify bad actions in the name of survival?
    We tend to celebrate "moral" actions as ones that perpetuate the survival of our kids, kin, nation, civilization. The husband who shields his family from a bullet. First responders who put themselves at risk. The solder who defends his homeland. The parent who works two jobs so his kids can have a better life. Religions, too. Imagine, celebrating someone for dying on a cross for others. Or Gandi's willingness to starve until death, in service of the Indian people. Nelson Mandela putting himself at similar risk. Even in the it's-just-a-game world of sports, there's admiration for playing through pain, in service of the team.

    Automation can contribute (positively or negatively) to all of those survive-and-thrive challenges. Oldwrench's A-bomb example (now there's some serious automation of warfare) is a good example of how difficult this can be to sort out. Still, we either sort it out on our own or have someone or something (say Mother Nature and collapse) sort it out for us.

    This reflexive action to attempt to continue the species seems pretty much programmed in, one way or another, in the evolution of all living things. Hard to say any living thing, ourselves included, spends much time trying to justify it. From a biological perspective, nature pretty much only cares if you or I survive long enough to successfully raise another generation. From a cultural evolution perspective, we older folks might still be useful if we help our community or civilization survive and thrive a bit longer.

  2. Likes Scottl liked this post
  3. #62
    Join Date
    Apr 2018
    Country
    UNITED STATES MINOR OUTLYING ISLANDS
    Posts
    7,535
    Post Thanks / Like
    Likes (Given)
    0
    Likes (Received)
    3672

    Default

    Quote Originally Posted by PeteM View Post
    From a biological perspective, nature pretty much only cares if you or I survive long enough to successfully raise another generation.
    Not even that. You, as a person, are only the tool your chromosomes use to continue their life

  4. Likes JST liked this post
  5. #63
    Join Date
    Mar 2006
    Location
    Vershire, Vermont
    Posts
    2,666
    Post Thanks / Like
    Likes (Given)
    1877
    Likes (Received)
    892

    Default

    Quote Originally Posted by CutEdge View Post
    Are you saying, then, that one can justify bad actions in the name of survival?
    Safe to say "bad" is an over used term. Useful to paint a quick picture or assign moral responsibility, but in the end, mostly distracting and unproductive.

    Talk about it in terms of actions and consequences and problems/solutions become evident. Makes it more complicated, so a lot of people who want life to be simple and easy get bored.

  6. #64
    Join Date
    May 2015
    Country
    UNITED STATES
    State/Province
    North Carolina
    Posts
    68
    Post Thanks / Like
    Likes (Given)
    6
    Likes (Received)
    11

    Default

    What we do, and what we should do, are two very different things. If "bad" and "moral" and "good" are meaningless abstractions, as others have implied, then so are "useful" and "productive" and "survival". Extending the terms past the individual and on to the children, the society, the species, etc... is just moving the goal posts. Mortality is absolute, regardless of scale, so I maintain that survival does not justify harmful actions (even as a people... even as a species).

    This thread is predicated on the assumption that ethics actually exist, and are not meaningless abstractions. Going forward, I would like to hear from anyone who accepts this premise. I genuinely want to know how to do automation in a way that is good, and not bad.

  7. #65
    Join Date
    Feb 2019
    Country
    UNITED STATES
    State/Province
    New York
    Posts
    443
    Post Thanks / Like
    Likes (Given)
    0
    Likes (Received)
    410

    Default

    I feel like your assumption that automation is putting people out of work isn't exactly the case. I'm sure it is happening some, however automation for the most part is filling job vacancies. Companies can't fill the positions they need so they get robots to do the simplest tasks. There are still plenty of skilled jobs out there unfilled, and plenty of opportunities to get training. When I decided to go to community college at 25 I was already making 40-50k a year with a wife/kid/mortgage. The federal financial aid was enough to cover my tuition, a great help to me at the time. Had my income been less I could have gotten books/food/housing paid for. The problem is people don't want to put in 80-100 hours a week to be successful. They'd rather point the finger at someone else and whine that their gravy train assembly line job is gone, and for that I have no sympathy.

  8. #66
    Join Date
    Mar 2006
    Location
    Vershire, Vermont
    Posts
    2,666
    Post Thanks / Like
    Likes (Given)
    1877
    Likes (Received)
    892

    Default

    Quote Originally Posted by CutEdge View Post
    What we do, and what we should do, are two very different things. If "bad" and "moral" and "good" are meaningless abstractions, as others have implied, then so are "useful" and "productive" and "survival".....
    Uh, not what I meant, but maybe you're referring to someone else. "Good", "moral" are useful shorthand, but don't suggest solutions. They're initial direction givers at best, mostly distractions, red herrings from the tasks and goals at hand.

    ...Extending the terms past the individual and on to the children, the society, the species, etc... is just moving the goal posts. Mortality is absolute, regardless of scale, so I maintain that survival does not justify harmful actions (even as a people... even as a species).....
    How far do you take this? Serious question, directed at you personally, just as example. Are you a Jain? Don't wear shoes because they might crush an insect? (That's the literal interpretation of your statement.) Or merely pacifist? Do you consider stepping on a bacterium a "harmful action"? Do you realize that existing as a human in the 21st century is inevitably harmful to other species, even other humans? Maybe the only way to prevent harm to other species is to commit suicide, except that would be harmful.

    Fussy, I know, but this is the main contradiction with morals and ethics. Is it ethics and morals for me/us, or for someone/something else?

    Define/analyze a process and define the goals (the hard part), work towards the goal and all that moral/ethics stuff goes away. Actions and consequences, actions and consequences.

  9. #67
    Join Date
    Aug 2005
    Location
    Toronto
    Posts
    5,691
    Post Thanks / Like
    Likes (Given)
    3783
    Likes (Received)
    4468

    Default

    Quote Originally Posted by CutEdge View Post
    What we do, and what we should do, are two very different things. If "bad" and "moral" and "good" are meaningless abstractions, .
    You are bundling to many potential conflicting subject areas and ending up with a confused mess of an argument.

    Accept for a second that sound and knowledgeable economics doesn't judge. Economists don't say things like so and so makes too much money. The tenet is that without undo coercion in an open market and within the law its silly (in economics) to judge. As soon as impose what you think is right and wrong, you're not then unbiasedly studying the subject, you're studying how the world conforms to your views and you miss the truth. Good and bad only work if I get to be the judge sort of thing.

    Most would agree the goal of economic activity is greater output for a given input, i.e. more wealth, prosperity, standard of living increases on so on. Its a laudable goal that has made our lifestyles (market based capitalism) the best on the planet.

    As for "right and wrong", "good and bad", that's the responsibility of social structures - i.e. government. There is a key reason for this. There's an endless list of conflicts between what is best for a business's economic success and what is consider good (by society at large). Say dumping chemicals in the river; makes for greater profit, but society would consider it bad. Here's the key.....success in business and the wealth it has brought depends on decentralized decision making. Moral rights depend on centralized decision making - by definition its something that comes from the collective. See? we don't each get to decide if murder is ok, its imposed on us as a centralized distillation of society's collective moral values. Same with dumping chemicals in the river.

    When you expect a business to do the "right" thing (according to who?) over the best economic thing (when the two are in conflict) you've moved to fairy-tale land if you are expecting anything but a disaster. Why? because you've then created a system that rewards bad behaviour. Those companies that chose to do bad things will gain economic ground on doing the wrong thing, say chemicals in the river. And since the model depends on decentralized economic decision making, its guaranteed that some will do the bad thing and will gain share for it. Dumb rose coloured glasses wishful thinking otherwise

    Fortunately as law trumps business (well, except maybe for Canadian banking) decentralized business decisions, despite being motivated by profit, have to adhere to society's central decisions on good and bad. The results you get are always systemic; since law trumps business look to law, gov and our crappy leaders if you're not getting the results you want. The dull bulbs blame business, but mostly business does exactly what you'd expect it to do, given they exist to pursue profits

    As for automation, stop the hand wringing and look at facts. We have full employment despite 200 years of automation and living standards are higher than ever. Even the A bomb isn't an argument against automation; government defined a goal and government used the bomb(despite arguments it was unnecessary at that juncture). (remember gov defines right and wrong) However horrible the goal, the automation cost a lot less than if they commanded soldiers to go in and kill every man woman and child in Hiroshima. Its business that automates for great productivity, its government that defines good and bad.

    Unless you can seriously say you'd rather give the men shovels, give your head a shake on thinking automation is bad or wrong. Or if someone is really opposed to it, go spend some time where they don't have much automation, i.e. some of the sub Sahara countries and see if you still feel the same way

  10. Likes neilho liked this post
  11. #68
    Join Date
    Jul 2019
    Country
    UNITED STATES
    State/Province
    Florida
    Posts
    3,540
    Post Thanks / Like
    Likes (Given)
    3667
    Likes (Received)
    717

    Default

    Quote Originally Posted by CutEdge View Post
    Not to put a damper on Progress, but I think it's good to take a step back sometimes and make sure we're going in the right direction.

    Is automation good or bad?

    Some points to consider are as follows:

    Pros:
    • Automation can make things cleaner, safer, more efficient, and improve quality.
    • Automation can eliminate boring, tedious, or dangerous tasks.
    • Automation can be an alternative to sending jobs overseas where labor standards are lower.


    Cons:
    • Automation can eliminate low-skill jobs, meaning poor, under-educated, or otherwise disadvantaged people now have no way to contribute.
    • Automation can put more and more control in the hands of fewer and fewer people.
    • Automation work puts us increasingly in front of computers, reducing the physical engagement of our whole bodies and all our senses in our work.
    Thanks for posting this question and issue. I think automation is very expensive and this is a natural cap on its growth. It is what it is. We benefit in many ways from automation. We also want people to prosper and make a living not being thrown out of work because automation has replaced jobs.

    Someone should put such labor to doing something useful as those people are like a untapped resource looking for useful employment.

    From the effect on human workers good and bad is of persistent concern to me. Making a living is nice because I can buy food and shelter. It is not too much for a person to ask that they be able to survive with their families with choices if automation takes the jobs they did.

  12. #69
    Join Date
    Mar 2006
    Location
    Vershire, Vermont
    Posts
    2,666
    Post Thanks / Like
    Likes (Given)
    1877
    Likes (Received)
    892

    Default

    Quote Originally Posted by Mcgyver View Post
    ...As for "right and wrong", "good and bad", that's the responsibility of social structures - i.e. government.....
    And religion?

  13. #70
    Join Date
    Aug 2005
    Location
    Toronto
    Posts
    5,691
    Post Thanks / Like
    Likes (Given)
    3783
    Likes (Received)
    4468

    Default

    Quote Originally Posted by neilho View Post
    And religion?
    I used social structures as a catch-all phrase for presenting the main dichotomy, being the centralized vs decentralized decision making, one suits best economics, one suits best judging right/wrong. What the centralized social structure is, where values/morals get defined, kind of doesn't matter to the point: its stupid to look to commerce to lead moral/ethic behaviour, that needs to come from a centralized decision/source

    since you asked, I'm ok with people thinking they get moral guidance from religion, but my personal belief is that its BS. A lot of religion teaches some moral values however I believe religion is entirely a man made set of tales and beliefs (vs dictated/constructed by a deity). If its all man made (which I believe is the only position critical thinking can lead to as there is zero evidence to support any other theory) , region is just a projection of human/societal values - as per German philosopher Feuerbach. i.e. Religion isn't creating the values, its reflecting them, with perhaps some agenda serving editorializing. Furthermore its not really central, has many fractions which manipulate things for all kinds of agendas other than sound morals. Lastly, so much of the values are antiquated or just plain wrong, stoning a woman to death, killing those not of the true faith, religious war or justify killing because god is on our side or simple crap like refusing to sit next to a woman on plane. Thanks but no thanks, religion is not a beacon of values I want to subscribe to.

  14. Likes neilho, tdmidget liked this post
  15. #71
    Join Date
    May 2015
    Country
    UNITED STATES
    State/Province
    North Carolina
    Posts
    68
    Post Thanks / Like
    Likes (Given)
    6
    Likes (Received)
    11

    Default

    Quote Originally Posted by AJ H View Post
    I feel like your assumption that automation is putting people out of work isn't exactly the case. I'm sure it is happening some, however automation for the most part is filling job vacancies. Companies can't fill the positions they need so they get robots to do the simplest tasks. There are still plenty of skilled jobs out there unfilled, and plenty of opportunities to get training.
    It's more a question than an assumption. Anyway, I'm starting to come to the conclusion that you are correct. Generally speaking, automation isn't putting people out of jobs. I have seen a company's reaction to eliminating tasks: they often move people around to other positions, to their credit. Usually when the jobs go away it has more to do with off-shoring and consolidation. Automation may be a way to counteract this; in that sense, it's a positive.

    There is one time I have streamlined a process and saw someone laid off because of it, where the process improvement itself was the reason given for his termination. However, in this case, I think the reason given was a bit deceitful. There was as underlying story involved, and I think "that operation went away" was a convenient excuse to remove someone who was not particularly welcome for other reasons. Of course, I don't know exactly why it happened, so I can't say for sure. But it is likely that, even in this case, automation was not truly to blame.


    Quote Originally Posted by neilho View Post
    How far do you take this? Serious question, directed at you personally, just as example. Are you a Jain? Don't wear shoes because they might crush an insect? (That's the literal interpretation of your statement.) Or merely pacifist? Do you consider stepping on a bacterium a "harmful action"?
    To answer neilho, you caught my mistake. I was using "harm" as a synonym for "bad"... but the two terms are not always the same. No, not all harmful things are bad things. The nature of good and bad is another topic; but whatever is bad, survival does not make it good. If I do something bad, and survive because of it, I have still done something bad... the end does not justify the means. Sometimes I have no choice but to do something bad; then what I need is forgiveness, not self-justification.

  16. #72
    Join Date
    Mar 2006
    Location
    Vershire, Vermont
    Posts
    2,666
    Post Thanks / Like
    Likes (Given)
    1877
    Likes (Received)
    892

    Default

    Quote Originally Posted by CutEdge View Post
    ...To answer neilho, you caught my mistake. I was using "harm" as a synonym for "bad"... but the two terms are not always the same. No, not all harmful things are bad things. The nature of good and bad is another topic; but whatever is bad, survival does not make it good. If I do something bad, and survive because of it, I have still done something bad... the end does not justify the means. Sometimes I have no choice but to do something bad; then what I need is forgiveness, not self-justification.


    I appreciate the response. Not a mistake, to my mind, it just exists. The nature of good and bad and their extensions, I think, is what we're discussing. If one bypasses the moral trappings those terms imply, what happened is what happened. One can then ponder the consequences of the actions.

    Take, for example,

    "If I do something bad, and survive because of it, I have still done something bad... the end does not justify the means
    You did what you did. Under the circumstances, probably understandable to anyone inquiring. I would argue that the end does justify the means. If...."the end" includes considering all the consequences of actions, including hurt feelings and death.

    Morals, good, bad, are pretty tricky. Like situational ethics, constantly changing depending on circumstance. Life becomes much more direct, and easier and simpler, if they're just skipped.

  17. Likes Mcgyver liked this post
  18. #73
    Join Date
    May 2015
    Country
    UNITED STATES
    State/Province
    North Carolina
    Posts
    68
    Post Thanks / Like
    Likes (Given)
    6
    Likes (Received)
    11

    Default

    Quote Originally Posted by Trueturning View Post
    From the effect on human workers good and bad is of persistent concern to me. Making a living is nice because I can buy food and shelter. It is not too much for a person to ask that they be able to survive with their families with choices if automation takes the jobs they did.
    What do you think companies can do to make sure that this happens in practice?

    I think it should be baked into the Return On Investment calculations. If we factor in the cost of, say, training someone to do a different task after her job is replaced by a robot, then the arrangements are already clearly stated up front. But I don't know what's best, so I'd like to hear your ideas.

  19. #74
    Join Date
    Feb 2012
    Location
    California
    Posts
    1,582
    Post Thanks / Like
    Likes (Given)
    972
    Likes (Received)
    1785

    Default

    Tangential, but very much relevant and worth a listen:

    Latest episode from the Freakonomics Podcast - Minimum Wage

    Some interesting tidbits in there. Like how the poorest people in the country (US) don't work at all, so raising the federal minimum wage to $15/hr doesn't help them.

  20. Likes Oldwrench liked this post
  21. #75
    Join Date
    Jan 2002
    Location
    West Coast, USA
    Posts
    9,127
    Post Thanks / Like
    Likes (Given)
    601
    Likes (Received)
    6638

    Default

    Quote Originally Posted by Orange Vise View Post
    Tangential, but very much relevant . . ..
    Thanks for that link.

    Another interesting tidbit -- a $15 minimum wage doesn't help or hurt as much as supporters or opponents might think. It is neither ends the world or ushers in a new dawn.

    What data there are shows that it doesn't much hurt business, because they adapt by cutting back hours and over-loading what staff they have. Helps the competent and working poor earn a tiny bit more. Doesn't help the really poor - because they're no one is hiring them at $7 or $15 an hour. Meanwhile, competent labor around here is making $25 an hour -- busy -- and hard to find.

  22. Likes Oldwrench liked this post
  23. #76
    Join Date
    Aug 2005
    Location
    Toronto
    Posts
    5,691
    Post Thanks / Like
    Likes (Given)
    3783
    Likes (Received)
    4468

    Default

    Quote Originally Posted by Orange Vise View Post
    Latest episode from the Freakonomics Podcast - Minimum Wage
    .
    I've always liked Levitt and Dubner, they've made economics entertaining. The Economist approach just has so much virtue; lets try and distill meaning from data using critical thinking and leave all the emotion, prejudices and personal agendas on the coat rack. Imagine if most governance worked that way.....vs what have? Promise/spend/promise/spend/promise/spend with the odd break for pocket lining

  24. #77
    Join Date
    Jun 2001
    Location
    St Louis
    Posts
    19,136
    Post Thanks / Like
    Likes (Given)
    2332
    Likes (Received)
    3555

    Default

    Quote Originally Posted by CutEdge View Post
    Are you saying, then, that one can justify bad actions in the name of survival?

    Legally, sure. Self defense. An absolute get-out-of-jail card, (especially if you have a blue uniform on).

    If you threaten me in such a way that only one of us can survive, well, you are toast. I'm gonna come out on top, or neither of us will.

    That has the sanction of society, and is built into the legal codes.

  25. #78
    Join Date
    Nov 2013
    Location
    Eastern Massachusetts, USA
    Posts
    6,909
    Post Thanks / Like
    Likes (Given)
    5911
    Likes (Received)
    6311

    Default

    Quote Originally Posted by AJ H View Post
    I feel like your assumption that automation is putting people out of work isn't exactly the case. I'm sure it is happening some, however automation for the most part is filling job vacancies. Companies can't fill the positions they need so they get robots to do the simplest tasks. There are still plenty of skilled jobs out there unfilled, and plenty of opportunities to get training. When I decided to go to community college at 25 I was already making 40-50k a year with a wife/kid/mortgage. The federal financial aid was enough to cover my tuition, a great help to me at the time. Had my income been less I could have gotten books/food/housing paid for. The problem is people don't want to put in 80-100 hours a week to be successful. They'd rather point the finger at someone else and whine that their gravy train assembly line job is gone, and for that I have no sympathy.
    While I think you are for the most part correct, ultimately automation could take all but the most creative jobs when it becomes advanced enough. Currently house construction provides a decent living for many. Imagine a future where a portable machine based on AI takes lumber and makes all the cuts and piercings for plumbing and wiring needed from the CAD file, imprints the pieces with identifying marks, orientation marks, and even marks to guide an automated nailer. Then a second machine places the pieces in exact position while a robotic arm nails them in place. Sheathing cut by another "smart" machine would then also be nailed on by the robotic arm after switching nailers.

    At some point humans would be needed to fasten these sub-assemblies to other sections but their labor input would be greatly reduced.

    What I have just described is a possible future of on-site "factory built" homes where the mobile factory units are on trailers and brought to the job site. Given that there would be strong financial incentive to replace higher wage workers IMO it is only a matter of time. The question we need to ask ourselves as a society is do we create a dystopian future where most people are out of work after being displaced by automation or do we create a future where automation augments still valued and useful employees?

  26. Likes TeachMePlease, CutEdge liked this post
  27. #79
    Join Date
    Aug 2008
    Location
    near Cleveland
    Posts
    906
    Post Thanks / Like
    Likes (Given)
    107
    Likes (Received)
    142

    Default

    Quote Originally Posted by Scottl View Post
    Imagine a future where a portable machine based on AI takes lumber and makes all the cuts and piercings for plumbing and wiring needed from the CAD file, imprints the pieces with identifying marks, orientation marks, and even marks to guide an automated nailer. Then a second machine places the pieces in exact position while a robotic arm nails them in place. Sheathing cut by another "smart" machine would then also be nailed on by the robotic arm after switching nailers.
    Today, construction with SIP panels is about that fast. On a prepared foundation, house in a day, or nearly so.

  28. Likes Ox liked this post
  29. #80
    Join Date
    Mar 2019
    Country
    UNITED STATES
    State/Province
    Wisconsin
    Posts
    485
    Post Thanks / Like
    Likes (Given)
    39
    Likes (Received)
    210

    Default

    Quote Originally Posted by britanyweel View Post
    Ethics can be a sensitive topic when we are discussing technology and anything related to it. I remember that we were having a conversation at university about robots and how are they compared to people. There are many types of robots, like PLC and it's really interesting to discuss this matter. I am really curious about the future. What it will bring? Are robots going to have the ability to think like humans when it comes to morality or free will? It's somehow scary to create a piece of machinery like this, but we are evolving as a nation and we are going to see what's next.
    "Ethics, morality and free will" will probably be subject to controls established by Government.

    Artur Pawlowski a Canadian provides some interesting observations.


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •