• N3Cr0@lemmy.world
    link
    fedilink
    English
    arrow-up
    86
    arrow-down
    4
    ·
    1 year ago

    I predict a huge demand of workforce in five years, when they finally realized AI doesn’t drive innovation, but recycles old ideas over and over.

    • chemical_cutthroat@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      13
      ·
      1 year ago

      “Workforce” doesn’t produce innovation, either. It does the labor. AI is great at doing the labor. It excels in mindless, repetitive tasks. AI won’t be replacing the innovators, it will be replacing the desk jockeys that do nothing but update spreadsheets or write code. What I predict we’ll see is the floor dropping out of technical schools that teach the things that AI will be replacing. We are looking at the last generation of code monkeys. People joke about how bad AI is at writing code, but give it the same length of time as a graduate program and see where it is. Hell, ChatGPT has only been around since June of 2020 and that was the beta (just 13 years after the first iPhone, and look how far smartphones have come). There won’t be a huge demand for workforce in 5 years, there will be a huge portion of the population that suddenly won’t have a job. It won’t be like the agricultural or industrial revolution where it takes time to make it’s way around the world, or where this is some demand for artisanal goods. No one wants artisanal spreadsheets, and we are too global now to not outsource our work to the lowest bidder with the highest thread count. It will happen nearly overnight, and if the world’s governments aren’t prepared, we’ll see an unemployment crisis like never before. We’re still in “removed around.” “Find out” is just around the corner, though.

      • ozmot@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 year ago

        Even mindless and repetitive tasks require instances of problem solving far beyond what a.i is capable of. In order to replace 41% of the work force you’ll need a.g.i and we don’t know if thats even possible.

        • assassin_aragorn@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          Let’s also not forget that execs are horrible at estimating work.

          “Oh this’ll just be a copy paste job right?” No you idiot this is a completely different system and because of xyz we can’t just copy everything we did on a different project.

        • msage@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          It was 41% of execs saying workforce will be replaced, not 41% of workforce will be replaced

        • Its not replacing people outright its meaning each person is capable of doing more work each thus we only need 41% the people to achieve the same task. It will crash the job market. Global productivity and production will improve then ai will be updated repeat. Its just a matter of if we can scale industry to match the total production capacity of people with ai assistance fast enough to keep up. Both these things are currently exponential but the lag may cause a huge unemployment crisis in the meantime.

          • localme@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            In this potential scenario, instead of axing 41% of people from the workforce, we should all get 41% of our lives back. Productivity and pay stay the same while the benefits go to the people instead of the corporations for a change. I know that’s not how it ever works, but we can keep pushing the discussion in that direction.

        • richmondez@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          1 year ago

          We are walking talking general intelligence so we know it’s possible for them to exist, the question is more if we can implement one using existing computational technology.

      • jaybone@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        I’ve worked with humans, who have computer science degrees and 20 years of experience, and some of them have trouble writing good code and debugging issues, communicating properly, integrating with other teams / components.

        I don’t see “AI” doing this. At least not these LLM models everyone is calling AI today.

        Once we get to Data from Star Trek levels, then I can see it. But this is not that. This is not even close to that.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      but recycles old ideas over and over.

      I am so glad us humans don’t do that. It’s so nice going to a movie theater and seeing a truly original plot.

  • pjwestin@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    1 year ago

    In my experience, 100% of executives don’t actually know what their workforce does day-to-day, so it doesn’t really surprise me that they think they can lay people off because they started using ChatGPT to write their emails.

  • xantoxis@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    4
    ·
    1 year ago

    Well it’s good to know 59% of execs are aware that AI isn’t gonna change shit

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Some of that 59% might, but I guarantee at least some very strongly think it will change things, but think the change it brings will require as many people as before (if not more), but that they will be doing exponentially more with the people they have.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Yes.

      The biggest factor in terms of job satisfaction is your boss.

      There’s a lot of bad bosses.

      AI will be an above average boss before the decade is out.

      You do the math.

  • febra@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Can’t wait for AI to replace all those useless execs and CEOs. It’s not like they even do much anyways, except fondling their stocks. They could probably be automated by a markov chain

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Don’t get a job in government contracting. Pretty much I do the work and around 5 people have suggestions. None of whom I can tell to removed off directly.

      Submit the drawing. Get asked to make a change to align with a spec. Point out that we took exception to the spec during bid. Get asked to make the change anyway. Make the change. Get asked to make another change by someone higher up the chain of five. Point out change will add delays and cost. Told to do it anyway. Make the next change…

      Meanwhile every social scientist “we don’t know what is causing cost disease”

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    AI will (be a great excuse to) reduce workforce, say 41% of people who get bonuses if they do.

  • EndHD@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 year ago

    If Gartner comes out with a decent AI model, you could replace over half of your CIOs, CISOs, CTOs, etc. Most of them lack any real leadership qualities and simply parrot what they’re told/what they’ve read. They’re their through nepotism.

    Also, most of them use AI as a crutch, so that’s all they know. Meanwhile, the rest of us use it as a tool (what it’s meant to be).

  • TheKrunkedJuan@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    As someone scripting a lot for my department in the tech industry, yea AI and scripts have a lot of potential to reduce labor. However, given how chaotic this industry is, there will still need to be humans to take into account the variables that scripts and AI haven’t been trained on (or are otherwise hard to predict). I know the managers don’t wanna spend their time on these issues, as there’s plenty more for them to deal with. When there’s true AGI, that may be a different scenario, but time will tell.

    Currently, we need to have some people in each department overseeing the automations of their area. This stuff mostly kills the super redundant data entry tasks that make me feel cross eyed by the end of my shift. I don’t wanna be the embodiment of vlookup between pdfs and type the same number 4+ times.

    • misspacific@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      exactly, this will eliminate some jobs, but anyone who’s asked an LLM to fix code longer than 400 lines knows it often hurts more than it helps.

      which is why it is best used as a tool to debug code, or write boilerplate functions.

      • hansl@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        You’ll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. It’s not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, you’ll need 10, then 5, then 1-2.

        You have more in common with the guy getting replaced today than you care to admit in your comment.

        Edit: not sure why I’m getting downvoted instead of having a discussion, but good luck to you all in your careers.

        • misspacific@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          i didn’t downvote you, regardless internet points don’t matter.

          you’re not wrong, and i largely agree with what you’ve said, because i didn’t actually say a lot of the things your comment assumes.

          the most efficient way i can describe what i mean is this:

          LLMs (this is NOT AI) can, and will, replace more and more of us. however, there will never, ever be a time where there will be no human overseeing it because we design software for humans (generally), not for machines. this requires integral human knowledge, assumptions, intuition, etc.

          • hansl@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            LLMs (this is NOT AI)

            I disagree. When I was studying AI at college 20+ years ago we were also talking about expert systems which are glorified if/else chains. Most experts in the field agree that those systems can also be considered AI (not ML though).

            You may be thinking of GAI or Universal AI which is different. I am a believer in the singularity (that a machine will be as creative and conscious as a human), but that’s a matter of opinion.

            I didn’t downvote you

            I was using “you” more towards the people downvoting me, not you directly. You can see the accounts who downvoted/upvoted, btw.

            Edit: and I assumed the implication of your comment was that “people who code are safe”, which is a stretch I was answering to. Your comment was ambiguous either way.

  • bean@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    And that means lower prices for consumers. Right? Guys… r… right?

  • boatsnhos931@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    Execs? The same people who make short sighted decisions and don’t understand basic psychology? Let me go get a pen so I won’t…give two fucks what this bogus survey says. Let AI run your business so I can have some excitement in my life

  • SomeGuy69@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    I never had the impression that there were enough people for the amount of work anyways. I don’t see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.

    AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it’s too generic. We’re not there yet, where companies learn their own LLM yet. some outlier try.

    We got to understand that there’s still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we’re social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.

    No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.

    Also for security reasons you can’t add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.

    My 5 cents.

    • melpomenesclevage@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Missing the point.

      AI won’t so much replace labor as make it more fungible, and thus exploitable/abusable.

      Except where its used as an excuse to just… Not. “Yes we have customer service; its just all chatgpt with no permissions” so nobody can ever return shit that was delivered broken.