• tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      25 days ago

      AI as a general concept probably will at some point. But LLMs have all but reached the end of the line and they’re not nearly smart enough.

      • li10@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        25 days ago

        LLMs have already reached the end of the line 🤔

        I don’t believe that. At least from an implementation perspective we’re extremely early on, and I don’t see why the tech itself can’t be improved either.

        Maybe it’s current iteration has hit a wall, but I don’t think anyone can really say what the future holds for it.

        • jacksilver@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          24 days ago

          LLMs have been around since roughly 2016 2017 (comment below corrected me that Attention paper was 2017). While scaling the up has improved their performance/capabilities, there are fundamental limitations on the actual approach. Behind the scenes, LLMs (even multimodal ones like gpt4) are trying to predict what is most expected, while that can be powerful it means they can never innovate or be truth systems.

          For years we used things like tf-idf to vectorize words, then embeddings, now transformers (supped up embeddings). Each approach has it limits, LLMs are no different. The results we see now are surprisingly good, but don’t overcome the baseline limitations in the underlying model.

          • Todd Bonzalez@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            24 days ago

            The “Attention Is All You Need” paper that birthed modern AI came out in 2017. Before Transformers, “LLMs” were pretty much just Markov chains and statistical language models.

        • Wooki@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          25 days ago

          we’re extremely early on

          Oh really! The analysis has been established since the 80’s. Its so far from early on that statement is comical

          • Todd Bonzalez@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            24 days ago

            Transformers, the foundation of modern “AI”, was proposed in 2017. Whatever we called “AI” and “Machine Learning” before that was mostly convolutional networks inspired by the 80’s “Neocognitron”, which is nowhere near as impressive.

            The most advanced thing a Convolutional network ever accomplished was DeepDream, and visual Generative AI has skyrocketed in the 10 years since then. Anyone looking at this situation who believes that we have hit bedrock is delusional.

            From DeepDream to Midjourney in 10 years is incredible. The next 10 years are going to be very weird.

      • APassenger@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        25 days ago

        It’s this. When boards and non-tech savvy managers start making decisions based on a slick slide deck and a few visuals, enough will bite that people will be laid off. It’s already happening.

        There may be a reckoning after, but wall street likes it when you cut too deep and then bounce back to the “right” (lower) headcount. Even if you’ve broken the company and they just don’t see the glide path.

        It’s gonna happen. I hope it’s rare. I’d argue it’s already happening, but I doubt enough people see it underpinning recent lay offs (yet).

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      25 days ago

      I can see the statement in the same way word processing displaced secretaries.

      There used to be two tiers in business. Those who wrote ideas/solutions and those who typed out those ideas into documents to be photocopied and faxed. Now the people who work on problems type their own words and email/slack/teams the information.

      In the same way there are programmers who design and solve the problems, and then the coders who take those outlines and make it actually compile.

      LLM will disrupt the programmers leaving the problem solvers.

      There are still secretaries today. But there aren’t vast secretary pools in every business like 50 years ago.

      • IsThisAnAI@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        25 days ago

        It’ll have to improve a magnitude for that effect. Right now it’s basically an improved stack overflow.

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      25 days ago

      I’m pretty sure I could write a bot right now that just regurgitates pop science bullshit and how it relates to Line Go Up business philosophy.

      Edit: did it, thanks ChatJippity

      def main():
          # Check if the correct number of arguments are provided
          if len(sys.argv) != 2:
              print("Usage: python script.py <PopScienceBS>")
              sys.exit(1)
          # Get the input from the command line
          PopScienceBS = sys.argv[1]
          # Assign the input variable to the output variable
          LineGoUp = PopScienceBS
          # Print the output
          print(f"Line Go Up if we do: {LineGoUp}")
      if __name__ == "__main__":
          main()
      
  • casmael@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    25 days ago

    I know just enough about this to confirm that this statement is absolute horseshit

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Let’s assume this is true, just for discussion’s sake. Who’s going to be writing the prompts to get the code then? Surely someone who can understand the requirements, make sure the code functions, and then test it afterwards. That’s a developer.

    • William@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I think that’s the point? They’re saying that those coders will turn into prompt engineers. They didn’t say they wouldn’t have a job, just that they wouldn’t be “coding”.

      Which I don’t believe for a minute. I could see it eventually, but it’s not “2 years” away by any stretch of the imagination.

      • Cringe2793@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        24 days ago

        Definitely be coding less I think. Coding or programming is basically the “grunt work”. The real skill is understanding requirements and translating that into some product.

  • spacecadet@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    24 days ago

    Everybody talks about AI killing programming jobs, but any developer who has had to use it knows it can’t do anything complex in programming. What it’s really going to replace is program managers, customer reps, makes most of HR obsolete, finance analysts, legal teams, and middle management. This people have very structured, rule based day to days. Getting an AI to write a very customized queuing system in Rust to suit your very specific business needs is nearly impossible. Getting AI to summarize Jira boards, analyze candidates experience, highlight key points of meetings (and obsolete most of them altogether), and gather data on outstanding patents is more in its wheelhouse.

    I am starting to see a major uptick in recruiters reaching out to me because companies are starting to realize it was a mistake to stop hiring Software Engineers in the hopes that AI would replace them, but now my skills are going to come at a premium just like everyone else in Software Engineering with skills beyond “put a react app together”

    • underthesign@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 days ago

      Trouble is, you’re basing all that on now, not a year from now, or 6 months from now. It’s too easy to look at it’s weaknesses today and extrapolate. I think people need to get real about coding and AI. Coding is language and rules. Machines can learn that enormously faster and more accurately than humans. The ones who survive will be those who can wield it as a tool for creativity. But if you think it won’t be capable of all the things it’s currently weak at you’re just kidding yourself unfortunately. It’ll be like anything else - a tool for an operator. Middlemen will be wiped out of the process, of course, but those with money remain those without time or expertise, and there will always be a place for people willing to step in at that point. But they won’t be coding. They’ll be designing and solving problems.

      • spacecadet@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        24 days ago

        We are 18 months into AI replacing me in 6 months. I mean… the CEO of OpenAI as well as many researchers have already said LLMs have mostly reached their limit. They are “generalizers” and if you ask them to do anything new they hallucinate quite frequently. Trying to get AI to replace developers when it hasn’t even replaced other menial office jobs is like saying “we taught AI to drive, it will replace all F1 drivers in 6 months”.

  • slimarev92@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    24 days ago

    If that’s true, how come there isn’t a single serious project written exclusively or mostly by an LLM? There isn’t a single library or remotely original application made with Claude or Gemini. Not one.

    • Hawk@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      My last employer had many internal tools that were fine.

      They had only a moderate amount of oversight.

      I had to find a new job, I’m actually thinking of walking away from software development now that there are so few jobs :(

      It sucks but there’s no sense pretending this won’t have a large impact on the job landscape.

      • slimarev92@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        12 days ago

        What did these tools do? I don’t see any LLm being used for creating anything working from scratch, without the human propmter doing most of the heavy lifting.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      24 days ago

      I will put down a solid grand that this exact same article will be printed by the exact same website 24 months from now and it will receive the exact same reception.

  • qarbone@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    If, 24 months from now, most people aren’t coding, it’ll be because people like him cut jobs to make a quicker buck. Or nickel.

    • Cringe2793@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      23 days ago

      Well if it works, means that job wasn’t that important, and the people doing that job should improve themselves to stay relevant.

      Edit: wow what a bunch of hypersensitive babies. I swear, y’all just allergic to learning or something. I just said people need to improve themselves to stay relevant, and people freak out and send me death threats. What a joke.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        24 days ago

        job wasn’t that important

        I keep telling you that changing out the battery in the smoke alarm isn’t worth the effort and you keep telling me that the house is currently on fire, we need to get out of here immediately, and I just roll my eyes because you’re only proving my point.

        • Cringe2793@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          Sure, believe what you want to believe. You can either adapt to what’s happening, or just get phased out. AI is happening whether you like it or not. You may as well learn to use it.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    24 days ago

    Translation: “We’re going to make the suite for building, testing, and deploying so obnoxiously difficult to integrate with your work environment that in two years nobody in your DevOps team will be able to get anything to a release state.”

    Me, fiddling with a config file for a legacy Perl script that’s been holding up the ass-end of the business since 1996: “Uh, yeah that’s great.”