• CaptPretentious@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    12 hours ago

    Today, I ran into a bug. We’re being encouraged to use AI more so I asked copilot why it failed. I asked without really looking at the code. I tried multiple times and all AI could say was ‘yep it shouldn’t do that’ but didn’t tell me why. So, gave up on copilot and looked at the code. It took me less than a minute to find the problem.

    It was a switch statement and the case statement had (not real values) what basically reads as ’ variable’ == ‘caseA’ or ‘caseB’. Which will return true… Which is the bug. Like I’m stripping a bunch of stuff away but co-pilot couldn’t figure out that the case statement was bad.

    AI is quickly becoming the biggest red flag. Fast slop, is still slop.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 hours ago

      AI thinks in the same way that ants think, there’s no real intelligence or thought going on but ants are still able to build complex logistics chains by following simple rules, although AI works on completely different principles the effect is the same, it’s following a lot of simple rules that lead to something that looks like intelligence.

      The problem is a lot of people seem to think that AIs are genuinely simulations of a brain, they think the AI is genuinely conjugating because they kind of look like they do sometimes. The world is never going to get taken over by a mindless zombie AI. If we ever do get AGI it won’t be from LLMs that’s for sure.

    • KumaSudosa@feddit.dk
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 hours ago

      I do find AI useful when I’m debugging a large SQL / Python script though and gotta say I make use of it in that case… other than that it’s useless and relying on it as ones main tool is idiotic