• alezyn@lemm.ee
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    2
    ·
    edit-2
    20 days ago

    From my understanding a vibe coder is someone who builds software using mainly AI generated code. It doesn’t necessarily mean they’re a bad coder, but often the code generated by AI is just hard to process at this scale and people will have no clue what exactly is going on in their project.

      • Photuris@lemmy.ml
        link
        fedilink
        arrow-up
        15
        ·
        20 days ago

        I’ve tried this on personal projects, but not work projects.

        My verdict:

        1. To be a good vibe coder, one must first be a good coder.

        2. Vibe coding is faster to draft up and POC, longer to debug and polish. Not as much time savings as one might think.

        • vrighter@discuss.tchncs.de
          link
          fedilink
          arrow-up
          13
          arrow-down
          1
          ·
          20 days ago

          exactly, you can only really verify the code if you were capable of writing it in the first place.

          And it’s an old well known fact that reading code is much harder than writing it.

          • ulterno@programming.dev
            link
            fedilink
            English
            arrow-up
            8
            ·
            20 days ago

            An irrelevant but interesting take is that this applies as an analogue to a lot of stuff in electronics related space.

            • It is harder to receive data than to transmit it, because you need to do things like:
              • match your receiver’s frequency with that of the transmission (which might be minutely different from the agreed upon frequency), to understand it
              • know how long the data will be, before feeding into digital variables, or you might combine multiple messages or leave out some stuff without realising
            • this gets even harder when it is wireless, because now, you have noise, which is often, valid communication among other devices

            Getting back to code, you now need to get in the same “wavelength” as the one who wrote the code, at the time they wrote the code.

          • brygphilomena@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            20 days ago

            I weirdly love reading code and figuring out what it’s doing. Debugging is cathartic.

            It might take a while and I might be cussing up a storm saying, wtf is this shit? Why the fuck would you do it this way? Why the fuck did you make this convoluted for no reason?

            Right now it’s unfucking some vibe coded bs where instead of just fixing an API to get the info we needed accurately, it’s trying to infer it from other data. Like, there is a super direct and simple route, but instead there are hundreds of lines to work around hitting the wrong endpoint and getting data missing the details we need.

            Plus letting the vibe add so much that is literally never used, was never needed, and on top of that returns incorrect information.

            • vrighter@discuss.tchncs.de
              link
              fedilink
              arrow-up
              2
              ·
              20 days ago

              enjoying it is a different issue. You probably enjoy it because it’s more difficult, which is perfectly valid reasoning

      • Kbobabob@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        20 days ago

        Even if you’re the one that built, programmed, and trained the AI when nothing else like it existed?

        • vrighter@discuss.tchncs.de
          link
          fedilink
          arrow-up
          4
          arrow-down
          4
          ·
          20 days ago

          So? Some of the people pushing out ai slop would be perfectly capable of writing their own llm out of widely available free tools. Contrary to popular belief, they are not complex pieces of software, just extremely data hungry. Does not mean they magically understand the code output by the llm when it spits out something.

          • Honytawk@feddit.nl
            link
            fedilink
            arrow-up
            6
            ·
            20 days ago

            Stark would have developed their own way of training their AI. It wouldn’t be an LLM in the first place.

            • vrighter@discuss.tchncs.de
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              20 days ago

              and he stil wouldn’t understand its output. Because as we clearly see, he doesn’t even try to look at it.

                • vrighter@discuss.tchncs.de
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  20 days ago

                  given that expert systems are pretty much just a big ball of if-then statements, then he might be considered to have written the app. Just with way more extra steps.