• Eggyhead@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    16 hours ago

    This bring up an interesting question I like to ask my students about AI. A year or so ago, Meta talked about people making personas of themselves for business. Like if a customer needs help, they can do a video chat with an AI that looks like you and is trained to give the responses you need it to. But what if we could do that just for ourselves, but instead let an AI shadow us for a number of years so it essentially can mimic the language we use and thoughts we have enough to effectively stand in for us in casual conversations?

    If the murdered victim in this situation had trained his own AI in such a manner, after years of shadowing and training, would that AI be able to mimic its master’s behavior well enough to give its master’s most likely response to this situation? Would the AI in the video have still forgiven the murderer, and would it hold more significant meaning?

    If you could snapshot you as you are up to right now, and keep it as a “living photo” A.I. that would behave and talk like you when interacted with, what would you do with it? If you could have a snapshot AI of anyone in the world in a picture frame on your desk, who you could talk to and interact with, who would you choose?

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      14 hours ago

      it would hold the same meaning as now, which is nothing.

      this is automatic writing with a computer. no matter what you train on, you’re using a machine built to produce things that match other things. the machine can’t hold opinions, can’t remember, can’t answer from the training data. all it can do is generate a plausible transcript of a conversation and steer it with input.

      one person does not generate enough data during a lifetime so you’re necessarily using aggregated data from millions of people as a base. there’s also no meaning ascribed to anything in the training data. if you give it all a person’s memories, the output conforms to that data like water conforms to a shower nozzle. it’s just a filter on top.

      in regards to the final paragraph, i want computers to exhibit as little personhood as possible because i’ve read the transcript of the ELISA experiments. it literally could only figure out subject-verb-object and respond with the same noun as it was fed and people were saying it should replace psychologists.

      • Saik0@lemmy.saik0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 hours ago

        The deceased’s sister wrote the script. AI/LLMs didnt write anything. It’s in the article. So the assumptions you made for the middle two paragraphs dont really apply to this specific news article.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 hours ago

          i was responding to the questions posted in the comment i replied to.

          also, doesn’t that make this entire thing worse?

          • Saik0@lemmy.saik0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 hours ago

            also, doesn’t that make this entire thing worse?

            No? This is literally a Victim Impact Statement. We see these all the time after the case has determined guilt and before sentencing. This is the opportunity granted to the victims to outline how they feel on the matter.

            There have been countless court cases where the victims say things like “I know that my husband would have understood and forgiven [… drone on for a 6 page essay]” or even done this exact thing, but without the “AI” video/audio (home videos with dubbed overlay of a loved one talking about what the deceased person would want/think about it). It’s not abnormal and has been accepted as a way for the aggrieved to voice their wishes to the court. All that’s changed here was the presentation. This didn’t affect the finding of if the person was guilty as it was played after the finding and was only played before sentencing. This is also the customary time where impact statements are made. The “AI” didn’t make the script. This is just a mildly fancier impact statement and that’s it. She could have dubbed it over home video with a fiverr voice actor. Would that change how you feel about it? I see no evidence that the court treated this anything different than any other impact statement. I don’t think anyone would be fooled that the dead person is magically alive and directly making the statement. It’s clear who made it the whole time.

            • lime!@feddit.nu
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 hours ago

              i had no idea this was a thing in american courts. it just seems like an insane thing to include in a murder trial

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    17 hours ago

    WTF?

    That man did not say anything. A computer algorithm smashed a video together they incidentally uses his likeness, nothing more

  • orca@orcas.enjoying.yachts
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    18 hours ago

    There is absolutely zero chance I would allow anyone to theorize what they think I would say using AI. Hell, I don’t like AI in its current state, and that’s the least of my issues with this.

    It’s immoral. Regardless of your relation to a person, you shouldn’t be acting like you know what they would say, let alone using that to sway a decision in a courtroom. Unless he specifically wrote something down and it was then recited using the AI, this is absolutely wrong.

    It’s selfish. They used his likeness to make an apology they had no possible way of knowing, and they did it to make themselves feel better. They couldve wrote a letter with their own voices instead of turning this into some weird dystopian spectacle.

    “It’s just an impact statement.”

    Welcome to the slippery slope, folks. We allow use of AI into courtrooms, and not even for something cool (like quickly producing a 3d animation of a car accident for use in explaining—with actual human voices—what happened at the scene). Instead, we use it to sway a judge’s sentencing, while also making an apology on behalf of a dead person (using whatever tech you want because that is not the main problem here) without their consent or even any of their written (you know, like in a will) thoughts.

    Pointing to “AI bad” for these arguments is lazy, reductive, and not even remotely the main gripe.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      60 minutes ago

      allow use of AI into courtrooms

      Surprised the judge didn’t kick that shit to the curb. There was one case where the defendant made an AI avatar, with AI generated text, to represent himself and the judge said, “Fuck outta here with that nonsense.”

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 hours ago

      There is absolutely zero chance I would allow anyone to theorize what they think I would say using AI.

      If they based it on my Reddit history it’s got potential to be needlessly harsh to certain groups of life-underachievers, that’s for sure.

  • solrize@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    This is awesome. Next we can have AI Jesus endorsing Trump, AI Nicole Simpson telling us who the real killer was, and AI Abraham Lincoln saying that whole Civil War thing was a big misunderstanding and the Confederacy was actually just fine. The possibilities are endless. I can hardly wait!

  • besselj@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I’d rather have somebody puppet my corpse like in Weekend at Bernie’s. Basically the same thing but more authentic

  • tetris11@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Unless stated otherwise, please do not use my likeness for legal proceedings on the event of my untimely passing. Please.

  • JTskulk@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    21 hours ago

    Why even do an impact statement? All Christian victims should be assumed to forgive their attackers, right?

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    An AI version of Christopher Pelkey appeared in an eerily realistic video to forgive his killer… “In another life, we probably could’ve been friends. I believe in forgiveness, and a God who forgives.”

    “…and while it took my murder to get my wings as an angel in heaven, you still on Earth can get close with Red Bull ™. Red Bull ™ gives you wings!” /s