• 0 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle

  • Oh it gets worse with Shadiversity. Huge AI art guy, his brother’s an actual artist too so it’s hard seeing Shad brag to him. Very “anti-woke” and paints his conservative Mormon beliefs on everything.

    The worst unforgivable part is the end of his book has impregnated rape victims step up to defend the rapist protagonist because he “gave them” a child, while the ones that didn’t get pregnant were jealous.

    He loves to bring up that the book is supposed to explore this immoral character. But this isn’t the protagonist’s viewpoint this is just how Shad thinks the world works. This is how Shad believes rape victims think.

    Very sad to see, I followed him for swords and castles but Jesus Christ.










  • Ironically the business people are terrible at business. I genuinely think LLMs (despite their economic evils) are stunning pieces of technology.

    But they are money sinks and the only plans for profit are subscriptions or advertisements. It’s Social Media/Streaming/Tech Startups panicked hype investing all over again. Subscriptions and advertising just simply do not pay the bills for huge server and gpu farms.

    But sustainability isn’t what they want is it? They want the stock to go up to then cash out when it’s about to fall. sigh


  • Microsoft’s bread and butter has been selling and servicing to businesses.

    So with that in mind, the hell are they thinking? Windows 10 end of life guarantees that businesses specifically will have to switch. Then the next option in line is one that will by default vacuum up all your proprietary information to feed into an AI, effectively “copyright laundering” it?.

    Even if there’s ways to deactivate the feature, the non-tech savvy managers will just go off of the headlines and the tech savvy ones will recognize the security risk. And government/healthcare computer might just fork Linux into a non-open source version.

    Ironically it feels like they’re focusing too much on consumers (on extorting them) and shooting themselves in the foot for their business clientele.




  • In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn’t downplay what’s happening because it’s generally accepted that having a source of information hallucinate is bad.

    I feel like the alternatives would downplay the problem. A “glitch” is generic and common, “lying” is just inaccurate since that implies intent to deceive, and just being “wrong” doesn’t get across how elaborately wrong an LLM can be.

    Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.

    Ultmately all the sciences are full of analogous names to make conversations easier, it’s not always marketing. No different than when physicists say particles have “spin” or “color” or that spacetime is a “fabric” or [insert entirety of String theory]…


  • Wirlocke@lemmy.blahaj.zonetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 month ago

    On Discord though there’s a lot of unchecked predation. Theoretically if this were implemented it would let them see the most suspicious users that interact with an unusual amount of children and review if the messages are inappropriate.

    But all that’s unlikely because if they actually cared they’d implement other simpler solutions first. So this idea is just hypothetical but not ideal.


  • I’m a bit annoyed at all the people being pedantic about the term hallucinate.

    Programmers use preexisting concepts as allegory for computer concepts all the time.

    Your file isn’t really a file, your desktop isn’t a desk, your recycling bin isn’t a recycling bin.

    [Insert the entirety of Object Oriented Programming here]

    Neural networks aren’t really neurons, genetic algorithms isn’t really genetics, and the LLM isn’t really hallucinating.

    But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.