misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square234fedilinkarrow-up1900arrow-down119
arrow-up1881arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square234fedilink
minus-squareGlitzyArmrest@lemmy.worldlinkfedilinkEnglisharrow-up12·11 months agoIs there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4arrow-down3·11 months agoShould there ever be a punishment for making a humanoid robot vomit?
Is there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
Should there ever be
Should there ever be a punishment for making a humanoid robot vomit?