you are viewing a single comment's thread.

view the rest of the comments →

[–]NastyWetSmear 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (1 child)

It ends the chat?? They programed their answer bot to get pissy?? HA!
Why this? Why program it to draw the line at calling it a common name?

"Okay, we've programed our answer bot to scour the internet for relevant words and phrases to help our customers, but we know what the internet is like. Let's talk about things it won't allow."
"Underaged pornography."
"Yes, obviously, good one."
"Videos of people dying."
"Not actually illegal to watch or look up, I say we allow it, but add a warning that the content may be graphic."
"Asking to call it Bob."
"Oh! Ugrh!... I mean, yes, clearly. Sorry, I threw up in my mouth a little. Fuck... Could you at least warn me before you say things like that??"

[–]Musky[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

It's annoying. I was playing a around with a project in chatGPT, had quite a bit of input to get it churning out what I wanted, and then suddenly it didn't like one of my prompts and killed the whole thing. Couple days of work up in smoke cause of a moody AI.