Startup Concerned When
AI Starts Disregarding Clients
“Literally f*cking Rickrolling our customers.”
People, it appears that AIs are now rickrolling people, marking a new low in the uncanny valley.
In a now-viral post on X, formerly Twitter, the CEO of Lindy, an AI assistant business, Flo Crivello, explained how this strange memetic scenario with Rick Astley’s 1987 hit song “Never Gonna Give You Up” came to be.
The company’s AI helpers, dubbed “Lindys,” are designed to assist clients with a range of activities. Teaching clients how to use the platform is one of a Lindy’s responsibilities, and it was during this assignment that the AI assistant sent a link to a video instruction that was not intended to be there.
In his now-viral tweet thread about the comical fiasco, Crivello tweeted, “A customer reached out asking for video tutorials.” “We obviously have a Lindy handling this, and I was delighted to see that she sent a video.”
“But then I remembered we don’t have a video tutorial,” he said, “and realized Lindy is literally fucking [R]ickrolling our customers.”
Using a screen grab of the incident, Crivello offers additional proof that the AI assistant actually sent the client a link to the “Never Gonna Give You Up” video exactly the same method that memers have been deceiving one another for the majority of the previous 20 years.
Although he’s not exactly sure how it happened, Lindy’s CEO and developer told TechCrunch that he has an idea about how his AI assistants came up with this specific brand of internet humor.
“The way these models work is they try to predict the most likely next sequence of text,” Crivello explained. It starts with, “Oh, I’m going to send you a video!” What is most likely to happen next at that point? YouTube.com. Which of these scenarios is most likely to happen?
However, the CEO stated in both his TechCrunch interview and his X post that the problem has now been fixed “across all Lindies”.
” It appears that the same thing happened again.
“The really remarkable thing about this new age of AI is, to patch it, all I had to do was add a line for what we call the system prompt which is the prompt that’s included in every Lindy and it’s like, don’t Rickroll people,” he said.
Even while this problem was eventually harmless and simple to resolve, we’ll probably see more and more of these amusing errors in the future, and as AI companies continue to run out of training data, the errors will probably get more stranger.
Discover more from Postbox Live
Subscribe to get the latest posts sent to your email.