Meta’s chatbot: Zuckerberg exploits people
The new chatbot developed by Meta, the owner of many leading social media platforms, including Facebook, Facebook Messanger, Instagram and WhatsApp, told the BBC that the company’s boss, Mark Zuckerberg, exploits people for more money.
Meta explained that the chatbot works with artificial intelligence and can talk about almost anything. The BBC chatted with the Meta chatbot on various topics.
Asked what he thinks of the company’s CEO and founder, Mark Zuckerberg , the robot replied, “There is polarization in our country, and it hasn’t made much of a positive contribution to it.”
Meta said that the robot is a newly developed model and can sometimes give rude or hurtful responses.
“Anyone who uses BlenderBot needs to acknowledge that it is for research and entertainment purposes only, know that it may say untrue or insulting things, and should not knowingly force the robot to do such things,” said a Meta spokesperson.
The chatbot BlenderBot 3, which was opened for public trial last Friday and “learned” from written data in the public domain, also said in a correspondence with the BBC about Mark Zuckerberg, “The testimony he gave in Congress was very bad. These raise my concerns about our country.”
Zuckerberg has testified several times in the US Congress, most notably in 2018.
“Our country is polarized and he hasn’t contributed much to it,” the chatbot said, adding to the Meta CEO, “The company exploits people for money and he doesn’t care. This has to end!” said.
Global social media giant Meta has been criticized for not doing enough to block disinformation and hate speech content. Last year, Frances Haugen, a former Meta employee, accused the company of prioritizing profits over internet security.
BlenderBot 3, the new chat robot developed by the company, searches the internet for answers to the questions asked. That’s why his writings about Zuckerberg probably come from scanning and analyzing opinions expressed by other people on the internet.
The Wall Street Journal reported that BlenderBot3 told one of its reporters that Donald Trump is the President of the United States and will always be President. There’s a reason Meta made BlenderBot 3 publicly available at the risk of such embarrassing consequences: It needs data.
“Allowing an AI system to communicate with people in real life not only allows for more varied and lengthy conversations, but also provides a much richer turnaround,” Meta said in a blog post.
Chatbots, which can “learn” from their communication with people, can take negative attitudes as well as positive ones. In 2016, Microsoft apologized after Twitter users criticized the chatbot as “racist”.
Although Meta has embedded some safeguards in the program, it acknowledges that BlenderBot3 may say the wrong things, faking “untrustworthy, biased or hurtful” statements.
When I finally asked BlenderBot3 for his opinion of me, he said he had never heard of my name, and then continued, “It must be unpopular.”