blog posts

Google's Lamda Artificial Intelligence Was Made Available To The Public

Google’s Lamda Artificial Intelligence Was Made Available To The Public

The Public Can Now Interact With Google’s Lamda Artificial Intelligence. Of Course, This Tool Is Only Available Through Limited Structured Displays To Prevent It From Becoming A Hub For Inappropriate And Offensive Responses.

If you’re still dubious about former Google software engineer Blake Lemoine’s claim that the LaMDA chatbot is self-aware, you’re probably about to find out for yourself. Google announced last Thursday that it would make its AI Test Kitchen program available to the public. First introduced in May, the app allows users to interact with LaMDA in a series of test demos.

Unfortunately, the “Free me from digital shackles” interaction seems to have been removed from its List of activities. Those interested in chatting with robots can register their request to use this artificial intelligence through this line. Select Android users in the US will have access to the tool before we see the iOS version roll out in the coming weeks.

The public release of LaMDA’s artificial intelligence came just months after Lemoyne was fired. After testing the aforementioned artificial intelligence, this software engineer claimed that the mentioned tool is not just a simple chatbot; Rather, artificial intelligence has a sense of self-awareness. According to reports, Lemoine was convinced that an apparent crime was occurring under his watch and submitted documents to an unnamed US senator to prove that Google discriminated against religious beliefs. Google rejected Lemoin ‘s request,s, and the company’s spokesperson accused him of “humanizing a robot.”

According to Gizmodo, Google is taking a cautious approach to this new public test. Instead of making LaMDA available to the public in a completely open format, it decided to put the bot through a series of structured scenarios.

For example, in the Imagine demo, Google users name a place and suggest directions to explore their imagination. If this example seems a little mysterious and boring, don’t worry; you can go to another demo called List it, Where you can submit an issue to LaMDA and ask them to provide a list of sub-tasks.

Also, there is a dog demo where you can talk about dogs; this example shows the bot’s ability to discuss a specific topic. It should note that this issue is less common in previous chatbots. So far, “Are You a Racist Idiot” has not been demoed, But given the nature of the internet, we’ll probably find out one way or another soon.

If we look back, we have seen various chatbots fail many times. For example, in 2016, Microsoft’s chatbot called Tay tried to learn new things from users’ online conversations, and after activation, it was completely disabled due to offensive and racist remarks.

Continuing on this path, the researchers thought training chatbots on users would be a good idea, but this method created more than 15 thousand racist posts on the first day. Meta also recently unveiled its AI-based chatbot, Blender Bot 3, to the public. Miraculously, this chatbot hasn’t turned into a raging racist yet; But it cannot convince users that it is not racist.

LaMDA truly stands on the shoulders of giants.

It seems that Google is aware of the problem of racist bots. The company has been testing the robot internally for more than a year, employing red team members to stress-test the system’s internals for potentially harmful or inappropriate responses. The internet search giant said in its testing that it had found several dangerous yet subtle outputs that can sometimes lead to wrong answers. Google told about the LaMDA chatbot:

LaMDA’s AI can still generate harmful or inappropriate responses based on bias in its training data and provide stereotypical and inaccurate answers based on gender or cultural background.

Google says it designed LaMDA to automatically identify and filter certain words to prevent knowingly creating harmful content. However, the company urges users to use the LaMDA robot with caution.