google.com, pub-7580744294872774, DIRECT, f08c47fec0942fa0 Google is using artificial intelligence to better recognize queries from people in crisis

Google is using artificial intelligence to better recognize queries from people in crisis

Artificial intelligence is being used by Google to better detect searches from individuals in crisis.

Many individuals turn to an impersonal source of help in times of personal crisis: Google. Every day, the organization receives requests for information on themes such as suicide, sexual assault, and domestic violence. 

 

Google, on the other hand, wants to do more to help consumers find the information they need and claims that new AI approaches that better comprehend the complexity of language are assisting in this effort.

 

Google is using artificial intelligence to better recognize queries from people in crisis

Google's latest machine learning model, MUM, will be integrated into its search engine to "better effectively detect a wider spectrum of personal crisis inquiries."  

MUM was first announced at the company's IO conference last year, and it has since been used to supplement search with features that attempt to answer queries related to the initial search.

 

According to Anne Merritt, a Google product manager for health and information quality, MUM will be able to recognize search queries relating to challenging personal situations that previous search tools could not.

 

"MUM can help us with lengthier or more complicated inquiries such, "Why did he assault me after I told him I didn't love him?" Merrit explained to The Verge.

 

Other queries that MUM can answer include "most common ways to commit suicide" (which Merrit says earlier systems "might have previously understood as information seeking") and "Sydney suicide hot spots" (which, again, earlier responses would have likely returned travel information — ignoring the mention of "suicide" in favor of the more popular query for "hot spots"). 

 

When Google detects such crisis searches, it shows a popup that says "Help is available," usually with a phone number or website for a mental health charity like Samaritans.

 

Google claims that in addition to utilizing MUM to respond to personal crises, it's also employing BERT, an older AI language model, to better recognize searches for explicit content like pornography. 

 

Google claims that by using BERT, it has "reduced unexpected stunning results by 30%" year over year. However, the corporation was unable to provide precise data on how many "shocking results" its users encounter on a daily basis, so although this is a step forward, it provides no indication of the scale of the problem.

 

Google is trying to persuade you that artificial intelligence is supporting the company in enhancing its search products, especially at a time when the notion that "Google search is dying" is gaining traction. However, there are several drawbacks to incorporating this technology.

 

Many AI specialists fear that Google's growing usage of machine learning language models may expose the corporation to new challenges, such as prejudice and disinformation in search results. AI systems are also enigmatic, giving developers only a sliver of insight into how they reach particular judgments.

 

When we questioned Google about how it determines which search phrases indicated by MUM are linked to personal problems in advance, its representatives were either reluctant or unable to respond. Human assessors are used to carefully evaluate modifications to the company's search offerings, but that's not the same as knowing how your AI system will answer certain inquiries in advance. 

 

However, such sacrifices appear to be worthwhile for Google.

 

 

Comments
No comments
Post a Comment



    Reading Mode :
    Font Size
    +
    16
    -
    lines height
    +
    2
    -