Google Lens Can Now Identify Skin Problems by Snapping a Picture
Views:
1970-01-01 08:00
If you’re suffering from a rash or another skin condition, you can now use Google

If you’re suffering from a rash or another skin condition, you can now use Google Lens, the company’s visual search function, to help you diagnose the problem.

Simply take a picture of the affected area using the Google Lens app, and the company will try to identify the problem for you. “Describing an odd mole or rash on your skin can be hard to do with words alone,” the company wrote in a blog post. “Fortunately, there’s a new way Lens can help, with the ability to search skin conditions that are visually similar to what you see on your skin.”

It's best to see a doctor for the most accurate assessment, but the Google Lens feature could point users in the right direction when it comes to treatment for less serious conditions. The technology works by snapping a picture of your skin problem and comparing it to visual matches.

“This feature also works if you're not sure how to describe something else on your body, like a bump on your lip, a line on your nails, or hair loss on your head,” Google added.

Although we can’t vouch for the technology’s accuracy, it may offer a preview of future healthcare analysis. Users are already pointing out that the Google Lens capability could be paired with an AI-powered chatbot, such as Google Bard, but trained on medical knowledge to help you diagnose your health problems.

The feature is intended to encourage more people to try out Google Lens. In Wednesday’s blog post, the tech giant also points out that Google Lens can be used to help solve homework problems, translate street signs in a foreign country, or identity clothes you recently saw and want to buy. The Google Lens app is available on Android; on iOS, it's part of the Google app.

Tags mobile apps