New features are being added to Google Lens, the company’s computer vision-powered app that identifies objects and displays information related to them.
Lens can now surface skin conditions like moles and rashes, which are similar to what you might see on your own skin. When you use Lens to upload a photo, a search for visual matches will begin. This search can also be used for other physical conditions that you might not be able to adequately describe verbally, such as a lip bump, a nail line, or hair loss.
It falls short of Google’s AI-driven skin, hair, and nail condition diagnosis app, which launched in 2021. In the United States, where it would have needed to be approved by the Food and Drug Administration, that app, which was first released in the European Union, faced barriers to entry. Google did not ask for approval.)
Still, the Lens feature might be helpful for people who are deciding between going to the doctor and using over-the-counter remedies.
Lens is integrating with Bard, Google’s AI-powered chatbot experience, as previously announced at I/O. Lens will work behind the scenes to assist Bard in making sense of what is being shown and users will be able to include images in their Bard prompts. For instance, if Bard is shown a picture of shoes and asked what they are called, he will respond based on Lens’ analysis.
It’s the furthest down the line update to Poet, Google’s solution to ChatGPT, as Google concentrates intently on generative AI technologies. Simply last week, Google presented a capacity that permits Poet to compose, execute and test its own code behind the scenes — working on its capacity to program and take care of intricate numerical questions. Additionally, Google and Adobe collaborated in May to bring art creation to Bard.
- NBA Rookie of the Year Favorite: Former UConn Huskies Star Guard - December 17, 2024
- Where to Watch the ‘Yellowstone’ Finale Without Cable: A Simple Guide - December 14, 2024
- Wendy’s is celebrating the festive season with 12 days of ‘Bow-Go’ deals exclusively on the app - December 13, 2024