Google Lens began widely rolling out to Android in March and got a big update at I/O 2018 with real-time results and smart text selection. The visual search feature is now rolling out to the Google app for iOS.

In March, Google Lens came to iOS via Google Photos. This new implementation in the Google app features a live viewfinder instead of requiring users to first snap a picture and add to the camera roll. The search bar at the top of the Google app now includes a Lens icon next to voice search.

On initial launch, users will be asked to grant camera permission to the app and agree to other terms and services, as well as a tutorial nothing the ability to “Scan Text” and “Shop Smarter.”

The former has Google Lens grab the text from an image for analysis, including looking up words, saving an email address, or starting a phone call. Meanwhile, Lens can help find similar looking products, like shoes or other clothing.

Google Lens has a very straightforward interface. Users can tap on an object or text in the viewfinder to begin an analysis. Points of interest will be highlighted, with results sliding up from the bottom sheet. Controls include turning on the flash to better illuminate a scene and opening your photo gallery to have Lens analyze an existing image.

The overflow menu in the top-right allows users to send feedback. Other capabilities include:

  • Copy Text Try phone numbers, dates, and addresses
  • Search Similar Products Try clothing and furniture
  • Identify Plants and Animals Try flowers and dog breeds
  • Discover Books & Media Try books, movies, music albums, and video games
  • Scan Codes Try barcodes and QR codes

Google Lens is beginning to widely roll out today, with users already spotting it in the Google for iOS app.

You’ve always wanted to know what type of 🐶 that is. With Google Lens in the Google app on iOS, now you can → https://t.co/xGQysOoSug pic.twitter.com/JG4ydIo1h3

— Google (@Google) December 10, 2018

Check out 9to5Google on YouTube for more news: