app or plugin to search by place name for photos on iOS and OSX?

jonyotten

Smack-Fu Master, in training
44
Upon closer examination of the search results… My Seattle photos were surfaced but I have no explanation for the first photo. Those Polar Bears reside at the Como Park Zoo in Saint Paul and the file isn’t geotagged otherwise, nor does the capture date overlap any Seattle files. A mystery for another time — it’s not holding me back.
thanks again for your help. i'm also digging more here.
so it looks like my 80 results for seattle are in part based upon some kind of OCR recognition? it even highlights the word it is keying on in the image as you can see below.
but does anyone know how the GPS results work though? like aside from this OCR type result which seems rather straightforward?
like where is it pulling its data from? Apple Maps? like if i search for something in Apple Maps and get a result should i expect a picture taken with GOS data to also be shown when searching in Photos?(!)
or does anyone have any more info on this topic?
 

Attachments

  • IMG_6570.jpeg
    IMG_6570.jpeg
    152.5 KB · Views: 0
  • IMG_6571.jpeg
    IMG_6571.jpeg
    105.9 KB · Views: 0

jaberg

Ars Praefectus
3,660
Subscriptor
Search for photos on iPad said:
Swipe from the left edge of the screen or tap to show the sidebar, then tap Search. View photos in the suggested categories, or use the search field at the top of the screen to search by any of the following:
Date (month or year)
Place (city or state)
Business names (museums or restaurants, for example)
Category (beach or sunset, for example)
Events (sports games or concerts, for example)
A person identified in your People album (see Find and identify people in Photos)
Text (an email address or phone number, for example)
Caption (see See photo and video information)
The person who added the photo to the library (see Set up or join an iCloud Shared Photo Library in Photos on iPad)
Search for photos on iPad
 

jonyotten

Smack-Fu Master, in training
44
Yes, a search of photos will search for text that appears within the photo. It’s been a feature for several years.
ok. thanks. it's taking me some time obviously. so i appreciate your help.
i wonder if anyone has any good info on what data it is using to provide results for Place searches. is it tied to Apple Map data? would i expect similar terms used in both apps to both return results? what kind of Places searches can you run beyond a proper City name? is it all - aside from the image text read results - being tied to a latitude and longitude that is embedded in the photo?
is it searching metadata (IPTC i guess) info placed inside the image "manually" for instance from within software like Aperture?
 

cateye

Ars Legatus Legionis
11,760
Moderator
Places searches can you run beyond a proper City name? is it all - aside from the image text read results - being tied to a latitude and longitude that is embedded in the photo?

Yes, it is completely dependent on the embedded location data (long/lat) in the photo itself. What results that returns—a city, a region, a specific location (like a restaurant or building)—Depends on how Apple's heuristics interprets that location data, and how closely you are zoomed into the map. Wide zoom resolves general locations, zooming in resolves more specific locations.

For example: You're on vacation in Seattle, and go have a nice dinner at Anthony's. You take several pictures of your meal. Apple location data may (or may not) recognize that those photos were taken while at Anthony's. After dinner, stuffed, you take a walk down the pier and keep taking photos. You're not still at Anthony's, but depending on the location data your photo attaches to the photo, the heuristics may still indicate the photo was taken at Anthony's. Or they may just generically say Pier 66, Seattle. You could then manually edit the location data of the photos to indicate an exact location, separating the ones that were taken while you were having dinner, and the ones taken while you were on your walk.

Or, as you've already noticed, you may take a picture of your menu and on that menu are the words "Anthony's Pier 66, Seattle". Searching for Seattle would find that photo regardless of its location data because it is able to search intelligently for words in photos (as others have noted) on any modern iPhone, iPad, or Mac.

This Apple Knowledge Guide article explains broadly how to view locations (on a Mac).

This Apple Knowledge Guide article explains how to manually change that location data (among other things).

How any of this works in Aperture is an unknown because Aperture was developed (and abandoned) before any of these heuristics or machine learning-based systems existed. As I mentioned in your locked thread, it is not relevant to this conversation in 2024.
 

jaberg

Ars Praefectus
3,660
Subscriptor
You could then manually edit the location data of the photos to indicate an exact location, separating the ones that were taken while you were having dinner, and the ones taken while you were on your walk.
Yes…with some weird Apple Black Box exceptions. Try as I might, there are photos in my own collection that were taken at the Minneapolis Institute of Art that insist they were made at the adjacent Minneapolis College of Design — despite multiple location edits to correct the (minor) error. This is the most prominent example in my mind — there are others.


Still faster than a card file and drawers of negatives. ¯\(ツ)
 
  • Hug
Reactions: cateye

continuum

Ars Legatus Legionis
94,897
Moderator
As with everything Apple, if you want to use an Apple solution, you have to accept that the many Apple Black Boxes you have no control over will get their assigned task or process right 9 times out of 10, but that 1 time will drive you absolutely bonkers.
++;

And when you try other products you may find the black boxes are still black boxes, but they often work less well, and you will still be frustrated!