Questions about matching

If identification is still running for an encounter will the options on the photo include ‘no matchable detection’ rather than ‘match results’? If so, is there ever a case that matching fails and it will continue to say ‘no matchable detection’, in which case, is there a way to distinguish the two situations?

For one encounter that was bulk imported with IDs already provided, I am noticing that the top proposed match is the same individual. My assumption is that I do not need to accept this match as the software should already know it is the same individual, given that the IDs provided in the upload were the same. Is this correct? Or is any further action required for me to now be able to use the photos I have bulk uploaded (with ID provided) as a catalog to match new photos against?

For that same encounter I linked to above, which is a left side image, the image shown for the third best match based on the PIE algorithm is a right side image. Is it expected behaviour of the software? It makes it challenging to visually assess the match; I suppose the computer may be able to mirror the image to assess the fin shape match, but it’s not as easy for a human verifier.

Something else I just noticed which may be related to my second question - I just did a second bulk import (it is in the process of being sent to annotation), for which the IDs were also included in the metadata. It included some additional photos of animals in the first import. I am noticing now that when I go to My Data > View My Individuals, the second bulk import has created a second individual with the same ID, rather than merging the photos for the same individual as the same individual. This this the intended behaviour or a bug? How can I merge these duplicate individuals?

If I recall correctly, “match results” is greyed out when identification is still underway. Generally, you can start another match if they’re not loading correctly and you know there are no big jobs in the queue.

In today’s case, we’re still in the midst of addressing disc space issues in Flukebook, so you may want to hold off re-running matches for now.

You will always need to confirm matches. While the algorithm does a good job of identifying matches, we rely on users to give the final confirmation on accepting a match to ensure accuracy.

Yes, this is intended behavior. I’ve seen this in Internet of Turtles when it tries to match the scales on one side of the face with a turtle facing the opposite direction. The algorithm is essentially saying, “I think these images are similar, can you tell me if I’m right?” and then we are free to dismiss anything that’s not an obvious match. You can ignore matches that show the opposite side of your animal when you’re unsure that there’s a clear match between them.

To clarify - even in the case where the IDs are already provided for both images being matched? I just uploaded 2000 photos of ~ 75 animals (for many photos of some animals, but with the ID provided in the metadata for all) and am confused about whether I need to go in for each of those photos and say ‘yes I gave these photos the same ID because they are the same animal’ or is flukebook does that automatically.

If I am understanding correctly, I may submit this as a feature request then - since flukebook has already detected L vs R it would be ideal if it could show an image of the matching side. Especially since when matching by individual scores instead of image scores, the only image shown for that individual could be of the opposite side for that individual.

Ah, thanks for clarifying that. In that case, you can view those match results to see if any new matches come up for that individual. Otherwise, you don’t need to do anything extra.

I don’t see why not. I did find an older feature request related to viewpoint issues in matches. It’s helpful for us to to get this feedback so that we can plan improvements accordingly.

1 Like

hi @Anastasia - just flagging this question from above, in case it got missed in the back and forth

I did miss that one; thanks for calling it out.

Can you post the URLs for the duplicated individuals so I can research this?

sure thing: Flukebook | Login

or if this is more helpful, these are the two pages for the same animal (J16) Flukebook
Flukebook
though there are > 60 individuals with the same issue

Thanks! This is helpful. Let me look into this and I’ll get back to you.

1 Like

Can you email me the spreadsheet for the second bulk import you did? services at wildme dot org.

sent - related to that import, the upload didn’t accept the Encounter.submitter0.fullName - do you know why that would be?

Thanks for sending that! According to the Bulk Import doc in the Fields available section, Encounter.submitter0.fullName doesn’t save unless submitter0.emailAddress is also reported.

I’m not seeing anything odd in your spreadsheet, so I’m going to report this as a bug to our devs. I don’t have an ETA for a fix at this time, but when we have one, I’ll share it here.

1 Like

Oh, and for general advice on merging individuals, we have different instructions for that depending on which page you’re accessing.

You don’t have to do that for this bug report, though. We need to be able to see those duplicates for now in order to troubleshoot.

1 Like

hi @Anastasia - just wondering if you have any updates on timelines for fixing this bug?

No, not at this time, unfortunately. This shouldn’t prevent you from working on your encounters.

Hi Anastasia, the next step I was hoping to do is a formal quantification of how well the matching works for our population, and that wouldn’t be a proper evaluation if there are duplicates of individuals. So, unfortunately, I will be holding off on further processing until this is resolved. Alternatively, if you think it will be some time I could manually resolve the matches, but as you mention this may not be ideal if those duplicates are needed for troubleshooting.

I understand. I’ll follow up as soon as I have more info on next steps.

1 Like

Good morning! I need to get more information from you so we can try and reproduce the duplicate individual page issue:

  • What operating system were you using? (eg. MacOS 10.15.3)
  • What web browser were you using? (eg. Chrome 79)