I hope you are well, I’m afraid that we may have detected a new bug on the platform. Since yesterday afternoon, it seems impossible to start a match after creating new annotations, the platform displays “cannot start match”. And this seems to be the case for all species and for all bulk imports.
Example of bulk import: Gm.Expe2023.7.15.xlsx
For melon-headed whale/Peponocephala electra the problem seems to be more serious, since no “start match” button appears after creating an annotation.
Example of bulk import: Pe.Expe2021.5.9.xlsx
HI @rebeca
I looked at these imports and they’ve only gone through detection and haven’t been sent to identification yet. You’ll need to send them to identification first before you’ll see the option on the encounter level to start another match.
Okay, I understand, but before, during the detection phase, if we added annotations before sending them to identification, we could start matches, which is no longer the case… Can we now add annotations without having to launch a start match before sending them to identification?
As far as melon-headed whales are concerned, isn’t this a problem for the future?
Digging into this a bit more, it doesn’t look like any of the encounters in this import were actually annotated, despite detection being complete. I’m going to re-send this through detection to see if it behaves this time.
I suspect this is related to one of the bug fixes from the latest release notes:
Removed ‘Start Match’ / 'Start Another Match" from unmatchable manual annotation #477
If Flukebook considers the annotation unmatchable, it won’t let you start a match from it. In this case, it means that we don’t have an identification algorithm yet for that species. It looks like we are still working on this, and I’ll check in with the machine learning team for an update.
Quick update: ID for melon-headed whales was already in place, but that ticket I linked to had an unintended result of making otherwise matchable annotation unmatchable. When we’re done writing up a ticket for it, I’ll share the link here. Thanks for finding this bug!
Hi Anastasia,
First of all, thanks a lot for the updates and reactivity.
I had time to look at the bulk imports and it looks like we now can manually start matches after creating manual annotations.
However:
-For the melon-headed whale, there isn’t the option “start match” regardless if the annotation have been generated manually or by the detection algorithm. Can it be added?
-For the pilot whale, we can now create annotations and manually start matches but we had send a bulk import where the manual annotations were generated when the start match problem occured. Those manual annotation ID result isn’t good looking: we can’t visualize the fins and compare them to the proposed matches. Is there a simpler solution than reimporting the bulk import and doing the manual annotations again?
This bulk import is : Gm.Expe2023.7.13.xlsx
An example of line when we have an ID problem (manual detection) : 21ba22a9-74f3-430a-8f1f-5298630e37a3
And without ID problem (detection by the algorithm) : c44eacc8-dc5a-4489-8751-18df1220f98f
This should be resolved now. Can you try re-running matches on the encounters with manual annotations? In our test, we saw the source image load when it was re-run after fixing some code.
The missing start match on the melon-headed whales has been a trickier problem to solve and we’ll need more time to continue working on it.
We’ve found the error that was preventing manual annotations on melon-headed whales from being matchable. You should be able to start matches on your manually annotated images now!