White Shark Dorsal Edge Annotator not functioning

What Wildbook are you working in?

SharkBook

What is the entire URL out of the browser, exactly where the error occurred?

Part 1:

Part 2:

Can you describe what the issue is you’re experiencing?

Part 1:
**I have uploaded about 20 individual images of White Shark dorsal fins and the AI is rarely able to annotate the presence of a dorsal fin. I have waited up to multiple days in case the processing tool is backed up from other users. After refreshing the page multiple times, deleting the encounter and re-uploading the image, I click on the box with three lines on the fin image to bring up the ‘Match Results’ tab, the message is displayed ‘cannot start match’ along with a lack of the green annotation bounding box. This is part 1 of the issue… what are possible causes of the AI annotations not functioning correctly?

Part 2:
From this point, I manually work through the process, adding an annotation, using the ‘white_shark+fin_dorsal’ iaClass, creating an accurate bounding box around the fin, then Save. The manual process works, however when I click ‘Start Match’, the algorithm becomes stuck at either ‘Waiting for results. The machine learning queue is working.’ or ‘attempting to fetch results’, typically the latter message. The parameters I set for the match are: Location = Pacific, Project Selection = California White Sharks (My project with ~2300 encounters), MiewID Matcher, CurvRank v2 dorsal edge-matcher, PIE v2 pattern-matcher. The CurvRank algorithm tends to work, and the Miew and PIE tend to get stuck as described earlier. I understand the size of my project can lead to slow speeds, however I have waited over a week and still maintained the ‘waiting…’ messages. Is there a way to get these algorithms functioning correctly from my end?

I have attached two of our dorsal fin images in case anyone would like to try to reproduce these errors
AN20112402_M_12 (54063)

AN20112403_U_13.5

Hi @Dmoran, welcome!

I’ll need some time to research why detection hasn’t been working for you and also why identification for certain algorithms is incomplete after all this time. It’s not typical for it to take that long.

I’m also going to look into correcting the spelling we have in Sharkbook for Año Nuevo.

I’ll post here if I have any follow-up questions or an update.

Thank you so much for the help!

1 Like

I tested uploading a new encounter with your photos. The second one was detected correctly and had an annotation added automatically. The first one did not get detected.

I suspect it’s because of a combination of being too closely cropped and its small resolution. Images less than 480px are sized up and can cause blurriness, which in turn makes detection and ID a little more challenging. Both of these images are pretty small (503 x 412 and 644 x 469). You can refer to the Photography Guidelines in our docs to improve identification success.

I was able to get match results for PIE v2 and MiewID in testing for a single location ID, but I I also re-ran the match results for this Encounter using your match parameters to see if there was either a hiccup in WBIA preventing the initial matches from completing or if there’s something else going on. I’m going to revisit this in the morning to check on its progress and continue investigating.

@Anastasia Thank you for getting back to us! This might be an ongoing issue as our database consists of thousands of these cropped images which are screengrabs from video. Do you think that we will just be limited when our images are below those image size guidelines?

It’s a possibility. I’ve been able to upload screenshots from videos before, but I typically do it from my desktop computer, so the larger viewing area results in a cropped image that is still within the recommended size guidelines.

I think the most likely hurdle will be having to manually annotate images that can’t complete detection. Based on what I’ve seen so far, they should still work for identification.

I’m seeing some issues in WBIA with a handful of Sharkbook jobs, so I’m working with my teammates to resolve that so we can get those other algorithm results displayed for you.

I just wanted to check in to let you know we’re still looking into this. Thanks for your patience.

Thank you! i have been continuing uploads with the known caveat that some images may be too cropped for the annotation and recognition to function perfectly. Thanks again!

1 Like

Have you run into any other issues with new Encounters not displaying your match results? I learned that MiewID was temporarily disabled on Sharkbook this weekend, but let me know if you’re not seeing any PIE v2 results.

Thanks for hanging in there! There was an issue specifically with the way identification was configured for white sharks that was causing these errors in WBIA. You should now see match results for PIE v2 and CurvRank. You may need to start a new match on any encounters that aren’t displaying match results.

MiewID will remain temporarily disabled on Sharkbook until we make some GPU upgrades to Sharkbook on our side late next week.

Matches have seemed to be working well since that fix, thank you! I have still run into cases where the annotation does not work in images with lower resolutions. Might have to stick to manual annotations. Our database is mainly screengrabs of videos, so data attrition will likely remain…good to know the cause. I have some small additional feature requests regarding quality reports in bulk imports, but I will do a separate thread for that. Thanks again for the help

2 Likes