Detect Frames with Green Dot?

stevenkan

Ars Legatus Legionis
15,662
My time-lapse camera rig takes interior photos of my bee hive every 10 minutes. I installed a new queen 10 days ago, and she was released from her tiny cage about 5 days ago. See the little wooden box at the bottom of the photo, here:

2024-05-21-11-00-11.jpg


Now that she is out and running around, she might be visible in one of the subsequent photos, but she’s very difficult to find, by eye, by scanning though hundreds or thousands of images. But, because she is a commercially-purchased queen, she is conveniently marked with a green dot of paint.

In 2021, with a vastly inferior setup, I did get lucky and find her once:

CombBuilderTimeLapse2021_Queen1.jpg


Is there a tool that can detect a frame of video that has a region of at least M x N pixels that is at least XX% “green,” for various values of green?

Thanks!
 

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
There are at least two easy ways I can think of to do this (for some definition of easy):

1) Get something like https://imagemagick.org/index.php then filter it by color, and check the total luminosity of the image.

You may need to play around with it a bit, but that is how I would do it in most scripting environments unless I have something specific in mind for the implementation. Getting it to find a blob of the right size may involve a few transforms if that color appears anywhere else, but should be possible. It has a lot of options.

2) Convert to appropriate sized images in slices, get an API subscription for OpenAI or Anthropic, then ask either the latest gpt or Claude 3 to tell you if there is a bee with a green dot of paint in each picture.
 
Last edited:
  • Like
Reactions: stevenkan

KingKrayola

Ars Scholae Palatinae
1,077
Subscriptor
Another thought: in Photoshop there is a black & white filter adjustment where you can apply a colour filter at the same time as converting to monochrome (as one would with a coloured lens filter and black and white film).

If you apply a green filter it should make the oranges and reds darker, and greens lighter. There may not be enough contrast or green tone to see the difference.

I'd try myself but I'm on my phone?
 

stevenkan

Ars Legatus Legionis
15,662
Your first image is almost monochromatic. Is it lit that orange, or is it bad colour balance on the photo? As continuum said, the image quality is not conducive to this task. The photo is basically just shades of orange, so "green dot" is more like "dot the same colour as everything else".

Can you change the lighting or balance the camera?
I set up the camera and lighting poorly, but I don't want to change it now, because this is for a time-lapse, and I don't want a big color shift in the middle of the video. Once I'm done taking the photos, I'll bring the time-lapse into DaVince Resolve and see if I can improve the color balance.

But the bees are generally yellow with black stripes, and the wax is yellow, and the wood is all light brown, so it's actually pretty monochromatic in real life as well, just not as badly as is shown in the first photo.

I think a green dot would look significantly different from everything, but I won't be able to tell until I spot her, which is what I'm trying to do, goto 1. 🤣
 

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
Can you put a similar green dot under the same lighting to use as a sample?

I would start by sampling that color so you can set a color range to detect, at which point you can see how much cleanup it needs to remove any smaller dots in that color range (through despeckle or lowering the resolution to take an average color per larger area). That is hard to do without having an image where you can see the dot to test though.
 
  • Like
Reactions: stevenkan

cerberusTI

Ars Tribunus Angusticlavius
6,449
Subscriptor++
Another thought: in Photoshop there is a black & white filter adjustment where you can apply a colour filter at the same time as converting to monochrome (as one would with a coloured lens filter and black and white film).

If you apply a green filter it should make the oranges and reds darker, and greens lighter. There may not be enough contrast or green tone to see the difference.

I'd try myself but I'm on my phone?
The benefit to a command line program like imagemagick is that you can script it through a program or shell commands (like bash, or powershell) to select the range of colors, then do any image manipulation necessary to remove noise, then count pixels so you can tell if the image had that color in any volume after processing, then if so circle it in red and write that image to a different directory.

It is much more automated if there are many images to process.
 

stevenkan

Ars Legatus Legionis
15,662
Can you put a similar green dot under the same lighting to use as a sample?

I would start by sampling that color so you can set a color range to detect, at which point you can see how much cleanup it needs to remove any smaller dots in that color range (through despeckle or lowering the resolution to take an average color per larger area). That is hard to do without having an image where you can see the dot to test though.
I would, except that this particular colony is super, super defensive, probably because of some Africanized Honey Bee (AHB) genetics, which is why I installed a new queen.

So I don't want to open up the hive for at least another month, by which time the new queen's children will have hatched out and moderated the "mood" of the colony a bit. By 3 months' time the whole colony should be gentle and manageable, like my other hives.

But I can fake it by editing in a green dot that I feel would be representative of what I think I'm looking for. Look dead-center:

1716352058759.png

That's downsized 50%. Full scale, it would be a ~circle with diameter ~16 pixels on an image that's 3008 x 2000.
 

cogwheel

Ars Tribunus Angusticlavius
6,691
Subscriptor
I set up the camera and lighting poorly, but I don't want to change it now, because this is for a time-lapse, and I don't want a big color shift in the middle of the video. Once I'm done taking the photos, I'll bring the time-lapse into DaVince Resolve and see if I can improve the color balance.

But the bees are generally yellow with black stripes, and the wax is yellow, and the wood is all light brown, so it's actually pretty monochromatic in real life as well, just not as badly as is shown in the first photo.

I think a green dot would look significantly different from everything, but I won't be able to tell until I spot her, which is what I'm trying to do, goto 1. 🤣
Could you take a second picture a fraction of a second before or after the one you want to use for the time lapse, but using a green light that'd make the dot stand out with much more contrast? You'd use the images with the current setup for the time lapse, and the green light image for queen locating only. Processing-wise, you could probably convert the green light image to black&white (1-bit) with a cutoff to only make the dot visible, then layer that on the timelapse frame to highlight the queen if the images were taken closely enough together.
 
  • Like
Reactions: stevenkan

w00key

Ars Praefectus
5,907
Subscriptor
I would, except that this particular colony is super, super defensive, probably because of some Africanized Honey Bee (AHB) genetics, which is why I installed a new queen.

So I don't want to open up the hive for at least another month, by which time the new queen's children will have hatched out and moderated the "mood" of the colony a bit. By 3 months' time the whole colony should be gentle and manageable, like my other hives.

But I can fake it by editing in a green dot that I feel would be representative of what I think I'm looking for. Look dead-center:

View attachment 81246

That's downsized 50%. Full scale, it would be a ~circle with diameter ~16 pixels on an image that's 3008 x 2000.
That seems doable, with a eye dropper I get HSL=92, 65, 72. Open video, for each frame, search HSL space for H=92 +- 10, S > 50, V > 50, and log when that happens and save the image as jpg.

Non greens are H=30-40 so the gap is significant enough to be easily detectable.


In Python, OpenCV and Pillow can do it with a 100 lines script I think, but finding a off the shelf project to do so may be hard.

Something like View: https://gist.github.com/SebOh/5d2438c7987591757a3591495720a5e7

to read the video + and then either count it in Pillow, or do some thresholding in HSL space (https://stackoverflow.com/questions/48182791/how-do-you-lightness-thresh-hold-with-hsl-on-opencv) + count pixels (https://stackoverflow.com/questions/45836214/opencv-python-count-pixels) and > X greens => alert.
 

stevenkan

Ars Legatus Legionis
15,662
Could you take a second picture a fraction of a second before or after the one you want to use for the time lapse, but using a green light that'd make the dot stand out with much more contrast? You'd use the images with the current setup for the time lapse, and the green light image for queen locating only. Processing-wise, you could probably convert the green light image to black&white (1-bit) with a cutoff to only make the dot visible, then layer that on the timelapse frame to highlight the queen if the images were taken closely enough together.
I could definitely rig up something different for next year's swarm season. Right now I'm using an EZOutlet5 to turn a pair of lightbulbs on, 10 seconds before firing the camera, but in previous years I was using the GPIO pins on the Pi to turn on a bank of white LEDs that I'd wired up to a breakout board with a power supply. I was hoping that having AC power and big-ass light bulbs would improve my images, but apparently I didn't do enough testing to get bulbs with good color rendition.

But reviving the LED board with green LEDs is a neat idea. Although next year the queens will be marked Blue, as the 5-year color cycle is WYRGB, or "Will You Raise Good Bees?"
 
  • Like
Reactions: continuum

stevenkan

Ars Legatus Legionis
15,662
That seems doable, with a eye dropper I get HSL=92, 65, 72. Open video, for each frame, search HSL space for H=92 +- 10, S > 50, V > 50, and log when that happens and save the image as jpg.

Non greens are H=30-40 so the gap is significant enough to be easily detectable.


In Python, OpenCV and Pillow can do it with a 100 lines script I think, but finding a off the shelf project to do so may be hard.

Something like View: https://gist.github.com/SebOh/5d2438c7987591757a3591495720a5e7

to read the video + and then either count it in Pillow, or do some thresholding in HSL space (https://stackoverflow.com/questions/48182791/how-do-you-lightness-thresh-hold-with-hsl-on-opencv) + count pixels (https://stackoverflow.com/questions/45836214/opencv-python-count-pixels) and > X greens => alert.
OMG. I hope you just copied and pasted that from somewhere else!

I'll see if I can grok what that code is doing.
 

cogwheel

Ars Tribunus Angusticlavius
6,691
Subscriptor
I could definitely rig up something different for next year's swarm season. Right now I'm using an EZOutlet5 to turn a pair of lightbulbs on, 10 seconds before firing the camera, but in previous years I was using the GPIO pins on the Pi to turn on a bank of white LEDs that I'd wired up to a breakout board with a power supply. I was hoping that having AC power and big-ass light bulbs would improve my images, but apparently I didn't do enough testing to get bulbs with good color rendition.

But reviving the LED board with green LEDs is a neat idea. Although next year the queens will be marked Blue, as the 5-year color cycle is WYRGB, or "Will You Raise Good Bees?"
You could make the LED board with multiple LED channels, one that's high CRI white LEDs for the time lapse itself, and either one RGB LED channel (technically three channels unless you use something like WS2812s) or five separate color channels (one optimized for each year's queen dot color) if the RGBs have problems with the W and Y dots.
 

w00key

Ars Praefectus
5,907
Subscriptor
OMG. I hope you just copied and pasted that from somewhere else!

I'll see if I can grok what that code is doing.
Haha yeah I linked to a Gist and the forums expanded it... And refuse to leave it as a link.

But the key lines are easy to spot, just open video, then extract the frames. In other examples they convert it to HSL and then do stuff with it.
 

Xelas

Ars Praefectus
5,444
Subscriptor++
Does the ink fluoresce under UV? If so, then a quick shot of UV can make it pop right out, or perhaps keeping a low-intensity UV light on will be enough to keep the ink "charged" and glowing. There seem to be some inks on the market specifically for marking bees that are fluorescent.
If you take a black-light shot and then very quickly afterwards a normal shot (before the queen moves away or gets hidden), and then superimpose them, that might be enough to make the queen stand out very well. The glow may even be enough to make the queen easier to find if she is partially or mostly obscured.

If you maintain that WYRGB scheme, finding the W or Y will be especially hard, but if the ink glows under UV, then it will be easier to filter for and detect regardless of what it's normal color is, and then you won't need to substantially change your process every year.
I really recommend including a color calibration chart of some sort within the setup. There are a bunch commercially available, but even a DIY one will make things like setting the white point and calibrating colors MUCH easier. Whatever color/ink is used for the queen must also be added to that card so that you can tell exactly what to look for in the frame.
Color calibration cards are used everywhere where color calibration is important. Even spacecraft carry them and they are in every shot (although sometimes edited or cropped out for publication). Even the venerable Voyagers had them.