We obtained reed bees from the Dandenong Ranges National Park, Victoria, Australia (lat. -37.90, long 145.37) These bees exhibit semi-social behaviour and construct their nests within the pithy stems of fern fronds and other plants. We placed each insect in a separate container to facilitate individual id for testing. In order to run the experiment over several days, insects were refrigerated overnight below 4°C. After warming up, each bee was individually recorded daily in an arena. Here it was illuminated by an overhead ring light and videoed using a Dino-Lite digital microscope for 30–50 seconds per session at 30 fps.We obtained reed bees from the Dandenong Ranges National Park, Victoria, Australia (lat. -37.90, long 145.37) These bees exhibit semi-social behaviour and construct their nests within the pithy stems of fern fronds and other plants. We placed each insect in a separate container to facilitate individual id for testing. In order to run the experiment over several days, insects were refrigerated overnight below 4°C. After warming up, each bee was individually recorded daily in an arena. Here it was illuminated by an overhead ring light and videoed using a Dino-Lite digital microscope for 30–50 seconds per session at 30 fps.

A New Era of Markerless Insect Tracking Technology Has been Unlocked by Retro-ID

2025/09/01 20:32

Abstract and 1. Introduction

  1. Related Works
  2. Method
  3. Results and Discussion
  4. Conclusion and References

2. Related Works

Explicit recognition of retro-id’s value as distinct from reid, and a need to test its performance are, to the best of our knowledge, novel. Re-id however, is well researched for human faces [12, 13, 19, 20, 24], and somewhat so for insects [2–4, 11, 14–16]. Insect re-id algorithms may rely on small markers or tags attached to an insect to track it over separate observations [2, 4, 14, 15]. Six ant colonies were monitored using tags over 41 days, collecting approximately nine million social interactions to understand their behaviour [14]. BEETag, a tracking system using bar codes, was used for automated honeybee tracking [4], and Boenisch et al. [2] developed a QR-code system for honeybee lifetime tracking. Meyers et al. [15] demonstrated automated honeybee re-id by marking their thoraxes with paint, while demonstrating the potential of markerless reid using their unmarked abdomens. Markerless re-id has been little explored. The study of Giant honeybees’ wing patterns using size-independent characteristics and a selforganising map was a pioneering effort in non-invasive reid [11]. Convolutional neural networks have been used for markerless fruit fly re-id [16] and triplet-loss-based similarity learning approaches have also been used to re-id Bumble bees returning to their nests [3].

\ All these studies adopt chronological re-id despite many highly relevant scenarios where this is inefficient. Our study therefore explores retro-id as a novel complementary approach to tracking individual insects for ecological and biological research.

3. Method

3.1. Data Collection

We obtained reed bees from the Dandenong Ranges National Park, Victoria, Australia (lat. -37.90, long. 145.37)[1]. These bees exhibit semi-social behaviour and construct their nests within the pithy stems of fern fronds and other plants [5]. Each nest can consist of several females who share brood-rearing and defence responsibilities. We placed each insect in a separate container to facilitate individual id for testing. In order to run the experiment over several days, insects were refrigerated overnight below 4°C. After warming up, each bee was individually recorded daily in an arena. Here it was illuminated by an overhead ring light and videoed using a Dino-Lite digital microscope for 30–50 seconds per session at 30 fps. We followed the process listed below to create our final datasets.

\

  1. Video Processing: Bee videos were processed frame by frame. To automate this, we trained a YOLO-v8 model to detect a bee’s entire body, head, and abdomen in each frame. This enabled automatic establishment of the bee’s orientation in the frame.

    \

  2. Image Preparation: Upon detection, bees were cropped from the frames using the coordinates provided by Step 1To align bees, we rotated frames using a bee’s orientation before cropping. Centred on the detected entire bee body, a 400x400 pixel region (determined empirically for our bee/microscope setup) was cropped, then resized to 256x256.

    \

  3. Contrast Adjustment: To enhance image quality and ensure uniform visibility across all samples, Contrast Limited Adaptive Histogram Equalisation (CLAHE) [18] was applied.

    \

  4. Quality Control: Manual inspection to remove misidentified objects maintained dataset integrity and ensured only bee images were included.

    \

  5. Dataset Segregation: The final dataset was divided into image subsets, each from a single session, to avoid temporal data leakage.

\ Using Steps 1–5, we curated a dataset of daily bee recording sessions across five consecutive days. Each session included the same 15 individuals videoed for approximately 1200 images/session (total dataset approximately 90K images).

3.2. Network Architecture, Training, Evaluation

We used a transfer-learning-based approach for re-/retro-id of the reed bees. All models were pre-trained on the ImageNet dataset [6] and subsequently fine-tuned using our own dataset. To identify suitable transfer-learning models, we selected 17 different models distributed across 10 different model architectures and parameter numbers ranging from 49.7 million in swinv2s to 0.73 million parameters in squeezenet1_0. To evaluate the models, we collected a second set of data on Day 5, “set-2”, four hours from the first set using Steps 1–5 (above). We trained all 17 models on the first set of Day 5 data. The 17 models were then evaluated based on their ability to re-id individuals in Day 5 set2 data. From them, we selected the seven models with the highest Accuracy (and F1) scores for further consideration. We then trained this top-7 on our original Day 1 and Day 5 data. We evaluated Day 1 models forward on Day 2–5 data and Day 5 models back in time on Day 4–1 data to conduct our main experiments. These forward and backwards evaluations allowed comparison of markerless re- and retro- id of individual insects. The training process was similar for all of the models we considered. We have used Adam Optimiser with a learning rate of 0.001 with 0.0001 weight decay, with a total 100 epochs on the training dataset. We used cross-entropy loss as the loss function for these models.

Figure 2. Re/retro-identification accuracy of regnet y 3 2gf model where re-identification is shown as forward identification from day 1-5, and retro-identification is shown as backward identification from day 5-1.

\

:::info Authors:

(1) Asaduz Zaman, Dept. of Data Science and Artificial Intelligence, Faculty of Information Technology, Monash University, Australia ([email protected]);

(2) Vanessa Kellermann, Dept. of Environment and Genetics, School of Agriculture, Biomedicine, and Environment, La Trobe University, Australia ([email protected]);

(3) Alan Dorin, Dept. of Data Science and Artificial Intelligence, Faculty of Information Technology, Monash University, Australia ([email protected]).

:::


:::info This paper is available on arxiv under CC BY 4.0 DEED license.

:::

\

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Top Solana Treasury Firm Forward Industries Unveils $4 Billion Capital Raise To Buy More SOL ⋆ ZyCrypto

Top Solana Treasury Firm Forward Industries Unveils $4 Billion Capital Raise To Buy More SOL ⋆ ZyCrypto

The post Top Solana Treasury Firm Forward Industries Unveils $4 Billion Capital Raise To Buy More SOL ⋆ ZyCrypto appeared on BitcoinEthereumNews.com. Advertisement &nbsp &nbsp Forward Industries, the largest publicly traded Solana treasury company, has filed a $4 billion at-the-market (ATM) equity offering program with the U.S. SEC  to raise more capital for additional SOL accumulation. Forward Strategies Doubles Down On Solana Strategy In a Wednesday press release, Forward Industries revealed that the 4 billion ATM equity offering program will allow the company to issue and sell common stock via Cantor Fitzgerald under a sales agreement dated Sept. 16, 2025. Forward said proceeds will go toward “general corporate purposes,” including the pursuit of its Solana balance sheet and purchases of income-generating assets. The sales of the shares are covered by an automatic shelf registration statement filed with the US Securities and Exchange Commission that is already effective – meaning the shares will be tradable once they’re sold. An automatic shelf registration allows certain publicly listed companies to raise capital with flexibility swiftly.  Kyle Samani, Forward’s chairman, astutely described the ATM offering as “a flexible and efficient mechanism” to raise and deploy capital for the company’s Solana strategy and bolster its balance sheet.  Advertisement &nbsp Though the maximum amount is listed as $4 billion, the firm indicated that sales may or may not occur depending on existing market conditions. “The ATM Program enhances our ability to continue scaling that position, strengthen our balance sheet, and pursue growth initiatives in alignment with our long-term vision,” Samani said. Forward Industries kicked off its Solana treasury strategy on Sept. 8. The Wednesday S-3 form follows Forward’s $1.65 billion private investment in public equity that closed last week, led by crypto heavyweights like Galaxy Digital, Jump Crypto, and Multicoin Capital. The company started deploying that capital this week, announcing it snatched up 6.8 million SOL for approximately $1.58 billion at an average price of $232…
Share
BitcoinEthereumNews2025/09/18 03:42