In January this year, Cefas installed a Remote Electronic Monitoring system on our research vessel, the Cefas Endeavour.
Why, you might ask... Do you really need to monitor a vessel that is already doing monitoring?
Remote Electronic Monitoring - Let's start with some background.
Remote Electronic Monitoring, or REM for short, is a data collection tool that can be installed on vessels to collect information on fishing activities and catches. The REM system consists of sensors and cameras collecting various data (video, GPS etc.), that can then be interpreted to generate other data (when and where a vessel is fishing and what it is catching).
The Fisheries Act 2020 requires the UK to manage fisheries in a sustainable way. This requires data and evidence to support the development and implementation of UK and internationally agreed policies. REM has been identified as a tool that can help meet the increasing demand for data collection during fishing activities.
However, for the data generated by REM systems to be useful, it must be reviewed and interpreted by skilled analysts. This review process can be very time consuming and is often a limiting factor in how much data can be generated. Application of Artificial Intelligence and Machine Learning algorithms for data review could help unlock the full potential of REM.
AI and REM
REM data is a good candidate for interpretation using AI/ML algorithms, but fishing isn't the easiest scenario to apply it to. Although some aspects are very predictable (i.e. vessels tow and haul at slower speeds than steaming), the sorting and processing of fish can be messy and to an untrained eye, chaotic. But this isn't impossible, and with the right training, AI/ML solutions could greatly improve the efficacy of using REM as a data collection tool.
A key limitation to establishing AI and Machine Learning solutions is training data, with quality and volume both key factors.
For a number of years, Cefas has been undertaking work to help develop these solutions, first in a Horizon 2020 project called SMARTFISH and more recently in a Horizon Europe project called EVERYFISH. In these projects, we have been collecting data from the types of commercial fishing vessels that we have already fitted REM and that we would like to apply the developed solutions to.
It's important to get "real" data, that is, data from the situations in which you want your solutions to work, but this is often data that is less than ideal. Fish can be overlapping and obscured, cameras may not always be clean (fishing can be messy), and without changing what the vessels are doing, different species are all mixed together.
Idea to fruition
But not all training data needs to come from vessels you intend to use the trained algorithm on, and that got us thinking about the Endeavour. The Endeavour fishes all year round and encounters all the species we want to train our algorithms on, the fish onboard are always separated out by species and the "environment" is much more controlled compared to a commercial fishing vessel. Gathering images from our research vessel just made sense.
As this idea was forming, I was approached by a colleague about The Alan Turing Institute who was looking for projects to collaborate with within the Environment and Sustainability Grand Challenge area. As part of a partnership with The Alan Turing Institute, installing the REM system on the Endeavour was one element of the proposed Catch Monitoring work package, which aimed to make monitoring catches with REM better through developments in Cefas's AI capabilities.
And so, with the idea formed and funding available, I started having some conversations. There were some questions, but these were easily answered, and it became clear that there was no reason to not install the system. So, in January 2024, while the vessel was in port for its maintenance period, a small team fitted the REM system on the Endeavour. Cameras were positioned over the most frequently used sampling stations, where baskets of fish are weighed on the scales. This would allow us to generate images of individual fish, piles of fish and fish in baskets. All of which would really help with other ongoing work.
We had some initial teething problems... lesson learnt to always put a "do not switch off sign" on something not to be turned off, but overall, the system worked well. The first fisheries survey went out in February and came back at the end of the month with our first lot of data.
The vessel sailed from Lowestoft on the 12th of February and landed in Swansea 15 days later on the 29th of February. Although we had some issues receiving GPS data for the first few days (including the departure from Lowestoft), the cameras still recorded, and we "fully" captured (video and GPS) 80% of the trip. Which, for a first trip was really good.
Over the 17-day survey, scientists onboard weighed ~18 tonnes of fish and measured around 53,000 individuals, with much of this activity captured on one of the 5 cameras installed.
Upon initial review of the footage captured we have saved 232 still images. These images contain everything from a single fish to a couple, to a lot!
We have started the process of annotating these images, so they can be used to train our algorithms. This includes drawing around the fish to create "polygons" or "masks" taking the image from the one on the left to the one on the right.