Show simple item record

dc.contributor.authorPeng Zhang
dc.contributor.authorPeng Zhang
dc.contributor.authorPeng Zhang
dc.contributor.authorPeng Zhang
dc.contributor.authorZongyi Yang
dc.contributor.authorZongyi Yang
dc.contributor.authorZongyi Yang
dc.contributor.authorZongyi Yang
dc.contributor.authorHong Yu
dc.contributor.authorHong Yu
dc.contributor.authorHong Yu
dc.contributor.authorHong Yu
dc.contributor.authorWan Tu
dc.contributor.authorWan Tu
dc.contributor.authorWan Tu
dc.contributor.authorWan Tu
dc.contributor.authorChencheng Gao
dc.contributor.authorChencheng Gao
dc.contributor.authorChencheng Gao
dc.contributor.authorChencheng Gao
dc.contributor.authorYue Wang
dc.contributor.authorYue Wang
dc.contributor.authorYue Wang
dc.contributor.authorYue Wang
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.contributor.otherCollege of Information Engineering, Dalian Ocean University, Dalian, China
dc.contributor.otherDalian Key Laboratory of Smart Fisheries, Dalian Ocean University, Dalian, China
dc.contributor.otherKey Laboratory of Facility Fisheries, Ministry of Education (Dalian Ocean University), Dalian, China
dc.contributor.otherLiaoning Provincial Key Laboratory of Marine Information Technology, Dalian, China
dc.date.accessioned2024-11-11T04:36:48Z
dc.date.available2025-10-02T04:33:04Z
dc.date.issued01-11-2024
dc.identifier.issn-
dc.identifier.urihttps://www.frontiersin.org/articles/10.3389/fmars.2024.1471312/full
dc.description.abstractFish segmentation in underwater videos can be used to accurately determine the silhouette size of fish objects, which provides key information for fish population monitoring and fishery resources survey. Some researchers have utilized underwater optical flow to improve the fish segmentation accuracy of underwater videos. However, the underwater optical flow is not evaluated and screen in existing works, and its predictions are easily disturbed by motion of non-fish. Therefore, in this paper, by analyzing underwater optical flow data, we propose a robust underwater segmentation network, RUSNet, with adaptive screening and fusion of input information. First, to enhance the robustness of the segmentation model to low-quality optical flow inputs, a global optical flow quality evaluation module is proposed for evaluating and aligning the underwater optical flow. Second, a decoder is designed by roughly localizing the fish object and then applying the proposed multidimension attention (MDA) module to iteratively recover the rough localization map from the spatial and edge dimensions of the fish. Finally, a multioutput selective fusion method is proposed in the testing stage, in which the mean absolute error (MAE) of the prediction using a single input is compared with that obtained using multisource input. Then, the information with the highest confidence is selected for predictive fusion, which facilitates the acquisition of the ultimate underwater fish segmentation results. To verify the effectiveness of the proposed model, we trained and evaluated it using a publicly available joint underwater video dataset and a separate DeepFish public dataset. Compared with the advanced underwater fish segmentation model, the proposed model has greater robustness to low-quality background optical flow in the DeepFish dataset, with the mean pixel accuracy (mPA) and mean intersection over union (mIoU) values reaching 98.77% and 97.65%, respectively. On the joint dataset, the mPA and mIoU of the proposed model are 92.61% and 90.12%, respectively, which are 0.72% and 1.21% higher than those of the advanced underwater video object segmentation model MSGNet. The results indicate that the proposed model can adaptively select the input and accurately segment fish in complex underwater scenes, which provides an effective solution for investigating fishery resources.
dc.format-
dc.language.isoEN
dc.publisherFrontiers Media S.A.
dc.relation.uri['https://www.elsevier.com/journals/european-journal-of-pharmaceutical-sciences/0928-0987/guide-for-authors', 'https://www.journals.elsevier.com/european-journal-of-pharmaceutical-sciences', 'https://www.elsevier.com/authors/open-access/choice#waivers']
dc.rights['CC BY', 'CC BY-NC-ND', 'CC BY-NC']
dc.subject['drug delivery', 'pharmacokinetics', 'pharmacodynamics', 'pharmaceutical', 'biomedical analysis', 'Pharmacy and materia medica', 'RS1-441']
dc.subject.lccScience
dc.titleRUSNet: Robust fish segmentation in underwater videos based on adaptive selection of optical flow
dc.typeArticle
dc.description.keywordsunderwater video processing
dc.description.keywordsmotion evaluation
dc.description.keywordsadaptive output selection
dc.description.keywordsrobust segmentation
dc.description.keywordsdeep learning
dc.description.pages-
dc.description.doi10.3389/fmars.2024.1471312
dc.title.journalFrontiers in Marine Science
dc.identifier.e-issn2296-7745
dc.identifier.oaioai:doaj.org/journal:a490fb3f46ea4b27a7c2ad21e5274c85
dc.journal.info-


This item appears in the following Collection(s)

Show simple item record