Accelerated muscle mass estimation from CT images through transfer learning
Background The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a ce...
Saved in:
| Published in: | BMC medical imaging Vol. 24; no. 1; pp. 271 - 18 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
London
BioMed Central
09.10.2024
BioMed Central Ltd Springer Nature B.V BMC |
| Subjects: | |
| ISSN: | 1471-2342, 1471-2342 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Background
The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device.
Methods
In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades.
Results
We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems.
Conclusion
In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. |
|---|---|
| AbstractList | Background
The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device.
Methods
In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades.
Results
We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems.
Conclusion
In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device. In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades. We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems. In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device.BACKGROUNDThe cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device.In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades.METHODSIn this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades.We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems.RESULTSWe show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems.In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks.CONCLUSIONIn the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. Background The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device. Methods In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades. Results We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems. Conclusion In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. Keywords: Medical image segmentation, CT image segmentation, Deep learning, Convolutional neural network Abstract Background The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device. Methods In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades. Results We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems. Conclusion In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device. In this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades. We show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems. In the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. BackgroundThe cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to variances in images depending on the computed tomography (CT) devices, a deep learning based segmentation model trained with a certain device often does not work with images from a different device.MethodsIn this study, we propose an efficient learning strategy for deep learning models in medical image segmentation. We aim to overcome the difficulties of segmentation in CT images by training a VNet segmentation model which enables rapid labeling of organs in CT images with the model obtained by transfer learning using a small number of manually labeled images, called SEED images. We established a process for generating SEED images and conducting transfer learning a model. We evaluate the performance of various segmentation models such as vanilla UNet, UNETR, Swin-UNETR and VNet. Furthermore, assuming a scenario that a model is repeatedly trained with CT images collected from multiple devices, in which is catastrophic forgetting often occurs, we examine if the performance of our model degrades.ResultsWe show that transfer learning can train a model that does a good job of segmenting muscles with a small number of images. In addition, it was confirmed that VNet shows better performance when comparing the performance of existing semi-automated segmentation tools and other deep learning networks to muscle and liver segmentation tasks. Additionally, we confirmed that VNet is the most robust model to deal with catastrophic forgetting problems.ConclusionIn the 2D CT image segmentation task, we confirmed that the CNN-based network shows better performance than the existing semi-automatic segmentation tool or latest transformer-based networks. |
| ArticleNumber | 271 |
| Audience | Academic |
| Author | Jung, Young Kul Yoon, Seunghan Kim, Younghoon Kim, Tae Hyung |
| Author_xml | – sequence: 1 givenname: Seunghan surname: Yoon fullname: Yoon, Seunghan organization: Department of Computer Science & Engineering (Major in Bio Artificial Intelligence), Hanyang University at Ansan – sequence: 2 givenname: Tae Hyung surname: Kim fullname: Kim, Tae Hyung organization: Division of Gastroenterology and Hepatology, Hallym University Sacred Heart Hospital – sequence: 3 givenname: Young Kul surname: Jung fullname: Jung, Young Kul email: free93cool@gmail.com organization: Division of Gastroenterology and Hepatology, Department of Internal Medicine, Korea University Ansan Hospital – sequence: 4 givenname: Younghoon surname: Kim fullname: Kim, Younghoon email: nongaussian@hanyang.ac.kr organization: Department of Computer Science & Engineering (Major in Bio Artificial Intelligence), Hanyang University at Ansan |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/39385108$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9Uktv1DAYjFARfcAf4IAiceGS4tfG9gmtKh4VlbiUs-XEn7NeJXaxEyT-PV93W9qtUOWDXzNjz2hOq6OYIlTVW0rOKVXtx0KZUqQhTDSECqEb8aI6oULShnHBjh6tj6vTUraEUKm4eFUdc83VihJ1Un1f9z2MkO0Mrp6W0o9QT7aUGsocJjuHFGuf01RfXNe4H6DU8yanZdjUc7axeMj1CDbHEIfX1UtvxwJv7uaz6ueXz9cX35qrH18vL9ZXTd8SNTeaOgFUQ699K7uOiq4DR7TwLYCitAPLOJdWUW2V5xKca6lV0jmnndRA-Fl1udd1yW7NTcZ_5T8m2WB2BykPxuY5oBXjGXi_wsecZIIxpnwvuffasZXofKdR69Ne62bpJnA9RLQ1Hoge3sSwMUP6bSgV7UozhQof7hRy-rVgbGYKBTMdbYS0FMMpXaE7rVqEvn8C3aYlR8xqh6L4PyIfUINFByH6hA_3t6JmjflIyRldIer8PygcDqbQY1F8wPMDwrvHTv9ZvO8CAtQe0OdUSgZv-jDvGoDKYTSUmNvamX3tDNbO7GpnBFLZE-q9-rMkvicVBMcB8kMaz7D-Auiq6UM |
| CitedBy_id | crossref_primary_10_3389_fpls_2025_1571445 crossref_primary_10_1007_s44196_025_00937_x |
| Cites_doi | 10.1109/TAI.2023.3327981 10.1007/978-3-030-87199-4_23 10.1186/s40537-023-00727-2 10.1016/j.compbiomed.2023.106646 10.1109/ACCESS.2023.3244952 10.1016/j.eswa.2022.119024 10.3390/app10134523 10.1016/j.compbiomed.2022.106365 10.1109/WACV51458.2022.00181 10.1109/ACCESS.2023.3335948 10.1007/978-3-031-08999-2_22 10.1016/j.inffus.2023.03.008 10.1007/978-3-030-33128-3_11 10.3389/fonc.2021.580806 10.3390/s19122650 10.1117/12.2549406 10.1016/j.media.2023.102958 10.1038/s41598-021-95972-x 10.1007/s12194-024-00839-1 10.1371/journal.pone.0257371 10.1109/83.661186 10.3350/cmh.2022.0231 10.3348/kjr.2019.0470 10.1109/ICASSP40776.2020.9053405 10.1109/3DV.2016.79 10.1016/j.ejrad.2020.109153 10.1007/s10462-011-9220-3 10.1109/CVPR.2016.308 10.1016/j.media.2020.101950 10.1097/MD.0000000000015867 10.1007/978-3-030-00889-5_1 10.1007/s10278-017-9988-z 10.1016/j.compmedimag.2019.04.007 10.3390/s21062083 10.1148/ryai.2021200130 10.1016/j.clnu.2021.06.025 10.1002/mp.14465 10.1109/WACV56688.2023.00614 10.1117/12.2512965 10.1016/j.promfg.2020.01.386 10.1007/978-3-319-24574-4_28 10.3390/electronics9030427 10.1016/j.media.2022.102680 10.3390/cancers13071590 |
| ContentType | Journal Article |
| Copyright | The Author(s) 2024 2024. The Author(s). COPYRIGHT 2024 BioMed Central Ltd. 2024. This work is licensed under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. The Author(s) 2024 2024 |
| Copyright_xml | – notice: The Author(s) 2024 – notice: 2024. The Author(s). – notice: COPYRIGHT 2024 BioMed Central Ltd. – notice: 2024. This work is licensed under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: The Author(s) 2024 2024 |
| DBID | C6C AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7QO 7RV 7X7 7XB 88E 8FD 8FE 8FG 8FH 8FI 8FJ 8FK ABUWG AFKRA ARAPS AZQEC BBNVY BENPR BGLVJ BHPHI CCPQU DWQXO FR3 FYUFA GHDGH GNUQQ HCIFZ K9. KB0 LK8 M0S M1P M7P NAPCQ P5Z P62 P64 PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS 7X8 5PM DOA |
| DOI | 10.1186/s12880-024-01449-4 |
| DatabaseName | Springer Nature OA Free Journals CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Biotechnology Research Abstracts Nursing & Allied Health Database Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland Advanced Technologies & Computer Science Collection ProQuest Central Essentials Biological Science Collection ProQuest Central Technology Collection ProQuest Natural Science Collection ProQuest One Community College ProQuest Central Engineering Research Database Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Nursing & Allied Health Database (Alumni Edition) Biological Sciences Health & Medical Collection (Alumni Edition) PML(ProQuest Medical Library) ProQuest Biological Science Database (NC LIVE) Nursing & Allied Health Premium Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest Central Student Technology Collection Technology Research Database ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Health & Medical Research Collection Health Research Premium Collection Biotechnology Research Abstracts Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) Advanced Technologies & Aerospace Collection ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Nursing & Allied Health Source ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts Advanced Technologies & Aerospace Database Nursing & Allied Health Premium ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest Nursing & Allied Health Source (Alumni) Engineering Research Database ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic MEDLINE Publicly Available Content Database |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: PIMPY name: Publicly Available Content Database url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Medicine |
| EISSN | 1471-2342 |
| EndPage | 18 |
| ExternalDocumentID | oai_doaj_org_article_f2eff519ed7242228fc73ff9d254bfb9 PMC11465928 A811773215 39385108 10_1186_s12880_024_01449_4 |
| Genre | Journal Article |
| GeographicLocations | South Korea |
| GeographicLocations_xml | – name: South Korea |
| GrantInformation_xml | – fundername: Artificial Intelligence Convergence Innovation Human Resources Development (Hanyang University ERICA) grantid: RS-2022-00155885; RS-2022-00155885 – fundername: Ministry of Trade, Industry and Energy (MOTIE) and Korea Institute for Advancement of Technology (KIAT) grantid: P0025661; P0025661 – fundername: Artificial Intelligence Convergence Innovation Human Resources Development (Hanyang University ERICA) grantid: RS-2022-00155885 – fundername: Ministry of Trade, Industry and Energy (MOTIE) and Korea Institute for Advancement of Technology (KIAT) grantid: P0025661 |
| GroupedDBID | --- 0R~ 23N 2WC 53G 5VS 6J9 7RV 7X7 88E 8FE 8FG 8FH 8FI 8FJ AAFWJ AAJSJ AASML ABUWG ACGFO ACGFS ACIHN ACIWK ACPRK ADBBV ADRAZ ADUKV AEAQA AENEX AFKRA AFPKN AFRAH AHBYD AHMBA AHYZX ALMA_UNASSIGNED_HOLDINGS AMKLP AMTXH AOIJS ARAPS BAPOH BAWUL BBNVY BCNDV BENPR BFQNJ BGLVJ BHPHI BMC BPHCQ BVXVI C6C CCPQU CS3 DIK DU5 E3Z EBD EBLON EBS EMB EMOBN F5P FYUFA GROUPED_DOAJ GX1 HCIFZ HMCUK HYE IAO IHR INH INR ITC KQ8 LK8 M1P M48 M7P M~E NAPCQ O5R O5S OK1 OVT P2P P62 PGMZT PHGZM PHGZT PIMPY PJZUB PPXIY PQGLB PQQKQ PROAC PSQYO PUEGO RBZ RNS ROL RPM RSV SMD SOJ SV3 TR2 UKHRP W2D WOQ WOW XSB AAYXX AFFHD CITATION ALIPV CGR CUY CVF ECM EIF NPM 3V. 7QO 7XB 8FD 8FK AZQEC DWQXO FR3 GNUQQ K9. P64 PKEHL PQEST PQUKI PRINS 7X8 5PM |
| ID | FETCH-LOGICAL-c608t-91d4e19ec9f67bb14bbed094f6ee811bea2337a819a8f37edd61a87ddd9d79e03 |
| IEDL.DBID | DOA |
| ISICitedReferencesCount | 3 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001329341900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1471-2342 |
| IngestDate | Fri Oct 03 12:51:02 EDT 2025 Tue Nov 04 02:05:57 EST 2025 Fri Sep 05 13:54:21 EDT 2025 Tue Oct 07 05:22:21 EDT 2025 Sat Nov 29 13:57:09 EST 2025 Sat Nov 29 10:34:43 EST 2025 Mon Jul 21 05:39:33 EDT 2025 Tue Nov 18 22:15:21 EST 2025 Sat Nov 29 06:11:10 EST 2025 Sat Sep 06 07:26:53 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 1 |
| Keywords | Deep learning Convolutional neural network Medical image segmentation CT image segmentation |
| Language | English |
| License | 2024. The Author(s). Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c608t-91d4e19ec9f67bb14bbed094f6ee811bea2337a819a8f37edd61a87ddd9d79e03 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| OpenAccessLink | https://doaj.org/article/f2eff519ed7242228fc73ff9d254bfb9 |
| PMID | 39385108 |
| PQID | 3115122807 |
| PQPubID | 44833 |
| PageCount | 18 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_f2eff519ed7242228fc73ff9d254bfb9 pubmedcentral_primary_oai_pubmedcentral_nih_gov_11465928 proquest_miscellaneous_3115094986 proquest_journals_3115122807 gale_infotracmisc_A811773215 gale_infotracacademiconefile_A811773215 pubmed_primary_39385108 crossref_citationtrail_10_1186_s12880_024_01449_4 crossref_primary_10_1186_s12880_024_01449_4 springer_journals_10_1186_s12880_024_01449_4 |
| PublicationCentury | 2000 |
| PublicationDate | 2024-10-09 |
| PublicationDateYYYYMMDD | 2024-10-09 |
| PublicationDate_xml | – month: 10 year: 2024 text: 2024-10-09 day: 09 |
| PublicationDecade | 2020 |
| PublicationPlace | London |
| PublicationPlace_xml | – name: London – name: England |
| PublicationTitle | BMC medical imaging |
| PublicationTitleAbbrev | BMC Med Imaging |
| PublicationTitleAlternate | BMC Med Imaging |
| PublicationYear | 2024 |
| Publisher | BioMed Central BioMed Central Ltd Springer Nature B.V BMC |
| Publisher_xml | – name: BioMed Central – name: BioMed Central Ltd – name: Springer Nature B.V – name: BMC |
| References | H Li (1449_CR26) 2019; 19 1449_CR47 P Bilic (1449_CR3) 2023; 84 TH Kim (1449_CR14) 2022; 28 L Alzubaidi (1449_CR45) 2021; 13 1449_CR48 H Lee (1449_CR23) 2017; 30 LL Ackermans (1449_CR19) 2021; 21 1449_CR41 FJM Shamrat (1449_CR31) 2023; 155 1449_CR40 1449_CR42 FJM Shamrat (1449_CR33) 2023; 11 1449_CR44 1449_CR2 1449_CR5 AE Kavur (1449_CR4) 2021; 69 1449_CR36 1449_CR35 1449_CR39 1449_CR7 1449_CR6 1449_CR9 1449_CR32 L Alzubaidi (1449_CR46) 2020; 9 1449_CR34 L Alzubaidi (1449_CR43) 2020; 10 Y Fu (1449_CR24) 2020; 47 S Dabiri (1449_CR22) 2019; 75 KA Weber (1449_CR25) 2021; 11 D Nishiyama (1449_CR28) 2021; 16 1449_CR29 N Sharma (1449_CR8) 2010; 35 AM Mharib (1449_CR10) 2012; 37 1449_CR21 YS Lee (1449_CR18) 2021; 40 H Lu (1449_CR1) 2019; 39 L Alzubaidi (1449_CR30) 2023; 10 D Zopfs (1449_CR27) 2020; 130 C Xu (1449_CR49) 1998; 7 1449_CR13 1449_CR16 X Li (1449_CR37) 2023; 152 1449_CR15 1449_CR17 HJ Park (1449_CR20) 2020; 21 1449_CR50 1449_CR51 1449_CR12 1449_CR11 H Yang (1449_CR38) 2023; 213 |
| References_xml | – volume: 35 start-page: 3 issue: 1 year: 2010 ident: 1449_CR8 publication-title: J Med Phys Assoc Med Phys India. – ident: 1449_CR42 – ident: 1449_CR32 doi: 10.1109/TAI.2023.3327981 – ident: 1449_CR5 doi: 10.1007/978-3-030-87199-4_23 – volume: 10 start-page: 46 issue: 1 year: 2023 ident: 1449_CR30 publication-title: J Big Data. doi: 10.1186/s40537-023-00727-2 – volume: 155 start-page: 106646 year: 2023 ident: 1449_CR31 publication-title: Comput Biol Med. doi: 10.1016/j.compbiomed.2023.106646 – volume: 11 start-page: 16376 year: 2023 ident: 1449_CR33 publication-title: IEEE Access. doi: 10.1109/ACCESS.2023.3244952 – volume: 213 start-page: 119024 year: 2023 ident: 1449_CR38 publication-title: Expert Syst Appl. doi: 10.1016/j.eswa.2022.119024 – volume: 10 start-page: 4523 issue: 13 year: 2020 ident: 1449_CR43 publication-title: Appl Sci. doi: 10.3390/app10134523 – volume: 152 start-page: 106365 year: 2023 ident: 1449_CR37 publication-title: Comput Biol Med. doi: 10.1016/j.compbiomed.2022.106365 – ident: 1449_CR50 doi: 10.1109/WACV51458.2022.00181 – ident: 1449_CR39 doi: 10.1109/ACCESS.2023.3335948 – ident: 1449_CR51 doi: 10.1007/978-3-031-08999-2_22 – ident: 1449_CR29 doi: 10.1016/j.inffus.2023.03.008 – ident: 1449_CR40 doi: 10.1007/978-3-030-33128-3_11 – ident: 1449_CR35 – ident: 1449_CR17 doi: 10.3389/fonc.2021.580806 – volume: 19 start-page: 2650 issue: 12 year: 2019 ident: 1449_CR26 publication-title: Sensors. doi: 10.3390/s19122650 – ident: 1449_CR16 doi: 10.1117/12.2549406 – ident: 1449_CR9 doi: 10.1016/j.media.2023.102958 – volume: 11 start-page: 1 issue: 1 year: 2021 ident: 1449_CR25 publication-title: Sci Rep. doi: 10.1038/s41598-021-95972-x – ident: 1449_CR15 – ident: 1449_CR41 doi: 10.1007/s12194-024-00839-1 – volume: 16 start-page: e0257371 issue: 9 year: 2021 ident: 1449_CR28 publication-title: PLoS ONE. doi: 10.1371/journal.pone.0257371 – volume: 7 start-page: 359 issue: 3 year: 1998 ident: 1449_CR49 publication-title: IEEE Trans Image Process. doi: 10.1109/83.661186 – volume: 28 start-page: 876 issue: 4 year: 2022 ident: 1449_CR14 publication-title: Clin Mol Hepatol. doi: 10.3350/cmh.2022.0231 – ident: 1449_CR11 – volume: 21 start-page: 88 issue: 1 year: 2020 ident: 1449_CR20 publication-title: Korean J Radiol. doi: 10.3348/kjr.2019.0470 – ident: 1449_CR13 doi: 10.1109/ICASSP40776.2020.9053405 – ident: 1449_CR48 doi: 10.1109/3DV.2016.79 – volume: 130 start-page: 109153 year: 2020 ident: 1449_CR27 publication-title: Eur J Radiol. doi: 10.1016/j.ejrad.2020.109153 – volume: 37 start-page: 83 year: 2012 ident: 1449_CR10 publication-title: Artif Intell Rev. doi: 10.1007/s10462-011-9220-3 – ident: 1449_CR34 doi: 10.1109/CVPR.2016.308 – volume: 69 start-page: 101950 year: 2021 ident: 1449_CR4 publication-title: Med Image Anal. doi: 10.1016/j.media.2020.101950 – ident: 1449_CR7 doi: 10.1097/MD.0000000000015867 – ident: 1449_CR12 doi: 10.1007/978-3-030-00889-5_1 – volume: 30 start-page: 487 issue: 4 year: 2017 ident: 1449_CR23 publication-title: J Digit Imaging. doi: 10.1007/s10278-017-9988-z – ident: 1449_CR44 – volume: 75 start-page: 47 year: 2019 ident: 1449_CR22 publication-title: Comput Med Imaging Graph. doi: 10.1016/j.compmedimag.2019.04.007 – volume: 21 start-page: 2083 issue: 6 year: 2021 ident: 1449_CR19 publication-title: Sensors. doi: 10.3390/s21062083 – ident: 1449_CR21 doi: 10.1148/ryai.2021200130 – volume: 40 start-page: 5038 issue: 8 year: 2021 ident: 1449_CR18 publication-title: Clin Nutr. doi: 10.1016/j.clnu.2021.06.025 – volume: 47 start-page: 5723 issue: 11 year: 2020 ident: 1449_CR24 publication-title: Med Phys. doi: 10.1002/mp.14465 – ident: 1449_CR36 doi: 10.1109/WACV56688.2023.00614 – ident: 1449_CR6 doi: 10.1117/12.2512965 – volume: 39 start-page: 422 year: 2019 ident: 1449_CR1 publication-title: Procedia Manuf. doi: 10.1016/j.promfg.2020.01.386 – ident: 1449_CR2 doi: 10.1007/978-3-319-24574-4_28 – volume: 9 start-page: 427 issue: 3 year: 2020 ident: 1449_CR46 publication-title: Electronics. doi: 10.3390/electronics9030427 – ident: 1449_CR47 – volume: 84 start-page: 102680 year: 2023 ident: 1449_CR3 publication-title: Med Image Anal. doi: 10.1016/j.media.2022.102680 – volume: 13 start-page: 1590 issue: 7 year: 2021 ident: 1449_CR45 publication-title: Cancers. doi: 10.3390/cancers13071590 |
| SSID | ssj0017834 |
| Score | 2.348717 |
| Snippet | Background
The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields.... The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields. Furthermore, due to... Background The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields.... BackgroundThe cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields.... Abstract Background The cost of labeling to collect training data sets using deep learning is especially high in medical applications compared to other fields.... |
| SourceID | doaj pubmedcentral proquest gale pubmed crossref springer |
| SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 271 |
| SubjectTerms | Abdomen Analysis Artificial intelligence Comparative analysis Computed tomography Convolutional neural network CT image segmentation CT imaging Datasets Deep Learning Diagnostic imaging Humans Image processing Image Processing, Computer-Assisted - methods Image segmentation Imaging Knowledge transfer Labeling Liver Machine learning Medical image segmentation Medical imaging Medical imaging equipment Medicine Medicine & Public Health Methods Muscle, Skeletal - diagnostic imaging Muscles Neural networks Performance degradation Performance evaluation Radiology Technology application Tomography, X-Ray Computed - methods Transfer learning |
| SummonAdditionalLinks | – databaseName: Nursing & Allied Health Database dbid: 7RV link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpR3LbtQw0IKCEBfej0BBRkLiAFE3sdePE1oqKiREhVBBvVnxa6lEs2Wzy_cz4zgpKaIXjonHUibz9nhmCHmJqaiZkqGUws9LLppYWuVtqbnm0bJGz7VLwybk4aE6Ptaf84Fbl69VDjoxKWq_cnhGvoddYarUu-Xt2c8Sp0ZhdjWP0LhKrlVou4Gf5ZdvYxYBh0gMhTJK7HWgi9WsBKtUYhyhSz4xRqln_9-a-Q_TdPHa5IXcaTJJB7f_F5k75FZ2Rumi55675Epo75Ebn3K6_T75uHAOzBJ2k_D0dNsBED0Fb5tia46-5pFifQrdP6LwvAwdzXN_6CY5xGFN81iK5QPy9eD90f6HMk9fKB1QbwNa0PNQ6eB0FNLailsbPASDUYSgqsqGpmZMNuBRNCoyGbwXVaOk9157qcOMPSQ77aoNjwkVENRYx6RnLHIXXRO5iKzWTaiEc7wuSDWQwbjcmhwnZPwwKURRwvSkM0A6k0hneEFej3vO-sYcl0K_Q-qOkNhUO71YrZcmy6iJdYgRPNrgZZ1OxqKTLEbtIYi20eqCvELeMCj68HmuyRUMgCQ20TILLNqVDJyoguxOIEFk3XR5YAuTVUZnznmiIC_GZdyJ1-DasNr2MEACrURBHvXMOKLENAPveaYKoiZsOsF5utKefE8NxbEyfa5r2Ppm4Ojz7_r3T31yORpPyc0aZQ1vW-hdsrNZb8Mzct392px06-dJUn8DX6pFQQ priority: 102 providerName: ProQuest – databaseName: SpringerLINK Contemporary 1997-Present dbid: RSV link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Zb9QwELagINQX7iNQkJGQeICom9jr43GpqJAQFYKC-mbF11KJZqvNLr-fGccJpBwSPCYeS_F4Lmc83xDyDFNRMyVDKYWfl1w0sbTK21JzzaNljZ5rl5pNyKMjdXKi3-eisG647T6kJJOlTmqtxH4HllTNSvApJZ4CdMkvkyvg7hSq44ePn8fcAbaOGMpjfjtv4oISUv-v9vgnh3TxsuSFjGlyRIc3_m8JN8n1HHjSRS8pt8il0N4m197l1Pod8nbhHLggRI7w9GzbARE9g8iaIgxHX99IsRaFHhxTeF6GjuYeP3STgt-wprkFxfIu-XT4-vjgTZk7LZQOdmoDFs_zUOngdBTS2opbGzwc_KIIQVWVDU3NmGwgemhUZDJ4L6pGSe-99lKHGbtHdtpVGx4QKuAAYx2TnrHIXXRN5CKyWjehEs7xuiDVwHzjMgw5dsP4atJxRAnTc8kAl0zikuEFeTHOOe9BOP5K_Qr3dKREAO30YrVemqyPJtYhRoheg5d1-gsWnWQxag8HZhutLshzlAiDag6f55pcrQCLRMAss8ACXckgYCrI3oQS1NNNhweZMtk8dAYhjqoERFSQp-MwzsQrb21YbXsa2AKtREHu9yI4LolpBpHyTBVETYRzsubpSHv6JYGHYxX6XNcw9eUgoz--689Mffhv5I_Ibo1ijjct9B7Z2ay34TG56r5tTrv1k6Sv3wEUgztp priority: 102 providerName: Springer Nature |
| Title | Accelerated muscle mass estimation from CT images through transfer learning |
| URI | https://link.springer.com/article/10.1186/s12880-024-01449-4 https://www.ncbi.nlm.nih.gov/pubmed/39385108 https://www.proquest.com/docview/3115122807 https://www.proquest.com/docview/3115094986 https://pubmed.ncbi.nlm.nih.gov/PMC11465928 https://doaj.org/article/f2eff519ed7242228fc73ff9d254bfb9 |
| Volume | 24 |
| WOSCitedRecordID | wos001329341900001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVADU databaseName: Open Access: BioMedCentral Open Access Titles customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: RBZ dateStart: 20010101 isFulltext: true titleUrlDefault: https://www.biomedcentral.com/search/ providerName: BioMedCentral – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: DOA dateStart: 20010101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: M~E dateStart: 20010101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: Advanced Technologies & Aerospace Database customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: P5Z dateStart: 20090101 isFulltext: true titleUrlDefault: https://search.proquest.com/hightechjournals providerName: ProQuest – providerCode: PRVPQU databaseName: Biological Science Database customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: M7P dateStart: 20090101 isFulltext: true titleUrlDefault: http://search.proquest.com/biologicalscijournals providerName: ProQuest – providerCode: PRVPQU databaseName: Health & Medical Collection customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: 7X7 dateStart: 20090101 isFulltext: true titleUrlDefault: https://search.proquest.com/healthcomplete providerName: ProQuest – providerCode: PRVPQU databaseName: Nursing & Allied Health Database customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: 7RV dateStart: 20090101 isFulltext: true titleUrlDefault: https://search.proquest.com/nahs providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: BENPR dateStart: 20090101 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: Publicly Available Content Database customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: PIMPY dateStart: 20090101 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest – providerCode: PRVAVX databaseName: Springer customDbUrl: eissn: 1471-2342 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0017834 issn: 1471-2342 databaseCode: RSV dateStart: 20011201 isFulltext: true titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22 providerName: Springer Nature |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3daxQxEB-0ivgiftbVekQQfNClt5u9fDxeS4siPZazlupL2HzVgr3Kffj3O5PdO7sV9cWXwF4mkExmMjOXzG8AXtFV1FDJkEvhR3klmphb5W2uK11Fyxs90i4Vm5CTiTo91fWVUl_0JqyFB24ZtxvLECO6GcHLMv1dEZ3kMWqPkY2NNqXuodezDqa6-wMqH7FOkVFid4GnsBrmaI9yiiB0XvXMUELr__1MvmKUrj-YvHZrmozR4X2413mRbNzO_gHcCLOHcOeouyd_BB_GzqE9IRgIzy5WCyRiF-gmM8LUaJMVGSWWsP1jht9nYcG6gj1smTzZMGddPYmzx_Dp8OB4_13elU3IHbJ9iceXrwIyzOkopLVFZW3wGMVFEYIqChuaknPZoCvQqMhl8F4UjZLee-2lDkP-BLZml7PwFJjAaMQ6Lj3nsXLRNbESkZe6CYVwriozKNZcNK7DFKfSFt9Mii2UMC3nDXLeJM6bKoM3mzHfW0SNv1Lv0eZsKAkNO_2AMmI6GTH_kpEMXtPWGtJZnJ5rutQDXCShX5kxZdtKjt5PBjs9StQ11-9eC4fpdH1hCK-oSKhCGbzcdNNIer82C5erlga3QCuRwXYrS5slcc3R7R2qDFRPynpr7vfMzr8mJHBKKR_pEoe-XQvkr3n9manP_gdTn8PdkhSKHlPoHdhazlfhBdx2P5bni_kAbsrpCbWnMrVqALf2Dib1dJAUdUBvbGts69EX7KnfH9Wf8Wv68eQnQPZBYA |
| linkProvider | Directory of Open Access Journals |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEB6VgoAL74ehwCKBOIDV2Ot4dw8IhULVKm3EIUi9be19hEo0KXEC4k_xG5lZ2ykuorceOMY7G-14v5nZ8c4D4AVdRfWkcLHIbT_O8sLHpbRlrDKV-ZIXqq9MaDYhRiN5cKA-rcGvNheGwipbnRgUtZ0Z-ka-SVVhklC75d3Jt5i6RtHtattCo4bF0P38gS5b9Xb3A-7vyzTd_jje2ombrgKxwVUtULpt5hLljPK5KMskK0tn0cnxuXMySUpXpJyLAi1lIT0Xzto8KaSw1iorlOtx_N9LcBn1uKAQMnGwcvASalrRJubIfLNC3S97MVrBmPwWFWcd4xd6BPxtCf4whWfDNM_c1QYTuH3zf3t5t-BGc9hmg1o6bsOam96Bq_tNOMFdGA6MQbNL1TIsO15WSMSO0ZtgVHqkzulklH_DtsYMf09cxZq-RmwRDvxuzpq2G5N78PlCWLkP69PZ1D0ElqPTVhouLOc-M94UPss9T1XhktyYLI0gabddm6b0OnUA-aqDCyZzXUNFI1R0gIrOIni9mnNSFx45l_o9oWlFSUXDw4PZfKIbHaR96rzHE7uzIg1f_rwR3Htl035W-lJF8IqwqEm14fJM0WRoIJNUJEwPKClZcDwkRrDRoUSVZLrDLQx1oxIrfYrBCJ6vhmkmhflN3WxZ0-AWKJlH8KAG_4olrjh6Bz0ZgeyIRYfn7sj06EsomE6Z932V4tQ3rQSdruvfL_XR-Ww8g2s74_09vbc7Gj6G6ynJOUWWqA1YX8yX7glcMd8XR9X8adASDA4vWrJ-A7mzpNY |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Zb9QwEB5BQRUv3NBAASMh8QBRN7HXx-NSWIEKq0oU1DcrvpZKbbbag9-PxzloyiEhHhOPpXg845nJeL4BeIGpqJEUPhfcjXPGq5Ab6UyumGLB0EqNlU3NJsRsJo-P1eGFKv50271LSTY1DYjSVK_3zl1oVFzyvVU8VeUoj_Ylx4hA5ewqXGPYNAjj9c9f-zwCtpHoSmV-O29gjhJq_69n8wXjdPni5KXsaTJK01v_v5zbcLN1SMmkkaA7cMXXd2H7U5tyvwcHE2ujaUJECUfONqtIRM6ix00QnqOpeyRYo0L2j0h8nvsVaXv_kHVyiv2StK0p5vfhy_Td0f77vO3AkNu4g-t4EjrmC-WtClwYUzBjvIsBYeDey6IwviopFVX0KioZqPDO8aKSwjmnnFB-RB_AVr2o_Q4QHgMbY6lwlAZmg60C44GWqvIFt5aVGRTdRmjbwpNjl4xTncIUyXXDJR25pBOXNMvgVT_nvAHn-Cv1G9zfnhKBtdOLxXKuWz3VofQhRK_WO1Gmv2PBChqCcjGQNsGoDF6idGhU__h5tmqrGOIiEUhLT7BwV9DoSGWwO6CMamuHw5186fbYWGmEPioSQFEGz_thnIlX4Wq_2DQ0cQuU5Bk8bMSxXxJVNHrQI5mBHAjqYM3DkfrkWwIVx-r0sSrj1NedvP78rj8z9dG_kT-D7cO3U_3xw-zgMdwoUeLxMobaha31cuOfwHX7fX2yWj5NavwD4vZHMQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Accelerated+muscle+mass+estimation+from+CT+images+through+transfer+learning&rft.jtitle=BMC+medical+imaging&rft.au=Yoon%2C+Seunghan&rft.au=Kim%2C+Tae+Hyung&rft.au=Jung%2C+Young+Kul&rft.au=Kim%2C+Younghoon&rft.date=2024-10-09&rft.pub=BioMed+Central+Ltd&rft.issn=1471-2342&rft.eissn=1471-2342&rft.volume=24&rft.issue=1&rft_id=info:doi/10.1186%2Fs12880-024-01449-4&rft.externalDocID=A811773215 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1471-2342&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1471-2342&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1471-2342&client=summon |