Representation of ophthalmology concepts by electronic systems: intercoder agreement among physicians using controlled terminologies
To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Sys...
Gespeichert in:
| Veröffentlicht in: | Ophthalmology (Rochester, Minn.) Jg. 113; H. 4; S. 511 |
|---|---|
| Hauptverfasser: | , , , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
United States
01.04.2006
|
| Schlagworte: | |
| ISSN: | 1549-4713, 1549-4713 |
| Online-Zugang: | Weitere Angaben |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).
Noncomparative case series.
Five complete ophthalmology case presentations selected from a publicly available journal.
Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders.
Intercoder agreement in each controlled terminology: complete, partial, or none.
Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.
The level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data. |
|---|---|
| AbstractList | To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).
Noncomparative case series.
Five complete ophthalmology case presentations selected from a publicly available journal.
Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders.
Intercoder agreement in each controlled terminology: complete, partial, or none.
Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.
The level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data. To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).OBJECTIVETo assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).Noncomparative case series.DESIGNNoncomparative case series.Five complete ophthalmology case presentations selected from a publicly available journal.PARTICIPANTSFive complete ophthalmology case presentations selected from a publicly available journal.Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders.METHODSEach case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders.Intercoder agreement in each controlled terminology: complete, partial, or none.MAIN OUTCOME MEASURESIntercoder agreement in each controlled terminology: complete, partial, or none.Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.RESULTSCases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.The level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data.CONCLUSIONSThe level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data. |
| Author | Chiang, Michael F Yu, Alexander C Starren, Justin Cimino, James J Hwang, John C Casper, Daniel S |
| Author_xml | – sequence: 1 givenname: John C surname: Hwang fullname: Hwang, John C organization: Department of Ophthalmology, Columbia University College of Physicians and Surgeons, New York, New York 10032, USA – sequence: 2 givenname: Alexander C surname: Yu fullname: Yu, Alexander C – sequence: 3 givenname: Daniel S surname: Casper fullname: Casper, Daniel S – sequence: 4 givenname: Justin surname: Starren fullname: Starren, Justin – sequence: 5 givenname: James J surname: Cimino fullname: Cimino, James J – sequence: 6 givenname: Michael F surname: Chiang fullname: Chiang, Michael F |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/16488013$$D View this record in MEDLINE/PubMed |
| BookMark | eNpNkE1LxDAQhoOsuB_6D0Ry8tY1Sdu08SaLX7AgiJ5Lmkx3szRJbdJD7_5wK7uCMDDD8LzPwCzRzHkHCF1TsqaE8rvD2nf7uJdrRghfEzpVcYYWNM9EkhU0nf2b52gZwoFMIE-zCzSnPCtLQtMF-n6HrocALspovMO-wUdta33rdyNW3inoYsD1iKEFFXvvjMJhDBFsuMfGReiV19BjuesB7KTC0nq3w91-DEYZ6QIegpkWk2uKty1oPIWscb8nDIRLdN7INsDVqa_Q59Pjx-Yl2b49v24etknHUhGTXGSNznMpIM80L0SjS8GZ5IqoXKd5o7Myl6AhZaSuldIFSE5VQxijBRMFZSt0e_R2vf8aIMTKmqCgbaUDP4SKFyUrUi4m8OYEDrUFXXW9sbIfq7-_sR_oXHhs |
| ContentType | Journal Article |
| DBID | CGR CUY CVF ECM EIF NPM 7X8 |
| DOI | 10.1016/j.ophtha.2006.01.017 |
| DatabaseName | Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic |
| DatabaseTitle | MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | no_fulltext_linktorsrc |
| Discipline | Medicine |
| EISSN | 1549-4713 |
| ExternalDocumentID | 16488013 |
| Genre | Comparative Study Research Support, Non-U.S. Gov't Journal Article Research Support, N.I.H., Extramural |
| GeographicLocations | United States |
| GeographicLocations_xml | – name: United States |
| GrantInformation_xml | – fundername: NLM NIH HHS grantid: LM07079 – fundername: NEI NIH HHS grantid: EY13972 |
| GroupedDBID | --- --K .1- .55 .FO .GJ 0R~ 123 1B1 1P~ 1~5 29N 4.4 457 4G. 53G 5RE 5VS 7-5 71M AAEDT AAEDW AALRI AAQFI AAQQT AAQXK AAXUO ABCQX ABFRF ABJNI ABLJU ABMAC ABOCM ABWVN ACGFO ACGFS ACIUM ACNCT ACRPL ADMUD ADNMO ADPAM AEFWE AENEX AEVXI AFFNX AFJKZ AFRHN AFTJW AITUG AJUYK AKRWK ALMA_UNASSIGNED_HOLDINGS AMRAJ BELOY C5W CGR CS3 CUY CVF DU5 EBS ECM EFJIC EIF EJD F5P FDB FEDTE FGOYB GBLVA HVGLF HZ~ IHE J1W K-O KOM L7B M27 M41 MO0 N4W N9A NPM NQ- O9- OF- OPF OQ~ P2P R2- RIG ROL RPZ SDG SEL SES SSZ UHS UNMZH UV1 WH7 X7M XH2 XPP Z5R ZGI ZXP 7X8 EFKBS |
| ID | FETCH-LOGICAL-p239t-594fd55a9e54d679fd8962a6c0c5d35fd485aede320bbccd7ea61cf0221729712 |
| IEDL.DBID | 7X8 |
| ISICitedReferencesCount | 20 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000236538900002&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1549-4713 |
| IngestDate | Thu Oct 02 03:30:18 EDT 2025 Thu Jan 02 22:00:14 EST 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 4 |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-p239t-594fd55a9e54d679fd8962a6c0c5d35fd485aede320bbccd7ea61cf0221729712 |
| Notes | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
| PMID | 16488013 |
| PQID | 67827369 |
| PQPubID | 23479 |
| ParticipantIDs | proquest_miscellaneous_67827369 pubmed_primary_16488013 |
| PublicationCentury | 2000 |
| PublicationDate | 2006-04-01 |
| PublicationDateYYYYMMDD | 2006-04-01 |
| PublicationDate_xml | – month: 04 year: 2006 text: 2006-04-01 day: 01 |
| PublicationDecade | 2000 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States |
| PublicationTitle | Ophthalmology (Rochester, Minn.) |
| PublicationTitleAlternate | Ophthalmology |
| PublicationYear | 2006 |
| SSID | ssj0006634 |
| Score | 1.9302527 |
| Snippet | To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9,... |
| SourceID | proquest pubmed |
| SourceType | Aggregation Database Index Database |
| StartPage | 511 |
| SubjectTerms | Decision Support Systems, Clinical Humans Medical Records Systems, Computerized - standards Medical Records, Problem-Oriented Observer Variation Ophthalmology - standards Quality Assurance, Health Care Reproducibility of Results Terminology as Topic United States Vocabulary, Controlled |
| Title | Representation of ophthalmology concepts by electronic systems: intercoder agreement among physicians using controlled terminologies |
| URI | https://www.ncbi.nlm.nih.gov/pubmed/16488013 https://www.proquest.com/docview/67827369 |
| Volume | 113 |
| WOSCitedRecordID | wos000236538900002&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1JS8NAFB6qFfHivtR1Dl4Hm2WSjAgiYvFgSxGV3sKsVqhJbKrg3R_um0liT-JBCLkFhpnJ97bvew-hU51QQF7w3BSTnIRMBISFQUy48AH8FKXUdWJ6uosHg2Q0YsMWumi0MJZW2WCiA2qVS5sjPwNQBUsbscvijdiZUba2Wg_QWEDtABwZS-iKR_Ne4WBLXU0ZIiACEBw0wjnH7sqL8WzM62KEB0_8u4vpTE1v7X-LXEertYuJr6o7sYFaOttEy_26iL6Fvu4d-7UWHWU4N7ha5-TV5dixrKSMJRafeD4mB1dNn8tzbFtMTK0Wfoo5hOsuwYjd1CL8kygpsWXUP-OaCj_RCle8Gwe2utxGj72bh-tbUg9jIIUfsBmhLDRwdJxpGqooZkYlLPJ5JLuSqoAaFSaUa6UDvyuElCrWPPKkARcBXCQWe_4OWszyTO8hrIVRyojECgBC6ilBRSiU5twzEQXw7aCTZndTuOy2gsEznb-XabO_HbRbHVBaVD05Uoj6AIm8YP_Pbw_QSpVGseybQ9Q28JvrI7QkP2Yv5fTY3SF4D4b9b2Jj184 |
| linkProvider | ProQuest |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Representation+of+ophthalmology+concepts+by+electronic+systems%3A+intercoder+agreement+among+physicians+using+controlled+terminologies&rft.jtitle=Ophthalmology+%28Rochester%2C+Minn.%29&rft.au=Hwang%2C+John+C&rft.au=Yu%2C+Alexander+C&rft.au=Casper%2C+Daniel+S&rft.au=Starren%2C+Justin&rft.date=2006-04-01&rft.eissn=1549-4713&rft.volume=113&rft.issue=4&rft.spage=511&rft_id=info:doi/10.1016%2Fj.ophtha.2006.01.017&rft_id=info%3Apmid%2F16488013&rft_id=info%3Apmid%2F16488013&rft.externalDocID=16488013 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1549-4713&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1549-4713&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1549-4713&client=summon |