I just added another publication to my web site. This 1994 paper published in the American Journal of Clinical Pathology, as a government work, provides some early data on the accuracy of medical autocoders.
The abstract:
Many pathology departments rely on the accuracy of computer-generated diagnostic coding for surgical specimens. At present, there are no published guidelines for assuring the quality of coding devices. To assess the performance of SNOMED coding software, manual coding was compared with automated coding in 9,353 consecutive surgical pathology reports at the Baltimore VA Medical Center. Manual SNOMED coding produced 13,454 diagnostic entries comprising 519 distinct diagnostic entities; 209 were unique diagnoses (assigned to only one of the 9,353 reports). Automated coding obtained 23,744 diagnostic entries comprising 498 distinct diagnostic entities, of which 129 were unique diagnoses. There were only 44 instances (0.5%) where automated coding missed key diagnoses on surgical case reports. In summary, automated coding compared favorably with manual coding. To achieve the maximum performance from software coding applications, departments should monitor the output from automatic coders. Modifications in reporting style, code dictionaries, and coding algorithms can lead to improved coding performance.
The last line of the abstract, "Modifications in reporting style, code dictionaries, and coding algorithms can lead to improved coding performance," has been my mantra for the past 14 years.
- Jules Berman
key words: biomedical informatics, medical informatics, medical record retrieval, medical record indexing, biomedical autocoding, biomedical autocoder, surgical pathology reports
No comments:
Post a Comment