Computerized versus hand-scored health literacy tools: A comparison of Simple Measure of Gobbledygook (SMOG) and Flesch-Kincaid in printed patient education materials

Kelsey Grabeel, Jennifer Russomanno, Sandra Oelschlegel, Emily Tester, Robert Heidel

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Objective: The research compared and contrasted hand-scoring and computerized methods of evaluating the grade level of patient education materials that are distributed at an academic medical center in east Tennessee and sought to determine if these materials adhered to the American Medical Association’s (AMA’s) recommended reading level of sixth grade. Methods: Librarians at an academic medical center located in the heart of Appalachian Tennessee initiated the assessment of 150 of the most used printed patient education materials. Based on the Flesch-Kincaid (F-K) scoring rubric, 2 of the 150 documents were excluded from statistical comparisons due to the absence of text (images only). Researchers assessed the remaining 148 documents using the hand-scored Simple Measure of Gobbledygook (SMOG) method and the computerized F-K grade level method. For SMOG, 3 independent reviewers hand-scored each of the 150 documents. For F-K, documents were analyzed using Microsoft Word. Reading grade levels scores were entered into a database for statistical analysis. Inter-rater reliability was calculated using intra-class correlation coefficients (ICC). Paired t-tests were used to compare readability means. Results: Acceptable inter-rater reliability was found for SMOG (ICC=0.95). For the 148 documents assessed, SMOG produced a significantly higher mean reading grade level (M=9.6, SD=1.3) than F-K (M=6.5, SD=1.3; p<0.001). Additionally, when using the SMOG method of assessment, 147 of the 148 documents (99.3%) scored above the AMA’s recommended reading level of sixth grade. Conclusions: Computerized health literacy assessment tools, used by many national patient education material providers, might not be representative of the actual reading grade levels of patient education materials. This is problematic in regions like Appalachia because materials may not be comprehensible to the area’s low-literacy patients. Medical librarians have the potential to advance their role in patient education to better serve their patient populations.

Original languageEnglish (US)
Pages (from-to)38-45
Number of pages8
JournalJournal of the Medical Library Association
Volume106
Issue number1
DOIs
StatePublished - Jan 1 2018

Fingerprint

Health Literacy
Patient Education
Reading
Hand
school grade
literacy
Librarians
health
American Medical Association
education
medical association
Appalachian Region
librarian
Research Design
Research Personnel
Databases
statistical analysis
Research
Population

All Science Journal Classification (ASJC) codes

  • Health Informatics
  • Library and Information Sciences

Cite this

@article{c10f5f6e396646d9b67d2d13e7448bab,
title = "Computerized versus hand-scored health literacy tools: A comparison of Simple Measure of Gobbledygook (SMOG) and Flesch-Kincaid in printed patient education materials",
abstract = "Objective: The research compared and contrasted hand-scoring and computerized methods of evaluating the grade level of patient education materials that are distributed at an academic medical center in east Tennessee and sought to determine if these materials adhered to the American Medical Association’s (AMA’s) recommended reading level of sixth grade. Methods: Librarians at an academic medical center located in the heart of Appalachian Tennessee initiated the assessment of 150 of the most used printed patient education materials. Based on the Flesch-Kincaid (F-K) scoring rubric, 2 of the 150 documents were excluded from statistical comparisons due to the absence of text (images only). Researchers assessed the remaining 148 documents using the hand-scored Simple Measure of Gobbledygook (SMOG) method and the computerized F-K grade level method. For SMOG, 3 independent reviewers hand-scored each of the 150 documents. For F-K, documents were analyzed using Microsoft Word. Reading grade levels scores were entered into a database for statistical analysis. Inter-rater reliability was calculated using intra-class correlation coefficients (ICC). Paired t-tests were used to compare readability means. Results: Acceptable inter-rater reliability was found for SMOG (ICC=0.95). For the 148 documents assessed, SMOG produced a significantly higher mean reading grade level (M=9.6, SD=1.3) than F-K (M=6.5, SD=1.3; p<0.001). Additionally, when using the SMOG method of assessment, 147 of the 148 documents (99.3{\%}) scored above the AMA’s recommended reading level of sixth grade. Conclusions: Computerized health literacy assessment tools, used by many national patient education material providers, might not be representative of the actual reading grade levels of patient education materials. This is problematic in regions like Appalachia because materials may not be comprehensible to the area’s low-literacy patients. Medical librarians have the potential to advance their role in patient education to better serve their patient populations.",
author = "Kelsey Grabeel and Jennifer Russomanno and Sandra Oelschlegel and Emily Tester and Robert Heidel",
year = "2018",
month = "1",
day = "1",
doi = "10.5195/jmla.2018.262",
language = "English (US)",
volume = "106",
pages = "38--45",
journal = "Journal of the Medical Library Association : JMLA",
issn = "1536-5050",
publisher = "Medical Library Association",
number = "1",

}

TY - JOUR

T1 - Computerized versus hand-scored health literacy tools

T2 - A comparison of Simple Measure of Gobbledygook (SMOG) and Flesch-Kincaid in printed patient education materials

AU - Grabeel, Kelsey

AU - Russomanno, Jennifer

AU - Oelschlegel, Sandra

AU - Tester, Emily

AU - Heidel, Robert

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Objective: The research compared and contrasted hand-scoring and computerized methods of evaluating the grade level of patient education materials that are distributed at an academic medical center in east Tennessee and sought to determine if these materials adhered to the American Medical Association’s (AMA’s) recommended reading level of sixth grade. Methods: Librarians at an academic medical center located in the heart of Appalachian Tennessee initiated the assessment of 150 of the most used printed patient education materials. Based on the Flesch-Kincaid (F-K) scoring rubric, 2 of the 150 documents were excluded from statistical comparisons due to the absence of text (images only). Researchers assessed the remaining 148 documents using the hand-scored Simple Measure of Gobbledygook (SMOG) method and the computerized F-K grade level method. For SMOG, 3 independent reviewers hand-scored each of the 150 documents. For F-K, documents were analyzed using Microsoft Word. Reading grade levels scores were entered into a database for statistical analysis. Inter-rater reliability was calculated using intra-class correlation coefficients (ICC). Paired t-tests were used to compare readability means. Results: Acceptable inter-rater reliability was found for SMOG (ICC=0.95). For the 148 documents assessed, SMOG produced a significantly higher mean reading grade level (M=9.6, SD=1.3) than F-K (M=6.5, SD=1.3; p<0.001). Additionally, when using the SMOG method of assessment, 147 of the 148 documents (99.3%) scored above the AMA’s recommended reading level of sixth grade. Conclusions: Computerized health literacy assessment tools, used by many national patient education material providers, might not be representative of the actual reading grade levels of patient education materials. This is problematic in regions like Appalachia because materials may not be comprehensible to the area’s low-literacy patients. Medical librarians have the potential to advance their role in patient education to better serve their patient populations.

AB - Objective: The research compared and contrasted hand-scoring and computerized methods of evaluating the grade level of patient education materials that are distributed at an academic medical center in east Tennessee and sought to determine if these materials adhered to the American Medical Association’s (AMA’s) recommended reading level of sixth grade. Methods: Librarians at an academic medical center located in the heart of Appalachian Tennessee initiated the assessment of 150 of the most used printed patient education materials. Based on the Flesch-Kincaid (F-K) scoring rubric, 2 of the 150 documents were excluded from statistical comparisons due to the absence of text (images only). Researchers assessed the remaining 148 documents using the hand-scored Simple Measure of Gobbledygook (SMOG) method and the computerized F-K grade level method. For SMOG, 3 independent reviewers hand-scored each of the 150 documents. For F-K, documents were analyzed using Microsoft Word. Reading grade levels scores were entered into a database for statistical analysis. Inter-rater reliability was calculated using intra-class correlation coefficients (ICC). Paired t-tests were used to compare readability means. Results: Acceptable inter-rater reliability was found for SMOG (ICC=0.95). For the 148 documents assessed, SMOG produced a significantly higher mean reading grade level (M=9.6, SD=1.3) than F-K (M=6.5, SD=1.3; p<0.001). Additionally, when using the SMOG method of assessment, 147 of the 148 documents (99.3%) scored above the AMA’s recommended reading level of sixth grade. Conclusions: Computerized health literacy assessment tools, used by many national patient education material providers, might not be representative of the actual reading grade levels of patient education materials. This is problematic in regions like Appalachia because materials may not be comprehensible to the area’s low-literacy patients. Medical librarians have the potential to advance their role in patient education to better serve their patient populations.

UR - http://www.scopus.com/inward/record.url?scp=85040911392&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85040911392&partnerID=8YFLogxK

U2 - 10.5195/jmla.2018.262

DO - 10.5195/jmla.2018.262

M3 - Article

C2 - 29339932

AN - SCOPUS:85040911392

VL - 106

SP - 38

EP - 45

JO - Journal of the Medical Library Association : JMLA

JF - Journal of the Medical Library Association : JMLA

SN - 1536-5050

IS - 1

ER -