Articles producció científica> Psicologia

Assessing the Quality of Mobile Health-Related Apps: Interrater Reliability Study of Two Cuides

  • Datos identificativos

    Identificador: imarina:9216855
    Autores:
    Miro, JordiLlorens-Vernet, Pere
    Resumen:
    Background: There is a huge number of health-related apps available, and the numbers are growing fast. However, many of them have been developed without any kind of quality control. In an attempt to contribute to the development of high-quality apps and enable existing apps to be assessed, several guides have been developed. Objective: The main aim of this study was to study the interrater reliability of a new guide & mdash; the Mobile App Development and Assessment Guide (MAG) & mdash; and compare it with one of the most used guides in the field, the Mobile App Rating Scale (MARS). Moreover, we also focused on whether the interrater reliability of the measures is consistent across multiple types of apps and stakeholders. Methods: In order to study the interrater reliability of the MAG and MARS, we evaluated the 4 most downloaded health apps for chronic health conditions in the medical category of IOS and Android devices (ie, App Store and Google Play). A group of 8 reviewers, representative of individuals that would be most knowledgeable and interested in the use and development of health-related apps and including different types of stakeholders such as clinical researchers, engineers, health care professionals, and end users as potential patients, independently evaluated the quality of the apps using the MAG and MARS. We calculated the Krippendorff alpha for every category in the 2 guides, for each type of reviewer and every app, separately and combined, to study the interrater reliability. Results: Only a few categories of the MAG and MARS demonstrated a high interrater reliability. Although the MAG was found to be superior, there was considerable variation in the scores between the different types of reviewers. The categories with the highest interrater reliability
  • Otros:

    Autor según el artículo: Miro, Jordi; Llorens-Vernet, Pere
    Departamento: Psicologia
    Autor/es de la URV: Llorens Vernet, Pere / Miró Martínez, Jordi
    Palabras clave: Telemedicine Reproducibility of results Reproducibility Rating-scale Rating Privacy Phone apps Mobile health Mobile apps Mobile applications Mobile application Mhealth Mars Mag Interrater reliability Humans Human Health care delivery Evaluation studies, rating Evaluation studies Delivery of health care
    Resumen: Background: There is a huge number of health-related apps available, and the numbers are growing fast. However, many of them have been developed without any kind of quality control. In an attempt to contribute to the development of high-quality apps and enable existing apps to be assessed, several guides have been developed. Objective: The main aim of this study was to study the interrater reliability of a new guide & mdash; the Mobile App Development and Assessment Guide (MAG) & mdash; and compare it with one of the most used guides in the field, the Mobile App Rating Scale (MARS). Moreover, we also focused on whether the interrater reliability of the measures is consistent across multiple types of apps and stakeholders. Methods: In order to study the interrater reliability of the MAG and MARS, we evaluated the 4 most downloaded health apps for chronic health conditions in the medical category of IOS and Android devices (ie, App Store and Google Play). A group of 8 reviewers, representative of individuals that would be most knowledgeable and interested in the use and development of health-related apps and including different types of stakeholders such as clinical researchers, engineers, health care professionals, and end users as potential patients, independently evaluated the quality of the apps using the MAG and MARS. We calculated the Krippendorff alpha for every category in the 2 guides, for each type of reviewer and every app, separately and combined, to study the interrater reliability. Results: Only a few categories of the MAG and MARS demonstrated a high interrater reliability. Although the MAG was found to be superior, there was considerable variation in the scores between the different types of reviewers. The categories with the highest interrater reliability in MAG were & ldquo;Security & rdquo; (alpha=0.78) and & ldquo;Privacy & rdquo; (alpha=0.73). In addition, 2 other categories, & ldquo;Usability & rdquo; and & ldquo;Safety,& rdquo; were very close to compliance (health care professionals: alpha=0.62 and 0.61, respectively). The total interrater reliability of the MAG (ie, for all categories) was 0.45, whereas the total interrater reliability of the MARS was 0.29. Conclusions: This study shows that some categories of MAG have significant interrater reliability. Importantly, the data show that the MAG scores are better than the ones provided by the MARS, which is the most commonly used guide in the area. However, there is great variability in the responses, which seems to be associated with subjective interpretation by the reviewers.
    Áreas temáticas: Saúde coletiva Medical informatics Health informatics Health care sciences & services Ciência da computação
    Acceso a la licencia de uso: https://creativecommons.org/licenses/by/3.0/es/
    Direcció de correo del autor: pere.llorens@urv.cat pere.llorens@urv.cat jordi.miro@urv.cat
    Identificador del autor: 0000-0002-3073-7885 0000-0002-3073-7885 0000-0002-1998-6653
    Fecha de alta del registro: 2024-10-12
    Volumen de revista: 9
    Versión del articulo depositado: info:eu-repo/semantics/publishedVersion
    Enlace a la fuente original: https://mhealth.jmir.org/2021/4/e26471
    URL Documento de licencia: https://repositori.urv.cat/ca/proteccio-de-dades/
    Referencia al articulo segun fuente origial: Jmir Mhealth And Uhealth. 9 (4): e26471-
    Referencia de l'ítem segons les normes APA: Miro, Jordi; Llorens-Vernet, Pere (2021). Assessing the Quality of Mobile Health-Related Apps: Interrater Reliability Study of Two Cuides. Jmir Mhealth And Uhealth, 9(4), e26471-. DOI: 10.2196/26471
    DOI del artículo: 10.2196/26471
    Entidad: Universitat Rovira i Virgili
    Año de publicación de la revista: 2021
    Tipo de publicación: Journal Publications
  • Palabras clave:

    Health Care Sciences & Services,Health Informatics,Medical Informatics
    Telemedicine
    Reproducibility of results
    Reproducibility
    Rating-scale
    Rating
    Privacy
    Phone apps
    Mobile health
    Mobile apps
    Mobile applications
    Mobile application
    Mhealth
    Mars
    Mag
    Interrater reliability
    Humans
    Human
    Health care delivery
    Evaluation studies, rating
    Evaluation studies
    Delivery of health care
    Saúde coletiva
    Medical informatics
    Health informatics
    Health care sciences & services
    Ciência da computação
  • Documentos:

  • Cerca a google

    Search to google scholar