schliessen

Filtern

 

Bibliotheken

The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation

Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementati... Full description

Journal Title: Implementation science : IS 08 January 2015, Vol.10, pp.2
Main Author: Lewis, Cara C
Other Authors: Stanick, Cameo F , Martinez, Ruben G , Weiner, Bryan J , Kim, Mimi , Barwick, Melanie , Comtois, Katherine A
Format: Electronic Article Electronic Article
Language: English
Subjects:
ID: E-ISSN: 1748-5908 ; PMID: 25567126 Version:1 ; DOI: 10.1186/s13012-014-0193-x
Link: http://pubmed.gov/25567126
Zum Text:
SendSend as email Add to Book BagAdd to Book Bag
Staff View
recordid: medline25567126
title: The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation
format: Article
creator:
  • Lewis, Cara C
  • Stanick, Cameo F
  • Martinez, Ruben G
  • Weiner, Bryan J
  • Kim, Mimi
  • Barwick, Melanie
  • Comtois, Katherine A
subjects:
  • Cooperative Behavior
  • Program Evaluation -- Methods
  • Translational Medical Research -- Methods
ispartof: Implementation science : IS, 08 January 2015, Vol.10, pp.2
description: Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically v
language: eng
source:
identifier: E-ISSN: 1748-5908 ; PMID: 25567126 Version:1 ; DOI: 10.1186/s13012-014-0193-x
fulltext: fulltext
issn:
  • 17485908
  • 1748-5908
url: Link


@attributes
ID528011525
RANK0.07
NO1
SEARCH_ENGINEprimo_central_multiple_fe
SEARCH_ENGINE_TYPEPrimo Central Search Engine
LOCALfalse
PrimoNMBib
record
control
sourcerecordid25567126
sourceidmedline
recordidTN_medline25567126
sourceformatXML
sourcesystemOther
pqid1675871123
galeid541557262
display
typearticle
titleThe Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation
creatorLewis, Cara C ; Stanick, Cameo F ; Martinez, Ruben G ; Weiner, Bryan J ; Kim, Mimi ; Barwick, Melanie ; Comtois, Katherine A
ispartofImplementation science : IS, 08 January 2015, Vol.10, pp.2
identifier
subjectCooperative Behavior ; Program Evaluation -- Methods ; Translational Medical Research -- Methods
descriptionIdentification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
languageeng
source
version5
lds50peer_reviewed
links
openurl$$Topenurl_article
backlink$$Uhttp://pubmed.gov/25567126$$EView_this_record_in_MEDLINE/PubMed
openurlfulltext$$Topenurlfull_article
addlink$$Uhttp://exlibris-pub.s3.amazonaws.com/aboutMedline.html$$EView_the_MEDLINE/PubMed_Copyright_Statement
search
creatorcontrib
0Lewis, Cara C
1Stanick, Cameo F
2Martinez, Ruben G
3Weiner, Bryan J
4Kim, Mimi
5Barwick, Melanie
6Comtois, Katherine A
titleThe Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation
description
0Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes.
1The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository.
2To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied.
3The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
subject
0Cooperative Behavior
1Program Evaluation -- Methods
2Translational Medical Research -- Methods
general
025567126
1English
2MEDLINE/PubMed (U.S. National Library of Medicine)
310.1186/s13012-014-0193-x
4MEDLINE/PubMed (NLM)
sourceidmedline
recordidmedline25567126
issn
017485908
11748-5908
rsrctypearticle
creationdate2015
addtitleImplementation science : IS
searchscope
0medline
1nlm_medline
2MEDLINE
scope
0medline
1nlm_medline
2MEDLINE
lsr4120150108
citationpf 2 vol 10
startdate20150108
enddate20150108
lsr30VSR-Enriched:[issue, galeid, pqid]
sort
titleThe Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation
authorLewis, Cara C ; Stanick, Cameo F ; Martinez, Ruben G ; Weiner, Bryan J ; Kim, Mimi ; Barwick, Melanie ; Comtois, Katherine A
creationdate20150108
lso0120150108
facets
frbrgroupid8636535015778077040
frbrtype5
newrecords20190701
languageeng
creationdate2015
topic
0Cooperative Behavior
1Program Evaluation–Methods
2Translational Medical Research–Methods
collectionMEDLINE/PubMed (NLM)
prefilterarticles
rsrctypearticles
creatorcontrib
0Lewis, Cara C
1Stanick, Cameo F
2Martinez, Ruben G
3Weiner, Bryan J
4Kim, Mimi
5Barwick, Melanie
6Comtois, Katherine A
jtitleImplementation Science : Is
toplevelpeer_reviewed
delivery
delcategoryRemote Search Resource
fulltextfulltext
addata
aulast
0Lewis
1Stanick
2Martinez
3Weiner
4Kim
5Barwick
6Comtois
aufirst
0Cara C
1Cameo F
2Ruben G
3Bryan J
4Mimi
5Melanie
6Katherine A
au
0Lewis, Cara C
1Stanick, Cameo F
2Martinez, Ruben G
3Weiner, Bryan J
4Kim, Mimi
5Barwick, Melanie
6Comtois, Katherine A
atitleThe Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation
jtitleImplementation science : IS
risdate20150108
volume10
spage2
pages2
eissn1748-5908
formatjournal
genrearticle
ristypeJOUR
abstractIdentification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes.
doi10.1186/s13012-014-0193-x
pmid25567126
issue1
oafree_for_read
date2015-01-08