Process and Outcome Evaluation of Two Phased Innovation Award Programs

Authors

Liberty A. Walton, Brandie K. Taylor, Krystal A. Tomlin, Dione I. Washington, and Jane C. Lockmuller

Abstract

The National Institutes of Health (NIH) uses several approaches to promote and support high-risk, high-reward research. The National Institute of Allergy and Infectious Diseases (NIAID) at the NIH, promotes such research through a biphasic R21/R33 grant mechanism. This mechanism begins with a R21 grant which may lead to a R33 grant. Through the provision of these grants, NIAID aims to support high-risk, high-reward product-oriented research within the Division of AIDS (DAIDS). In particular, two DAIDS programs utilize this mechanism - the AIDS Vaccine Research (AVR) Program and the Microbicide Innovation Program (MIP). To determine if this biphasic grant mechanism is achieving its goal, The Madrillon Group Inc., under contract and in collaboration with NIAID’s Policy, Planning, & Evaluation Branch (PP&E) and DAIDS, conducted a process and outcome evaluation of these R21/R33 research programs. This poster illustrates the evaluation methodology, provides key study findings, and outlines lessons learned from this evaluation.

Evaluation Questions

  1. Is the Phased Innovation Award (PIA) mechanism an appropriate mechanism for desired microbicide and prophylactic vaccine research?
  2. Is the PIA mechanism a valuable component of the DAIDS research portfolio?
  3. What was the overall impact of the PIA mechanism-supported milestone-driven research?

Data Sources

  • Archival Data (e.g. NIH grant and application records)
  • Bibliometric Data
  • Web-based, Principal Investigator (PI) Survey
  • Interviews of Principal Investigators
  • Interviews with Federal Stakeholders
  • Case Studies

Methods

Collection Method Source Pros Cons
Archival Data
  • NIH databases (e.g. QVR, RePORTER)
  • Quick access to large datasets (e.g. demographic and professional characteristics; science content of projects; grant funding history; and key personnel and collaborations)
  • Difficult to repeat data pulls since database is updated frequently
  • Data entry practices have changed overtime so it can be difficult to compare certain data across years
Bibliometric Data
  • Scientific Publication Information Retrieval & Evaluation System (SPIRES)
  • Quick access to large datasets
  • Commonly used outcome measures
  • Requires time for data cleaning
  • Publications do not always accurately acknowledge their funding sources
  • Not all journals are included in PubMed
  • It can be challenging to match grants with publications due to differences in formatting
  • If a grant changes its number (i.e. from an R21 to R33), it can be difficult to correctly match publications and grants
Web-based, Principal Investigator Survey
  • Principal Investigators
  • Cost efficient
  • Reaches people internationally
  • Immediate data accessibility
  • Easily allows for skip patterns
  • Easy to track respondents and send tailored reminders to non-respondents
  • Short time required to complete (~20 minutes)
  • Does not provide opportunity to clarify questions
  • Needs to be tested on multiple platforms
  • Some email addresses on record are no longer valid
Telephone Interviews
  • Principal Investigators
  • Semi-structured interview protocol allowed for follow-up questions and clarification of questions and responses
  • Reaches people nationally
  • Length of interview (~48 minutes)
  • Interviewer Bias
In-Person and Telephone Interviews
  • Federal Stakeholders (Program Officers and Directors, Grants Management Officers, and Scientific Review Officers)
  • Semi-structured interview protocol allowed for follow-up questions and clarification of questions and responses
  • Telephone option allowed for more flexible scheduling
  • Good response rates
  • Variable length of interview depending on programmatic area (12-43 minutes)
Case Studies
  • Four similar grant programs at other NIH Institutes
  • Semi-structured interview protocol allowed for follow-up questions and clarification of questions and responses
  • Enabled a detailed examination of details for implementing PIA grant mechanisms
  • Length of interview (90-120 minutes)

Challenges & Lessons Learned

  • The evaluation would have benefited from having more time (i.e., long than nine months) to be conducted
  • It can be difficult to find appropriate comparison groups when evaluating a grant mechanism, rather than a scientific program
  • Research was too recent to study long term outcomes
  • Limitations to use of self-reported data
  • No established benchmarks for assessing bibliometrics

Successes

  • High response rates
    • PI interviews (n=9): 100%
    • Federal Interviews (n=15): 93%
    • PI surveys (n=64): 95%
      • Initial email sent from NIH
      • Announced at professional meeting
  • Case studies allowed methods and results to be generalized to other Institutes
  • Qualitative and quantitative data allowed for stronger analyses and interpretation
  • With the evaluation, DAIDS was able to:
    • Document funding of high-risk projects
    • Better document outcomes
    • Identify program areas for improvement

Key Findings

The DAIDS PIA mechanism:

  • Achieved the AVR and MIP program goals
  • Provided the ability to evaluate research progress
  • Supported research that led to new scientific hypotheses, models, methods, tools, etc.
  • Stimulated multidisciplinary collaborations

Use of Results

  • Changed other DAIDS grant mechanisms used to include a biphasic approach with go/no-go milestones
  • Informed the design of an evaluation of a similar grant mechanism at the National Cancer Institute

Contact Information

NIAIDEvaluation2@niaid.nih.gov

Acknowledgements

The Madrillon Group, Inc.

Poster Presented at the 2015 American Evaluation Association (AEA) Annual Meeting:

Walton, L., Taylor, B., Tomlin, K., Washington, D., & Lockmuller, L. (2015, November). Process and Outcome Evaluation of Two Phased Innovation Award Programs at the National Institute of Allergy and Infectious Diseases. Presented at the Annual AEA Conference Meeting, Chicago

Content last reviewed on