If an application includes an evaluation research component (or consists entirely of evaluation research), the application is expected to propose the most rigorous evaluation design appropriate for the research questions to be addressed. If the primary purpose of the evaluation is to determine the effectiveness or impact of an intervention (e.g., program, practice, or policy), the most rigorous evaluation designs may include random selection and assignment of participants (or other appropriate units of analysis) to experimental and control conditions. In cases where randomization is not feasible, applicants should propose a strong quasi-experimental design that can address the risk of selection bias.
Applications that propose evaluation research should also include an implementation evaluation component to ensure that implementation fidelity metrics are collected and analyzed. Applicants are encouraged to propose research methodologies that enable interim feedback on implementation fidelity to enable program providers to make ongoing improvements in program delivery.
Applications that propose evaluation research should include discussion of independence and safeguarding integrity. If an application proposes an evaluation involving program staff, the applicant must demonstrate research/evaluation independence and provide a description of and justification for the roles of program staff as distinct from evaluation activities, a discussion of potential risks (if any) to independence and integrity, and a description of the safeguards that will be employed to ensure research independence.
NIJ also encourages applicants to consider the feasibility of including:
- Cost/benefit and cost effectiveness analyses. In cases where evaluations find that interventions have produced the intended benefit, cost/benefit or cost effectiveness analyses provide valuable and practical information for practitioners and policymakers that aids decision-making.
- Field-initiated action research. These are research partnership proposals that meet the needs and missions of local justice and service provider entities. These partnerships should apply a data driven, problem-solving approach to challenges prioritized by agency partners; identify actionable and measurable responses; implement changes; and employ an action research evaluation approach to assessing the impact of interventions on desired outcomes that emphasizes scientific rigor and meaningful stakeholder engagement; as well as include a statement of institutional partnership.
- Implementation science. Research that leverages implementation science knowledge and promotes evidence-based policy and practice — most directly by developing, supporting, and evaluating efforts to improve research evidence use by policymakers, agency leaders, intermediaries, and other decision-makers who shape justice outcomes. A multitude of theories, models, and frameworks (TMFs) — mostly developed in other disciplines — exist to facilitate the implementation of evidence-based programs and practices.[1] NIJ is interested in evaluating the impact of the application of existing TMFs to criminal and juvenile justice settings.
- Core components research. Core program components are defined as essential functions and principles underlying the program design that are necessary to produce outcomes in a typical service setting and include parts, features, attributes, or characteristics that influence a program's success. Examples of how core program components can be identified include through systematic reviews, meta-analyses, or multisite evaluations.[2]
Evaluation research projects may address a wide range of research questions beyond those focused on the effectiveness or impact of an intervention. Different research designs may be more appropriate for different research questions and at different stages of program development. The intervention strategies, setting, other contextual factors, and resources should be considered when selecting an evaluation design.
In all cases, applications are expected to propose the most rigorous evaluation design appropriate for the research questions to be addressed. Applicants are encouraged to review evidence-rating criteria at How We Rate Programs for further information on high-quality evaluation design elements.