The Call For Abstracts Submission Deadline is Closed.

The Abstract Review Process

All abstracts submitted that meet minimal acceptance requirements (i.e., submitted by the deadline, contain sufficient information in each of the required elements for reviewers to make rating decisions) will be reviewed by a minimum of three qualified individuals. Proposals will be scored in each of the required proposal elements.

CB reserves the right to make all final decisions regarding which proposals to accept. We anticipate that those submitting abstracts will be notified as to whether or not their proposal has been accepted no later than mid-April 2019.

Deadline for Submissions: February 1, 2019 (11:59 p.m. ET)

The Call for Abstracts for the 2019 National Child Welfare Evaluation Summit is broad and inclusive. Any proposal that substantively contributes to one or more of the three primary goals—building evidence, strengthening practice, and informing policy—will be considered. We expect the submission selection process to be highly competitive.

Proposals that advance the Children's Bureau's (CB's) prevention-focused vision for the future of child welfare, inform efforts to implement the Family First Prevention Services Act (FFPSA), or promote the routine and practical use of data and evaluation to continue to improve child welfare practice are strongly encouraged. We also have an interest in proposals that demonstrate productive and meaningful collaboration, such as interjurisdictional and cross-disciplinary approaches to data collection, sharing, and analysis and/or partnerships between evaluators and child welfare agencies or communities of interest.

In general, we seek sessions that focus on longstanding and emerging evaluation issues and practice challenges, with special attention to points of debate, gaps in knowledge, innovative methods, and new findings that contribute to stakeholders’ understanding of how to prevent and respond to child maltreatment more effectively. Proposals that engage multiple conference audiences, invite constructive dialogue, and present multiple points of view may be given preference.

We hope to receive proposals that will address a wide array of topics, including submissions in the following areas:

Proposals in this area should explore the purposes of evaluation; the choices stakeholders make about evaluation approach, design, and measurement; and stakeholder analyses based on these choices. Presenters are encouraged to discuss strengths, limitations, and implications and how these affect what stakeholders can learn and how evaluation findings should be used.

In addition to presentations that describe innovative and practical solutions to evaluation challenges, proposals in this area could explore a variety of topics and issues. For example:

  • Examining key factors that guide the selection of research approach, design, and methods
  • Defining and measuring nebulous constructs or outcomes (e.g., quality hearings, meaningful family engagement, healthy culture and climate)
  • Addressing measurement and sampling challenges at the child, family, organizational, community, or population levels
  • Weighing the relative advantages of quantitative, qualitative, and mixed methods
  • Using evaluation approaches driven by stakeholders and/or participants

Proposals in this area should examine how effective data collection, analysis, and interpretation facilitates understanding of the populations served by child welfare workers, the services these populations receive, and how worker interventions impact the lives of children, youth, and families. Presenters are encouraged to discuss the use of child welfare administrative data, other quantitative datasets, and qualitative data to identify and explore challenges, performance, and outcomes through evaluation and continuous quality improvement (CQI).

In addition to presentations that describe the nimble use of data to understand populations and performance, proposals in this area could explore a variety of topics and issues. For example:

  • Using child welfare administrative data to identify and assess agency needs and strengths
  • Analyzing data at the child, family, organizational, community, or population levels to understand and improve performance
  • Exploring opportunities, risks, and technical choices when using predictive analytics in child welfare
  • Developing agreements and models for sharing data across systems and agencies
  • Using quantitative and qualitative data to identify and confirm possible root causes
  • Exploring variation in the data, including over representation and disparities in service delivery and outcomes

Proposals in this area should explore effective means of translating, communicating, and using research and evaluation findings to improve practice. Presenters are encouraged to describe how they have made findings more consumable and useful and how these efforts have successfully facilitated data-driven decision-making and improvements.

In addition to presentations that explore efforts to bridge the research-to-practice gap by improving the accessibility and applicability of findings, proposals in this area could explore a variety of topics and issues. For example:

  • Attempting to make findings more relevant, meaningful, consumable, and useful (e.g., through data visualization, use of Bayesian methods, etc.)
  • Tailoring communication and dissemination approaches for intended audiences and communities (e.g., courts, tribes, etc.)
  • Communicating and using negative and null findings
  • Teaming to use data to drive decision-making and program improvement, including between agencies and courts
  • Integrating research and evaluation with ongoing CQI processes
  • Evaluating dissemination, reach, consumption, and use of findings in child welfare

Proposals in this area should focus on efforts to define “evidence” and how tests of efficacy and effectiveness are conducted in a variety of complex contexts with diverse populations. Presenters are encouraged to discuss dilemmas, challenges, and implications associated with balancing research “rigor” with feasibility considering resource realities; the time-sensitive needs of children, youth, and families; and the FFPSA requirements. improve practice. Presenters are encouraged to describe how they have made findings more consumable and useful and how these efforts have successfully facilitated data-driven decision-making and improvements.

In addition to presentations that describe efforts to test specific prevention and child welfare interventions and share formative and summative evaluation findings, proposals in this area could explore a variety of topics and issues. For example:

  • Understanding evidence continuums and rating criteria in child welfare and other fields
  • Operationalizing programs and practices in child welfare and performing rigorous formative evaluation (e.g., casework practice models, workforce strategies, etc.)
  • Making randomized control trials more accessible, practical, and feasible
  • Examining factors that make quasi-experimental designs necessary and most appropriate
  • Answering research questions about effectiveness and cost
  • Setting standards and requirements for evaluation that will promote evidence building and progression on evidence continuums

Proposals in this area should explain innovative evaluation approaches, including the use of technology to enhance and facilitate evaluation, as well as present results from the evaluation of new technologies and practice innovations. Presenters are encouraged to identify and discuss security and privacy issues associated with the use of emerging technologies in research, evaluation, or CQI efforts.

In addition to presentations that share novel evaluation and performance improvement strategies and technologies, proposals in this area could explore a variety of topics and issues. For example:

  • Exploring new frontiers of innovation in child welfare research and evaluation (neuroscience, epigenetics, pharmacology, artificial intelligence, etc.)
  • Evaluating innovations and technological advancements in workforce development (e.g., simulation training, virtual reality, and distance learning)
  • Using geospatial mapping to better understand context, services, and performance
  • Weighing opportunities and risks associated with data collection from widespread and emerging information technologies (e.g., search engines, social media, mobile applications)
  • Ensuring human subject protection, privacy, and security when collecting and using data in a rapidly changing technological and research environment
  • Building innovations into data infrastructure, including through the Comprehensive Child Welfare Information System

Proposals in this area should describe efforts to understand the needs and experiences of specific demographic groups and communities and to evaluate the effectiveness of programs and services designed to support them. Presenters are encouraged to discuss opportunities, challenges, and considerations when selecting methods, collecting and analyzing data, and interpreting findings, especially when focused on potentially vulnerable subpopulations.

In addition to presentations that highlight promising approaches and important considerations for population-specific research and evaluation, proposals in this area could explore a variety of topics and issues. For example:

  • Empowering and protecting vulnerable groups throughout the evaluation process
  • Designing or adapting methods and measures for specific communities (e.g., children who are victims of sex trafficking, youth in transition, prospective resource families, immigrant caregivers)
  • Examining differences in worldviews and their implications for building evidence
  • Defining and promoting rigorous evaluation with tribal communities
  • Collaborating with the legal and judicial community to evaluate practice
  • Using evaluation to support the development or cultural adaptation of interventions
  • Conducting research and evaluation to understand and address disparity and disproportionality