Cost-effectiveness Analysis to Inform BC Screening Policy
|Institution:||University of California, Davis|
Joy Melnikow , M.D., M.P.H. -
|Award Cycle:||2011 (Cycle 17)||Grant #: 17IB-0007||Award: $149,996|
|Community Impact of Breast Cancer>Health Policy and Health Services: better serving women's needs|
Initial Award Abstract (2011)
Non-technical overview of the research topic and relevance to breast cancer:
Cost effectiveness analysis (CEA) is a tool that policymakers and others can use to consider the potential impacts of various policy options on projected future health program benefits and costs. Breast cancer health programs for underserved women are faced with increasing need for services and declining budgets. Carefully constructed computer modeling can be useful in projecting potential outcomes of policy and budgetary choices. This project is based on a cost effectiveness model that was designed specifically for the Every Woman Counts (EWC) Program, a California safety net program providing breast cancer screening and diagnostic services to uninsured and underinsured women.
The question(s) or central hypotheses of the research:
The central research question of this proposal asks, “How can a user-friendly computer interface best facilitate the integration of cost effectiveness analysis into breast cancer health policy making?” This project will develop a computer interface to enable breast cancer policy makers, advocates, and researchers to choose program parameters and receive immediate feedback on the costs and outcomes of policy alternatives they are considering. This project is aimed at creating a user-friendly tool that will help to integrate research evidence into health policy making.
The general methodology:
We will hold a structured meeting with breast cancer policy stakeholders to identify priority questions related to breast cancer screening for underserved women in California, to increase their familiarity with CEA, and to begin to assess the user-friendly interface. Based on the policy priorities identified in the meeting, we will modify the EWC model and develop the computer interface for use by policy stakeholders. We will conduct follow-up interviews and additional testing of the model with the stakeholders that participated in the initial group meeting. Participant interviewing will provide us insight into how users perceive, understand, and use the policy model and into how they interpret the model output. At every step of the research process, the research team will work closely with a Project Advisory Board composed of representatives of breast cancer advocacy organizations, and state and county public health administrators. They will advise the research team on all aspects of the project.
Innovative elements of the project:
This research project consists of an innovative use of technology applied to an issue of increasing importance related to breast cancer screening. As health organizations, particularly those that serve uninsured and underinsured women, are pressed to provide more services to more women, cost effectiveness analysis provides a strategy for coping with limited resources. This project is innovative, not just in its application of CEA to policy for a specific breast cancer screening program, but also in providing a computer interface that enables non-technical users to directly estimate the results of their policy choices.
Advocacy involvement and relevance to the human issues associated with breast cancer:
Breast cancer advocates on the Advisory Board will be directly involved in the design of the project, interpretation of the findings, and development of the model. Advocates will also be involved in the study as participants in the structured meeting, as well as the subsequent testing and interviews about the model.
Final Report (2015)
Cost-effectiveness analysis (CEA) is an established analytic method used for decision-making; however, adoption and integration of this evidence-based tool by policymakers has been slow. Obstacles hindering widespread adoption of this CEA include basic knowledge and understanding of CEA methods (e.g., model inputs, assumptions, terminology, etc.); difficulty interpreting technical findings, and access to a user-friendly interface. Building upon an existing cost-effectiveness model tailored to the Every Woman Counts (EWC) program, a California breast cancer screening safety net program, this project designed and tested a user-friendly CEA computer model interface for use by breast cancer policymakers, advocates and researchers to obtain evidence-based and relevant estimates of the impacts of user-defined program policies. Specifically, the model allows users to choose multiple program input parameters and to receive immediate feedback on the costs and health outcomes of the proposed policy defined by the user.
All five project aims were achieved. Through an iterative process with the EWC staff and our project Advisory Board, we created an initial list of relevant policy questions and program challenges and options about breast cancer screening in this underserved population; this list was further refined through discussions at a half-day stakeholder meeting of 29 participants, and follow up discussions with the project Advisory Board (Aim #1). Based on the feedback from these three sources, we identified and incorporated key functions that would make the model most relevant to stakeholders (Aim #2) and adapted the model to function under both fixed and variable budget conditions (Aim #3). We developed a user-friendly model interface (Aim #4) and tested it with stakeholders through one-onone cognitive interviews (Aim #5).
The primary barrier to this project was recruiting key stakeholders for the interview phase of the project. Despite a lower response rate than planned (15 rather than 25), we reached saturation with interview responses and found interviewee comments to be remarkably consistent and informative.
Major accomplishments included recruiting and meeting bi-/monthly with five Advisory Board members with breast cancer and/or policy expertise; convening a stakeholder meeting (29 diverse participants from California) who provided input on key policy questions and model inputs and outputs. Based on this feedback, we modified the model to include a fixed budget perspective; added new outputs like number of women served and cost of screening/diagnosis; provided interpretative text for input and output variables; and added FAQ questions about the model’s limitations and assumptions regarding screening barriers, geographic differences, and new screening technologies. Following the model update, 15 interviewees with breast cancer screening and/or policy responsibility were recruited for input on the design and usability of the revised model interface. Those interviewees comfortable with math and/or computers and those with EWC-specific responsibility (rather than a particular stakeholder group) were most likely to report that the model was useful and relevant, and would recommend it to other decision makers. A few desired similar models to be developed for other policy issues. Additionally, based on the first seven interview responses, we created a tutorial guide to accompany the model to orient users to CEA terminology, the model functions and to provide more detailed interpretations/translations of CEA graphs and tables. We used the guide in the last five interviews and gathered feedback on both guide and the model. Final revisions to both the tutorial and the model were made based on this feedback.
Future plans include further revisions for clarity, review by several health economists to insure model validity, adding printer-friendly capability and initiating discussions with the EWC staff about the best arrangements for make the model and tutorial accessible to policy makers and their support staff. We plan to seek funding for a study of model use by policy makers and advocates in real world settings