Co-production & Actionable Science Panel

May 4, 2021 Panel Hosted by the Partners & Knowledge Transfer Working Group

This panel focused on best practices and lessons learned for the co-production of scientific materials with stakeholders and decision makers.

Below you will find a list of the panelists and their backgrounds, a video recording of the panel with times listed when each panelist spoke, resources shared by the panelists, and answers to some of the questions we did not have time to get to during the Q&A session.

Our panelists were:

Recording of the Panel

Timing of Activities in the Video

  • 0:00 Welcome and Overview, Jody Peters (University of Notre Dame)
  • Introductions of panelists were made by Christopher Brown (NOAA) before each panelist listed below
  • 5:27 Panelist: Alyssa Rosemartin (USA National Phenology Network)
  • 18:10 Panelist: Mahmud Farooque (Consortium for Science, Policy and Outcomes)
  • 30:04 Panelist: Ellen Mecray (NOAA)
  • 45:10 Q&A Session moderated by Kira Sullivan-Wiley (University of Notre Dame)

Resources Shared by Panelists

Co-production definitions

  1. Information users and providers collaborate in the generation of knowledge. Key features are about building and sustaining relationships, acknowledging each partner’s differences in what they want out of the partnership and the project and what their strengths and limitations are. As shown in David-Chavez and Gavin 2018, co-production can take different forms depending on the project/funding (contractual, consultative, collaborative, collegial, indigenous).
  2. Participatory Technology Assessment is an engagement model that seeks to improve the outcomes of science and technology decision-making through dialog with informed citizens. It is engaging a group of non-experts who are representative of the general population but who, unlike political, academic, and industry stakeholders, are generally underrepresented in science and technology related policymaking.
  3. Building a set of trusted experts and engaging with external and internal customers. Having continuous engagement with communication that fosters mutual learning and facilitates joint dedication to achieving agreed upon needs and goals.


Alyssa Rosemartin:

  1. Meadow et al. 2015. Moving toward the deliberate coproduction of climate science knowledge. Weather, Climate, and Society 7(2): 179-191. https://doi.org/10.1175/WCAS-D-14-00050.1
  2. Biggs, S. D., 1989: Resource-poor farmer participation in research: A synthesis of experiences from nine national agricultural research systems. International Service for National Agricultural Research, 37 pp. https://vtechworks.lib.vt.edu/handle/10919/66367
  3. Enquist et al. 2017. Foundations of translational ecology. Frontiers in Ecology and the Environment 15(10): 541-550. https://doi.org/10.1002/fee.1733
  4. David-Chavez and Gavin. 2018. A global assessment of Indigenous community engagement in climate research. Environmental Research Letters 13(12): 123005. https://doi.org/10.1088/1748-9326/aaf300 (This publication has the framework for levels of community participation, e.g., Contractual, Consultative, Collaborative, Collegial, and Indigenous), https://iopscience.iop.org/article/10.1088/1748-9326/aaf300/meta
  5. Wall et al. 2017. Developing evaluation indicators to improve the process of coproducing usable climate science. Weather, Climate and Society 9(1):95-107. https://doi.org/10.1175/WCAS-D-16-0008.1
  6. Monahan et al. 2016. Climate change is advancing spring onset across the U.S. national park system. Ecosphere 7(10):e01465. https://doi.org/10.1002/ecs2.1465
  7. NPS page on the Spring Advancing Project 
  8. Info sheet on Forest Pest Risk Project 
  9. National Conservation Training Center Webinar series about indigenous approaches to phenology
  10. Indigenous Phenology Network 
  11. Google Spreadsheet for Assessing Copro Efforts relative to the Indicators of Successful Knowledge CoProduction – this is a blank example of the spreadsheet Alyssa’s team uses to guide their co-production efforts. It’s based on the Wall et al 2017, with some modifications (which we have written up and submitted to Conservation Science & Practice). Alyssa is happy to share what it looks like filled out. Contact eco4cast.initiative@gmail.com to be put in touch with Alyssa.

Mahmud Farooque: 

  1. Participatory Technology Assessment and ECAST pTA Method – The assessment is a powerful tool for bringing public perspectives to bear on critical science and technology decisions. The ECAST pTA Method is a reflexive, adaptable, and scalable model that consists of three phases: problem framing, ECAST deliberations, and results and integration.
  2. Funtowicz and Ravetz. 1993. Science for the post-normal age. Futures 25(7): 739-755. https://doi.org/10.1016/0016-3287(93)90022-L
  3. Schuubiers & Fisher. 2009. Lab-scale intervention. Science & Society Series on Convergence Research. EMBO (European Molecular Biology Organization) Reports, 10(5): 424–427. https://dx.doi.org/10.1038%2Fembor.2009.80
  4. US National Academies of Sciences, Engineering and Medicine 2016. Gene drives on the horizon: Advancing science, navigating uncertainty, and aligning research with public Values. Washington, DC: The National Academies Press. https://doi.org/10.17226/23405.
  5. Lang, et al. 2012. Transdisciplinary research in sustainability science: practice, principles, and challenges. Sustainability Science, 7:25–43. https://link.springer.com/article/10.1007/s11625-011-0149-x
  6. NSF I-Corps – experiential education to help researchers gain valuable insight into entrepreneurship, starting a business or industry requirements. 
  7. Business Model Canvas. https://www.strategyzer.com/canvas/business-model-canvas
  8. ESA Sustaining Biological Infrastructure Course – The online Strategies for Success course will provide financial management, strategic planning, communication, and fundraising skills and tools you need to make your project or program more successful and financially sustainable. This course teaches the Business Model Canvas.

Ellen Mecray:

  1. NOAA Water Initiative Vision and Five-year Plan, December 2016. The mission of the NOAA Water Initiative is to improve the Nation’s water security by providing science-based information and services that address vulnerability to water risks and enabling greater efficiency and effectiveness in the management of water resources. NOAA will advance this mission primarily through transforming integrated water prediction services in collaboration with decision makers, partners, and users.
  2. NOAA Office of Education Office – The Office of Education provides support for education projects and programs through grants, manages partnerships that advance NOAA’s mission and draws on the assets of the agency, provides scholarships for undergraduates to gain hands-on experience while pursuing research and educational training in NOAA-mission sciences 
    1. NOAA Hollings Undergraduate Scholarship – provides academic assistance (up to $9,500 per year) for two years of full-time study and a 10-week, full-time paid summer internship at any NOAA facility nationwide, with travel expenses included.
    2. Educational Partnership Program Undergraduate Scholarship – provides awards of up to $9,500 per year for two years of undergraduate study to rising juniors at Minority Serving Institutions who are majoring in STEM fields that directly support NOAA’s mission. Scholars conduct research at NOAA facilities during two 10-week, full-paid summer internships.

Additional Answers to Questions We Did Not Get to During the Q&A Session

  1. How can we support an iterative approach to co-production and decision support development and refinement?  How can this work best be supported when there can be a mismatch between the grant funding and continual refinement for users?
    • Alyssa: In some cases, I have switched my mindset – from where the grant is the primary organizing system to where the people/relationships/working group is the primary organizing system, and the grants come in to support that group’s work. I do think that we should keep thinking about all these awkward/poorly incentivized pieces and try to change them structurally, to make it easier to do this work – longer time frames on grants, more seed grants that foster partnership dev without need for outcomes.
    • Mahmud: I was asked a similar question in another forum and my detailed answer is here (https://pewsconf.wordpress.com/program/mahmud-farooque/). The short version is that as a practical matter, achieving our goals required my colleagues and me to be pragmatic, strategic, and proactive to the opening and closing of opportunity windows.
  1. What recommendations do you have for moving ‘service delivery’ and ‘decision support’ to the front end of the process, rather than as a secondary component in the forecast production process?
    • Alyssa: Basically, following the framework from the Wall et al 2017 – especially the input and process indicators – that’s where the foundation is laid. I tend to think – if application is a primary goal of a forecast project, then you have to start with the users in the room, work on understanding their problems, what info they are lacking. 
    • Mahmud: I will refer to my co-production diagram and say that we need to include service delivery and decision support in the “problem framing” step of the co-production process. This is also consistent with the recommended practice for usable and actionable science–start with the problem and not the most interesting forecast/research question. In a recent Nature article Gluckman et al (https://www.nature.com/articles/s41599-021-00756-3#Sec5) said: “Clearly, the success of brokerage requires a policy-making process that is receptive to evidence—a process that begins with a question rather than an answer.” 
  1. How is an agency like NOAA thinking about sunsetting a project or data product that is only useful to a small constituent group so that resources can be provided to deliver products in a more agile manner?
  1. I’m interested in the panelists perspectives on the impacts for forecasters of framing co-production as: a) development of scientific products, adding input from stakeholders and decision makers vs. b) implementing decision science, adding stakeholders and relevant science and scientists.
    • Alyssa: Yes – this is good to think about – there is even a third option, right? – c) a manager initiates the process and adds decision science experts and natural science experts as needed. I operate in a), basically because of circumstances, I’m a scientist/tool-developer type. I think b) or c) make as much or more sense. Science as it is practiced is a legacy of history, and has a lot of inequity baked in.. what would it look like to redesign the whole thing? If People with Problems were at the center – they take the first action – and when they need info, they reach out to a trusted cadre of ‘science translators’ or info finders or librarians who identify if the needed info exists, and if not, call on people trained in the methods appropriate to the problem to tackle it?
  1. Measuring/valuation is hard!  How is value effectively demonstrated to admin/program  officials outside end user support/lobbying?
    • Alyssa: I think this depends so much on the context, the funder- maybe also about relationships, they know you and see that you are doing good work, accountable to stakeholders? I think as the culture shifts too, maybe NSF or other science funders could see some of the hallmarks of good process as proof of success. As a society we are very focused on outputs/outcomes – when the process is just as important. 
  1. How do you ensure that you build in the time and capacity for assessing and meeting user needs into a project from the start?
    • Alyssa: I’ve been trying to write that kind of time into grants – thinking about it at the earliest stages of planning. I think it might also be a slow mindset change – becomes a natural focus with practice. There is a culture of overpromising products to get funding, so it’s tough.
  1. In your opinion, what are the knowledge gaps needed by climate change mitigators that scientists in ecology could help with, and might not be aware of?
    • Alyssa: The Climate Adaptation Science Centers have good info on this regionally – I would also say the seasonal-to-subseasonal and 5-10 year timeframes are of high interest to managers/climate change practitioners, but we don’t have the science/methods to deliver on these yet.
  1. EFI is a nascent field and there is a lot of opportunities to “get it right”.   What recommendations do you have in the development of co-production processes?  Development of decision support tools?  Research questions and community research priorities?
    • Alyssa: Would be interesting to talk more about this – lots of potential, and I’m not caught up on what’s going on at EFI. First thoughts are about centering equity – maybe having some flagship/focal efforts that are focused on serving environmental justice communities? Maybe a collab between EFI and CAKE (climate adaptation knowledge exchange) or similar group, to share practices and ways of thinking?
  1. Even when not directly involved in co-production of knowledge projects, what can ecologists in academia still do to make their work more useful to climate change mitigators?
    • Alyssa: I think ecologists could look to the priorities of boundary organizations that are working with managers, like the Climate Adaptation Science Centers – go after questions that match those priorities. Part of my journey has been about figuring out the balance with broad-scale, what could maybe be called ‘desktop ecology’ and being truly place-based – so for me, I try to stay connected to the ecology of the place where I live in a day-to-day way & to follow what my state and local leaders are doing for climate adaptation locally.
  1. Do you weight any of the criteria used in your prioritization process?
    • Alyssa: If this was about the Forest Pest Risk project – yes that is definitely part of it, the framework works for different management contexts, because each decision-maker can weigh the columns based on their priorities.