July 28, 2020
We will be highlighting operational forecasts in conjunction with the release of Newsletters for a new series called “Forecast Spotlight”. The goal is to highlight operational forecasts being conducted by our EFI members, how they got into forecasting, and lessons learned.
The inaugural forecast is ecocaster by Nicholas Record at Bigelow Laboratory for Ocean Sciences. You can also see Nicholas’ ESA 2020 presentation, Using citizen observations to forecast ecosystems from jellyfish to moose to whales, at the EFI watch party on August 4 starting at 1:45pm Eastern Time here: https://youtu.be/42nZ4yAwG1o
1. How did you get interested in ecological forecasting?
I joined a NASA project in the early aughts, funded by their Ecosystem Forecasting program. The goal of the project was to forecast right whale movements to help manage this endangered species. In addition to the value of the project, I was drawn to the interdisciplinary nature of forecasting, including math, biology, and geoscience, as well as the social sciences and visual arts.
2. What are you trying to forecast?
In my Ocean Forecasting Center, we work on everything from viruses to whales. We try to experiment with new methods as often as we can, embracing the “Cambrian explosion” of forecasting approaches that Payne et al (2017)1 described. At the moment, I’m really interested in using crowd-science approaches to build ecosystem forecasts for human-wildlife interactions. For example, we have real-time forecasts for jellyfish sightings, moose-car interactions, and tick encounters.
3. Who are the potential users or stakeholders for the forecasts you create?
For the crowd-science forecasts, like the jellyfish forecast, the audience is the general public. I need to engage hundreds of people for the forecasts to work, and giving people a daily forecast to look at is a fun way to engage them. For other projects, like our harmful algal bloom forecasts, stakeholders are more specific, including industry members and management agencies. Some of those forecasts are viewable by the public.
4. What are the key lessons you have learned from your forecasts?
If the general public is involved, any barrier to entry (e.g. a webform or login) can result in a major drop in the amount of data that comes in.
The human dimension is just as important a part of the system as the wildlife, environment, etc.
If you want your forecast to be perfect, it might take forever before anyone outside your group ever sees it. Failure is the greatest teacher (I also learned this from Yoda).
5. What was the biggest or most unexpected challenge you faced while operationalizing your forecast?
Data repositories periodically change their data formats or data pipelines. Keeping things automated is constant work.
6. Is there anything else you want to share about your forecast?
Check out some of the experimental crowd-science forecasts at eco.bigelow.org
…And if you have a crowd-sourced forecast idea that you’d like to tinker with, I’d be happy to collaborate.
1 Payne, M.R., A.J. Hobday, B.R. MacKenzie, D. Tommasi, D.P. Dempsey, S.M.M. Fassler, A.C. Haynie, R. Ji, G. Liu, P.D. Lynch, D. Matei, A.K. Miesner, K.E. Mills, K.O. Strand, E. Villarino. 2017. Lessons from the First Generation of Marine Ecological Forecast Products. Frontiers in Marine Science. https://doi.org/10.3389/fmars.2017.00289