As natural disasters unfold, near-real time imagery is used by first responders including disaster response professionals and utility companies to understand and predict the effects and extent of the crisis. One key set of sensors is the GOES Advanced Baseline Imager, which currently operates from two geostationary satellites that image the entire western hemisphere every ten minutes. These sensors can serve as a key resource for rapidly visualizing natural disasters.
In addition to the “full disk” images that capture almost an entire half of the globe, the ABI also captures two “mesoscale” domains every sixty seconds. These high-temporal frequency data provide detailed information about ongoing weather events and other topics of interest. During the United States’s hurricane season, it’s common for one of the mesoscale domains to focus on a large ongoing hurricane event.
GOES mesoscale data are available with low temporal latency, but utilizing these products in near-real time decision support tools is non-trivial. The data require post-processing to convert them into meaningful red-green-blue (RGB) imagery, and during the nighttime, dawn, and dusk, additional processing is required to blend RGB imagery with data from the infrared bands. Element 84’s FilmDrop system of tools provides near-real time data processing that can respond to new data events and produce web-ready products with extremely short turnaround.To demonstrate how this might work in the event of an ongoing crisis, we created a small data pipeline to display GOES mesoscale data of the recent Hurricane Hilary and display it on a web map along with a simple vector layer. The data are sourced from Microsoft’s Planetary Computer, and the RGB images have been produced via a simplified version of the GeoColor Product available from the GOES Imagery Viewer. The image metadata are available as a STAC Item collection, similar to what you would get as a search response from a STAC API server such as Earth Search. The code is available on GitHub.