Project Lead: David Diaz, School of Environmental & Forest Sciences Engineering Postdoctoral Fellow
eScience Liaison: Valentina Staneva
Maps that delineate and classify forest conditions remain indispensable prerequisites for forest stewardship planning. By developing new reproducible and open-source methods to automate forest mapping, our effort is designed to facilitate conservation and management planning among the 40,000+ non-industrial forest landowners in Oregon and Washington who control over 3 million acres of land. Social science research suggests more than 70% of these owners have strong stewardship attitudes, but are not (yet) engaged in any conservation or management activities. Developing a written stewardship plan is a critical bottleneck for many of these landowners before they can adopt new practices or access state and federal cost-share and incentive programs for implementing new conservation practices.
Through public records requests, forest stand boundaries hand-drawn by state and federal foresters across several million acres covering the Pacific Northwest’s diverse ecoregions have been gathered. These data provide the targets we will use to train modern computer vision models to delineate and classify forest conditions from publicly-available aerial and satellite imagery. The maps we generate will be served to landowners and forest managers across Oregon and Washington through an open-source web app designed to auto-populate Forest Management Plans following widely-used state and federal templates.
Over the Winter Quarter in the Incubator, we organized and formatted a 2TB dataset to allow easy loading and model training and set up a framework for model iteration and comparison using Tensorboard and Neptune. With the support of an AI for Earth Grant from Microsoft, we spun up a virtual machine on Azure and began training convolutional neural networks created with PyTorch to segment land cover types. We were thrilled to see the model learning to generalize and detecting streams and roads that are not present in the human-drawn annotations but which are apparent from aerial imagery and terrain models.