TreeTect
Can we use very high-resolution satellite imagery to support and enhance our tree inventory?
Our collaborative work on TreeTect was featured in 2020 at The Hague's annual innovation conference, Impact Fest. Watch the recording of the session.
The experiment
We worked with an open source initiative, Green City Watch, to use Very High-Resolution (VHR) satellite imagery. We used this platform to remotely “detect” the locations of trees, as well as a few health indicators of those trees.
We started with part of one neighborhood (Nubian Square in Roxbury) to understand:
- what the technology was capable of in an urban context, and
- whether or how it might be a useful tool for internal and external stakeholders on a larger scale.
Why we did this
One of our local nonprofits, Speak For the Trees, has cataloged a third of Boston's public street trees. However, there remain barriers to getting a more timely and robust picture of the full urban tree canopy. We were curious to explore whether the use of a novel technology could help us do so. We wanted to learn by doing.
Often, urban tree inventories are incomplete even when they seem "done". That's because many of them focus only on publicly owned trees. However, in many cities around the world, between 50-70% of the trees and canopy cover is on private lands. A city may have limited jurisdiction over these trees but still be interested in community benefits healthy trees can offer.
So, we engaged in this small-scale, collaborative research prototype to:
- Test whether a technology (Very High-Resolution satellite imagery) can be applied in a novel way to increase the City’s ability to maintain better information about our urban trees and tree canopy (in a cost-effective way)
- Test whether a more frequently updated tree inventory data layer is useful to stakeholders
- Build on the “equitable infrastructure” work of StreetCaster and the Streets’ Cabinet’s continued exploration of equitable street redesigns
What we learned
What we learnedFeasibility of technology in an urban setting
- The algorithm achieved 98% recall (with 89% precision) at a 10-foot tolerance, and 98% recall (with 92% precision) at a 20-foot tolerance. Recall, in our context, is simply whether the algorithm selected actual trees (the relevant items in our context) so that we can use the results reliably in conjunction with our municipal GIS.
- We believe we could have gotten to species identification via satellite imagery alone if we had more time and budget. Usually, you need LiDAR data — which is often expensive — to remotely detect tree species.
Relevance of results to stakeholders
- External partners were excited by this small research project. They were supportive of future work in this area.
- There could be an interesting opportunity to build the accuracy of the algorithm in other Boston neighborhoods and simultaneously engage citizens in local tree stewardship. We could do this by inviting residents to contribute “hand annotations''(the data points that inform the training dataset).
Potential future 'equitable infrastructure' applications
- We could provide the City’s Arborists with a more holistic “empty tree pit” inventory. This could help us identify new planting sites proactively, layered with requests we receive from residents via 311.
- Using the algorithm’s health indicators, we could proactively identify dying or dead trees. We would equip the Arborists with a list of trees to remove before they are called in via 311, and potentially before they become at-risk of falling over in high winds.
Tree Health Assessment
This displays the tree health analysis of the trees in part of Nubian Square:
- the green are in good health
- the yellow in fair health, and
- the red in poor health.
This is based on spectral vegetation indices computed for each individual tree, as identified by the algorithm.
(Image courtesy: Green City Watch)