The power grid that delivers electricity to our homes and businesses is outdated and vulnerable to failures. In the past decade, we’ve had grid failures in Texas that caused over 4.5 million homes and businesses to lose power, wide-scale droughts and wildfires in the western U.S. coupled with rolling blackouts and even headlines of overgrown trees causing millions to lose power. In 2021, the American Society of Civil Engineers gave our energy sector a C-grade. Plus, incorporating renewable energy sources complicates a utility's energy balance as a result of their inherent inconsistency in energy production.
To prevent widespread failures, the energy sector is deploying smart grids using advanced sensors and controls to detect and correct imbalances, but this doesn't optimize energy sources and distribution networks. Regular monitoring for maintenance, risks and changes in demand is also essential.
Satellite data, combined with geospatial datasets and machine learning, helps enhance efficiency and reduce costs in smart grid implementation and management. In fact, the use of geospatial data in smart grid management is projected to grow by 19.6% annually from 2021 to 2026, reaching over $130 billion. With this rate of growth, I believe that leveraging satellite data and geospatial technology will be critical in optimizing smart grid development and upgrades in six key ways.
When energy companies search for the best locations for solar or wind farm projects, geospatial data can help, using factors like solar irradiance, land availability and proximity to substations. One clear way is in determining the distance between an energy source and the nearest substation, which is a significant cost factor. The Energy Institute of the University of Texas reported that it costs nearly $960,000 per mile for single-circuit 230 kW above-ground line and over $3 million for double-circuit 500 kW above-ground line, with the cost only increasing for underground lines. One 2020 study highlights challenges with substation data accuracy, with existing substations across the globe having significantly outdated or incomplete records. This poor data quality leads to suboptimal investment decisions and increased operational costs. However, during our work using satellite imagery, artificial intelligence and other techniques, we've seen a substantial increase in substation count compared to standard sources.
Once a site is identified, the construction process is right behind. During the land development and buildout, real-time site monitoring gives the business enhanced visibility from above. Satellite imagery can provide an objective source of information into the progress of the new construction project, keeping decision-makers informed as to whether the project is on schedule.
The utility industry has shifted the way it monitors and protects equipment and assets since the widespread adoption of sophisticated satellite imagery. By analyzing the changes in their assets and infrastructure over time, including power lines, substations and transformers, utility managers can detect potential issues and plan maintenance or repairs accordingly. This helps reduce downtime and improve reliability. If transmission lines are at risk, such as from new construction or other forms of encroachment too close to transmission lines, those threats can be flagged. The key is accessing sophisticated data, in real time, to ensure optimal results.
One American utility company suffered over $30 billion in liability from fires caused by electricity transmission lines. Satellite imagery is a valuable tool for responding to natural disasters and emergencies, allowing utilities to assess infrastructure damage and prioritize repairs. Companies are using satellite data and analytics in developing resiliency roadmaps, including weather-predictive services, fire risk monitoring and flood modeling. Although sensor deployment and high-definition cameras offer real-time situational awareness, satellite imagery's value is increasing due to scalability and efficiency.
Vegetation management is crucial for utility maintenance and operations, as overgrown trees and shrubs can cause power outages. In 2003, overgrown tree branches caused a power outage in Ohio, causing 50 million people to lose power. That emphasized the need to improve near real-time monitoring across the grid, which would help check for risks related to vegetation management. However, to monitor this risk both in near real time and at scale proves to be difficult. The U.S.'s electric power transmission grid is made up of around 700,000 circuit miles of lines. The advancement of geospatial analytics techniques, such as LiDAR (light detection and ranging) combined with near-infrared (NIR) spectroscopy from satellite imagery, can assist in vegetation monitoring along the transmission corridors, allowing companies to proactively identify potential risks and then enable proactive measures against fire hazards.
By analyzing imagery and spatiotemporal data, utilities can better understand patterns of energy use and predict future demand. They are then able to plan for peak usage periods and optimize their energy production and distribution processes. As the nation experiences more frequent, intense and longer-duration extreme weather events, including severe temperature and precipitation events, stronger hurricanes and storm surge, droughts and wildfires, energy demand will shift accordingly. In other words, utility companies can't simply take current use rates and project them forward using population growth rates, but must also look at how our changing climate will impact demand. Satellite data products such as temperature models and ozone depletion maps may prove to be vital for getting both scales.
Satellite imagery can be used to reduce risk and enhance predictability, visibility and industry efficiency. As demand for safety and efficiency grows, I believe that geospatial analytics and smart grid management will become increasingly important for utility operators and energy providers.