field of solar panels
Image: Dennis Schroeder/NREL

After 11 months and nearly 20 meetings, a new regulatory working group report is born. And this one is special.

For those who have been involved in or following California’s nearly two-year Distributed Resources Plan (DRP) proceeding (which is a scaled-back west coast version of the New York Reforming the Energy Vision effort), this report represents an important milestone not only in the proceeding, but also for an issue near and dear to IREC’s heart: “hosting capacity” analyses.

Issued by the Integration Capacity Analysis (ICA) Working Group — convened by the California Public Utilities Commission (CPUC) to guide the development and execution of the California utilities’ hosting capacity demonstration effort — this notably wonky report assesses the findings of the demonstration project and outlines recommended next steps. IREC’s policy and technical team participated actively in all phases of the Working Group, wherein we learned a great deal more about hosting capacity analyses (hence, near and dear).

While perhaps lower on the radar screen from some other clean energy policy issues, hosting capacity analysis certainly warrants the full attention of stakeholders, policymakers and regulators working to transform the electricity grid. As a national organization that works state-by-state on these important issues, IREC observes that hosting capacity analyses are building blocks with which the grid of the future will be built.

For the deep dive you’ve been waiting for on hosting capacity, search no further. Herein you will find an overview of hosting capacity analyses, why they matter, a summary of the key findings from the California Working Group effort and important insights that are transferable to other states seeking to enable a more modern grid.

First, let’s start with the basics.

Hosting What?
The “Integration Capacity Analysis” or the ICA, is California’s term for what is more commonly known as a hosting capacity analysis. The point of this type of analysis is to simulate the ability of individual distribution circuits, or even “nodes” on a circuit, to accommodate additional Distributed Energy Resources or DERs, without requiring significant upgrades in order to ensure system safety and reliability.

At its core, the ICA is a complex modeling exercise that gathers detailed information about the distribution grid, including the physical infrastructure (the wires, voltage regulating devices, substations, transformers, etc.), the type and performance of load on the grid (load curves showing maximum and minimum load), and the existing generators and load control measures on the grid (including rooftop PV, demand response, etc.).

This data is input into a model to create a “base case” for existing grid conditions, and then simulations are run in order to see how the grid would perform if new DERs were added. The process of developing the baseline model, and the methodology for running the simulation, includes hundreds of different decisions which can significantly shape the outcome, both in terms of the final hosting capacity figure, and its accuracy when compared to real life conditions. So, this is one of those times when it is really important to check the ingredients list on a product before consuming.

A Wonky Working Group is Born
Following the passage of California’s AB 327 (2013), which required the investor owned utilities to prepare Distributed Resources Plans that identify optimal locations for distributed energy resources on the grid, the CPUC initiated a proceeding wherein they required the utilities to prepare a hosting capacity analysis of their system. At the time, this was new territory for the CPUC, utilities and involved stakeholders. As such, the CPUC required the utilities conduct a demonstration project to further develop and test the hosting capacity methodology before it was fully implemented across each utilities’ entire distribution system.

The CPUC created the ICA Working Group, open to all interested stakeholders, to monitor and provide input into the development of these demonstration projects.

An unexpectedly popular working group, nearly 60 stakeholders participated at some point, representing over a dozen companies and organizations, including IREC, the California Office of Ratepayer Advocates and the utilities. The core Working Group consisted of about 15 diverse entities. Nearly three years following the passage of AB 327, the Working Group commenced efforts in May 2016, the utilities filed their demonstration project results in December of 2016, and the Working Group report was filed just last week.

Defining Use Cases for the ICA
Following IREC’s suggestion, the Working Group organized its evaluation of the demonstration project results by first identifying the core “use cases” for the ICA — since knowing how the analysis will be used is essential to being able to evaluate the results.  At a high level, the Working Group had broad consensus that there were two core uses cases:

  • Interconnection Process: The first use case for ICA is for utility interconnection processes, both as an informational tool to guide projects to appropriate locations with a clear understanding of the interconnection options, and also to actually be used by utilities to make decisions in the interconnection review process.
  • Distribution Planning Process: In this use case, the hosting capacity analysis would be used to help make broader decisions about how to plan for and operate the distribution grid to cost effectively accommodate growing amounts of DERs.

The group discovered that although there was agreement at a high level on the use cases, there were differing opinions on the details of the interconnection use case, and not enough detail on the planning use case to enable the group to fully evaluate whether the methodologies developed would adequately serve those needs. Nonetheless, the group was able to come to consensus on an outline of the interconnection use case, and it agreed to further define the planning use case in coming months.

Lessons Learned
States should begin their hosting capacity analysis process by convening stakeholders to develop a clear and shared vision for how the analysis should be used before the modeling effort begins. It is likely the use cases will evolve over time, but knowing how the analysis will be used will significantly shape the methodological choices that are made and could prevent some need to significantly redesign the methodology later on.

Selecting the Hosting Capacity Methodology
In their initial effort, before the demonstration project began, the California utilities started with two different methodologies. San Diego Gas & Electric (SDG&E) and Southern California Edison (SCE) chose to use what is known as the “iterative” method. Pacific Gas and Electric (PG&E) chose to use what is known as the “streamlined” method, based upon a technique first developed by the Electric Power Research Institute (EPRI). However, in authorizing the demonstration projects, the Commission ordered the utilities to converge around a single methodology.

Since each was known to have pros and cons, the Working Group decided that each utility would test both methodologies in two of their distribution planning areas, and would also apply each methodology to a common reference circuit to enable the utilities to ensure consistent application of the methodologies.

The iterative method “is based on iterations of successive power flow simulations at each node on the distribution system, whereas the streamlined method uses a set of equations and algorithms to evaluate power system criteria at each node on the distribution system.”

It should be noted that there is also a third methodology, known as the “stochastic method” which is currently in use by Pepco in its territories in the mid-Atlantic, and EPRI has further developed their streamlined methodology into something they are now calling the “DRIVE” tool. Neither of these were tested in California.

The final reports the utilities filed in December documented the results of the two methodologies:

  • The streamlined methodology was significantly faster to run (from a computing standpoint), however it had accuracy issues (both over and underestimating hosting capacity) in a not-insignificant number of cases, particularly when it came to application on complex circuits and with respect to two of the four power system criteria that were evaluated: power quality/voltage and protection.
  • The iterative methodology results were found to be sufficiently accurate, but running the model was computationally intense and thus would require more resources to deploy and may not be able to be run as often as needed for the type of scenario analysis that may be used for planning.

Since neither methodology was better in every way, the Working Group had to make a choice about which methodology to recommend for full rollout across the entire utility territories. The Working Group had a clearly defined sense of the interconnection use case and, for many participants, this was the more immediate priority (though all agree with the value of the planning use case). Accuracy is absolutely paramount in interconnection decisions since the safety and reliability of the grid relies upon it, and project developers need to make financial decisions based upon their ability to connect at low cost at a particular location. Thus the Working Group overall recommended use of just the iterative methodology for the entire interconnection use case, with PG&E including an alternate proposal for a “blended approach.”

PG&E separately proposed to use the streamlined method for the publication of the hosting capacity results in a map (i.e. the informational portion of the use case), but to use the iterative method to make actual interconnection decisions on a project-by-project basis. The non-utility stakeholders, including IREC, determined that this approach would undermine one of the more important goals of the interconnection use case: the ability of potential applicants to be able to determine their interconnection results accurately using the publicly available map if the project proposed was below the indicated hosting capacity amount. SDG&E and SCE also agreed that they did not want to deploy both methodologies.

One of the reasons behind the difference in the utilities’ positions relates to the current state of their distribution system model and the type of software and other tools they currently rely on to complete the analysis.

The Working Group determined that it did not have sufficient information about the types of decisions that would be made in the planning use case to enable the same level of detailed evaluation of the methodologies for that case. The group recognized that the iterative methodology might be problematic for use where multiple different scenarios needed to be run system-wide, but also was hesitant to recommend the deployment of two different methodologies.

In addition, if the results were to eventually be used to make specific investment decisions (i.e. to upgrade a particular line section or transformer, etc. vs. to authorize an overall spending plan for types of upgrades), then the inaccuracies in the streamlined method might continue to be problematic.

Lessons Learned
The methodology selected matters immensely and should be carefully considered and evaluated (and well-vetted) before proceeding to a system-wide analysis. The comparative research and results from the California utilities’ demonstration projects and ICA Working Group report offer tremendously valuable information that is now publicly available about two possible methodologies. Other utilities and state commissions should examine and understand the lessons gleaned from the California demonstration projects, as these lessons are likely very applicable across the United States.

Computational Efficiency Refinements and Cost Estimates
During the Working Group process, the utilities identified a couple of methodological changes that could be made to the iterative method in order to reduce the computational burden while still providing accurate and useful results. Since most of the Working Group selected the iterative method, these changes were especially important to consider. There were two factors that were identified as having a particularly significant impact on the processing time and costs: the number of hours the load profile should include (i.e. 24, 96, 576, etc.) and the frequency with which the model would be updated (yearly, monthly, weekly, etc.).

The group requested a base estimate for a plausible scenario for each methodology and asked the utilities to then identify the cost factors associated with a set of defined sensitivities for each scenario. The utilities prepared estimates, but the non-utility members of the Working Group found they lacked sufficient information to be particularly meaningful in guiding the decision making process.

Partly due to the short timeframe, the utilities were unable to identify the sensitivities, to break the costs down into categories, and, in most cases, the high end of the range provided was twice that of the low end. The Working Group report contains a discussion of how the cost estimates were considered and what conclusions could be drawn about them now, along with recommendations for next steps on how to resolve the cost questions going forward.

The non-utility Working Group members unanimously recommended moving ahead with a 576-hour and weekly update approach and asked the Commission to implement this approach for three years along with detailed cost tracking. At the end of that time, they recommended the Commission evaluate what the actual costs were and consider whether scaling back of the granularity was necessary.

Lessons Learned
There remain considerable unknowns about the costs of the hosting capacity computing power and also about the ability of the utilities to develop a more efficient method to manage the process. However, the Working Group reasoned that since the increased granularity is important to achievement of the use case goals, it is likely worth considering having the California utilities attempt full deployment; a cost cap and an opportunity to reconsider the benefits and costs after a three-year trial period could help avoid cost overruns.Commission oversight will also be important to improve cost efficiencies and ensure increased transparency of costs over the long term.

Additional Technical Details Still to be Resolved
The Working Group also issued recommendations regarding certain aspects of the methodology that should be altered or require further consideration in the coming months. The full list is featured in the Working Group report, but a few are particularly important.

One of the power system criteria evaluated in the demonstration project was “safety/reliability,” though IREC believes it would be more accurate to simply define this criteria as “operational flexibility” at this time. In many cases “operational flexibility” was by far the most limiting factor to DER hosting capacity. Though the Commission had directed the utilities to avoid heuristic (i.e. rule of thumb) approaches in their methodology, the utilities were unable to identify a method for evaluating safety and reliability impacts that relied on actual power-flow modeling rather than a heuristic measurement.

As a result, the utilities used a rough test to determine whether there would be any limitation on their operational flexibility by examining whether there was any backfeed past a SCADA device. The reasoning here is that if the utilities are constrained in their ability to shift loads and generation between circuits in the case of a fault or emergency condition, then a safety or reliability issue could arise.

While the Working Group members understood the utility’s reasoning behind the need to protect some operational flexibility in order to maintain system safety and reliability, there was no evidence presented to show that absolute protection of operational flexibility necessarily results in meaningful safety and reliability gains. For example, does the utility need 30 different configuration options, or would, say, 20 suffice to protect reliability? Or, could the safety and reliability concerns be addressed in a different manner without restricting hosting capacity?

Thus, the Working Group agreed to a continued use of the criteria at this time (with a possibility of modifications in the interconnection process), but recommended that the utilities continue to look for more precise ways of identifying safety and reliability risk. The group also flagged for the Commission that there may need to be a broader policy discussion around ways of better addressing operational flexibility when it comes to DER integration.

Another important issue that was tabled for further consideration is how to appropriately model and integrate the use of smart inverters into the hosting capacity analysis. It is widely recognized that the use of smart inverters can help to increase hosting capacity in certain cases and thus any hosting capacity analysis that is intended to replicate field conditions will need to take their capabilities into account as more are deployed on the system.

However, the utilities need more time to work with their software vendors to define a way to do this. The utilities also committed to working with the software providers to identify methods to allow voltage regulating devices to “float” in the model, like they would in the field.

Finally, it is important to recognize that while the utilities are aiming to model their entire system down to the nodal level, this effort currently does not include single phase lines that connect directly to the customer’s site. In order to use the hosting capacity results to more fully automate the interconnection process it will be necessary to include these single phase lines in the analysis, thus the Working Group put this as a high priority action for the coming year. The Working Group will meet for at least another six months to try to work through some of these issues.

Lessons Learned
Hosting capacity analyses are complex and multi-faceted, and represent a new frontier in grid planning and integration of distributed energy resources. While important lessons learned from leading states should be help other states leap frog past initial learning gaps, it is likely that these analyses will continue to evolve as long as new techniques are discovered and new needs or uses are developed.

A True “Working” and Collaborative Effort
In closing, IREC offers some additional observations about the California Working Group process and how it has compared to other efforts we have participated in, both in California and across the 30 or so states that IREC has intervened in over the last 10 years. While the summary above highlighted some of the areas where the Working Group did not quite reach consensus, these points of difference should definitely not be the defining headline of this Working Group report. Rather, this was a process wherein the utilities and the stakeholders truly worked collaboratively together and were able to agree on many crucial points along the way on a very dense and complicated subject that will ultimately lead to a truly valuable, if not transformative, tool.

The California utilities should be commended for the pioneering work they have done to test out the different methodologies, for their collaboration with each other to develop a methodology capable of producing relatively consistent results, and for their willingness to take into account and respond to the feedback from stakeholders. This kind of close collaboration between the stakeholders and utilities results in an outcome that most stakeholders can stand behind, and ultimately is relatively rare in our experience.

IREC appreciates the time and effort that utilities, stakeholders, facilitators and Commission put into making this a truly meaningful process. We hope the Commission moves expeditiously to issue a decision that takes into account the Working Group’s findings and allows the utilities to get to work on deploying the ICA across their systems. We look forward to helping others learn from the many lessons that were drawn out through this process and to seeing this first-of-its kind tool be deployed to greatly ease DER integration in the Golden State in coming years.