Just over a decade ago, Sandia National Laboratories founded the PV Performance Modeling Collaborative (PVPMC). PVPMC is increasing transparency and accuracy in PV system performance modeling, according to Sandia, with the organization helping to bring together stakeholders to improve modeling practices. Joshua Stein, senior scientist at Sandia, thinks it is time for a similar effort to shed light onto data analytics for PV plant performance monitoring.
“There is a lack of standardization and harmonization in the way people are talking about fault analytics and monitoring,” he explained to pv magazine. Currently, some companies are offering services that include data analytics and fault identification and classification based on monitoring data from PV power plants. However, these companies each use their own set of failure definitions, which means that it is very difficult to compare and contrast the value of these services.
“Imagine if every doctor or hospital you visited used their own set of diagnoses and clinical practices.” said Stein. “It would be very difficult and confusing to get a second opinion. Most mature industries standardize their definitions of faults and mitigation procedures. Solar PV is not quite there yet.”
For one thing, the introduction of machine learning techniques to automate fault detection in PV systems has further muddied the picture. Stein was careful to clarify that he is not against machine learning. “Automated ML-based fault detection is beneficial in that it can lead to highly scalable analytics,” said Stein.
Rethinking KPIs
Technical key performance indicators (KPIs) are essential for evaluating PV power plants, from the development stage to contractual agreements between asset owners and O&M providers.
“While IEC and ASTM standards define some KPIs, such as performance ratio (PR) and capacity tests, their calculation methods often vary or rely on user interpretation, leading to uncertain results,” said Stein’s Sandia Labs colleague Marios Theristis.
“For instance, there is significant flexibility in data handling, which can introduce biases that impact contracts and financial decisions. Without clear KPI definitions and harmonized calculations, contracts may be unfairly affected – not due to actual under-performance or over-performance, but because of calculation bias,” he added.
Bad data is data that is not useful because it has been obtained through inadequate means. It contains errors or it isn’t clean. This in turn creates problems for PV stakeholders who rely on it to tell them what their system’s actual performance is.
Standard industry practice is for asset owners to subcontract an O&M provider to take care of operation and maintenance. This means a contractual agreement dictating the power plant is being maintained appropriately and meeting certain KPIs.
“If you have bad data quality, then there will be some bias introduced in the KPI estimation. This bias can be positive, or it can be negative. In a hypothetical scenario where the PR bias is negative, then the O&M provider would be unfairly paying penalties caused by data quality, and not under-performance,” said Theristis.
This also works in reverse where an O&M provider could benefit, due to bias caused by data quality and not plant over-performance. Theristis continued: “In some cases, the operational PR is compared to the pre-construction PR generated by a PV design software, leading to an apples-to-oranges comparison: Is the PR difference due to an overly optimistic pre-construction simulation or actual underperformance?”
Program transparency
Theristis has begun a new program to tackle these issues, the PV O&M Analytics Collaborative (PVMAC). He introduced the project at the EU PVSEC conference held in Vienna in September 2024.
“Our objective is to create a collaborative network to improve transparency in PV analytics software and services, engage with monitoring/analytic companies, O&M providers, asset owners, insurers, and help these stakeholders come together to agree on what standards are needed and improve the overall market’s transparency,” he said.
The team is also using Sandia Labs’ supercomputers to run simulations across the United States to reduce uncertainties in KPI estimation.
PVMAC is still very new, but over the next few months the researchers will meet with industry to carry out interviews and tests such as blind analytic comparisons. This involves giving stakeholders real or synthetic performance data and asking them to identify and classify faults and calculate KPIs.
“Blind comparisons are a great way to evaluate the state of the industry practice and the consistency between different providers.
“We expect that these comparisons will highlight discrepancies in how different companies define faults and calculate KPIs,” said Stein.
“We plan to host dedicated sessions on O&M analytics, failure and KPI harmonization at our upcoming PVPMC workshops,” added Theristis.
Standardizing performance
The two scientists hope that by shining light on any inconsistencies, it will be easier to build consensus on the steps needed for standardization. For example, an O&M provider that offers a performance guarantee could be incentivized to not clean irradiance sensors well because a dirty pyranometer makes the PV performance appear better than it actually is. The goal for Stein and Theristis is to quickly determine why a PV plant is under-performing and suggest strategies for addressing the problems.
A lot is at stake, even a small percentage of underperformance for a large PV plant results in significant financial losses. Stein reckons measuring the performance of a PV system is not that different from measuring the health of a person.
“You collect data about your patient, e.g., vital signs, medical history, if these indicate a more serious problem, you order more tests. At least in medicine, standards are pretty much international, so it doesn’t matter where you go in the world, you will be evaluated in a similar manner.
“I’m hoping that with PV, we can have a similar system. The oil and gas industry, for instance, has been standardized. If you go to an oil rig or a gas turbine in any country in the world, the standards are similar,” he said.
Tangled web
Poor power plant management can also pose problems. Theristis warned that although many O&M products and services are available, asset owners may not have “a quantitative knowledge” of the potential benefits of investing in data quality, O&M, and analytic capabilities. “These solutions have not achieved transparency and lack independent validation,” he said.
Using supervised machine learning methods for classification is preferable to unsupervised, according to Stein, because the former allows you to tell the algorithm what the categories are.
“But many people opt for unsupervised, which basically says there’s a problem without necessarily specifying the problem,” said Stein. “It can get tangled up in clustering algorithms which say all these problems are similar somehow and then if you use the same practice on multiple data sets you can end up with clustered data, but it is clustered in for different reasons.
“That’s one of the dangers of machine learning; it works for a particular data set, but it’s hard to extend it to a different data set that you haven’t trained it on.”
If machine learning is to be used effectively to monitor fault detection, the industry must first agree on how these faults are defined. Not all companies selling and using AI or machine learning solutions to identify faults are using these methods consistently and transparently, Stein said.
“What we’ve seen is the problem is not all companies are doing the same thing. Some companies are doing very good work and have very valuable products but there’s nobody out there actually coming up with the standards and validation so those companies can be properly valued, right?”
The scientist warned that ‘AI hype’ might be to blame. “Many companies advertise that their software uses AI technology, but few are willing to describe exactly how it is being used. And so, each company has their own sort of ‘diagnostic manual’ for what can go wrong with the PV system,” he said.
“Artificial intelligence is just a buzzword sometimes.”
This content is protected by copyright and may not be reused. If you want to cooperate with us and would like to reuse some of our content, please contact: editors@pv-magazine.com.
By submitting this form you agree to pv magazine using your data for the purposes of publishing your comment.
Your personal data will only be disclosed or otherwise transmitted to third parties for the purposes of spam filtering or if this is necessary for technical maintenance of the website. Any other transfer to third parties will not take place unless this is justified on the basis of applicable data protection regulations or if pv magazine is legally obliged to do so.
You may revoke this consent at any time with effect for the future, in which case your personal data will be deleted immediately. Otherwise, your data will be deleted if pv magazine has processed your request or the purpose of data storage is fulfilled.
Further information on data privacy can be found in our Data Protection Policy.