Skip to main content

Table 1 The induction and evaluation of new energy uncertainty modeling methods

From: Planning of distributed renewable energy systems under uncertainty based on statistical machine learning

Method

References

Characteristic

Evaluation

Parameter probability prediction

[22,23,24]

The premise is that the known data conform to a certain probability model; hypothesis testing is required

The assumptions of the probability model and the determination of the relevant parameters affect the precision of the generated scenarios.

Nonparametric probability prediction

[26, 27, 56]

Don't need global parameter assumption; simple calculation and wide applicability

Unable to process massive sample data. And some data suitable for parameter estimation may lose some scenario characteristics.

MCS sampling

[28, 29, 57]

Close to the actual sample

The accuracy is low when the sample is small, and the efficiency is low when the sample is large.

LHS sampling

[30, 31]

Suitable for uniform sampling in multi-dimensional space and small samples; high sampling efficiency

The correlation coefficient matrix (CCM) of random variables is required as input parameters, but CCM is difficult to extract.

Copula and its improvement

[32,33,34, 36, 37, 58]

The correlation characteristics of weather sensitive factors are captured

It is difficult to process high-dimensional and complex scenario samples.

Auto regressive moving Average

[38, 40, 59]

Use the mean variance normalization method to preprocess the data, which is simple

Prone to over-fitting and pattern recognition errors; insufficient data diversity.

Markov stochastic process

[41, 44]

The daily cycle and seasonal change characteristics of scenarios are considered; the scope of application is wide

Due to the lack of memory, only short-term autocorrelation characteristics are retained.

Radial basis function

[45]

The results of fitting the actual scenarios are of high accuracy

When the sample size is large, there are many hidden layers and the network structure is complex, resulting in low computational efficiency.

Artificial neural networks

[46]

The training process is stable and the convergence rate is fast

The neural network design is complex; the interpretation ability is weak; machine learning feature selection will have a great impact on the fitting effect.

GANs and its improvements

[48, 55, 60, 61, 62, 63]

High accuracy in processing high-dimensional samples; no need to manually label data; capture correlation features

GANs have the problem of mode collapse and weak interpretability.

Variational automatic encoder

[47]

The mathematical proof is clear; the interpretability is strong; the long-term and short-term characteristics can be captured

It is almost impossible to characterize the correlation of scenario historical data.