Method | References | Characteristic | Evaluation |
---|---|---|---|
Parameter probability prediction | The premise is that the known data conform to a certain probability model; hypothesis testing is required | The assumptions of the probability model and the determination of the relevant parameters affect the precision of the generated scenarios. | |
Nonparametric probability prediction | Don't need global parameter assumption; simple calculation and wide applicability | Unable to process massive sample data. And some data suitable for parameter estimation may lose some scenario characteristics. | |
MCS sampling | Close to the actual sample | The accuracy is low when the sample is small, and the efficiency is low when the sample is large. | |
LHS sampling | Suitable for uniform sampling in multi-dimensional space and small samples; high sampling efficiency | The correlation coefficient matrix (CCM) of random variables is required as input parameters, but CCM is difficult to extract. | |
Copula and its improvement | The correlation characteristics of weather sensitive factors are captured | It is difficult to process high-dimensional and complex scenario samples. | |
Auto regressive moving Average | Use the mean variance normalization method to preprocess the data, which is simple | Prone to over-fitting and pattern recognition errors; insufficient data diversity. | |
Markov stochastic process | The daily cycle and seasonal change characteristics of scenarios are considered; the scope of application is wide | Due to the lack of memory, only short-term autocorrelation characteristics are retained. | |
Radial basis function | [45] | The results of fitting the actual scenarios are of high accuracy | When the sample size is large, there are many hidden layers and the network structure is complex, resulting in low computational efficiency. |
Artificial neural networks | [46] | The training process is stable and the convergence rate is fast | The neural network design is complex; the interpretation ability is weak; machine learning feature selection will have a great impact on the fitting effect. |
GANs and its improvements | High accuracy in processing high-dimensional samples; no need to manually label data; capture correlation features | GANs have the problem of mode collapse and weak interpretability. | |
Variational automatic encoder | [47] | The mathematical proof is clear; the interpretability is strong; the long-term and short-term characteristics can be captured | It is almost impossible to characterize the correlation of scenario historical data. |