|Table of Contents|

[1] Xiong Zhihua, Zhu Feng, Shao Huihe,. Application of thermal parameter soft sensor in power plant [J]. Journal of Southeast University (English Edition), 2005, 21 (1): 44-47. [doi:10.3969/j.issn.1003-7985.2005.01.010]
Copy

Application of thermal parameter soft sensor in power plant()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
21
Issue:
2005 1
Page:
44-47
Research Field:
Energy and Power Engineering
Publishing date:
2005-03-30

Info

Title:
Application of thermal parameter soft sensor in power plant
Author(s):
Xiong Zhihua1 Zhu Feng2 Shao Huihe1
1Institute of Automation, Shanghai Jiaotong University, Shanghai 200030, China
2Henan Electric Power Research Institute, Zhengzhou 450052, China
Keywords:
Gaussian process soft sensor sparse approximation online learning economical monitoring
PACS:
TK39
DOI:
10.3969/j.issn.1003-7985.2005.01.010
Abstract:
In order to solve the problem of the invalidation of thermal parameters and optimal running, we present an efficient soft sensor approach based on sparse online Gaussian processes(GP), which is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data to specify the prediction of the GP model.By an appealing parameterization and projection techniques that use the reproducing kernel Hilbert space(RKHS)norm, recursions for the effective parameters and a sparse Gaussian approximation of the posterior process are obtained.The sparse representation of Gaussian processes makes the GP-based soft sensor practical in a large dataset and real-time application.And the proposed thermal parameter soft sensor is of importance for the economical running of the power plant.

References:

[1] McAvoy T J.Contemplative stance for chemical process [J].Automatica, 1992, 28(2):441-442.
[2] MacKay D J C.Introduction to Gaussian processes[R].Cambridge:Cambridge University, 1998.
[3] Williams C K I. Prediction with Gaussian processes:from the linear regression to linear prediction and beyond [A].In:Jordan M I, ed.Learning and Inference in Graphical Models[C].Kluwer Academic Press, 1998.599-621.
[4] Wahba G. Spline models for observational data[M].Philadelphia:SIAM, 1990.
[5] Anderieu C, de Freitas N, Doucet A, et al.An introduction to MCMC for machine learning[J].Machine Learning, 2003, 50(1):5-43.
[6] Csato L. Gaussian processes — iterative sparse approximations[D].Birmingham:Department of Computer Science and Applied Mathematics of Aston University, 2002.
[7] Csato L, Opper M.Sparse online Gaussian processes [J].Neural Computation, 2002, 14(3):641-668.
[8] Smola A J, Scholkopf B.Sparse greedy matrix approximation for machine learning[A].In:Proceedings of the 17th International Conference on Machine Learning[C].San Francisco, 2000.911-918.
[9] Williams C K I, Seeger M.Using the Nystrom method to speed up kernel machines[A].In:Leen T K, Diettrich T G, Tresp V, eds.Advances in Neural Information Processing Systems[C].Cambridge:MIT Press, 2001, 13:682-688.
[10] Seeger M.Bayesian model selection for support vector machines, Gaussian processes and other kernel classifiers[A].In:Solla S A, Leen T K, Muller K R, eds.Advances in Neural Information Processing Systems[C].Cambridge:MIT Press, 2000, 12.603-609.
[11] Scholkopf B, Smola A J.Learning with kernels[M].Cambridge:MIT Press, 2002.
[12] Seeger M.Gaussian processes for machine learning[R].Berkeley:University of California, 2004.

Memo

Memo:
Biographies: Xiong Zhihua(1979—), male, graduate;Shao Huihe(corresponding author), male, professor, hhshao@sjtu.edu.cn.
Last Update: 2005-03-20