Prediction of combustion efficiency using multiple neural networks
Ahmad, Z, Bahadori, A & Zhang, J 2017, 'Prediction of combustion efficiency using multiple neural networks', Chemical Engineering Transactions, vol. 56, pp. 85-90.
Published version available from:
In order to improve the generalisation capability of neural network based models, combining multiple neural networks (MNN) is proposed in this paper with the application of predicting the combustion efficiency from the boiler. This is due to the fact that single feed forward artificial neural networks (FANN) lack of the robustness due to the overfitting of the models. Combination of MNN was introduced and researchers concentrate on how overfitting can be avoided by combining the single FANN. In this study, the individual FANN are trained using different training data sets and /or from different initial weights, then combined. Instead of choosing the best single FANN model among the networks, all the neural networks are combined. It can also be described as architecture of network consisting of several sub-models and a mechanism which combines the outputs of these sub-models. In this study bootstrap application or bootstrap technique were apply to replicate the initial raw data or to create different training and testing data sets. Bootstrap basically relate or deals with the sampling to create random data sets for training and testing. By creating an equal number of bad and good data sampling, it actually improves the generalisation capability of FANN. The simple averaging method was applied in combining the MNN. The data for modelling was taken from energy management handbook with total of 66 data points where the training and testing consist of 39 samples data while unseen data consist of 27 samples data. The result shows that the MNN combination with simple averaging method did slightly improved the model prediction of the combustion efficiency by using two inputs which are stack temperature rise, dT and also the excess air. The coefficient of determination, r2 , and root mean squared error (RMSE) for unseen data for MNN are 0.9999 and 0.0105.