Professor |
Associate Professor |
The main stream in our lab is related to computational intelligence. There are three key words: recognition, learning and understanding. The goal of our research is to develop some learning models that are flexible enough to adapt changing environment, and also simple enough to be realized, interpreted and re-used. The final goal is to design a system that can think, and decide what to do and how to grow-up based on its own thinking. For this purpose, many approaches have been studied - e.g., neuro-computation, evolutionary computation, reinforcement learning, and so on. Of course, results proposed in conventional symbol based artificial intelligence are also included. So far we have used or proposed the following learning models:
|
[qf-zhao-01:2004] |
Hazem M. El-Bakry and Q. F. Zhao. Face detection using fast neural processors and image decomposition. International Journal of Computational Intelligence, 1(4):313-316, 2004. |
In this paper, an approach for reducing the computation steps by fast neural networks for the searching process is presented. The principle of divide and conquer is applied through image decomposition. Each image is divided into small sub-images and then each one is tested separately using a fast neural network. Compared with existing fast neural networks, experimental results show that a large speed up ratio can be achievedwhen applying this technique to locate human faces in clustered scenes. |
|
[qf-zhao-02:2004] |
Hazem M. El-Bakry and Q. F. Zhao. A modified cross correlation in the frequency domain for fast pattern detection using neural networks. International Journal on Signal Processing, 1(3):188-194, 2004. |
Recently, neural networks have shown good results for detection of a certain pattern in a given image. In our earlier works, we proposed a fast algorithm for pattern detection. The algorithm was designed based on cross correlation in the frequency domain between the input image and the weights of the neural network. In this paper, the algorithm is improved to compensate for the symmetric condition which is required by the input image. Two new ideas are introduced. Both ideas can accelerate the speed of the fast neural network as there is no need for converting the input image into symmetric one as before. The new ideas are verified using both theoretic analysis and practical results. |
|
[qf-zhao-03:2004] |
Q. F. Zhao. Recognition, learning and understanding. International Scientific Journal of Computing, 3(1):84-92, 2004. |
Symbolic and non-symbolic approaches are two popular categories in machine learning. In general, symbolic approaches can provide comprehensible rules, but cannot adapt to changing environments efficiently. On the contrary, non-symbolic approaches can adapt to changing environments but cannot provide comprehensible rules. In this study, we introduce a hybrid learning model called neural network tree (NNTree). An NNTree is a decision tree (DT) with each non-terminal node being an expert neural network (ENN). Experimental results have shown that NNTrees can be used both for learning and understanding. In this paper, we first summarize our results obtained in the current years, and then propose some topics for future study. |
|
[qf-zhao-04:2004] |
Q. Z. Xu, Q. F. Zhao, W. J. Pei, L. X. Yang, and Z. Y. He. Interpretable neural network tree for continuous-feature data sets. Neuro Information Processing - Letter and reviews, 3(3):77-84, 2004. |
Neural network tree (NNTree) is a hybrid learning model. Currently we have proposed a multiple objective optimization based genetic algorithm for evolving NNTrees, and shown through experiments that an NNTree can be interpreted easily if the number of attributes used in each non-terminal node is limited. One problem remained is that for problems with continuous attributes, interpretation is still very complex. In this paper, we propose to solve this problem through self-organized learning of the attributes. We show through experiments that NNTrees built from the training data after self-organized learning are equally good as those obtained from the original data. |
[qf-zhao-05:2004] |
Q. F. Zhao. Design smart NNTrees based on the R4-rule,. In T. Shih, editor, Proc. of the IEEE 19th International Conference on Advanced Information Networking and Applications (AINA2005), pages Vol. 2, 547-551, Tamkang, Taipei, Mar. 2005. IEEE, IEEE. |
Neural network tree (NNTree) is a hybrid learning model with the overall structure being a decision tree (DT), and each non-terminal node containing an expert neural network (ENN). Generally speaking, NNTrees outperform conventional DTs because more complex and possibly better features can be extracted by the ENNs. So far we have studied several genetic algorithms (GAs) for designing the NNTrees. These algorithms are computationally expensive, and the NNTrees obtained are often very large. In this paper, we propose a new approach based on the R4-rule, which is a non-genetic evolutionary algorithm proposed by the author several years ago. The key point is to propose a heuristic method for defining the teacher signals for the examples assigned to a non-terminal node. Once the teacher signals are defined, the ENNscan be trained quickly using the R4-rule. Experiments with several public databases show that the new approach can produce smart NNTrees quickly and effectively. |
|
[qf-zhao-06:2004] |
Q. F. Zhao, C. Lu, W. J. Pei, and Z. Y. He. Evolution and Interpretation of MTM-NNTrees. In IEEE, editor, Proc. International Conference on Systems, Man and Cybernetics (SMC'04), pages 5702-5707, Holland, Oct. 2004. IEEE, IEEE. |
Neural network tree (NNTree) is a hybrid learning model with the overall structure being a decision tree (DT) and each non-terminal node being an expert neural network (ENN). So far we have shown through experiments that NNTrees are not only learnable, but also interpretable if the number of inputs for each ENN is limited. Therefore, NNTrees might be an efficient model for unifying both learning and understanding. One important problem is that even if an NNTree is interpretable, the rules extracted from it may not be understandable because they may contain too many details. To solve this problem, we propose a new type of NNTrees in which a multi-template matcher (MTM) is used for each ENN instead of an multilayer perceptron (MLP). In this model, each template can be used as a previous case, and an MTM-NNTree can be understood straightforwardly. In this paper, we provide an evolutionary algorithm for designing MTM-NNTrees, and show through experiments that the MTM-NNTrees are as powerful as MLP-NNTrees. |
|
[yliu-01:2004] |
Y. Liu. A Hybrid Neural Network Learning System. In Proceedings of the Fourth International Conference on Computer and Information Technology (CIT2004), pages 1016-1021. Wuhan Univerity and University of Aizu, IEEE Computer Society, September 2004. |
This paper presents a hybrid learning system for learning and designing of neural network ensembles based on supervised learning and unsupervised learning. There are two terms in the performance function where one term is optimised by supervised learning, and the other by unsupervised learning. Through supervised learning, each neural network in an ensemble could learn target output as much as possible from the training data. By unsupervised learning, all neural networks learn simultaneously to cover different parts of training data in order to learn how to subdivide the whole training data. The learning behaviour of the hybrid learning system is studied based on correlations among the individual networks in the ensemble. |
|
[yliu-02:2004] |
Y. Liu. How to Find Different Neural Networks by Negative Correlation Learning. In Proceedings of International Joint Conference on Neural Networks 2005. IEEE Neural Network Society, IEEE Press, 2005. |
Two penalty functions are introduced in the negative correlation learning for finding different neural networks in an ensemble. One is based on the average output of the ensemble. The other is based on the classification. The idea of penalty function based on the average output is to make each individual network has the different output value to that of the ensemble on the same input. In comparison, the penalty function based on the classification is to lead each individual network to have different class to that of the ensemble on the same input. Experiments on a classification task showhow the negative correlation learning generates different neural networks with two different penalty functions. |
|
[yliu-03:2004] |
Y. Liu. Generate Different Neural Networks by Negative Correlation Learning. In Lecture Notes in Computer Science. Xiangtan University, Springer, 2005. |
This paper studies how to combine supervised learning and unsupervised learning to train a set of neural networks for an neural network ensemble. There are two terms in the performance function in which one term is the mean-squared error optimized by supervised learning, and the other is a correlation penalty function learned by unsupervised learning. Through supervised learning, each neural network in an ensemble could learn target output as much as possible. By unsupervised learning, all neural networks learn simultaneously to cover different parts of training data. The learning behavior of such a hybrid learning system is studied based on correlations among the individual networks in the ensemble. |
[yliu-04:2004] |
X. Yao and Y. Liu. Machine learning, page 32. Introductory Tutorials in Optimisation, Decision Support and SearchMethodologies. Kluwer Academic Publishers, 2005. |
[yliu-05:2004] |
Y. Liu, 2004-. Editor |
[yliu-06:2004] |
Y. Liu, 2005. Co-Chairof the 2005 International Symposium on Intelligence Computation and Applications (ISICA2005) |
[yliu-07:2004] |
Y. Liu, 2005. Keynote speaker at the 2005 International Symposium on Intelligence Computation and Applications |
[yliu-08:2004] |
Y. Liu, 2005. One of Editors of the book Evolvable Hardware by Springer |
[qf-zhao-07:2004] |
asako Nihei. Graduation Thesis: A study on lip segmentation, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-08:2004] |
Sota Harada. Graduation Thesis: Feature extraction for neural network based face recognition, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-09:2004] |
Akihiko Tamura. Graduation Thesis:An experimental study on neural network ensembles, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-10:2004] |
Hiroyuki Kobayashi. Graduation Thesis: A study on extraction of face and eyes, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-11:2004] |
Satomi Nishigaya. Graduation Thesis: A study on image morphing, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-12:2004] |
Takashi Harada. Master Thesis: Feature extraction for Kanji evaluation using GA, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-13:2004] |
Shiegeru Haruyama. Master Thesis: Create simple NNTrees with MOO-based GA, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-14:2004] |
Weiwei Du. Master Thesis: An improved R4-rule for designing the smallest NN-MLP, University of Aizu, 2005. Thesis Advisor: Qiangfu Zhao |
[qf-zhao-15:2004] |
Takaharu Takeda. Master Thesis: A study on retraining of neural network trees, University of Aizu, 2004. Thesis Advisor: Qiangfu Zhao |
[yliu-09:2004] |
Yutaka Kooriyama. Graduation Thesis: Evolutionary Learning for Robot Using RNN, University of Aizu, 2004. Thesis Advisor: Liu, Y. |
[yliu-10:2004] |
Takumi Nakamura. Graduation Thesis: Ear Segmentation and Recognition, University of Aizu, 2004. Thesis Advisor: Liu, Y. |
[yliu-11:2004] |
Rihito Nagae. Graduation Thesis, University of Aizu, 2004. Thesis Advisor: Liu, Y. |
[yliu-12:2004] |
Yo Takahashi. Master Thesis: Design of Control Strategies for a Multi-agent Ssytem, University of Aizu, 2004. Thesis Advisor: Liu, Y. |