Please use this identifier to cite or link to this item:
http://dspace.cityu.edu.hk/handle/2031/8957
Title: | Kernel Selection in Machine Learning |
Authors: | Chan, Yui Lik |
Department: | Department of Electronic Engineering |
Issue Date: | 2018 |
Supervisor: | Supervisor: Prof. Leung, Andrew C S; Assessor: Dr. Po, Lai Man |
Abstract: | This project aims at improving the algorithm of Node Incremental Extreme Learning Machine(NI-ELM), which is a special type of single-hidden-layer feed-forward network(SLFN), in terms of the computation complexity and accuracy under fault-free and faulty environment. NI-ELM has been proved that it has better performance on both training and test dataset comparing to other I-ELM, such as original I-ELM, CI-ELM, EM-ELM etc. However, it lacks decay parameter to handle weight fault and overfitting, thus its performance on test set varies with different datasets and is poor in faulty network. Adoption of Block Matrix and weight-decay method can reduce the computation cost of calculating Moore-Penrose inverse in recursive way and improve the generalization of NI-ELM. This project compares the results of three algorithms, I-ELM, original NI-ELM and new NI-ELM with small and medium size of datasets including Classification and Regression in fault-free and faulty network. In the fault-free network, the training error of the new NI-ELM is similar to that of original NI-ELM and is lower than that of I-ELM whereas the test error of new NI-ELM is the lowest. In the faultless network, both training error and test error of new NI-ELM and FT-IELM are greatly lower than original NI-ELM. |
Appears in Collections: | Electrical Engineering - Undergraduate Final Year Projects |
Files in This Item:
File | Size | Format | |
---|---|---|---|
fulltext.html | 148 B | HTML | View/Open |
Items in Digital CityU Collections are protected by copyright, with all rights reserved, unless otherwise indicated.