Paper Search Console

Home Search Page About Contact

Journal Title

Title of Journal: Int J Mach Learn Cyber

Search In Journal Title:

Abbravation: International Journal of Machine Learning and Cybernetics

Search In Journal Abbravation:

Publisher

Springer Berlin Heidelberg

Search In Publisher:

DOI

10.1016/0022-1759(86)90337-6

Search In DOI:

ISSN

1868-808X

Search In ISSN:
Search In Title Of Papers:

Bifiring deep neural networks

Authors: JinCheng Li Wing W Y Ng Daniel S Yeung Patrick P K Chan
Publish Date: 2013/09/26
Volume: 5, Issue: 1, Pages: 73-83
PDF Link

Abstract

Deep neural networks provide more expressive power in comparison to shallow ones However current activation functions can not propagate error using gradient descent efficiently with the increment of the number of hidden layers Current activation functions eg sigmoid have large saturation regions which are insensitive to changes of hidden neuron’s input and yield gradient diffusion To relief these problems we propose a bifiring activation function in this work The bifiring function is a differentiable function with a very small saturation region Experimental results show that deep neural networks with the proposed activation functions yield faster training better error propagation and better testing accuracies on seven image datasets


Keywords:

References


.
Search In Abstract Of Papers:
Other Papers In This Journal:


Search Result: