Evolution Of Activation Functions for Neural Architecture Search
The introduction of the ReLU function in neural network architectures yielded substantial improvements over sigmoidal activation functions and allowed for the training of deep networks. Ever since, the search for new activation functions in neural networks has been an active research topic. However,...
محفوظ في:
| المؤلف الرئيسي: | |
|---|---|
| التنسيق: | masterThesis |
| منشور في: |
2020
|
| الموضوعات: | |
| الوصول للمادة أونلاين: | http://hdl.handle.net/10725/13847 https://doi.org/10.26756/th.2022.373 http://libraries.lau.edu.lb/research/laur/terms-of-use/thesis.php |
| الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
| _version_ | 1864513468321759232 |
|---|---|
| author | Nader, Andrew |
| author_facet | Nader, Andrew |
| author_role | author |
| dc.creator.none.fl_str_mv | Nader, Andrew |
| dc.date.none.fl_str_mv | 2020 2020-05-18 2022-07-21T08:40:45Z 2022-07-21T08:40:45Z |
| dc.identifier.none.fl_str_mv | http://hdl.handle.net/10725/13847 https://doi.org/10.26756/th.2022.373 http://libraries.lau.edu.lb/research/laur/terms-of-use/thesis.php |
| dc.language.none.fl_str_mv | en |
| dc.publisher.none.fl_str_mv | Lebanese American University |
| dc.rights.*.fl_str_mv | info:eu-repo/semantics/openAccess |
| dc.subject.none.fl_str_mv | Computer network architectures Neural networks (Computer science) Machine learning Lebanese American University -- Dissertations Dissertations, Academic |
| dc.title.none.fl_str_mv | Evolution Of Activation Functions for Neural Architecture Search |
| dc.type.none.fl_str_mv | Thesis info:eu-repo/semantics/publishedVersion info:eu-repo/semantics/masterThesis |
| description | The introduction of the ReLU function in neural network architectures yielded substantial improvements over sigmoidal activation functions and allowed for the training of deep networks. Ever since, the search for new activation functions in neural networks has been an active research topic. However, to the best of our knowledge, the design of new activation functions has mostly been done by hand. In this work, we propose the use of a self-adaptive evolutionary algorithm that searches for new activation functions using a genetic programming approach, and we compare the performance of the obtained activation functions to ReLU. We also analyze the shape of the obtained activations to see if they have any common traits such as monotonicity or piece-wise linearity, and we study the effects of the self-adaptation to see which operators perform well in the context of a search for new activation functions. We perform a thorough experimental study on datasets of different sizes and types, using different types of neural network architectures. We report favorable results obtained from the mean and standard deviation of the performance metrics over multiple runs. |
| eu_rights_str_mv | openAccess |
| format | masterThesis |
| id | LAURepo_664f3edea6c5fa156fae77d249aba407 |
| language_invalid_str_mv | en |
| network_acronym_str | LAURepo |
| network_name_str | Lebanese American University repository |
| oai_identifier_str | oai:laur.lau.edu.lb:10725/13847 |
| publishDate | 2020 |
| publisher.none.fl_str_mv | Lebanese American University |
| repository.mail.fl_str_mv | |
| repository.name.fl_str_mv | |
| repository_id_str | |
| spelling | Evolution Of Activation Functions for Neural Architecture SearchNader, AndrewComputer network architecturesNeural networks (Computer science)Machine learningLebanese American University -- DissertationsDissertations, AcademicThe introduction of the ReLU function in neural network architectures yielded substantial improvements over sigmoidal activation functions and allowed for the training of deep networks. Ever since, the search for new activation functions in neural networks has been an active research topic. However, to the best of our knowledge, the design of new activation functions has mostly been done by hand. In this work, we propose the use of a self-adaptive evolutionary algorithm that searches for new activation functions using a genetic programming approach, and we compare the performance of the obtained activation functions to ReLU. We also analyze the shape of the obtained activations to see if they have any common traits such as monotonicity or piece-wise linearity, and we study the effects of the self-adaptation to see which operators perform well in the context of a search for new activation functions. We perform a thorough experimental study on datasets of different sizes and types, using different types of neural network architectures. We report favorable results obtained from the mean and standard deviation of the performance metrics over multiple runs.1 online resource (xii, 108 leaves): col. ill.Bibliography: leaf 98-108.Lebanese American University2022-07-21T08:40:45Z2022-07-21T08:40:45Z20202020-05-18Thesisinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesishttp://hdl.handle.net/10725/13847https://doi.org/10.26756/th.2022.373http://libraries.lau.edu.lb/research/laur/terms-of-use/thesis.phpeninfo:eu-repo/semantics/openAccessoai:laur.lau.edu.lb:10725/138472022-07-21T08:41:37Z |
| spellingShingle | Evolution Of Activation Functions for Neural Architecture Search Nader, Andrew Computer network architectures Neural networks (Computer science) Machine learning Lebanese American University -- Dissertations Dissertations, Academic |
| status_str | publishedVersion |
| title | Evolution Of Activation Functions for Neural Architecture Search |
| title_full | Evolution Of Activation Functions for Neural Architecture Search |
| title_fullStr | Evolution Of Activation Functions for Neural Architecture Search |
| title_full_unstemmed | Evolution Of Activation Functions for Neural Architecture Search |
| title_short | Evolution Of Activation Functions for Neural Architecture Search |
| title_sort | Evolution Of Activation Functions for Neural Architecture Search |
| topic | Computer network architectures Neural networks (Computer science) Machine learning Lebanese American University -- Dissertations Dissertations, Academic |
| url | http://hdl.handle.net/10725/13847 https://doi.org/10.26756/th.2022.373 http://libraries.lau.edu.lb/research/laur/terms-of-use/thesis.php |