Optimization problems using the particle swarm optimization algorithm
Several techniques and models for training artificial neural networks exist, such as the architecture of the multi-layer perceptron. Multi-layer perceptron has a non-linear mapping between inputs and outputs of the neural network. Furthermore, Backpropagation is the traditional algorithm for trainin...
Zapisane w:
| 1. autor: | |
|---|---|
| Format: | bachelorThesis |
| Język: | eng |
| Wydane: |
2023
|
| Hasła przedmiotowe: | |
| Dostęp online: | http://repositorio.yachaytech.edu.ec/handle/123456789/592 |
| Etykiety: |
Dodaj etykietę
Nie ma etykietki, Dołącz pierwszą etykiete!
|
| Streszczenie: | Several techniques and models for training artificial neural networks exist, such as the architecture of the multi-layer perceptron. Multi-layer perceptron has a non-linear mapping between inputs and outputs of the neural network. Furthermore, Backpropagation is the traditional algorithm for training a neural network. On the other hand, in recent years, nature-inspired metaheuristic algorithms have been implemented to optimize the parameters of ANN. A popular algorithm for this task is PSO, which has a quantum version (QDPSO). Thus, this thesis proposes the integration of QDPSO in a multi-layer perceptron for classification problems and compares it with PSO, PSO-bound, L-BFGS, Adam, and SGD. The contributions of this work are the architecture and integration of the QDPSO, validation of the model proposed comparing with optimizers based on metaheuristics and gradient using benchmark datasets, and analysis of the training behavior increasing the classes and samples number of the circle dataset. Besides, we propose a technique for image classification using Isomap as a reduction algorithm. Isomap reduces six times the image features for the input layer. Also, it is compared with MSD, TSNE, and PCA using the iris and breast cancer datasets. Finally, the validation and comparison results demonstrated that the architecture and technique proposed in this thesis have an excellent classification of the benchmark and MCW datasets. Moreover, the QDPSO optimizer has faster convergence and adequate behavior during the training for balanced datasets. |
|---|