Please use this identifier to cite or link to this item: https://gnanaganga.inflibnet.ac.in:8443/jspui/handle/123456789/5540
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKumar, Dilip-
dc.contributor.authorG. S. Pradeep, Ghantasala-
dc.contributor.authorRathee, Manisha-
dc.contributor.authorKallam, Suresh-
dc.contributor.authorBathla, Priyanka-
dc.date.accessioned2024-01-31T09:58:37Z-
dc.date.available2024-01-31T09:58:37Z-
dc.date.issued2023-
dc.identifier.isbn9798350341737-
dc.identifier.urihttps://doi.org/10.1109/CSET58993.2023.10346225-
dc.identifier.urihttp://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/5540-
dc.description.abstractMachine learning models have been successfully applied in numerous fields. Training a model is the most important aspect of machine learning for its successful application for the problem. To improve the training of a machine learning model thereby improve the performance, the selection of features and setting optimal parameters is crucial. Mainly two kinds of parameters are required to deal with, namely internal and external parameters. Internal parameters are model parameters and configurable such as weights of neural networks and their estimation can be done using data set. The hyperparameters such as learning rate, size of layers, number of layers, loss function etc, are external parameters and its values cannot be determined using the data set and it is not the part of the model. Its estimation can be done by the domain expert or using some trialanderror techniques until it achieves some acceptable values. However, these techniques are highly timeconsuming and cannot ensure the optimal values for these hyperparameters. In recent years different metaheuristic techniques have been applied to determine the optimal values of hyper parameters for machine learning models. In this paper we have conducted a brief comparative study of a few popular metaheuristic approaches applied for the hyperparameter optimization for various machine learning models. In this paper various evaluation measures have been considered for comparative analysis of metaheuristic approaches for hyperparameter optimization for deep learning model. © 2023 IEEE.en_US
dc.language.isoenen_US
dc.publisher2023 International Conference on Computer Science and Emerging Technologies, CSET 2023en_US
dc.subjectDeep Learningen_US
dc.subjectGenetic Algorithms. Ant Colony Optimizationen_US
dc.subjectHyperparameter Optimizationen_US
dc.subjectMetaheuristic Techniquesen_US
dc.subjectParticle Swarm Optimizationen_US
dc.titleA Brief Comparative Study of Metaheuristic Approaches for Hyperparameter Optimization of Machine Learning Modelen_US
dc.typeArticleen_US
Appears in Collections:Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.