Please use this identifier to cite or link to this item:
https://gnanaganga.inflibnet.ac.in:8443/jspui/handle/123456789/15659
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Dilip, Kumar | - |
dc.contributor.author | Ghantasala, G S Pradeep | - |
dc.contributor.author | Rao, D Nageswara | - |
dc.contributor.author | Rathee, Manisha | - |
dc.contributor.author | Bathla, Priyanka | - |
dc.date.accessioned | 2024-05-29T08:51:27Z | - |
dc.date.available | 2024-05-29T08:51:27Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | pp. 883-887 | en_US |
dc.identifier.isbn | 9798350383522 | - |
dc.identifier.uri | http://dx.doi.org/10.1109/IC2PCT60090.2024.10486744 | - |
dc.identifier.uri | http://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/15659 | - |
dc.description.abstract | Machine learning models could provide human intelligence to perform complex tasks effectively. Among them, deep learning models have successfully been applied in several domains. However, that efficacy can be achieved only through effective design and training of deep learning models, which is a complex task to achieve and requires a lot of expertise in the domain. Finding the optimal hyperparameter values is crucial for high-performance machine learning models, especially tree-based models and neural network-based models. Estimation of the parameters such as the weight of neural networks can be done using data sets and these are internal model parameters, however the hyperparameters such as number of layers, size of layers, learning rates etc. are external to the model. The values of hyperparameters can either be determined intuitively by domain experts or by manually and randomly trying various possible configurations until we arrive at some acceptable values. Both have severe limitations for machine learning models with a lot of hyperparameters. Several attempts have been made in the direction of automatic hyperparameter optimization to save human efforts in designing machine learning models. Nature-inspired computing has gained a lot of popularity in this direction in recent years. In this paper, we have applied ant colony optimization, a nature-inspired approach for hyperparameter optimization for deep learning models for the prediction of lung cancer. © 2024 IEEE. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Proceedings - International Conference on Computing, Power, and Communication Technologies, IC2PCT 2024 | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | en_US |
dc.subject | Ant Colony Optimization | en_US |
dc.subject | Combinatorial Optimization | en_US |
dc.subject | Deep Learning Model | en_US |
dc.subject | Hyperparameter Optimization | en_US |
dc.subject | Metaheuristic | en_US |
dc.title | Aco-Based Hyperparameter Tuning of a Dl Model for Lung Cancer Prediction | en_US |
dc.type | Article | en_US |
Appears in Collections: | Conference Papers |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.