Applied and Computational Engineering
- The Open Access Proceedings Series for Conferences
Series Vol. 2 , 22 March 2023
* Author to whom correspondence should be addressed.
Automatic feature extraction and processing of greater data is now possible because of advances in Deep Learning. To pre-train from a wider corpus and comprehend the language feature for sentiment classification work, transformers Generalized Autoregressive Pre-training for Language Understanding and Bidirectional Encoder Representations from Transformers (BERT) have been proposed. These language models learn the context in both ways. In the proposed work, we have examined and tested our text dataset of skin cancer cases using the BERTbase model. When determining whether a patient's symptoms are compatible with cancer or not the model has a 97.3 percent accuracy rate.
Bidirectional, Sentiment Classification, Transformers, Reviews, Encoders
1. Devlin, J., et al., BERT: Pre-training of deep bidirectional transformers for language under-standing. arXiv preprint arXiv:1810.04805, 2018.
2. Pavithra, E., Janakiramaiah, B., Narasimha Prasad, L. V., Deepa, D., Jayapandian, N., & Sathishkumar, V. E.. Visiting Indian Hospitals Before, During and After Covid. Interna-tional Journal of Uncertainty, Fuzziness and Knowledge-Based Systems., 30(1), pp. 111-123,2020.
3. Sun, C., et al. How to fine-tune BERT for text classification? in China national conference on Chinese computational linguistics. 2019. Springer.
4. Lee, J., et al., BioBERT: a pre-trained biomedical language representation model for biomedi-cal text mining. Bioinformatics, 2020. 36(4): p. 1234-1240.
5. Munikar, M., S. Shakya, and A. Shrestha. Fine-grained sentiment classification using BERT. in 2019 Artificial Intelligence for Transforming Business and Society (AITB). 2019. IEEE.
6. Li, X., et al., Enhancing BERT representation with context-aware embedding for aspect-based sentiment analysis. IEEE Access, 2020. 8: p. 46868-46876.
7. Jain, P.K., et al., Employing BERT-DCNN with sentic knowledge base for social media senti-ment analysis. Journal of Ambient Intelligence and Humanized Computing, 2022: p. 1-13.
8. Nezhad, Z.B. and M.A. Deihimi, Twitter sentiment analysis from Iran about COVID 19 vac-cine. Diabetes & Metabolic Syndrome: Clinical Research & Reviews, 2022. 16(1): p. 102367.
9. Singh, M., A.K. Jakhar, and S. Pandey, Sentiment analysis on the impact of coronavirus in social life using the BERT model. Social Network Analysis and Mining, 2021. 11(1): p. 1-11.
10. Shah, B.K., et al. Sentiments Detection for Amazon Product Review. in 2021 International Conference on Computer Communication and Informatics (ICCCI). 2021. IEEE.
11. Lehečka, J., et al. BERT-based sentiment analysis using distillation. in International Confer-ence on Statistical Language and Speech Processing. 2020. Springer.
12. Sarma, P.K., Y. Liang, and W.A. Sethares, Shallow domain adaptive embeddings for sentiment analysis. arXiv preprint arXiv:1908.06082, 2019.
13. Devlin, J., et al., BERT: Pre-training of deep bidirectional transformers for language under-standing. arXiv preprint arXiv:1810.04805, 2018. [2] Pavithra, E., Janakiramaiah, B., Narasimha Prasad, L. V., Deepa, D., Jayapandian, N., & Sathishkumar, V. E.. Visiting Indian Hospitals Before, During and After Covid. Interna-tional Journal of Uncertainty, Fuzziness and Knowledge-Based Systems., 30(1), pp. 111-123,2020. [3] Sun, C., et al. How to fine-tune BERT for text classification? in China national conference on Chinese computational linguistics. 2019. Springer. [4] Lee, J., et al., BioBERT: a pre-trained biomedical language representation model for biomedi-cal text mining. Bioinformatics, 2020. 36(4): p. 1234-1240. [5] Munikar, M., S. Shakya, and A. Shrestha. Fine-grained sentiment classification using BERT. in 2019 Artificial Intelligence for Transforming Business and Society (AITB). 2019. IEEE. [6] Li, X., et al., Enhancing BERT representation with context-aware embedding for aspect-based sentiment anal
14. Sathishkumar V E, Changsun Shin, Youngyun Cho, “Efficient energy consumption prediction model for a data analytic-enabled industry building in a smart city”, Building Research & Information, Vol. 49. no. 1, pp. 127-143, 2021.
15. Sathishkumar V E, Youngyun Cho, “A rule-based model for Seoul Bike sharing demand pre-diction using Weather data”, European Journal of Remote Sensing, Vol. 52, no. 1, pp. 166-183, 2020.
16. Sathishkumar V E, Jangwoo Park, Youngyun Cho, “Using data mining techniques for bike sharing demand prediction in Metropolitan city”, Computer Communications, Vol. 153, pp. 353-366, 2020.
17. Sathishkumar V E, Yongyun Cho, “Season wise bike sharing demand analysis using random forest algorithm”, Computational Intelligence, pp. 1-26, 2020.
18. Sathishkumar, V. E., Hatamleh, W. A., Alnuaim, A. A., Abdelhady, M., Venkatesh, B., & Santhoshkumar, S. (2021). Secure Dynamic Group Data Sharing in Semi-trusted Third Party Cloud Environment. Arabian Journal for Science and Engineering, 1-9.
19. Sathishkumar V E., Jangwoo Park, Youngyun Cho, “Seoul Bike Trip duration prediction using data mining techniques”, IET Intelligent Transport Systems, Vol. 14, no. 11, pp. 1465-1474, 2020.
20. Sathishkumar Easwaramoorthy., Sophia, F., & Prathik, A. (2016, February). Biometric Authen-tication using finger nails. In 2016 international conference on emerging trends in engineer-ing, technology and science (ICETETS) (pp. 1-6). IEEE.
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open Access Instruction).