CFP last date
21 October 2024
Reseach Article

An Empirical Examination of the Relationship between Code Smells and Vulnerabilities

by Aakanshi Gupta, Bharti Suri, Vijin Vincent
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 176 - Number 32
Year of Publication: 2020
Authors: Aakanshi Gupta, Bharti Suri, Vijin Vincent
10.5120/ijca2020920362

Aakanshi Gupta, Bharti Suri, Vijin Vincent . An Empirical Examination of the Relationship between Code Smells and Vulnerabilities. International Journal of Computer Applications. 176, 32 ( Jun 2020), 1-9. DOI=10.5120/ijca2020920362

@article{ 10.5120/ijca2020920362,
author = { Aakanshi Gupta, Bharti Suri, Vijin Vincent },
title = { An Empirical Examination of the Relationship between Code Smells and Vulnerabilities },
journal = { International Journal of Computer Applications },
issue_date = { Jun 2020 },
volume = { 176 },
number = { 32 },
month = { Jun },
year = { 2020 },
issn = { 0975-8887 },
pages = { 1-9 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume176/number32/31405-2020920362/ },
doi = { 10.5120/ijca2020920362 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:44:00.770022+05:30
%A Aakanshi Gupta
%A Bharti Suri
%A Vijin Vincent
%T An Empirical Examination of the Relationship between Code Smells and Vulnerabilities
%J International Journal of Computer Applications
%@ 0975-8887
%V 176
%N 32
%P 1-9
%D 2020
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The quality of software is a crucial issue as a software system evolves. Managing source code smells and vulnerabilities contributes to software quality. In general, metrics have been used to classify code smells in source code, and an empirical examination is being considered in this paper on the correlation of code smells and vulnerabilities. For continuous inspection of code quality, Sonar Cloud has been used to conduct automated assessments with static code analysis to detect code smells and vulnerabilities with web scrapping technique. Web scraping, web harvesting, or web data extraction is data scraping used for extracting data from websites. In web scrapping technique, Selenium library provides sufficient tool to scrap data from Sonar Cloud. A statistical correlation approach is used to create a relationship between code smell and vulnerability that takes both dependent and independent values to measure coefficient of correlation. The conclusion of the study is, there exist vulnerabilities and code smells pair whose correlation coefficient is up to 0.93, which is quite sufficient to justify the results.

References
  1. Felivel Camilo, Andrew Meneely, and Meiyappan Nagappan. Do bugs foreshadow vulnerabilities?: a study of the chromium project. In Proceedings of the 12th Working Conference on Mining Software Repositories, pages 269–279. IEEE Press, 2015.
  2. Istehad Chowdhury and Mohammad Zulkernine. Using complexity, coupling, and cohesion metrics as early indicators of vulnerabilities. Journal of Systems Architecture, 57(3):294– 313, 2011.
  3. Gabriela Czibula, Zsuzsanna Marian, and Istvan Gergely Czibula. Detecting software design defects using relational association rule mining. Knowledge and information systems, 42(3):545–577, 2015.
  4. Marco D’Ambros, Alberto Bacchelli, and Michele Lanza. On the impact of design flaws on software defects, 2010.
  5. Jiang Dexun, Ma Peijun, Su Xiaohong, and Wang Tiantian. Detecting bad smells with weight based distance metrics theory. In 2012 Second International Conference on Instrumentation, Measurement, Computer, Communication and Control, pages 299–304. IEEE, 2012.
  6. Davide Falessi and Alexander Voegele. Validating and prioritizing quality rules for managing technical debt: An industrial case study, 2015.
  7. Francesca Arcelli Fontana, Mika V M¨antyl¨a, Marco Zanoni, and Alessandro Marino. Comparing and experimenting machine learning techniques for code smell detection. Empirical Software Engineering, 21(3):1143–1191, 2016.
  8. Rajeev Gopalakrishna, E Spafford, and Jan Vitek. Vulnerability likelihood: A probabilistic approach to software assurance. CERIAS, Purdue Univeristy Tech. Rep, 6:2005, 2005.
  9. Aakanshi Gupta, Bharti Suri, Vijay Kumar, Sanjay Misra, Tomas Bla?zauskas, and Robertas Dama?sevi?cius. Software code smell prediction model using shannon, r´enyi and tsallis entropies. Entropy, 20(5):372, 2018.
  10. Aakanshi Gupta, Bharti Suri, and Sanjay Misra. A systematic literature review: Code bad smells in java source code. In International Conference on Computational Science and Its Applications, pages 665–682. Springer, 2017.
  11. Elmar Juergens, Florian Deissenboeck, Benjamin Hummel, and Stefan Wagner. Do code clones matter?, 2009.
  12. Marija Kati´c and Kre?simir Fertalj. Challenges and discussion of software redesign, 2009.
  13. Marija Kati´c and Kre?simir Fertalj. Challenges and discussion of software redesign. In Proceedings of the 4th International Conference on Information Technology, 2009.
  14. Wael Kessentini, Marouane Kessentini, Houari Sahraoui, Slim Bechikh, and Ali Ouni. A cooperative parallel searchbased software engineering approach for code-smells detection. IEEE Transactions on Software Engineering, 40(9):841– 861, 2014.
  15. Ivan Victor Krsul. Software vulnerability analysis. Purdue University West Lafayette, IN, 1998.
  16. Wei Li and Raed Shatnawi. An empirical study of the bad smells and class error probability in the post-release objectoriented system evolution, 2007.
  17. Akito Monden, Daikai Nakae, Toshihiro Kamiya, Shin-ichi Sato, and Ken-ichi Matsumoto. Software quality analysis by code clones in industrial legacy software, 2002.
  18. Fabio Palomba, Gabriele Bavota, Massimiliano Di Penta, Rocco Oliveto, Andrea De Lucia, and Denys Poshyvanyk. Detecting bad smells in source code using change history information. In Proceedings of the 28th IEEE/ACM International Conference on Automated Software Engineering, pages 268–278. IEEE Press, 2013.
  19. Luca Pellegrini, Andrea Alexander Janes, and Davide Taibi. On the fault proneness of sonarqube technical debt violations. an empirical study, 2018.
  20. Foyzur Rahman, Christian Bird, and Premkumar Devanbu. Clones: What is that smell?, 2012.
  21. Dag IK Sjøberg, Aiko Yamashita, Bente CD Anda, Audris Mockus, and Tore Dyb°a. Quantifying the effect of code smells on maintenance effort, 2013.
  22. Aiko Yamashita. Assessing the capability of code smells to explain maintenance problems: an empirical study combining quantitative and qualitative data, 2014.
  23. Thomas Zimmermann, Nachiappan Nagappan, and Laurie Williams. Searching for a needle in a haystack: Predicting security vulnerabilities for windows vista. In 2010 Third International Conference on Software Testing, Verification and Validation, pages 421–428. IEEE, 2010.
  24. Olbrich, Steffen and Cruzes, Daniela S and Basili, Victor and Zazworka, Nico. The evolution and impact of code smells: A case study of two open source systems. In 2009 3rd international symposium on empirical software engineering and measurement, pages 390–400. IEEE, 2009.
  25. Ping, Liu and Jin, Su and Xinfeng, Yang. Research on software security vulnerability detection technology. In Proceedings of 2011 International Conference on Computer Science and Network Technology, pages 1873–1876. IEEE, 2011.
  26. Mannan, Umme Ayda and Ahmed, Iftekhar and Almurshed, Rana Abdullah M and Dig, Danny and Jensen, Carlos. Understanding code smells in Android applications. In 2016 IEEE/ACM International Conference on Mobile Software Engineering and Systems (MOBILESoft), pages 225–236. IEEE, 2016.
Index Terms

Computer Science
Information Sciences

Keywords

Software Quality Code Smell Vulnerability