Uneingeschränkter Zugang

Analysing the Analysers: An Investigation of Source Code Analysis Tools

,  und   
15. Aug. 2024

Zitieren
COVER HERUNTERLADEN

D. Baca, K. Petersen, B. Carlsson, and L. Lundberg, “Static code analysis to detect software security vulnerabilities – does experience matter?” in 2009 International Conference on Availability, Reliability and Security, Fukuoka, Japan, Mar. 2009, pp. 804–810. https://doi.org/10.1109/ARES.2009.163 Search in Google Scholar

A. G. Bardas et al., “Static code analysis,” Journal of Information Systems & Operations Management, vol. 4, no. 2, pp. 99–107, 2010. Search in Google Scholar

B. A. Kitchenham, T. Dyba, and M. Jorgensen, “Evidence-based software engineering,” in Proceedings. 26th International Conference on Software Engineering, Edinburgh, UK, Jun. 2004, pp. 273–281. https://doi.org/10.1109/ICSE.2004.1317449 Search in Google Scholar

T. B. C. Arias, P. Avgeriou, and P. America, “Analyzing the actual execution of a large software-intensive system for determining dependencies,” in 2008 15th Working Conference on Reverse Engineering, Antwerp, Belgium, Oct. 2008, pp. 49–58. https://doi.org/10.1109/WCRE.2008.11 Search in Google Scholar

F. Angerer, “Variability-aware change impact analysis of multi-language product lines,” in Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering, Sep. 2014, pp. 903–906. https://doi.org/10.1145/2642937.2653472 Search in Google Scholar

G. Booch, “Object-oriented development,” IEEE Transactions on Software Engineering, vol. SE-12, no. 2, pp. 211–221, Feb. 1986. https://doi.org/10.1109/TSE.1986.6312937 Search in Google Scholar

D. De Champeaux, A. Anderson, and E. Feldhousen, “Case study of object-oriented software development,” ACM SIGPLAN Notices, vol. 27, no. 10, pp. 377–391, Oct. 1992. https://doi.org/10.1145/141937.141967 Search in Google Scholar

K. Lieberherr and C. Xiao, “Object-oriented software evolution,” IEEE Transactions on Software Engineering, vol. 19, no. 4, pp. 313–343, Apr. 1993. https://doi.org/10.1109/32.223802 Search in Google Scholar

M. Gabbrielli and S. Martini, Programming Languages: Principles and Paradigms. Springer Nature, 2023. Search in Google Scholar

M. Mantere, I. Uusitalo, and J. Roning, “Comparison of static code analysis tools,” in 2009 Third International Conference on Emerging Security Information, Systems and Technologies, Athens, Greece, Jun. 2009, pp. 15–22. https://doi.org/10.1109/SECURWARE.2009.10 Search in Google Scholar

R. Lammel, M. Leinberger, T. Schmorleiz, and A. Varanovich, “Comparison of feature implementations across languages, technologies, and styles,” in 2014 Software Evolution Week-IEEE Conference on Software Maintenance, Reengineering, and Reverse Engineering (CSMRWCRE), Antwerp, Belgium, Feb. 2014, pp. 333–337. https://doi.org/10.1109/CSMR-WCRE.2014.6747188 Search in Google Scholar

N. F. Schneidewind, “The state of software maintenance,” IEEE Transactions on Software Engineering, vol. SE-13, no. 3, pp. 303–310, Mar. 1987. https://doi.org/10.1109/TSE.1987.233161 Search in Google Scholar

G. A. Di Lucca, A. R. Fasolino, F. Pace, P. Tramontana, and U. De Carlini, “Ware: A tool for the reverse engineering of web applications,” in Proceedings of the Sixth European Conference on Software Maintenance and Reengineering, 2002, pp. 241–250. Search in Google Scholar

L. Coyle, M. Hinchey, B. Nuseibeh, and J. L. Fiadeiro, “Guest editors’ introduction: Evolving critical systems,” Computer, vol. 43, no. 05, pp. 28–33, May 2010. https://doi.org/10.1109/MC.2010.139 Search in Google Scholar

S. Olbrich, D. S. Cruzes, V. Basili, and N. Zazworka, “The evolution and impact of code smells: A case study of two open source systems,” in 2009 3rd international symposium on empirical software engineering and measurement, Lake Buena Vista, FL, USA, Oct. 2009, pp. 390–400. https://doi.org/10.1109/ESEM.2009.5314231 Search in Google Scholar

A. S. Cairo, G. d. F. Carneiro, and M. P. Monteiro, “The impact of code smells on software bugs: A systematic literature review,” Information, vol. 9, no. 11, Nov. 2018, Art. no. 273. https://doi.org/10.3390/info9110273 Search in Google Scholar

D. Binkley, “Source code analysis: A road map,” in Future of Software Engineering (FOSE’07), Minneapolis, MN, USA, May 2007, pp. 104– 119. https://doi.org/10.1109/FOSE.2007.27 Search in Google Scholar

J. Cruz-Benito, S. Vishwakarma, F. Martin-Fernandez, and I. Faro, “Automated source code generation and auto-completion using deep learning: Comparing and discussing current language model-related approaches,” AI, vol. 2, no. 1, pp. 1–16, Jan. 2021. https://doi.org/10.3390/ai2010001 Search in Google Scholar

N. Harrison, Code Generation with Roslyn. Springer, 2017. Search in Google Scholar

F. Nagel, G. M. Bierman, and S. D. Viglas, “Code generation for efficient query processing in managed runtimes,” Proceedings of the VLDB Endowment (PVLDB), vol. 7, no. 12, pp. 1095–1106, Aug. 2014. https://doi.org/10.14778/2732977.2732984 Search in Google Scholar

D. Steidl, B. Hummel, and E. Juergens, “Quality analysis of source code comments,” in 2013 21st International Conference on Program Comprehension (ICPC), San Francisco, CA, USA, May 2013, pp. 83–92. https://doi.org/10.1109/ICPC.2013.6613836 Search in Google Scholar

R. Plosch, H. Gruber, A. Hentschel, G. Pomberger, and S. Schiffer, “On the relation between external software quality and static code analysis,” in 2008 32nd annual IEEE software engineering workshop, Kassandra, Greece, Oct. 2008, pp. 169–174. https://doi.org/10.1109/SEW.2008.17 Search in Google Scholar

D. Singh, V. R. Sekar, K. T. Stolee, and B. Johnson, “Evaluating how static analysis tools can reduce code review effort,” in 2017 IEEE symposium on visual languages and human-centric computing (VL/HCC), Raleigh, NC, USA, Oct. 2017, pp. 101–105. https://doi.org/10.1109/VLHCC.2017.8103456 Search in Google Scholar

I. Stamelos, L. Angelis, A. Oikonomou, and G. L. Bleris, “Code quality analysis in open source software development,” Information Systems Journal, vol. 12, no. 1, pp. 43–60, Jan. 2002. https://doi.org/10.1046/j.1365-2575.2002.00117.x Search in Google Scholar

M. Harman, “Why source code analysis and manipulation will always be important,” in 2010 10Th IEEE working conference on source code analysis and manipulation, Timisoara, Romania, Sep. 2010, pp. 7–19. https://doi.org/10.1109/SCAM.2010.28 Search in Google Scholar

S. Mukherjee, Source Code Analytics with Roslyn and JavaScript Data Visualization. Springer, 2016. Search in Google Scholar

T. W. Thomas, H. Lipford, B. Chu, J. Smith, and E. Murphy-Hill, “What questions remain? An examination of how developers understand an interactive static analysis tool,” in Twelfth Symposium on Usable Privacy and Security (SOUPS 2016), Jun. 2016. Search in Google Scholar

A. R. Yazdanshenas and L. Moonen, “Crossing the boundaries while analyzing heterogeneous component-based software systems,” in 2011 27th IEEE International Conference on Software Maintenance (ICSM), Williamsburg, VA, USA, Sep. 2011, pp. 193–202. https://doi.org/10.1109/ICSM.2011.6080786 Search in Google Scholar

P. Emanuelsson and U. Nilsson, “A comparative study of industrial static analysis tools,” Electronic Notes in Theoretical Computer Science, vol. 217, pp. 5–21, Jul. 2008. https://doi.org/10.1016/j.entcs.2008.06.039 Search in Google Scholar

P. Louridas, “Static code analysis,” IEEE Software, vol. 23, no. 4, pp. 58– 61, Jul.–Aug.2006. https://doi.org/10.1109/MS.2006.114 Search in Google Scholar

N. E. Fenton and M. Neil, “Software metrics: roadmap,” in Proceedings of the Conference on the Future of Software Engineering, May 2000, pp. 357–370. https://doi.org/10.1145/336512.336588 Search in Google Scholar

F. G. Toosi, J. Buckley, and A. R. Sai, “Source-code divergence diagnosis using constraints and cryptography,” in Proceedings of the 13th European Conference on Software Architecture, ECSA ’19, vol. 2, New York, NY, USA, Sep. 2019, pp. 205–208. https://doi.org/10.1145/3344948.3344983 Search in Google Scholar

Z. Zhioua, S. Short, and Y. Roudier, “Static code analysis for software security verification: Problems and approaches,” in 2014 IEEE 38th International Computer Software and Applications Conference Workshops, Vasteras, Sweden, Jul. 2014, pp. 102–109. https://doi.org/10.1109/COMPSACW.2014.22 Search in Google Scholar

A. Hovsepyan, R. Scandariato, W. Joosen, and J. Walden, “Software vulnerability prediction using text analysis techniques,” in Proceedings of the 4th international workshop on Security measurements and metrics, Sep. 2012, pp. 7–10. https://doi.org/10.1145/2372225.2372230 Search in Google Scholar

M. Gegick, L. Williams, J. Osborne, and M. Vouk, “Prioritizing software security fortification throughcode-level metrics,” in Proceedings of the 4th ACM workshop on Quality of protection, Oct. 2008, pp. 31–38. https://doi.org/10.1145/1456362.1456370 Search in Google Scholar

I. Stamelos, L. Angelis, A. Oikonomou, and G. L. Bleris, “Code quality analysis in open source software development,” Information systems journal, vol. 12, no. 1, pp. 43–60, 2002. Search in Google Scholar

E. L. Vargas, J. Hejderup, M. Kechagia, M. Bruntink, and G. Gousios, “Enabling real-time feedback in software engineering,” in Proceedings of the 40th International Conference on Software Engineering: New Ideas and Emerging Results, Sydney, NSW, Australia, May 2018, pp. 21–24. https://doi.org/10.1145/3183399.3183416 Search in Google Scholar

W. Maalej and D. Pagano, “On the socialness of software,” in 2011 IEEE Ninth International Conference on Dependable, Autonomic and Secure Computing, Dec. 2011, pp. 864–871. https://doi.org/10.1109/DASC.2011.146 Search in Google Scholar

E. Soares, G. Sizilio, J. Santos, D. A. da Costa, and U. Kulesza, “The effects of continuous integration on software development: a systematic literature review,” Empirical Software Engineering, vol. 27, no. 3, Mar. 2022, Art. no. 78. https://doi.org/10.1007/s10664-021-10114-1 Search in Google Scholar

M. Shahin, M. A. Babar, and L. Zhu, “Continuous integration, delivery and deployment: a systematic review on approaches, tools, challenges and practices,” IEEE Access, vol. 5, pp. 3909–3943, Mar. 2017. https://doi.org/10.1109/ACCESS.2017.2685629 Search in Google Scholar

S. Arachchi and I. Perera, “Continuous integration and continuous delivery pipeline automation for agile software project management,” in 2018 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, May–Jun. 2018, pp. 156–161. https://doi.org/10.1109/MERCon.2018.8421965 Search in Google Scholar

Y. Oda, H. Fudaba, G. Neubig, H. Hata, S. Sakti, T. Toda, and S. Nakamura, “Learning to generate pseudo-code from source code using statistical machine translation,” in 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE), Lincoln, NE, USA, Nov. 2015, pp. 574–584. https://doi.org/10.1109/ASE.2015.36 Search in Google Scholar

D. A. Plaisted, “Source-to-source translation and software engineering,” Journal of Software Engineering and Applications, vol. 6, no. 4A, pp. 30– 40, Apr. 2013. https://doi.org/10.4236/jsea.2013.64A005 Search in Google Scholar

M. Harsu, “Identifying object-oriented features from procedural software,” Nordic Journal of Computing, vol. 7, no. 2, pp. 126–142, 2000. Search in Google Scholar

S. Corporation, “Coverty: Static analysis tool,” Synopsys Documentation Portal. [Online]. Available: https://www.synopsys.com/software-integrity/static-analysis-tools-sast/coverity.html Search in Google Scholar

I. Gomes, P. Morgado, T. Gomes, and R. Moreira, “An overview on the static code analysis approach in software development,” Faculdade de Engenharia da Universidade do Porto, Portugal, 2009. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=ce3c584c906eea668954f6a1a0ddbb295c6ec5a2 Search in Google Scholar

T. D. Oyetoyan, B. Milosheska, M. Grini, and D. Soares Cruzes, “Myths and facts about static application security testing tools: An action research at telenor digital,” in Agile Processes in Software Engineering and Extreme Programming. XP 2018. Lecture Notes in Business Information Processing, J. Garbajosa, X. Wang, and A. Aguiar, Eds., vol 314. Springer, Cham. https://doi.org/10.1007/978-3-319-91602-6_6 Search in Google Scholar

M. Zitser, “Securing software: An evaluation of static source code analyzers,” Master’s Thesis, Massachusetts Institute of Technology, Cambridge, MA, 2003. Search in Google Scholar

T. Hofer, “Evaluating static source code analysis tools,” Tech. Rep., Ecole Polytechnique Fédérale de Lausanne, 2010. [Online]. Available: https://core.ac.uk/download/pdf/147963417.pdf Search in Google Scholar

M. Ashouri, “Practical dynamic taint tracking for exploiting input sanitization error in java applications,” in Information Security and Privacy: 24th Australasian Conference, ACISP 2019, Christchurch, New Zealand, Jul. 2019, pp. 494–513. https://doi.org/10.1007/978-3-030-21548-4_27 Search in Google Scholar

N. Manzoor, H. Munir, and M. Moayyed, “Comparison of static analysis tools for finding concurrency bugs,” in 2012 IEEE 23rd international symposium on software reliability engineering workshops, Dallas, TX, USA, Nov. 2012, pp. 129–133. https://doi.org/10.1109/ISSREW.2012.28 Search in Google Scholar

F. Thung, Lucia, D. Lo, L. Jiang, F. Rahman, and P. T. Devanbu, “To what extent could we detect field defects? An empirical study of false negatives in static bug finding tools,” in Proceedings of the 27th IEEE/ACM International Conference on Automated Sof tware Engineering, Sep. 2012, pp. 50–59. https://doi.org/10.1145/2351676.2351685 Search in Google Scholar

M. G. Nanda, M. Gupta, S. Sinha, S. Chandra, D. Schmidt, and P. Balachandran, “Making defect-finding tools work for you,” in Proceedings of the 32Nd ACM/IEEE International Conference on Software Engineering, vol. 2, May 2010, pp. 99–108. https://doi.org/10.1145/1810295.1810310 Search in Google Scholar

E. E. Schultz Jr, D. S. Brown, and T. A. Longstaff, “Responding to computer security incidents: Guidelines for incident handling,” Tech. Rep., Lawrence Livermore National Lab., CA (USA), 1990. Search in Google Scholar

A. S. S. C. A. T. K. SonarQube, “Code Quality Tool & Secure Analysis with SonarQube.” https://www.sonarsource.com/products/sonarqube/ Search in Google Scholar

J. Novak, A. Krajnc, and R. Zontar, “Taxonomy of static code analysis tools,” in The 33rd International Convention MIPRO, 2010, pp. 418–422. [Online]. Available: https://www.researchgate.net/publication/251940397_Taxonomy_of_static_code_analysis_tools Search in Google Scholar

E. E. Mills, Software Metrics. Software Engineering Institute, 1988. Search in Google Scholar

L. Rosenberg, T. Hammer, and J. Shaw, “Software metrics and reliability,” in 9th international symposium on software reliability engineering, Nov. 1998. Search in Google Scholar

S. R. Chidamber and C. F. Kemerer, “A metrics suite for object oriented design,” IEEE Transactions on Software Engineering, vol. 20, no. 6, pp. 476–493, Jun. 1994. https://doi.org/10.1109/32.295895 Search in Google Scholar

MacDonell, S.G., Buckingham, D., Gray, A.R. and Sallis, P.J., 2002. Software forensics: extending authorship analysis techniques to computer programs. Journal of Law and Information Science, 13, pp.34-69. https://openrepository.aut.ac.nz/server/api/core/bitstreams/963770a7-cd78-4105-8385-99c6b0c4f64e/content Search in Google Scholar

“PYPL PopularitY of Programming Language index.” [Online]. Available: https://pypl.github.io/PYPL.html Search in Google Scholar

J. Zheng, L. Williams, N. Nagappan, W. Snipes, J. Hudepohl, and M. Vouk, “On the value of static analysis for fault detection in software,” IEEE Transactions on Software Engineering, vol. 32, no. 4, pp. 240–253, Apr. 2006. https://doi.org/10.1109/TSE.2006.38 Search in Google Scholar

D. Guaman, P. Sarmiento, L. Barba-Guaman, P. Cabrera, and L. Enciso, “SonarQube as a tool to identify software metrics and technical debt in the source code through static analysis,” in 7th International Workshop on Computer Science and Engineering, WCSE, Jan. 2017, pp. 171–175. Search in Google Scholar

J. García-Munoz, M. García-Valls, and J. Escribano-Barreno, “Improved metrics handling in SonarQube for software quality monitoring,” in Distributed Computing and Artificial Intelligence, 13th International Conference, S. Omatu et al., Eds., Springer Cham, Jun. 2016, pp. 463– 470. https://doi.org/10.1007/978-3-319-40162-1_50 Search in Google Scholar

V. Lenarduzzi, F. Lomio, H. Huttunen, and D. Taibi, “Are SonarQube rules inducing bugs?” in 2020 IEEE 27th international conference on software analysis, evolution and reengineering (SANER), London, ON, Canada, Feb. 2020, pp. 501–511. https://doi.org/10.1109/SANER48275.2020.9054821 Search in Google Scholar

C. Vassallo, F. Palomba, A. Bacchelli, and H. C. Gall, “Continuous code quality: are we (really) doing that?” in Proceedings of the 33rd ACM/IEEE International Conference on Automated Software Engineering, Sep. 2018, pp. 790–795. https://doi.org/10.1145/3238147.3240729 Search in Google Scholar

D. Marcilio, R. Bonifacio, E. Monteiro, E. Canedo, W. Luz, and G. Pinto, “Are static analysis violations really fixed? A closer look at realistic usage of SonarQube,” in 2019 IEEE/ACM 27th International Conference on Program Comprehension (ICPC), Montreal, Canada, May 2019, pp. 209– 219. https://doi.org/10.1109/ICPC.2019.00040 Search in Google Scholar

M. A. Al Mamun, A. Khanam, H. Grahn, and R. Feldt, “Comparing four static analysis tools for java concurrency bugs,” in Third Swedish Workshop on Multi-Core Computing (MCC-10), 2010, pp. 18–19. Search in Google Scholar

V. Lenarduzzi, F. Pecorelli, N. Saarimaki, S. Lujan, and F. Palomba, “A critical comparison on six static analysis tools: Detection, agreement, and precision,” Journal of Systems and Software, vol. 198, Apr. 2023, Art. no. 111575. https://doi.org/10.1016/j.jss.2022.111575 Search in Google Scholar

R. P. Jetley, P. L. Jones, and P. Anderson, “Static analysis of medical device software using CodeSonar,” in Proceedings of the 2008 workshop on Static analysis, Jun. 2008, pp. 22–29. https://doi.org/10.1145/1394504.1394507 Search in Google Scholar

M. Beller, R. Bholanath, S. McIntosh, and A. Zaidman, “Analyzing the state of static analysis: A large-scale evaluation in open source software,” in 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), vol. 1, Osaka, Japan, Mar. 2016, pp. 470–481. https://doi.org/10.1109/SANER.2016.105 Search in Google Scholar

Snyk, “Snyk analysis tool for code security & code quality scanning,” 2023. [Online]. Available: https://snyk.io/product/snyk-code/ Search in Google Scholar

S. A. Licorish and M. Wagner, “Combining GIN and PMD for code improvements,” in Proceedings of the Genetic and Evolutionary Computation Conference Companion, Jul. 2022, pp. 790–793. https://doi.org/10.1145/3520304.3528772 Search in Google Scholar

N. Ayewah and W. Pugh, “The Google FindBugs fixit,” in Proceedings of the 19th international symposium on Software testing and analysis, Trento Italy, Jul. 2010, pp. 241–252. https://doi.org/10.1145/1831708.1831738 Search in Google Scholar

T. Sharma, M. Kechagia, S. Georgiou, R. Tiwari, I. Vats, H. Moazen, and F. Sarro, “A survey on machine learning techniques for source code analysis,” arXiv preprint arXiv:2110.09610, 2021. Search in Google Scholar

Sprache:
Englisch
Zeitrahmen der Veröffentlichung:
1 Hefte pro Jahr
Fachgebiete der Zeitschrift:
Informatik, Künstliche Intelligenz, Informationstechnik, Projektmanagement, Softwareentwicklung