Open Source and Legacy code

There are more and more solutions of analysis of code which allow to measure the quality of your applications. Most are sold by software vendors, and we had the opportunity to verify that these solutions are expensive to buy, to implement and to use (Disposable software). In response, the last decade has seen the rise of the Open Source alternative to proprietary software.

I often hear about Open Source solutions that:

  • They analyze only Java.
  • They require a strong Open Source / J2EE technical expertise.
  • These are tools for developers.

The first Open Source tools to analyze code – like PMD, FindBugs, Checkstyle – are actually focused on the Java code and mainly used by developer’s profiles. The Sonar platform that we used in our previous posts in order to analyze Cobol code (Sonar – First Analysis, Sonar – Cobol analysis with Jenkins) has many plugins (Open Source or commercial) for many languages​​, not just new technologies oriented, but also for Legacy technologies such as Cobol, Abap or from the Microsoft world.

We were also able to verify that no technical knowledge is required. We did use:

  • A Windows / Oracle environment very common in the business world.
  • Tomcat running as Windows service, to install Sonar and Jenkins.
  • Transparent integration between Sonar and Jenkins, with the help of the Sonar plugin.
  • The Sonar Java Runner: it allows us to carry out at once analysis without the need for other tools such as Ant or Maven, which actually require some J2EE knowledge.

Finally, we saw in our last post how to implement an Analysis Server for Legacy code based on Sonar / Jenkins, and how we can organize our analysis environment in different ways. Here are some use cases for such a server.

Quality Gate

Every new version of an application is delivered into a specific directory. Analyses are already set and run automatically from Jenkins. An alert can be set in the Sonar portal on the occurrence of certain defects with a certain criticality, which will trigger a Go / NoGo, ie. acceptance or rejection of this version for the next phase (test or production).

Benchmarking of providers

A set of metrics are incorporated into Service Level Agreements (SLA) for different providers working on the maintenance of various applications. The code for each correction or change or new version is delivered into our environment. Analyses can be performed manually or automatically (for example, overnight). If the metrics that constitute the service agreement are not met, the provider may be forced to deliver a new delivery or undergo certain penalties. The manager in charge of these providers can use the Sonar portal at the end of each month, to get objective data that allow him to compare the quality of service of each of them.

Application Governance

In each project, code analysis are regularly conducted (every week for example) regardless of any new version. The managers of these projects can monitor over time factors such as maintainability, capacity of evolution or technical debt, factors that directly impact project costs and time to market. These data are aggregated monthly or quarterly in a Sonar application portfolio and reviewed by IT management.

These are just a few examples demonstrating that tools such as Sonar and Jenkins are not just for developers, J2EE gurus or Java applications.

We will have the opportunity to detail these use cases in our future posts, with Open Source tools and Legacy code.

This post is also available in Leer este articulo en castellano and Lire cet article en français.

Leave a Reply

Your email address will not be published. Required fields are marked *