Today we continue our evaluation of the quality of Cobol code with Sonar.
In the previous post, we worked with the metrics measuring code size, complexity, level of documentation and duplication, which allowed us to formulate some initial recommendations to the people responsible for this application. Continue reading
Code quality has been a constant concern for ages. Bad practices generate defects that impact users and costs of maintainability. Technical Debt, at first a simple metaphor, has since become a tool for measuring application quality and costs.
A few years ago, software that helped to identify these defects were rare and expensive. Today, Open Source tools such as Sonar allow everyone – project teams, providers, consultants, etc.. – to detect easily and cheaply these bad practices.
The Open Source world has long suffered from its image of ‘geek’ because these tools were first used by J2EE enthusiasts. But times have changed, and it is now possible to analyze Legacy code, such as Cobol and ABAP with Sonar.
This is the objective of our series of posts: show that it is possible to assess the quality of Cobol applications without knowing anything of Mainframe world.
The previous posts about the preparation and analysis of Cobol code with Sonar and Jenkins drew some anxious comments about the results of the analysis and the rules available in the Sonar dashboard.
Do these results allow an evaluation of the quality of Cobol applications? What value do we deliver to teams, stakeholders and management? And for those who are not familiar with the Mainframe world, what are the ‘best / bad practices’ in terms of Cobol code?
Many questions, and we’re not going to answer all of them in one post. This one will be dedicated to the presentation of different rules and quality defects, frequently encountered in Cobol applications.
The objective is: you have made an analysis, the results appear in the Sonar dashboard. Now, where to start?
Let’s continue our serie about the analysis of Cobol code, with the objective to demonstrate that it is simple and easy to initiate a process of evaluation of the quality of this Legacy code, without being a Mainframe expert.
You already have a platform of code analysis with Sonar and Jenkins. If this is not the case, an earlier serie of posts will explain you how to install these tools:
in our environement.
You are used to analyze Java code or .NET with this platform and you got the idea, or you were asked, to do the same for Cobol applications. The problem: you know nothing of the Mainframe world.
Don’t panic. Our two previous posts explained:
Now is time to implement the process of analysis in our platform Sonar-Jenkins. Let’s play. Continue reading
We have seen last week what you need to know about Mainframe-Cobol, before going into a meeting with the specialists of this technology and prepare a process of Cobol application analysis with Sonar.
We will now examine the questions you need to ask in order to organize these analysis (that we will present in a future article).
These questions also allow you to specify the rules for the delivery of the Cobol source code.
I did some posts some months ago (Sonar – Cobol analysis with Jenkins, Open Source & Legacy code) with the idea to to show that you don’t need to be a J2EE guru or an Open Source expert to analyze Legacy code like Cobol (or ABAP) with Sonar and Jenkins.
And I see people going to my blog to look at these pages, which show some interest for this subject.
Now, as someone said recently: “We’ve got Sonar analysis well under way there. We’re contemplating putting Cobol under analysis as well, but I don’t understand the mainframe side and the Cobol-ers don’t understand the Jenkins & Sonar side”.
So, let’s try to help. This post (and following ones) has the objective to describe what you should know to implement a process of Cobol code analysis with Sonar and Jenkins.
We saw in the previous post how to install the Sonar plugin for Jenkins. It is now time to make our first analysis from Jenkins. It will focus on the Cobol source code already used in our first analysis with Sonar and the Java Runner. Continue reading