A rule is known or it is not. A ‘best practice’ is applied or is not. But if it is not applied, is it because it is not applicable?
You must present the results of your initial Cobol analysis and of course, you want them to be as most relevant as possible in order to prove their value to the project teams, providers, stakeholders, etc.
This requires defining a Quality model – a set of rules and levels of severity – that allows the rapid identification of the most costly and dangerous ‘bad practices’. Obviously that would be a failure if you point at a violation to a ‘best practice’ which is not one – as for example, the use of SQL code (see our last post).
What are the rules you can use or not? What critical thresholds should you choose? How to adjust the Quality model in a Sonar Quality profile for your Cobol applications?
We will show in this post how to set up your own Quality model, using a Sonar View and a very useful widget. Continue reading
The previous posts about the preparation and analysis of Cobol code with Sonar and Jenkins drew some anxious comments about the results of the analysis and the rules available in the Sonar dashboard.
Do these results allow an evaluation of the quality of Cobol applications? What value do we deliver to teams, stakeholders and management? And for those who are not familiar with the Mainframe world, what are the ‘best / bad practices’ in terms of Cobol code?
Many questions, and we’re not going to answer all of them in one post. This one will be dedicated to the presentation of different rules and quality defects, frequently encountered in Cobol applications.
The objective is: you have made an analysis, the results appear in the Sonar dashboard. Now, where to start?
We have seen in our previous post how to analyze Cobol code with Sonar and Jenkins.
But in fact we have not used all the existing Cobol rules. Why? Some rules are disabled because they are specific to a particular context and then require some setup. For example, naming rules are not standardized in Cobol, and will often be different between different departments or even between teams from the same department.
So we need to manage different Quality models corresponding to different set of rules, depending on the project. Sonar gives us that opportunity, thanks to the ‘Quality profiles’ which include active rules during a code analysis.
We will see in this post, how to create a new profile with all the Cobol rules, and assign it to an existing project. Continue reading
Let’s continue our serie about the analysis of Cobol code, with the objective to demonstrate that it is simple and easy to initiate a process of evaluation of the quality of this Legacy code, without being a Mainframe expert.
You already have a platform of code analysis with Sonar and Jenkins. If this is not the case, an earlier serie of posts will explain you how to install these tools:
in our environement.
You are used to analyze Java code or .NET with this platform and you got the idea, or you were asked, to do the same for Cobol applications. The problem: you know nothing of the Mainframe world.
Don’t panic. Our two previous posts explained:
Now is time to implement the process of analysis in our platform Sonar-Jenkins. Let’s play. Continue reading
We have seen last week what you need to know about Mainframe-Cobol, before going into a meeting with the specialists of this technology and prepare a process of Cobol application analysis with Sonar.
We will now examine the questions you need to ask in order to organize these analysis (that we will present in a future article).
These questions also allow you to specify the rules for the delivery of the Cobol source code.