Category Archives: Application Quality

Legacy application – Refactoring or reengineering? (VII)

Qualilogy Legacy Application Refactoring ReengineeringWe have seen in the previous post how to use the SonarQube dashboard to estimate the effort of caracterization tests, recommended by Michael Feathers in his book ‘Working Effectively with Legacy Code’.

We categorized the various components of our Legacy application (Microsoft Word 1.1a) in different groups, the simplest functions with Cyclomatic Complexity (CC) of less than 20 points, the complex and very complex functions up to 200 points CC, and finally 6 ‘monster’ components.

We built a formula based on the Cyclomatic Complexity and a readability factor, in order to evaluate the testing effort on each of these groups. Continue reading

Legacy application – Refactoring or reengineering? (VI)

Qualilogy -Application Legacy - Effort de testIn the two previous posts, we presented the definition of characterization tests as proposed by Michael Feathers in his book “Working Effectively with Legacy Code”.

We showed briefly how we can use such tests in order to acquire the knowledge of the application’s behavior. I say briefly because, ideally, we should have developped and presented some tests as examples, but that would require several posts, and this series is already very long. Just have a look to Michael Feathers book if you want to go more in depth into this.

We just have to remember that these tests will facilitate the transfer of knowledge from our Legacy application (Microsoft Word 1.1a), and any subsequent refactoring or reengineering operation will be faster and safer. Continue reading

Legacy application – Refactoring or reengineering? (V)

Legacy ou Reengineering - Tests de caracterisationIn our previous post, we have presented Michael Feathers and his book « Working Effectively with Legacy Code » according to which the absence of unit tests is the determinant factor of a Legacy application.

He proposes the concept of characterization tests to understand the behavior of the application, in order to qualify what it actually does, which is not exactly the same than discover through the code what it is supposed to do.

So what about our Legacy application which does not already have unit tests? Can we adress one of our three scenarios – transfering the knowledge of the application to another team – with unit tests? Would it be easier, especially if we also have to think to the other two strategies to evaluate: refactoring and reengineering? Continue reading

Legacy application – Refactoring or reengineering? (IV)

Qualilogy-Legacy-Reengineering-RefactoringBack from summer vacation and back on this series of posts about using SonarQube with a Legacy C application, in this case the first version of Word published by Microsoft in 1990.

We posed the following hypothesis: Microsoft has just been sold and its new owner asks you, as a Quality consultant, to recommend a strategy for this software.

Do not think that this never happens: software editors are purchased every day, and R&D people and software code are at the heart of these acquisitions.

Continue reading

Legacy C application – Refactoring or reengineering? (II)

WordCLegacy_UseCases2We continue this series about the Use Cases that may arise with a Lecagy C application: Word 1.1a, first version of this word procesor released by Microsoft in 1990.

The first two posts were dedicated to metrics of size, complexity, level of comments and duplications, as well as various ‘Issues’ Blocker, Critical, Major and Minor. Continue reading

Madrid DevOps – Continuous Integration

Madrid_DevOps2Madrid DevOps is a group of profesionnals dedicated to … DevOps, as you can imagine. There is a ‘Meetup Group’ where to find news, mainly about new meetings each month.

On April 10, the talk was titled ‘Continuous Integration’, with Manuel Recena Soto and Antonio Manuel Muñiz de ClinkerHQ. I asked some questions about their presentation, that you can find on https://speakerdeck.com/clinkerhq/integracion-continua (in spanish). Continue reading

Legacy C application – Refactoring or reengineering? (I)

WordCLegacy_UseCases1Let’s continue our series about assessing the quality of the source code of Word 1.1a, the first version of the word processor released by Microsoft in 1990.

In the first post we saw quantitative metrics: size, complexity, level of comments and duplication. The second post was devoted to various Issues Blocker, Critical, Major and Minor. Continue reading

Audit of a Legacy C application – Microsoft Word 1.1a (II)

C_Analysis_Word2In the previous post, we have examined the first results of the analysis of the source code of Word 1.1a (1990).

We counted 349 files, which is not huge, but with a rather high size: on average more than 470 LOC (Lines Of Code), and many of them well beyond 1 000 LOC. The complexity metrics are also high, and the rate of comments quite low, but it was probably normal at that time, more than 40 years ago.

The very low level of duplication maked me think that all the components needed to implement a feature can be found in the same file, which explains the size and complexity of many of them. It seems that the motto was : priority to efficiency, then readability and understandibility are less critical. Continue reading

Audit of a Legacy C application – Microsoft Word 1.1a (I)

C_Analysis_Word0Microsoft has released this week the source code of Word 1.1a (1990) to the Computer History Museum: http://www.computerhistory.org/atchm/microsoft-word-for-windows-1-1a-source-code/.

This is an early version of Word for Windows, January 1991:
http://blogs.technet.com/b/microsoft_blog/archive/2014/03/25/microsoft-makes-source-code-for-ms-dos-and-word-for-windows-available-to-public.aspx.

I was interested in analyzing the source code of this release. I was curious to see what would be the results from both a quantitative point of view – number of lines of code, complexity , etc. – and a qualitative one: violations of best programming practices, Blockers, Critical defects, etc. Continue reading