mardi 2 avril 2019

In cognitive psychology is refactoring a cost or a benefit?


Did you ever get frustrated when someone refactored your code? Or did you refactor someones code just to find out that the original developer wasn't all too happy about it?  This certainly happened to me a few times. Here's my understanding of what's happening from the point of view of cognitive psychology.

In cognitive psychology there's the term Long-Term Memory (LTM), which is roughly what we usually call just memory. What's interesting with our LTM is that it's optimised for storing large amounts of data, but not for changing existing data.

When we refactor some code we're forcing the developers knowing the code to rewrite their long-term memory. Remember that LTM is made to store a lot but not to be updated. That's a cost we're imposing on the original author. Sure the intention is to make future features cheaper to implement, but that's still a certain cost vs a potential benefit, later.

Given this we can better understand the frustration that a developer might feel when his/her code is thoroughly refactored. The bigger the change the higher the cost for the developers who know the code. In particular if the only motivation for changing the code is to make it cleaner or better, then that frustration might well be justified. After all better for whom? Certainly not for the original developer!

On the other hand whenever we have to evolve the code we're forced to rewrite the code and the corresponding LTM anyway, and if the code is not very well decoupled then it is quite unclear what part of our mental model of the code has to be rewritten. Should our brain just hang on to the original image or replace it or a part of it? The resulting LTM rewrite is probably imprecise, or requires more effort. The more complex the code, the harder it is to remodel and if it contains accidental complexity, we pay accidental remodelling cost. In short, with bad code that keeps changing, it's expensive to maintain a good mental model of the code in our long term memory.

If instead it was implemented respecting the principles of Single Responsibility Principle (small cohesive parts) and Open-Closed Principle (for new features code is added, instead of changed) then for any change in functionality we'd only have to swap one well identified part of the code. It becomes clear to our brain which part has to be rewritten in LTM, hence the cost of a rewrite is low. By respecting the principles of SRP and OCP we minimise the cost of maintenance in our LTM.

Conclusion and disclaimer


Cognitive psychology tells us there's an additional cost of refactoring in the form of (cache) invalidation of our mental model of the code. Of course this doesn't mean we should avoid refactoring as it also tells us that we need well factored code in order to minimise the cost as new features get added. By understanding the forces at play we can better choose when and what to refactor.

I'm nowadays very wary of refactoring code that doesn't need new behaviour. It hasn't always been so, indeed for several years I happily refactored code just because I could. Learning the basics of cognitive psychology has thought me to concentrate refactoring efforts to whenever I need to change the code anyway.

As a last word, there are valid reasons for refactoring code without a need for change. For instance for readability: naming things, extracting methods. Another good example is removing accidental complexity.

Would you like to hear more about this? Like how our LTM is feed through Chunking and Short Term Memory and how to optimise for that?

2 commentaires:

  1. I love your conclusion "I'm very wary of refactoring code that does not need new features".

    I've also moved away from the Red-Green-Refactor loop in favor of the Refactor-Red-Green loop, or even Red-DisableTest-Green-Refactor-Red-Green... (https://philippe.bourgau.net/dont-stick-to-tdds-red-green-refactor-loop-to-the-letter/)
    Kent Beck said 'Make the change easy, then do the change'

    It's very nice to have a bit more theory to back best practices :)

    RépondreSupprimer
    Réponses
    1. Thanks for the feeedback. I've never even thought of Preparatory Refactoring (https://martinfowler.com/articles/workflowsOfRefactoring/#preparatory) as opposed to TDD. Reading the comments of your post, you don't think of it that way either now. It's a very clear post!

      Anyway I certainly try to do that as often as I can. In fact it's the cornerstone of my favorite approach to TDD in legacy code that I call the 3Ps (Protect, Prepare, Produce). It reminds me I should write that up really soon, thanks!

      Supprimer