6 Comments
Mar 17, 2023Liked by Maarten Dalmijn

Rebuilds are sometimes necessary when the technology used to build the application has been replaced by a newer technology. I was Scrum Master of a team that was rewriting a B2B web application originally written in AngularJS which was replaced by Angular. For two very good reasons, it was decided to rewrite the application in Angular. First, Angular is a better language, based on TypeScript. Second, over time, it would have been increasingly difficult to find AngularJS developers to support the application.

The rewrite was 100% successful and included some better design elements. Over time, the new application also had 100% coverage of automated tests for what could be automated.

Expand full comment

This really spoke to me and my experience. I am looking forward to Part 2!

Expand full comment

I enjoyed the insight of this article, but in the end it fell in the trap of actually misquoting that famous quote "Premature optimization is the root if all evil".

https://ubiquity.acm.org/article.cfm?id=1513451

That should be used in the context it was intended too. I think we've taught a couple of generations of programmers to just repeat that as a mantra, without doing the due diligence and applying some critical thinking to it.

Expand full comment
author

Thanks!

I've read the article, and I don't see the conflict.

But please elaborate, maybe I'm missing something and happy to amend if the case.

I support your perspective, that's also what I stress in the article.

Expand full comment
author

If your point is:

"Its usually not worth spending a lot of time micro-optimizing code before its obvious where the performance bottlenecks are."

That I'm using the quote beyond micro-optimization you are correct.

But I don't believe it's an unwarranted stretch because in one case it's performance bottlenecks and in the other we are talking about system problems / bottlenecks.

You are right that I'm talking about macro-optimization, not micro-optimization. But when the quote was made, performance was a much bigger consideration (generally speaking), due to slower CPUs.

Expand full comment

IMHO that quote should never be repeated(a bit of a strong statement I know), because most of the people that stumble on it don't have the full context(I've seen that first hand with people I worked) and take it as a confirmation of some truth they thought possessing. When we design we should optimize to the local minima of our design space, from selecting the tech stack to adjusting our design to the access patterns of our data. When developing, it makes a difference if our data structures or algorithms have O(n) complexity or amortized O(1), and that is thinking about optimization. I think that in the craft of sw dev we should actually do early optimizations as much as we can and we should iterate on that as fast as we can. I have seen that quote misused and used as an excuse so many times, to justify having optimization as a after thought and then delivering poor performing software. 😃 And I will disagree with you one more time, if you'll allow me. Performance is still a big consideration today, AWS is switching running more loads on a different CPU arch.(Graviton) because it performs better with a better perf/Watt ratio, it's gearing the implementation of its internal services towards more efficient programming languages(Rust) because the resulting programs are more efficient and consume less CPU cycles, hence less power - that at scale has huge implications in terms if costs. So I am of the opinion and of thought that we actually should always keep optimization in mind, and as early as possible.

Expand full comment