My programming career started in the late 80’s when I was about 10 years old. It was the time that personal computers were really rare. My father took me to his work at the Dutch National Railways. They had one brand new personal computer in its own special room. Fully equipped with Hercules card, Turbo button and himem and emm386 installed. It was there where I made my first programming steps in Basica and later GW-Basic. Text games, yet another start menu, ascii-art hang man, how cool was that!
It was not long before I moved to Turbo Pascal, the predecessor of the current Delphi (what a product, still a pity they missed the turn somewhere). At the time, object orientation did not yet exist (at least not in Pascal). All code was procedural, and peeks, pokes and interrupt handlers were the way to communicatie with hardware and the OS. But then version 5.5 of Turbo Pascal came out in 1989, which finally included object orientation! By far not as complicated as we have it now, but it worked! And it was brilliantly simple and beautiful without dependency inversion containers, decorators, overloadable operators and all other modern language features and libraries that make life overly complicated and only add to the confusion — unless when you’re more of a fitter than a real engineer.
After that, it was time for my studies, artificial intelligence. It was long before the time of the large neural networks and language models we have nowadays. It formed the basis for the next 20 years of my working career, at a leading supplier of OCR systems for the logistics market.
They just moved from procedural Pascal to object oriented Delphi, so I could eat my heart out as a software engineer to bring structure to the code base of the pattern recognition algorithms they used to read addresses on mail pieces. It was not long before our projects became more demanding, and I was involved more and more in creating new algorithms that were even more flexible and powerful, but still be highly performant, even when performing fuzzy search on very large data sets of addresses with very poor handwriting recognition results as input.
We introduced scrum to work together more constructively and to be in closer contact with the customer. It brought us freedom and responsibility at the same time, which was a major boost for our productivity.
When the company grew, we also started to automate the logistics within the sorting center (like connecting to the sorting machines and processing the images of hundreds of mail pieces per second). Using our own message bus (message buses were not yet as hot as they currently are nor were they performant enough for our needs), we managed to create a massive microservice platform that was scalable, flexible, and never let us down. Remember: This was in the era when monoliths were the only type of application we knew, and the cloud (let alone the current generation of microservices) were not even invented.
We were still using Delphi for this. The software was running as windows services, with all data stored on file system as embedded SQLite database. Life was so simple compared to current time cloud deployments. Softeware engineers wrote the software, Ops engineers did the deployment, and it all worked!
Then came the time for my first cloud project. We built a similar system (albeit much smaller), but then “scalable” (hey, was our old system not scalable?) and “high in the cloud” (buzzword alert). Our architects came up with the most complicated design ever for a system that did little more than receiving mail piece events and handling them one by one. Well, one by one… The architecture was such that all events were processed in random order, blindly overwriting the outputs of any concurrent events. As pragmatic engineers we immediately noticed that in the diagrams, but boy, what an effort to convince the architects. And what an incredible amount of hours we spent in creating a microservice implementation that got it all right.
Well — it was not just the creation of the code that took a lot of time. It was also the deployment complexity. And the mindset of some other engineers that dogmatically threw in all kinds of unnecessary design patterns. For a 30 line lambda! I was shocked to realize that we spent 20% of our time on programming; 80% on cloud annoyances; and that everybody, including architects and managers were happy with that!
Then, someone pointed me towards the game changer of my professional life. The stateful cloud! Virtual actors. Just write a stateful monolith that automatically scales like hell! After having tried the existing frameworks, all of which disappointed me because of their complexity, incompleteness or the way they fail to decouple domain code from underlying infrastructure (actors are GRPC handlers instead of plain objects, for example), I decided to write one myself. One that fits with my way of thinking. That is simple but powerful, unopiniated but complete, and that nicely decouples domain logic from technical details. And there she was: Darlean.
Despite having done succesful experiments with one of our largest logistical partners, it was considered too scary to try something new. After all those years where we had always been reinventing the wheel successfully because no existing tooling was good enough for what we needed, the game was over. No more reinventing the wheels, but sticking to existing tools because… Well.. Because of that. Because that’s the way we fit.
So it was time for me to move on. And here I am happily working at a great company in Zwolle, The Netherlands, that does embrace my vision, that does embrace Darlean, and that is not scared to reinvent the wheel when the wheel does not yet exist. How cool is that!