Every coder has parts of programming they love, and parts they hate. Additionally, they have a mental model of what is risky, and what isn't. Languages, by their design, will make some things easier and some things harder, and so when people "love a language," they typically mean "it maps to what gives me dopamine."
A good example is mutable state: for a bunch of people, functional languages that enforce immutability has this calming effect, since you know your data isn't mutating where you can't see it. You've been burned by C++ code that's passing references as arguments, and you don't know if the list you received as an argument will be the same list after you've passed it as an argument to a different function. You don't know if you can futz with that list and not make a problem for someone somewhere else.
But for most people, they much prefer how "intuitive" it is to have mutable state, where they just change the thing in front of them to be what they need. This is especially true in the context of for loops vs. recursion: "why can't I just use a for loop and increment a counter!" A lot of Golang folks love that it explicitly rejects functional mapping primitives and "all you need is a for loop."
It's a very personal decision, and while IMO it doesn't really matter for the ultimate business success (usually companies fail because of something that's not tech-related in the least), it does shape _how_ it feels to work on a tech stack, and I'd argue, what kinds of technical problems you run into.
I'm really torn -- you and your engineers should be excited to work on your codebase. You should enter it and be like "yes, I've made good choices and this is a codebase I appreciate, and it has promise." If you have a set of storylines that make this migration appropriate, and its still early in the company that you can even do this in 3 days, then by all means, do it! And good luck. It'll never be cheaper to do it, and you are going to be "wearing" it for your company's lifetime.
But a part of me is reading this and thinking "friend... if PostHog was able to do what they're doing on the stack you're abandoning, do you think that stack is actually going to limit your scalability in any way that matters?" Like, you have the counterexample right there! Other companies are making the "technically worse" choice but making it work.
I love coding and I recognize that human beings are made of narratives, but this feels like 3 days you could have spent on customer needs or feature dev or marketing, and instead you rolled around in the code mud for a bit. It's fine to do that every now and then, and if this was a more radical jump (e.g. a BEAM language like Elixir or Gleam, or hell, even Golang, which has that preemptive scheduler + fast compiles/binary deploys + designed around a type system...) than I'd buy it more. And I'm not in your shoes so it's easy to armchair quarterback. But it smells a bit like getting in your head on technical narratives that are more fun to apply your creativity to, instead of the ones your company really needs.
The author addresses that in the article. Python can scale but then developers would have to work with unintuitive async code. You can think of it as a form of tech debt - every single decision they make will take longer because they have to learn something new and double check if they're doing it the right way.
> The author addresses that in the article. Python can scale but then developers would have to work with unintuitive async code
Python didn't cause their problems, Django did. They wanted async, but chose a framework that doesn't really support it. And they weren't even running it on an async app server.
Python didn't work for them because every subsequent choice they made was wrong.
I think you're saying the same thing that I am. Python didn't work for them because they didn't use it correct and so accelerated the amount of tech debt they created. Posthog is using Django and they've scaled so clearly they've figured something out with using Python/Django with async but it probably isn't intuitive because neither you nor the author know of a good way to support it.
I’m surprised the article doesn’t make more of TypeScript.
From a technical perspective, I find both python and node.js to be pretty underwhelming. If I had to pick a shiny new thing it would probably be one of the usual suspects like Rust.
But last time I worked with Python (2022), types in python were pretty uninspiring. In 2022 typescript was already very powerful and it just keeps improving.
Whats "really good"? Pydantic? Mypy with dataclasses and built in typings? Is integration with Django okay?
Genenily curious, not sarcastic.
Im coming from from static typing and learning python ecosystem. Im still searching to make it work for me
Pydantic is good. Mypy and pyright are good enough for type checking real projects. I run mypy as pre commit. It takes time but it has saved me from real bugs.
The type system coupled with pydantic for validation is more expressive and ergonomic than java / go. But it's also lousy when working with people who don't have the type oriented thinking (especially juniors). You need to enforce people to type public signatures and enable strict linter settings.
Mixed:
Library wise, FastAPI ecosystem is type-first and works well. But old world ecosystems like django - I don't have first hand experience.
SQL alchemy seems to be getting better. But I wish something type-first similar to sqlc or room or Micronaut Data JDBC existed for python, where I could just use pydantic validated DTOs and use query builder, rather than dealing with SQLAlchemy's proxy objects. It's workable though. I would suggest keeping SQLA objects in only the layer that touches the DB and convert them to pydantic models at the boundary.
Library support is hit or miss. In common web dev, I get good typings as long as I stick to popular and "boring" libraries. Sometimes I have to look at docstring and use typing.cast to return types.
Cons:
new type checking solutions like pyrefly aren't there yet for my use cases. Ruff is good as linter and VSCode extension.
IDE extensions and mypy still miss some bugs which should never happen in typed languages. ESP with coroutines. (I wish there was a way to make it an error, to call coroutines without await, unless submitting them to a asyncio.run or gather etc.., dart has very good async linting in comparison).
Writing a: dict[tuple[str, str, str], int] = {} is no fun. But it guarantees if I use a wrong type of key down in the function, I will get a red squiggle.
It's insane to me how making the technically worse choice is okay to you because some company out there is "making it work"
Also what could you do in 3 mere days that would pay off more than having the code in a language that the team is much more efficient with, one which doesn't need hacks to "make it work"?
It would save you several days on features forever, compared to doing one thing for just 3 days.
Basically, you have said it: It is about what the team is knowledgeable about.
In my book Nodejs doesn't belong on the server, but that's the choice they made. Python at least is thought out as a backend language, but can also be criticized for many aspects. If a team is more knowledgeable about modern languages, of course there are many technically probably better choices than both Nodejs or Python.
I have to spend 3 days working on someone else's "narratives that are more fun to apply their creativity to" all the time, even when my intuition and experience tells me it isn't a good idea. Sometimes my intuition is wrong. I've yet to meet a product manager that isn't doing this even when they claim to have all the data in the world to support their narrative.
Personally I don't think there's anything wrong with scratching that itch, especially if its going to make you/your team more comfortable long term. 3 days is probably not make-or-break.
>if PostHog was able to do what they're doing on the stack you're abandoning, do you think that stack is actually going to limit your scalability in any way that matters?
Also, considering the project is an AI framework, do you think the language ChatGPT is built on is a worse choice than the language we use because it's in the browser?
I was just thinking... "BugHog? The platform famously broken more often than not?"
We have a whole posthog interface layer to mask over their constant outages and slowness. (Why don't we ditch them entirely? I, too, often ask this, but the marketing people love it)
It sounds like the author and their team were more comfortable with node.js than python. they acknowledge fastapi was a good alternative that could solve their issues and allow some code reuse, but decided not to because they just wanted to use node.
the gist of this blog post is this company knew and understood node better than python, so they migrated to what they knew.
Similar topic explored in videos from a friend of mine I call the "Adventure Game sommelier," because he's played so many of them and can recommend you one for precisely your needs. The first
Sierra's Thexder haunted me for years, because I couldn't understand the tinny Japanese at the start of it at all or even tell what language that was, but I remembered the sound from playing it on an Apple ][ GS as a kid despite that.
And tell me if you can hear "Siera ga ookuri suru Thexder" (Thexder presented by Sierra) there. Even now, at most I can make out Sierra and a distorted bit of Thexder, even knowing how Thexder is rendered into katakana.
lmaoooooo buddy "oracle" implies they know something; the people brought in to fire people at Twitter spent almost no time in determining who was good or even how the company worked, completely undermining your thesis. It was madness: people were instructed to print paper copies of their code to bring into an office, like it was 1995. Remember geohot in a Spaces saying "the main problem with Twitter is that you can't run it locally?", as if any company of that size has run that way at any point in the last decade? Additionally, the horrible communication and chaos made it even harder for performers to perform. As others have pointed out, its stock lost a ton of value, it performs worse financially, and as a product by virtually every other metric has gone to hell (outages, CSAM safety, spam, bots...).
You tell a decent story at the start, but your choice of example couldn't be worse.
It's a thing tech (and tech-adjacent) people say when they rub their temples and decide to try to work out the solution to a problem from their existing ideas and biases instead of considering what anyone else has said about it.
He won the popular vote by less than Hillary won it in 2016, _when she lost the election._ Every time he's been candidate or president, he's the least popular of either in the history we've polled popularity or approval. Many pluralities who constitute "America" (felons, especially when you consider how many people we imprison; and immigrants) who are subject to US law don't get to vote. And many institutions (DC not having senators, Puerto Rico, and the Senate generally) are barely representational and serving the function of democracy.
So I know it makes you feel intelligent and cool to say "ah, but you see: this is what they wanted," but in every other way you can measure it besides the very narrow way you're focused on, it's as untrue as it could be.
It's not a panacea, and the way people talk about it drives me crazy. There are many different modalities, with very different levels of effectiveness for any given person. CBT is awful for me, for example, and it's the most popular modality. I also did ketamine-assisted therapy and it absolutely changed my life.
There are definitely people who won't get anything from it, but the reflexive "therapy is useless" is a weird thing to perform when it's obviously helped a ton of people.
Haha I was checking the comments precisely to see if this was the case. This happens nearly everywhere a language that isn't Java, Python, Ruby, Go, PHP, or JavaScript is used. IMO it has more to do with tech labor arbitrage[1] than anything technical. Even if a system is punching above its weight, over time, the Weird Language Choice spooks people enough that they get the rewrite bug.
Bleacher Report is a funny example: it used to be a darling example of Elixir, where a migration from Ruby -> Elixir claimed a move from "150 Ruby servers to 5 (probably overprovisioned) Elixir servers."[2] But then management and politics got scared, moved it all to more conventional tech, and the whole system suffered (see this legendary post[3]).
Fred Hebert describes a similar thing happening with a migration from Erlang deployments to Go/Docker/immutable, where you lose some pretty valuable capabilities by migrating to more conventional tech.[4]
I don't see this changing anytime soon -- we came of age when it was viable to attract investment with the promise of tech innovation. These days, those are liabilities because managers misunderstood the "Use Boring Technology" post the way consultants bartardized "Agile" (taking decent advice and misunderstanding it into something wholly different and horrifying). The result is you've got companies with customers in the 1000s using k8s, calling it "simple" and "Boring," whereas that same company would be called amateur if they did things like stateful deploys on-prem.[5]
Love your take on it. From a fellow prog languages enthusiast that matured, I can safely say that you are right.
I've coded in so many languages in my life, but the job market pull from Ruby always drags me back, with time, I began to really love and appreciate what Ruby is.
I still find Ruby a bit niche than other mainstream languages like Java and Python, I bet that if I had >5 years of Java, the Java market pull would be higher than Ruby and I'd be doing Java.
A good example is mutable state: for a bunch of people, functional languages that enforce immutability has this calming effect, since you know your data isn't mutating where you can't see it. You've been burned by C++ code that's passing references as arguments, and you don't know if the list you received as an argument will be the same list after you've passed it as an argument to a different function. You don't know if you can futz with that list and not make a problem for someone somewhere else.
But for most people, they much prefer how "intuitive" it is to have mutable state, where they just change the thing in front of them to be what they need. This is especially true in the context of for loops vs. recursion: "why can't I just use a for loop and increment a counter!" A lot of Golang folks love that it explicitly rejects functional mapping primitives and "all you need is a for loop."
It's a very personal decision, and while IMO it doesn't really matter for the ultimate business success (usually companies fail because of something that's not tech-related in the least), it does shape _how_ it feels to work on a tech stack, and I'd argue, what kinds of technical problems you run into.