Shame I had to scroll this far to get to this comment. I have absolutely no idea why, in 2024, people still think we should let the government decide what is good/what we can see/what others can say.
Sure, but the framework cares about that for me. I don’t use rails personally but that’s the whole point — someone upstream of me is paying attention and making everything work together.
In contrast, I have work apps made in React that need regular piecemeal updating — routers, form libraries, query managers, CSS — because we’ve chosen to cobble that together ourselves. That’s fine, that’s the path we chose knowingly when we picked the tech we picked, but the point isn’t that frameworks don’t have dependencies — it’s that they take on more of the burden of managing them for you.
Well, Next is kinda like that then. It takes care of the sub-dependencies for you and when you upgrade, you just upgrade to the next major Next version (which isn't necessarily easy, but more so than upgrading 100 individual packages). They provide codemods for some stuff too.
I suspect that most rails, or next, projects add additional dependencies than just the framework. Generally the framework isn't the issue in my experience.
Sure, but it's not an either/or situation. Every big project adds dependencies, but using Next means you have some basic, common functionality included out of the box by default/by convention (like TypeScript, linting, testing, routing, caching, SSR, static builds, serverless definitions, etc.) all done in a predefined way. Maybe your project has 200 deps, but Next would replace like 50 of the big ones that you'd otherwise have to separately install and maintain. Just having a basic page/app router and minimal state system (via contexts and RSC and props and such) reduces a lot of the headaches of the bad old React Router days.
It replaces "React soup of the day" with a more standard "recipe" shared by most Next projects – like "Grandma Vercel's secret React minestrone", I guess. But yes, projects would typically still add their own "spices" on top of those basics.
This post is so interesting because it highlights the people that don't know anything about the requirements or state of cheats/anticheat. What you're describing is 10 years out of date. Every modern cheat has a toggle, and (almost) every modern cheater masks augmented behavior with misses/native behavior.
This thread is full of armchair developers who see a problem and immediately think, "Oh, it's easy, just do this simple thing I just thought of," as is there haven't been billions of dollars and decades of research spent on this problem.
According to the latest study [1] estimating how much money cheat developers make annually it is an upper limit of ~ $75M. I would say that the very liberal estimation of anti cheating efforts will cost maybe $100M annually. That does not include only research efforts but actual cost of tackle them (extra compute, reviewers...etc). This is unrealistic but even through to reach the point of billions (2-3 billions) you would say that Gaming companies were spending on average $100M since the beginning of personal computers era (on research only). This is not something that is hard to believe even with the most liberal interpretation.
So I think it is fair to say the there haven't been billions of dollars of research spent on this problem.
That's only looking at western audiences. In 2020, Tencent said that the cheating market in China is worth $293M annually [1]. In China there are many individual games making billions in annual revenue. PUBG bans over a hundred thousand cheaters every week. I don't think adding up to billions is too farfetched, if you count globally over the decades, although it'd be close.
There are also the costs of the opportunities that cheating prevents from happening. Development would be much faster and more types of games could be made.
So you can essentially skip half your character's progression arc by entering a credit card number.
Now, you can argue that the best gear is BOP (Bind on Pickup) so this isn't a huge factor, but there's still definitely an aspect of "pay to win", since there are plenty of other things you need gold for that payment skips.
You can also argue that WoW isn't competitive, but all multiplayer games have a light competition of being ahead of others in progression, even if it's not direct competition. (I'm ignoring PvP because actual PvP is a tiny minority interest. )
This description of the pay to win properties of WoW is slightly dated. Gold buys you very little in the way of gear these days. They have de-emphasized the role of gold over time because players kept buying it.
That didn't stop players from figuring out how to pay to win though. They now pay "boosting" and "carry" services - other people who group up with you and then clear dungeons while you just follow along behind them and collect the loot as it drops.
There are advertisers spamming ads for these carry services all over the place inside the game even though they're against ToS. It does still have its charms but on balance WoW really has become a train wreck.
This shows up in fighting games, where DLC (i.e. paid) characters often have increasingly overpowered properties or even entirely new mechanics that the rest of the cast struggles to deal with.
That's only 1/3rd of that list. Any multiplayer game is competitive to some degree. If you see my previous comment, it specifies "pay for advantage". Some games you have to pay to unlock gear or xp boosters to make it really playable.
> Some games you have to pay to unlock gear or xp boosters to make it really playable.
Again, can you provide an example? Also
> Any multiplayer game is competitive to some degree
Is just blatantly incorrect, unless you just mean "One player is further in the game than the other", in which case literally all games are "competitive", including single player.
You can either take the route of doing financial software, data science, ML and web dev _for a game company_, or you can start making games in your free time and then attempt a pivot.
I think the first path is probably easier and more lucrative, but will have a much longer time difference in terms of getting you where you want to go.
Having had lots of friends work there, the approach seems to be "Complain about working at Amazon for literal years but never really do anything about it", followed by "Get laid off"
I don't believe they can actually do anything about it as this "culture" comes from the very top.
I remember a few years ago an Amazon worker died in the workplace and his supervisor watched him die instead of helping him because "these were the rules" (see the related HN thread[0]). You can imagine what kind of place that is.
warehouse is very different than corporate. also "watched him die instead of helping him" is a lie. More correct would be "the supervisor walked him to medical staff instead of calling medical staff".
> also "watched him die instead of helping him" is a lie. More correct would be "the supervisor walked him to medical staff instead of calling medical staff".
The worker, who previously was asking for help and was refused any, reports a stabbing pain in the chest and ask to a doctor. He already walked to his manager a long distance and can not walk any more. The manager refuses to call a doctor and says he can walk with the worker to the doctor but doesn't help him in any way like giving a hand. So the worker tries to do his best, is walking slower and slower trying to catch his breath, and finally dies.
What defines harm? Or more specifically, who?