Being forced to live with more HW restrictions usually results in more reliance on SW creativity and better optimizations instead of lazy developers bloating SW to fill all available resources.
Just like how it's no surprise that websites developed where everyone has the latest and grates fully loaded M silicon MacBooks also sufferer from horrible lack of optimizations because "it works on my machine" while being a stuttery mess everywhere else.
Websites and the like are a different world from ML training, where devs seem to be more performance-conscious. But there's a weird reliance on CUDA because devs (rightfully) don't trust the alternatives.
Yeah, seen the same with PC gaming. Minimum specs absolutely exploding for no real reason other than the fact most gamers were buying the latest top tier cards. Then the Steam Deck came out and devs are forced to consider the fact that a 2D pixel art game shouldn't be lagging out on a SoC capable of producing stunning 3D graphics in a properly optimized game.
Well, the Steam Deck's release coincided with the popularity of reconstruction techniques. The Deck didn't "force" devs to consider optimization so much as it just gave them a low-end reconstruction target to play with. Without FSR and XESS, there's no doubt that the Deck would be a solidly last-gen console.
Strictly speaking, a lot of games really shouldn't be playable on the Steam Deck. Baldur's Gate III and Cyberpunk 2077 are both CPU-bound before reaching 60fps and can barely keep their head above 30fps running at 360p internal resolution. The Deck's saving grace is that it can tap into the same dynamic resolution mode that last-gen consoles depend on for consistent framerates.
Just like how it's no surprise that websites developed where everyone has the latest and grates fully loaded M silicon MacBooks also sufferer from horrible lack of optimizations because "it works on my machine" while being a stuttery mess everywhere else.