Hey, $DEITY did its absolute best with the constraints and the requirements. But hey, can't please everyone apparently. Be happy you can relieve yourself well past the intended warranty period. The parts were designed to be easily _aftermarket_ replaceable with sufficient advances in technology, retaining the fundamental design without changes.
First, taking the opportunity this discussion presents, I'd like to state for the record, AGAIN, that I have long appreciated the Win32 API and still do -- not because it's great in and out of itself necessarily, it certainly has more warts than your average toad native to the Amazon, but because it de-facto worked for a long while through simple iteration (which grew warts too though) _and_ while it didn't demand Microsoft had everything for _everyone_, it kept Win32 development stable "at the bottom", as the "assembly" layer of Windows development, which everything else was free to build on, _in peace_. Ironically -- looking at the volume of APIs and SDKs Microsoft is churning out today, by comparison, through sheer mass and velocity -- they've proven utterly unable to be sole guardians of their own operating system. There's a plethora of articles shared on Hacker News on this inadequacy on their part to converge on some subset of software that a Windows developer can use to just start with a window or two of their own, on the screen. Win32 _gave you exactly that_. And even `CreateWindow2` export would have worked beyond what `CreateWindow` or `CreateWindowEx` couldn't provide, because you could count on someone who loved it more to just abstract it with a _thin_ layer like WxWidgets etc. Things _worked_. Now there's internal strife between the .NET and "C++ or bust" teams at Microsoft, and the downstream developers are just everything between confused and irritated, this is entirely self-inflicted, Microsoft. It's also a sign of bloat -- if the company could split these groups into subsidiaries, they could compete on actual value delivered, but under the Microsoft umbrella, the result is entirely different.
Second -- and this is a different point entirely -- not two weeks ago there was at least _two_ articles shared here which I read with a mix of mild amusement and sober agreement, about the _opposite_ of what the author of the article linked above, advocates for -- _idiomatic_ design (usually one that's internally consistent):
What I am getting at is that this is clearly different people vocally preferring different -- _opposite_ -- UX experiences. From my brief stint with graphic design, I know there's no silver bullet there either -- consistency is on some level in a locked-horns conflict with creativity (which in part suggests _defiance_), but it's just funny that we now have examples of both, with the above, to which I should add:
> This is why we can't have nice things!
Also, while we "peasants" argue about which way good design should lean -- someone likes their WinAmp-like alpha-blended non-uniform windows and someone else maintains anything that's not defined by the OS is sheer heresy -- the market for one or the other is kept well fueled and another round on the carousel we all go (money happily changing hands).
For my part I wish we'd settle, as much as settling can be done. The APIs should support both, but the user should get to decide, not the developer. Which is incidentally what CSS was _ideally_ kind of was supposed to give us, but we're not really there with that, and I am digressing.
The ugly truth indeed. It sucks to die for the world you won't enjoy, but sometimes it's the only viable solution. Much of our progress has been to minimise casualties and human suffering in order to sustain the world most can agree is better (than the alternatives), but it seems the period of the wave just hits the troughs farther apart, but when it hits them it's like taking breath before the water swallows you, and without training it's quite the panic and suffering (and prospect of death). We know it's in our bones but we want to forget because our bodies are made to interpret pain in the most direct and literal sense -- re-conditioning is always painful too. Strong people create weak people who create strong people, etc.
So yeah _we_ will be fine, but some of us definitely won't, and with the growth in our numbers on Earth, the proportion of martyrs may be growing. Quantifying personal suffering is not possible, especially if the prospect is death.
I don't want to stir up the hornet's nest here, but in my humble opinion the entire problem rests on the unabated and unchecked modern and "late-stage" capitalism model, championed by the U.S. and since exported to and sprung good root everywhere else, even in Europe where it as of yet has a few more checks and balances (which unsurprisingly draws a lot of ire from its acolytes and priests across the Atlantic).
Soviet Union lost due to an inferior societal model, but this too is too much along what once was a relatively sustainable path. The American dream is now a parody of itself, as it takes more to end up with the rest of them, I could go on about the irony of wanting to escape the pit but not wanting to acknowledge the pit is the 99% of the U.S. -- Not Altmans, Bezos'es, Musks or Trumps or their hordes of peripheral elites.
Point being, the model doesn't work _today_ with its cancerous appetite and correspondingly absurd neglect of the human, _any_ human. We can't have humanism and the kind of AI we're about to "enjoy".
The acceleration of wealth disparity may prove to be nearly geometrical, as the common man is further stripped of any capacity to inflict change on the "system". I hope I am wrong, but for all their crimes, anarchy and in a twist of irony -- inhumane treatment of opponent -- the October revolutionaries in Russia, yes bolsheviks, were merely a natural response to a similar atmosphere in Russia at the turn of the previous century. It's just that they didn't have mass surveillance used against them in the same capacity our gadgets allow the "governments" today, nor were they aided by AI which is _also_ something that can be used against an entire slice of populace (a perfect application of general principles put in action). So although the situation may become similar, we're increasingly in no position to change it. The difference may be counted in _generations_, as in it will take multiple generations to dismantle the power structures we allow be put in place now, with Altmans etc. These people may not be evil, but history proves they only have to be short-sighted enough for evil to take root and thrive.
Sorry for the wall of text, but I do agree with the point of the blog post in a way -- demanding people become civilised and refrain from throwing eggs (or Molotovs) on celebrities that are about to swing _entire governments_, is not seeing the forest for the trees.
There's also no precedent in a way -- our historical cataclysms we have created ourselves, have been on a smaller scale, so we're spiraling outwards and not all of the tools we think we have, are going to have the effect required in order to enact the change we want. In the worst case, of course.
I started using Git around 2008, if memory serves. I have made myself more than familiar with the data model and the
"plumbing" layer as they call it, but it was only a year ago -- after more than two decades of using Git, in retrospect -- that a realisation started downing on me that most folks probably have a much easier time with Git than I do, _due_ to them not caring as much about how it works _or_ they just trust the porcelain layer and ignore how "the sausage is made". For me it was always either-or situation -- I still don't trust the high-level switches I discover trawling Git's manpages, unless I understand what the effect is on the _data_ (_my_ data). Conversely, I am very surgical with Git treating it as a RISC processor -- most often at the cost of development velocity, for that reason. It's started to bug me really bad as in my latest employment I am expected to commit things throughout the day, but my way of working just doesn't align with that it seems. I frequently switch context between features or even projects (unrelated to one another by Git), and when someone looks at me waiting for an answer why it takes half a day to create 5 commits I look back at them with the same puzzled look they give me. Neither of us is satisfied. I spend most of the development time _designing_ a feature, then I implement it and occasionally it proves to be a dead-end so everything needs to be scrapped or stashed "for parts", rinse, repeat. At the end of the road developing a feature I often end up with a bunch of unrelated changes -- especially if it's a neglected code base, which isn't out of ordinary in my place of work unfortunately. The unrelated changes must be dealt with, so I am sitting there with diff hunks trying to decide which ones to include, occasionally resorting to hunk _editing_ even. There's a lot of stashing, too. Rebasing is the least of my problems, incidentally (someone said rebasing is hard on Git users), because I know what it is supposed to do (for me), so I deal with it head on and just reduce the whole thing to a series of simpler merge conflict resolution problems.
But even with all the Git tooling under my belt, I seem to have all but concluded that Git's simplicity is its biggest strength but also not a small weakness. I wish I didn't have to account for the fact that Git stores snapshots (trees), after all -- _not_ patch-files it shows or differences between the former. Rebasing creates copies or near-copies and it's impossible to isolate features from the timeline their development intertwines with. Changes in Git aren't commutative, so when my human brain naively things I could "pick" features A, B, and C for my next release, ideally with bugfixes D, E and F too, Git just wants me a single commit, except that the features and/or bugfixes may not all neatly lie along a single shared ancestral stem, so either merging is non-trivial (divergence of content compounded with time) or I solve it by assembling the tree _manually_ and using `git commit-tree` to just not have to deal with the more esoteric merge strategies. All these things _do_ tell me there is something "beyond Git" but it's just intuition, so maybe I am just stupid (or too stupid for Git)?
I started looking at [Pijul](https://pijul.org/) a while ago, but I feel like a weirdo who found a weird thing noone is ever going to adopt because it's well, weird. I thought relying on a "theory of patches" was more aligned with how I thought a VCS may represent a software project in time, but I also haven't gotten far with Pijul yet. It's just that somewhere between Git and Pijul, somewhere there is my desired to find a better VCS [than Git], and I suspect I am not the only one -- hence the point of the article, I guess.
Between the genuine weirdos, the autistic and/or the neuro-divergent, is there anyone left, really? Do the "normies" genuinely exist? Happy-go-lucky, knows a bit about everything but doesn't nerd out on anything, picks up every conversation subject and listens and holds their own in a manner that is just right? I am genuinely curious about the existence of these "superhumans".
There are many many of these socially-skilled normies. But, by virtue of being socially skilled, most have already pretty much filled up their social capacity and don't tend to show up at the kind of venues dedicated to helping under-socialized people meet up.
While there is often a "normal" (bell-curve fitting) distribution for individual factors, putting them together can be counter-intuitive.
> Even when considering just three dimensions, fewer than 5% of pilots were “average” in all. [1]
I would guess many/most people probably think they fall into either (1) the normal bucket or (変) the weird/fringe bucket. Either "I am pretty normal" or "I am an outsider". How many think "We're all fairly different once you cluster in any 3 interesting dimensions!"?
But people feel that dichotomy, which makes me think it is largely about perception relative to a dominant culture: the in-group versus out-group feeling. For example, atheists might feel like outsiders in many parts of the U.S., but less so in big cities and in other countries. In dense urban walkable cities (like NYC), people see diversity more directly and more often. Seeing a bunch of people is different than seeing a bunch of cars.
I think it should be fairly easy to determine if atheists really are outsiders in parts of the US or if it's just perception: just look at voting results, and church attendance for any given area. I don't think it's merely perception at all; visit any rural area and you'll likely see a surprising number of churches relative to the population.
Also, seeing people walking around in public doesn't tell you anything about their religious beliefs unless they're in some sect where they make it obvious with their clothing or hairstyle.
"Just"? How would you build a predictive model that inferred aggregate individual qualities such as
"% atheists" based on voting results? That would be a rather indirect and distorted path for estimation. There are better ways.
It's not a great way, admittedly, but there is a very high correlation between Republican voters and religiosity. Very high turnout for Republican candidates plus lots of active churches in an economically-poor area I think is a reliable indication that atheism in that area is low.
it's a Japanese word for "weird". I'm guessing that OP is a bit of an Otaku (aka "obsessed with Japan") -- which is either ironic or completely appropriate.
My first thought was that they were an LLM, but then checking their profile it seems they've been around since 2012 and have a comment expressing that they seem to get accused of being an LLM a lot, and suggesting people don't do that.
> Quite soon these accusations will nearly always be accurate.
/headscratching They don't have to be, do they? It is possible that some people will build identity systems with norms that e.g. humans type with their own hands. These could become popular, at least conceivably, in certain areas. Hard to enforce for sure. And getting harder and harder to distinguish reliably.
The "normie" doesn't really exist. Everyone is kind of weird in some aspect, which might not be obvious on a surface level.
But having gone to a bunch of programming meetups, the majority of people are perfectly pleasant and good to socialise with. The weirdos are usually non tech people who have an app or crypto idea they want help with. Or just total crazy people who just showed up to the first event they could find regardless of topic.
The hero image on the linked page, which consists of a muted teal background with the words "Introducing Muse Spark", weighs in at 3,5MB. I don't even...
"Please don't complain about tangential annoyances—e.g. article or website formats, name collisions, or back-button breakage. They're too common to be interesting."
Maybe they did get their models to test their pages, but they didn't tell their models to pretend that they're browsing on mobile using a 3G connection.
Good catch - looks like it's a PNG image, with an alpha channel for the rounded corners, and a subtle gradient in the background. The gradient is rendered with dithering, to prevent colour banding. The dither pattern is random, which introduces lots of noise. Since noise can't be losslessly compressed, the PNG is an enormous 6.2 bits per pixel.
While working on a web-based graphics editor, I've noticed that users upload a lot of PNG assets with this problem. I've never tracked down the cause... is there a popular raster image editor which recently switched to dithered rendering of gradients?
My reasoning is because once upon a time, I was using Macromedia Fireworks, and PNGs gave far far better results than JPGs did at the time, at least in terms of output quality. Nearly certainly because I didn't understand JPG compression, but for web work in the mid 2000s PNGs became my favourite. Not to mention proper alpha channels!
I am simply offended. By Meta's lack of sensibilities (or ability) towards use of images on the Web while touting their new flavour of artificial intelligence as a product.
> But I suspect that only creatures that have hopes and dreams and fears similar to our own would actually have much impact on human loneliness. And finding that sort of creature may be a long shot indeed.
I've been saying this for many years, and have had the same suspicion for longer -- first of all, for most people beyond myopic (with good reason) zoologists or biologists, it's not about "alien" life -- plenty of that, arguably, in the Mariana trench etc, noone's noticing because it's an answer to the wrong question. Extrapolating this, it's not even about _extra-terrestrial_ life necessarily -- not for some of us -- finding living bacteria of non-terrestrial origin on Mars is going to be amazing and an epic discovery, but it's still an answer to the wrong question -- our search is deep down motivated by desire to communicate, to ask as if our own mirror "what is going on?", "why do we exist?", "have you guys figured it out" and last but not least -- "we are excited to meet you, for all our numbers we've been feeling lonely with so much space, thinking we were alone". Many a sci-fi author express the question much better, because that's the one that matters.
But to placate the level-headed empirists -- yes, discovering the bacteria or alien jelly-fish in the interstellar void, is of course scientifically a big thing. But I suspect we are just being cautious not wanting to utter that discovering these we just want to get _more_ excited about the possibilities the former allows -- that we _will_ meet sentient beings of intelligence who will in the very least understand us (with due effort), sort of like the extended family we suspect we have and always wanted to meet, but the meeting is always postponed.
For all his infamy, Jobs held Apple together in large part through his uncompromising perfectionism and attention to the kind of details that have since been demoted to "we'll fix it in the next version" or the equivalent of "# temporary". Every company is a bit of an ant-farm, but this one either has no single queen to lay down the law, or the queen is "trying things out" :P
Jobs used to laugh at Microsoft for all manner of inconsistencies in behaviour and user experience with Windows, but now Apple is contending with the same problem, in part due to exposure as macOS has never been so popular and prevalent, and now there are ever growing amount of eyes calling them out for those inconsistencies that have been appearing more and more frequently without Jobs' leadership style.
I see you point, but I think that Jobs not per se held Apple together. This is Tim Cook doing as well and arguably on a way larger scale.
The one thing that distinguished Jobs from the rest ever since is the fact, that he was Apple's greatest fan boy. If you have a look at the Itunes introduction, Jobs sits there and for around 2 hours showcases every feature and function. He was so into the product, that this keynote is for me the most nerdy ever conducted by him.
The others as well always show him being the company's No 1 fan and host of every feature there is.
Imagine to have a boss like this. He set the standard for product development in every regard.
And this is what slipped. Consistency is lacking and according to biographies about Cook, he has a very huge focus on him as a person. This is always wrong. It is about the product, nothing else.
There will never be a Jobs again. And it is getting worse from here: the old guard is mostly gone. Even the myth of Steve Jobs is nothing Gen Z cares about.
We live in the Post-Jobs phase and Cook seems to be overshadowing Jobs, as sad as this is. All innovations except the headphones date back to Jobs. All the scale that Apple reached to Cook.
I bet Jobs would rather have a way smaller scale with great products. This luxury lifestyle is nothing Jobs liked.
You have to realise every single UI up to that point was solid white or grey and unable to access alpha channels. And the fact that they expanded upon this design “language” with the transparent Imac cases made it all cohesive and 2YK hip.
Having used `jq` and `yq` (which followed from the former, in spirit), I have never had to complain about performance of the _latter_ which an order of magnitude (or several) _slower_ than the former. So if there's something faster than `jq`, it's laudable that the author of the faster tool accomplished such a goal, but in the broader context I'd say the performance benefit would be required by a niche slice of the userbase. People who analyse JSON-formatted logs, perhaps? Then again, newline-delimited JSON reigns supreme in that particular kind of scenario, making the point of a faster `jq` moot again.
However, as someone who always loved faster software and being an optimisation nerd, hat's off!
Integrating with server software, the performance is nice to have, as you can have say 100 kRPS requests coming in that need some jq-like logic. For CLI tool, like you said, the performance of any of them is ok, for most of the cases.
Indeed, thanks for spotting that, as I myself remember discovering there's at least two. Thing is, I had learned and started with Mike Farah's `yq`, not the pass-through-to-`jq` variant written in Python that's often more easily (read: system package manager) available. Both semantics and syntax are a bit different between the two.
A bit of a fun fact: there's a quote by Farah where he said that the language and semantics of the tool he was writing, didn't really "click in" until he was well into writing it :-) I myself have been on occasion pulling my hair out trying to wield `yq`'s language, there's some inconsistencies here and there which I think are related to the novel nature of the language (not novel to everyone but it's uncommon even for those well versed with e.g. SQL). `jq` suffers from similar woes, but to a lesser degree.
reply