You don’t need to do anything complicated. Take a couple screenshots and crop to a slice where the rings are expanding horizontally. There is your barcode.
The article reads like a naïve person coming to terms with the idea of consciousness… like “whoa maaan, my fingers… they’re finging!”
“Extra-cerebral” neurons are optimized for different functions than neurons proper in the brain.
It’s unlikely that the gut has thoughts and feelings, given neuronal tissue is distributed throughout viscera (versus concentrated in one spot like the brain). They are distributed so that smooth muscle tissue can contract appropriately and push food and wastes down the line.
The author compares the number of GI neurons to the number of neurons in a dog’s brain, but gleans over the number of neurons in a dog’s GI tract which is probably similar or proportionally less because the tract is physically smaller.
The neurons the author highlights in the heart are concentrated at the base, because the shape of the heart is optimized for coordinated contraction via electrical impulse propagation.
The Idea Factory is a worthwhile read. One of the concepts that AT&T operated on during monopoly times seemed to be focused on providing a “gold-plated” premium service.
My understanding was that MCI rolled in and set up dollar-store version of Long Lines, with sound quality to match. They used that as the basis to challenge AT&Ts monopoly to steamroll shitty service everywhere.
The cream-skimming that happened with profit-first MCI resulted in the loss of a resilience mindset and long-term planning for a national network.
I've gotten sucked into 1950s/60s/70s Pan Am (the defacto U.S. flag carrier of the era) advertising videos and the concept is similar to AT&T: premium service, basically a monopoly.
Problem is: prices were REALLY high.
Competition worked out in Part 121 airlines and telco, in the long run.
Fun fact for the young: Sprint (long before being bought by T-Mobile) was primarily a long-distance company, and they advertised that the sound quality was "so good you could hear a pin drop". Many ads featured this bouncing pin (e.g., https://www.youtube.com/watch?v=Z-cbzf9amfo from 1986).
The logo they used until just before the buyout was a stylized image of a pin falling down.
You have to painfully force quit Safari by going through the motions of the gestures, wait for the phone to respond, and then quit Safari from the card task switcher.
I get the sense the Arch wiki pages has more detail than the man pages themselves.
The wiki captures the knowledge that developers of said apps assume to be common, but don’t actually make sense unless you are bootstrapped into the paradigm.
Most man pages are written for someone who knows pretty precisely what they want to do, but don't recall which nobs to turn in the program to get that done. The Arch wiki instead focuses on someone who has a vague idea of what tools to use but doesn't know how those tools operate.
I've found that with an intermediate understanding, the Arch wiki is so much better that I often times won't even check the man pages. But on the occasions where I know the thing pretty well, they can be quite spotty, especially when it's a weird or niche tool among Arch users. So, depending on how you define "more detail", that might be an illusion.
Man pages were always intended to be concise reference material, not tutorials or full docs. More akin to commented header files or a program's --help output, before the latter became common.
(GNU info tried to be a more comprehensive CLI documentation system but never fully caught on.)
Anecdotally the arch wiki expands on the vauge man pages, often with examples for cases actually used by people. And they are much more easily accessible to modify and have instant gratification of publishing changes. Publishing to upstream man pages of a project, need to wait for it to trickle down.
Arch wiki is far better than most man pages. I've referred to Arch for my own non-Arch systems and when building Yocto systems. Most Arch info applies.
In the ancient days I used TLDP to learn about Linux stuff. Arch wiki is now the best doc. The actual shipped documentation on most Linux stuff is usually terrible.
GNU coreutils have man pages that are correct and list all the flags at least, but suffer from GNU jargonisms and usually a lack of any concise overview or example sections. Most man pages are a very short description of what the program does, and an alphabetic list of flags. For something as versatile and important as dd the description reads only "Copy a file, converting and formatting according to the operands" and there's not even one example of a full dd command given. Yes, you can figure it out from the man page, but it's like an 80s reference, not good documentation.
man pages for util-linux are my go-to example for bad documentation. Dense, require a lot of implicit knowledge of concepts, make references to 90s or 80s technology that are now neither relevant nor understandable to most users.
Plenty of other projects have typical documentation written by engineers for other engineers who already know this. man pipewire leaves you completely in the dark as to what the thing even does.
Credit to systemd, that documentation is actually comprehensive and useful.
It was the accessibility. You were learning computing concepts from scratch, that would then increase in complexity in real-time as your learning caught up if you were actively engaged.
- Pre-LED Jumbotrons used CRT pixels called "Trinilite" elements. This was a proprietary Sony technology where each sub-pixel or "cell" was a miniaturized CRT assembly. Each resolved one pixel each.
- A "maximum" NTSC configuration consisting of 40 units wide would result in a horizontal resolution of just 640 dots.
- The display needed a calibration using a “Screen Alignment Unit” (the JME-SA200). This unit used a remote modem chain involving a "cellular phone" and "digital data card." This means that Jumbotron techs could dial in over 1998-era mobile networks to geometrically align a stadium-sized wall of vacuum tubes as they sat in the middle of said stadium.
I also found the format of the manual interesting, because it follows the same style of consumer-grade Sony devices from that period.
I believe these were supported from 2001-2011, which makes it amusing to think of some guy sitting in Times Square adjusting a Jumbotron and changing the inputs from the sidewalk.
Wait is that available? Best clock would be UTC + decimal time anyway, netric time is a good name for that.
Decimal time: you divide the day into powers of tens, a 'deci' is 2.4 hours, a 'centi' is 14.4 ~= 15 minutes, a 'mili' is 1.44 minutes ~= 86 seconds and so on.
Great system with convenient lengths, and easy to add duration + date, and convert between different units.
While deciday is workable hour replacement and milliday is decent minute replacement, the metric time would be missing decent seconds replacement. The next named prefix, microday, is 86ms which is very small for humans.
Of course it is not insurmountable problem, but for lot of other units milli- tends to represent the lower end of "human scale"; millimeters, milligrams, milliliters, milliwatts, etc are as small as most people will ever use in day-to-day life.
In some ways it would be more convenient if the base unit would be e.g. 1e-4 days (or some other power of 10) so day would be 10 kilos. Understandably it would be lot less elegant but more practical. Deci- would be .86 seconds which is kinda nice.
I think what likely happens here is that everyone just starts calling 10^(-5) days (= 0.864 current seconds) a "second". (And then there's confusion over which second people mean, analogous to what happens now with tons.)
reply