These days POTS lines are usually only used for last mile communications so the calls get converted to VoIP on the telco side. Basically it's for backwards compatibility — the phone lines are already there, a lot of people have phone wiring in their houses and no configuration is required on the consumer's end.
In general the days of having direct electrical connections between two distant telephones are long gone. The telco companies scrapped it when they realized that they could trunk the phone calls from a local branch to the central office using PCM streams over a single cable.
Metallic path between two stations that weren't terminated in the same CO has been dead for a long time! I suspect nowadays you're unlikely to have metallic path outside the frame you land on, if that, unless you're paying for dry pairs.
This is a terminal emulator within vim. I believe that they meant that a terminal emulator for Vim would be required so that it could handle ANSI escape codes, etc. This is something that a VGA graphics card couldn't handle on its own.
No, it's a different issue. I did my BSc in Poland, then MSc in Germany and now finishing PhD in Switzerland. There is a huge difference in the quality of teaching caused by decades of bad practices, lack of respect towards students, conservative mindset of faculty that prevents modernization effort, and lack of motivation caused by high teaching load and poor compensation. On top of that, you get the negative selection in Polish academia that results in filling positions with mediocre and passive researchers.
Our students are not less independent or less inteligent than Germans and Swiss. It's our faculty and teaching staff that is often neither able nor willing to implement good teaching practices.
Can't speak for Poland but further east this extends to basic education too. The primary mode of education is memorization, teachers routinely insult pupils, and there is just a startling lack of understanding of how to enable a person to learn. They would much rather blame the student than look at their own methods.
I have often wondered what causes this huge difference in efficacy of teaching between eastern and western Europe. Past influence from the Soviets? Wealth difference? Other cultural influences?
Toit and Ruby are both inspired by Smalltalk, and there are certainly similarities in the syntax but we felt that starting 20 years later we could do a lot better than Ruby syntax.
In particular, Python basically won the significant-indentation argument. Since all programs are formatted with correct indentation the punctuation is redundant clutter.
So for a new language it just feels right to go with indentation instead of curlies.
We also wanted to be free to add type annotations and other enhancements, so it was never going to be compatible with Ruby anyway. And we didn't want to raise expectations of Ruby-compatibility that would immediately be dashed.
Phones have just an image sensor with a direct interface to the CPU, with a driver plus a ton of software running on the CPU to enhance quality. You can get good cameras with modern image sensors with usb interface. Note that they need a local controller to well, control them and provide a usb interface, and need firmware for the local controller and need to provide a driver or support for a standard API at the USB end. The market is tiny compared to phones, so for those reasons you can't buy a usb camera with the same low cost and high performance as what is in your smartphone.
That being said, you can buy good usb cameras based on many modern image sensors from a company like e-con[1], but you have to do research about what features are enabled by the driver.
I'm not sure why actual webcams including a way to mount on your monitor are so far behind and expensive. Logitech C920 is still a common recommendation, and it's now 10 years old!
Phone cameras are very good but owing much of it to the DSP and software. An iPhone camera will not produce iPhone quality photos without the chipset and OS.
oh, that's for economic reasons. The industrial and desktop consumer computer vision markets are orders of magnitude smaller and their development cycle times orders of magnitude longer.
I looked into this a while ago- trying to use gcam technology for scientific imaging- when I worked at Google, and there was zero interest from those teams. They were 100% focused on next-gen camera tech (and it showed- that was the period when phones got unbelievably good at taking high quality images using computational photography).
For some reason, I just can't remember things for life.
Despite being fairly experienced and having a rather solid grasp of Python/Go/JS/etc I still have to look up basic things like file modes (with bash syntax being especially notorious for being forgettable to me).
What I'm good at, however, is keeping references in my head, i.e. I know precisely what to Google to get the exact answer that I want. This way I compensate for my bad memory.
And for the record, I'm in my 20s. Sometimes I worry if it's going to take a toll on my career in the long run but so far it hasn't been much of a problem (apart from being a source of insecurities).
Your post reminds me of a joke punchline which was something along the lines of "an engineer doesn't actually have to remember very much, they only need to know how to efficiently find the information when required"
It's true for people working in technology due to the rapid pace of progress. As soon as we learn something it's out of date!
This is a very good example of how you can save yourself the mental hurdle of remembering how to configure something etc.
I can only suspect how much time, trial and error this must've taken. This is my main issue with IaC. The concept really lends itself to any kind of modern infra, however I'm really put off by the sheer amount of time it takes me to whip out a bunch of Ansible playbooks and helper scripts, and on top of that make sure that all of them are idempotent.
Maybe I'm doing something wrong and this should be easy?
Pretty sure you're not doing anything wrong - well, if you are then I am too :-)
What's often overlooked (I believe) is that when you're doing this work in your day job you've for existing infra to support your infra along with other folks you can rely on to for help.
With home infra, you first must invent the universe (apologies to Carl). Having built 3 variations of home infra (Swarm on x64/Beelink, K3s on Pi, K3s on x64/Odroid) I've gained a strong admiration for anyone who take this on irregardless of there relative success.
What I've learnt over time is to add as little accidental complexity as is possible which I think is what you're getting at. One incarnation of the Pi K3s was provisioned by Ansible (on it's own stand alone Pi that the cluster would netboot from). Was Ansible better that imaging the usb drives manually and running a small script on each of the nodes? - probably a wash. I did however learn a bit of Ansible.
Today's stack is way too complex, and fragile. The entire stack relies on the goodwill of other projects to continue to be maintained/not have breaking changes. Even with the initial setup cost s(h)aved there will be a continuous maintenance burden. What you get in abstraction and automation you pay back in babysitting the additional connecting interfaces.
> The entire stack relies on the goodwill of other projects to continue to be maintained/not have breaking changes
I agree with the sentiment of your comment, but when was this ever not the case, other than the days when you built your own OS and tooling from scratch?
No. It never gets a habit. It is torture, and you have to look up the commands every time you do it, because you do it infrequently enough not to learn by heart.
Commands you use every day in the CLI? In your editor?
I can trivially enter Emacs and modify Apache config when it breaks backwards compat between distribution upgrades, but when it's infrastructure as code...
I can’t think of a single time I’ve had an Apache config break with upgrades - other than having to recompile the webservice proxy module to change the case of the Upgrade header for a device that doesn’t work with the standards and is case sensitive
(Major upgrades every couple of years break the binary compatibility)
Projects are making this easier. For example moving from large mono config files to using conf.d/ directories where you can drop in extra files and manage each one independently is great for IaC.
While it's challenging up front I do enjoy being able to freshly install the latest version of Fedora, run my playbook and be more or less up and running.
It feels cleaner and more reliable (at least until this week when a broken selinux policy made it to stable) rather then trying to upgrade packages across major release versions in place.
<hat material="tinfoil"> If I've somehow acquired secret hidden malware or someone has opened up something to come back in later that's also going to get flushed out at least once every six months.