Hacker Newsnew | past | comments | ask | show | jobs | submit | extinctpotato's commentslogin

You might appreciate this: https://antmicro.github.io/myst-editor/


Yes.


These days POTS lines are usually only used for last mile communications so the calls get converted to VoIP on the telco side. Basically it's for backwards compatibility — the phone lines are already there, a lot of people have phone wiring in their houses and no configuration is required on the consumer's end.

In general the days of having direct electrical connections between two distant telephones are long gone. The telco companies scrapped it when they realized that they could trunk the phone calls from a local branch to the central office using PCM streams over a single cable.


Metallic path between two stations that weren't terminated in the same CO has been dead for a long time! I suspect nowadays you're unlikely to have metallic path outside the frame you land on, if that, unless you're paying for dry pairs.


This is a terminal emulator within vim. I believe that they meant that a terminal emulator for Vim would be required so that it could handle ANSI escape codes, etc. This is something that a VGA graphics card couldn't handle on its own.


In case of Poznan, it's the lecturer who fails to teach this subject in a sane way.


Do Polish universities not have an expectation that students attempt to figure stuff out on their own?

Why do these students expect to happen when they enter the workforce?


No, it's a different issue. I did my BSc in Poland, then MSc in Germany and now finishing PhD in Switzerland. There is a huge difference in the quality of teaching caused by decades of bad practices, lack of respect towards students, conservative mindset of faculty that prevents modernization effort, and lack of motivation caused by high teaching load and poor compensation. On top of that, you get the negative selection in Polish academia that results in filling positions with mediocre and passive researchers.

Our students are not less independent or less inteligent than Germans and Swiss. It's our faculty and teaching staff that is often neither able nor willing to implement good teaching practices.


Can't speak for Poland but further east this extends to basic education too. The primary mode of education is memorization, teachers routinely insult pupils, and there is just a startling lack of understanding of how to enable a person to learn. They would much rather blame the student than look at their own methods.

I have often wondered what causes this huge difference in efficacy of teaching between eastern and western Europe. Past influence from the Soviets? Wealth difference? Other cultural influences?


But didn't USSR also produce some of the best minds in arts, science and technologies?


The best minds produce themselves, really.


It feels very Ruby-like. In fact, it makes me wonder why not reuse Ruby's syntax if you're making a thing like that.


Toit and Ruby are both inspired by Smalltalk, and there are certainly similarities in the syntax but we felt that starting 20 years later we could do a lot better than Ruby syntax.

In particular, Python basically won the significant-indentation argument. Since all programs are formatted with correct indentation the punctuation is redundant clutter. So for a new language it just feels right to go with indentation instead of curlies.

We also wanted to be free to add type annotations and other enhancements, so it was never going to be compatible with Ruby anyway. And we didn't want to raise expectations of Ruby-compatibility that would immediately be dashed.



Could somebody do an ELI5 on why some phones have very good cameras but for some reason there's no standalone USB version of them?


Phones have just an image sensor with a direct interface to the CPU, with a driver plus a ton of software running on the CPU to enhance quality. You can get good cameras with modern image sensors with usb interface. Note that they need a local controller to well, control them and provide a usb interface, and need firmware for the local controller and need to provide a driver or support for a standard API at the USB end. The market is tiny compared to phones, so for those reasons you can't buy a usb camera with the same low cost and high performance as what is in your smartphone.

That being said, you can buy good usb cameras based on many modern image sensors from a company like e-con[1], but you have to do research about what features are enabled by the driver.

I'm not sure why actual webcams including a way to mount on your monitor are so far behind and expensive. Logitech C920 is still a common recommendation, and it's now 10 years old!

[1]https://www.e-consystems.com/See3CAM-USB-3-Camera.asp


Phone cameras are very good but owing much of it to the DSP and software. An iPhone camera will not produce iPhone quality photos without the chipset and OS.


That still leaves the original question of why dedicated cameras aren’t doing this.


Probably cause Big Tech stole all the computer vision and DSP folks


The question of GP wasn't that, but why you can't buy "iPhone image processing pipeline to UVC/USB".


I think we agree but you don’t understand me.


there are cameras that do this; there are many UVC USB3 webcams with phone-grade sensors (medium quality).


“This” refers to the contribution of software processing described above, i.e. explicitly not the matter of sensor quality.


oh, that's for economic reasons. The industrial and desktop consumer computer vision markets are orders of magnitude smaller and their development cycle times orders of magnitude longer.

I looked into this a while ago- trying to use gcam technology for scientific imaging- when I worked at Google, and there was zero interest from those teams. They were 100% focused on next-gen camera tech (and it showed- that was the period when phones got unbelievably good at taking high quality images using computational photography).


That appears to be what the Opal C1 is doing.

https://opalcamera.com/


That looks great, except for being mac-only.

Have they given any indication about whether it'll be a standard UVC camera and Just Work on all platforms?


Also it isn’t available to actually buy. I’m sure they’ll be ready just in time for pandemic to end and macbooks to have better webcams…


Software is subscription! Ridiculous!


For some reason, I just can't remember things for life.

Despite being fairly experienced and having a rather solid grasp of Python/Go/JS/etc I still have to look up basic things like file modes (with bash syntax being especially notorious for being forgettable to me).

What I'm good at, however, is keeping references in my head, i.e. I know precisely what to Google to get the exact answer that I want. This way I compensate for my bad memory.

And for the record, I'm in my 20s. Sometimes I worry if it's going to take a toll on my career in the long run but so far it hasn't been much of a problem (apart from being a source of insecurities).


Your post reminds me of a joke punchline which was something along the lines of "an engineer doesn't actually have to remember very much, they only need to know how to efficiently find the information when required"

It's true for people working in technology due to the rapid pace of progress. As soon as we learn something it's out of date!


This is a very good example of how you can save yourself the mental hurdle of remembering how to configure something etc.

I can only suspect how much time, trial and error this must've taken. This is my main issue with IaC. The concept really lends itself to any kind of modern infra, however I'm really put off by the sheer amount of time it takes me to whip out a bunch of Ansible playbooks and helper scripts, and on top of that make sure that all of them are idempotent.

Maybe I'm doing something wrong and this should be easy?


Pretty sure you're not doing anything wrong - well, if you are then I am too :-)

What's often overlooked (I believe) is that when you're doing this work in your day job you've for existing infra to support your infra along with other folks you can rely on to for help.

With home infra, you first must invent the universe (apologies to Carl). Having built 3 variations of home infra (Swarm on x64/Beelink, K3s on Pi, K3s on x64/Odroid) I've gained a strong admiration for anyone who take this on irregardless of there relative success.

What I've learnt over time is to add as little accidental complexity as is possible which I think is what you're getting at. One incarnation of the Pi K3s was provisioned by Ansible (on it's own stand alone Pi that the cluster would netboot from). Was Ansible better that imaging the usb drives manually and running a small script on each of the nodes? - probably a wash. I did however learn a bit of Ansible.


Today's stack is way too complex, and fragile. The entire stack relies on the goodwill of other projects to continue to be maintained/not have breaking changes. Even with the initial setup cost s(h)aved there will be a continuous maintenance burden. What you get in abstraction and automation you pay back in babysitting the additional connecting interfaces.


> The entire stack relies on the goodwill of other projects to continue to be maintained/not have breaking changes

I agree with the sentiment of your comment, but when was this ever not the case, other than the days when you built your own OS and tooling from scratch?


It’s a good habit to always use tools like terraform for cloud or Ansible/Salt/Puppet for machines instead of directly doing something.

Especially cloud setups that just run containers are relatively easy to get idempotent with terraform


No. It never gets a habit. It is torture, and you have to look up the commands every time you do it, because you do it infrequently enough not to learn by heart.


Huh? Compared to... running commands infrequently to administer servers? IaC is strictly and comically better.


Commands you use every day in the CLI? In your editor?

I can trivially enter Emacs and modify Apache config when it breaks backwards compat between distribution upgrades, but when it's infrastructure as code...


I can’t think of a single time I’ve had an Apache config break with upgrades - other than having to recompile the webservice proxy module to change the case of the Upgrade header for a device that doesn’t work with the standards and is case sensitive

(Major upgrades every couple of years break the binary compatibility)


I am pretty confident it broke for me between 12.04 and 14.04 Ubuntu releases, and I am pretty sure both were already in the 2.x series.

Most likely the change in NameVirtualHost behaviour as listed at https://httpd.apache.org/docs/2.4/upgrading.html, but there are a bunch of other changes too.


Oh sure every few years when you upgrade to a new OS, but not a normal security upgrade.


Still need to install the machine with the "hand" or with Satellite or Anaconda.


(Repo owner here) You're not wrong; it's still not easy for me even though I do this in my day job.


Projects are making this easier. For example moving from large mono config files to using conf.d/ directories where you can drop in extra files and manage each one independently is great for IaC.

While it's challenging up front I do enjoy being able to freshly install the latest version of Fedora, run my playbook and be more or less up and running.

It feels cleaner and more reliable (at least until this week when a broken selinux policy made it to stable) rather then trying to upgrade packages across major release versions in place.

<hat material="tinfoil"> If I've somehow acquired secret hidden malware or someone has opened up something to come back in later that's also going to get flushed out at least once every six months.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: