Hacker Newsnew | past | comments | ask | show | jobs | submit | masklinn's commentslogin

> well, sudo-rs had a few privilege escalation CVEs recently. So there has been some recent evidence in favor of the stability argument.

it would probably be a lot stronger an argument if sudo hadn’t also had a few privilege escalation CVEs recently.


> Go won’t put large allocations on the stack even if escape analysis would permit it

Depends what you mean by “large”. As of 1.24 Go will put slices several KB into the stack frame:

    make([]byte, 65536)
Goes on the stack if it does not escape (you can see Go request a large stack frame)

    make([]byte, 65537)
goes on the heap (Go calls runtime.makeslice).

Interestingly arrays have a different limit: they respect MaxStackVarSize, which was lowered from 10MB to 128 KB in 1.24.

If you use indexed slice literals gc does not even check and you can create megabyte-sized slices on the stack.


There is a option -smallframes that seems to be intended for conservative use cases. Below are the related configs and a test at what point they escape (+1).

  // -smallframes
  // ir.MaxStackVarSize = 64 * 1024
  // ir.MaxImplicitStackVarSize = 16 * 1024
  a := [64 * 1024 +1]byte{}
  b := make([]byte, 0, 16 * 1024 +1)
  // default
  // MaxStackVarSize = int64(128 * 1024)
  // MaxImplicitStackVarSize = int64(64 * 1024)
  c := [128 * 1024 +1]byte{}
  d := make([]byte, 0, 64 * 1024 +1)
Not sure how to verify this, but the assumption you can allocate megabytes on the stack seems wrong. The output of the escape analysis for arrays is different then the make statement:

  test/test.go:36:2: moved to heap: c
Maybe an overlook because it is a bit sneaky?

> Not sure how to verify this, but the assumption you can allocate megabytes on the stack seems wrong.

    []byte{N: 0}

doesn't make sense.

And yet it does: https://godbolt.org/z/h9GW5v3YK

And creates an on-stack slice whose size is only limited by Go's 1GB limit on individual stack frames: https://godbolt.org/z/rKzo8jre6 https://godbolt.org/z/don99e9cn


Yea with more context it suddenly makes sense :p

Interesting, [...] syntax works here as expected. So escape analysis simply doesn't look at the element list.


A copy of what? It’s returning a pointer, so i has to be on the heap[0].

gc could create i on the stack then copy it to the heap, but if you plug that code into godbolt you can see that it is not that dumb, it creates a heap allocation then writes the literal directly into that.

[0] unless Foo is inlined and the result does not escape the caller’s frame, then that can be done away with.


That’s the one.

Since 1.17 it’s not impossible for escape analysis to come into play for slices but afaik that is only a consideration for slices with a statically known size under 64KiB.


That breaks existing usages of innerhtml which may legitimately need its more dangerous features.

It seems obvious enough that parent is talking about changing the behavior of innerHTML within their own application, not for browser makers to change the implementation. It's unfair to take the most uncharitable interpretation and upbraid the other commenter for being insufficiently defensive[1] when they pressed "reply".

1. <https://pchiusano.github.io/2014-10-11/defensive-writing.htm...>


> It seems obvious

Doesn't seem obvious unless your dutch.

Especially as the first things I would think obvious is: if breaking the behaviour of innerHTML is not a concern for your software why keep it at all? Delete the property or make it readonly.


> Doesn't seem obvious unless your dutch.

I don't know what that means.

> if breaking the behaviour of innerHTML is not a concern for your software why keep it at all?

For the reason that they said.


> Doesn't seem obvious unless your dutch.

I believe this is a reference to the old "Zen of Python," which one gets by running `import this` in the Python REPL.

"""

There should be one-- and preferably only one --obvious way to do it.

Although that way may not be obvious at first unless you're Dutch.

"""


Maybe I should have included the word ”monkeypatch” in the comment.

> Are there any examples where the first approach (sanitize to string and set inner html) is actually dangerous?

The term to look for is “mutation xss” (or mxss).


> The only downside of CDs was that you couldn’t record from the radio and Napster eventually solved that better than radio ever did.

This was far from the only drawback with CDs especially early on, at least in mobile applications: the media (and thus player) is bulky, cases are fragile (in part through increased leverage), it has low resilience to physical damage, and before memory prices hit low enough for significant buffering the slightest g forces would lead to skips.

MDs were real progress on that front. Shame it was quite expensive and the digital models were hobbled by horrendous software. And obviously flash-based pmps then smartphones are their lunch entirely.


I remember my first “portable” was so bulky it came with its own carry case like a hand bag.

You had to step very lightly when using it as it was just itching to skip.

It would also eat through batteries like no one’s business.


MDs are just another example of Sony screwing it up by making things proprietary and keeping it to themselves instead of creating an ecosystem (memory sticks were another example, although they didn't offer quite the same advantages). It's really a shame, I think if Sony would have gone about this differently they likely would have put off the emergence of MP3 players for a long time.

It’s a funny comment because those formats only existed to be proprietary. Sony learnt the wrong lessons from CD, which they co developed with Phillips. They saw the success of that format and wished they were getting royalties on the underlying tech.

They then wasted billions and decades in formats other companies wouldn’t touch because they had fees attached. Minidisc being a prime example. Sounded worse than CDs, cost the same. Had a recording feature people already had with cassette.


MD was quite conventient for recording (interviews, ambient ...) and with random access much better than cassettes.

And it could have been the successor to the floppy

There were already plenty of “successors to the floppy” in the dustbin of history (floptical, Iomega zip, LS120, …). None of them was competitive as a distribution format, or at all once CD-R became widely available.

Yeah, and the MiniDisc was the only one that could have come close. Sony already had computer MiniDisc readers/writers, mass production with pre-recorded content, (fairly) large volumes.

They just never connected these things to each other. It could have been a great standard and we would have been plagued to this day with them. :)

In some ways it's even better than USB flash. There are no read-only flash drives, for instance. It's also a problem that you mosh "data" in the same port you mosh "keyboard" or "spy device". We gained a lot with the USB paradigm but we lost some things, too.


MiniDisk! I loved that format. Great physical size. I suspect my love is all about nostalgia for the future, because when they came out they were foreign (at least in the US) and fly.

After using minidisk I was sure that LS120 would succeed. The formats of cartridged optical disks mostly removed the annoyance of scratched disks. Now the only place I see optical disks in a cartridge is at the library where they put some CDs in a cartridge to use in a special drive.

I was in college during the time, but I remember all of these digital art students had iMacs and these clear+blue FireWire zip drives they used to carry around between classes and home.


In principle, maybe, but Zip disks were errorprone, didn't store music for portable players, and were rather large and cumbersome. Minidiscs were even smaller than floppies and more robust.

> has low resilience to physical damage

No it doesn't. As a child, one time I tried to make a CD unplayable and literally couldn't do it. (Sandpaper didn't do the trick.)

The real issue was the skipping when you tried to use a portable CD player.


> No it doesn't.

Yes it does.

> As a child, one time I tried to make a CD unplayable and literally couldn't do it. (Sandpaper didn't do the trick.)

Either child you was incompetent or your player was very good at error recovery, because I personally saw a number of car CDs thrown out as the car’s stereo was unable to read them anymore.


you were probably scraping the thick transparent side, not the side with the label? the data is immediately under the label. the clear side can be surprisingly scraped up and still read properly, though I'm not sure how!! I have some CDs that I thought were ruined because of how scratched up the underside is, and they play just fine. Pretty sweet! Then I have one or two where the label side got a scratch taken out of it, and indeed, you can see right through the disc at those points - unrecoverable damage. Conversely a scratched up underside can simply be buffed/polished smooth and the disc will read good as new. I actually have one disc that cracked in half (a singular crack from the center to the outside edge, not spanning the total diameter of the disc)... and it actually plays without any skips (though surely depending on quality of the player and its resilience to read errors). I couldn't believe it at the time. A single piece of masking tape to hold the edge together was a sufficient "repair".

I worked in a CD foundry in the early 1990s. Scratches that were not tangential (perpendicular to the radius) were irrelevant, as the basic CD encoding scheme provide something like (IIRC) 30+ bytes of data parity protection. If the scratch width along the track wasn't longer than that, it didn't exist.

If it did exist, some toothpaste rubbed tangentially around the CD on your fingertips was often enough to buff it out, at least as far as the 30-byte limit cared.

It was a phenomenal jump in data integrity, built in at the recording level. Sure, you could encode even floppies with that scheme... but your computer didn't, natively.


CD pickup detects changes in the reflected light due to the reflective pits. As long as the scratches are significantly bigger than pits they will create lower frequency attenuation to the reflected light which won't affect the high frequency signal coming off pits. You will get occasional errors when crossing into and out of a scratch but that's just a few samples, likely those won't even make it through through the speakers. I have not tried but I imagine a very fine sandpaper could create the scratches at high enough frequency to interfere with the pickup.

But the label side is indeed very fragile as you can easily damage the reflective pits, only covered by a layer of paint. It's as same as a simple mirror, where the thin layer of reflective metal is very well protected from the front but is only covered with paint in the back.


They must’ve had a really robust kind of CDs wherever you lived, then. Like everyone else, I wore out a lot of discs simply by storing them outside their case.

Do you mean the OG audio CD's made at the audio CD factory, or those newfangled CD-R's?

Both, until I discovered the toothpaste-buffing trick.

Did that work? I heard everything already, from it being a wonder solution to it destroying the discs even further (if i had to guess they used the kind of toothpaste with little stones in them?)

CD goes in the microwave

Even outside of such, if your contention is low and critical section short, spinning a few rounds to avoid a syscall is likely to be a gain not just in terms of latencies but also in terms of cycles waste.

> I don't think Safari mattered much.

Apple was the first to publicly call out native plugins (jobs did so on stage) and outright refused to support them on iOS, then everyone else followed suit.


>then everyone else followed suit

NPAPI's death in non-IE-browsers started around 2015. Jobs announcing mobile Safari without Flash was 2010. Unfortunately, ActiveX still works to this very day.

Chrome built up a whole new PPAPI to support reasonably fast Flash support after the Jobs announcement. Microsoft launched a major release of Silverlight long after Jobs' speech, but Silverlight (rightfully) died with Windows Phone, which it was the main UI platform for around its practical death. Had Microsoft managed to launch a decent mobile operating system, we'd probably still be running Silverlight in some fashion today. Even still, Silverlight lasted well until 2021 before Silverlight actually fell out of support.

Jobs may have had a hand in the death of Flash websites, but when it came to Java Applets/Silverlight, the decision had little impact. That plugin model was slowly dying on its own already.


> then everyone else followed suit

There was a Flash runtime on Android. It was terrible. Java applets were already dead anyway, outside of professional contexts, which are not relevant on phones anyway.


Yes I’m sure jobs went on stage calling out flash as the main source of Safari crashes because of Microsoft’s Java plugin.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: