Hacker Newsnew | past | comments | ask | show | jobs | submit | skriticos2's commentslogin

When I think about the two models, I have Linux as the dictator type and XML as committee designed. Both are functional enough, but the while so few data points are hardly conclusive, I think it's generally indicative.

I'm not a particular fan of XML, even if it's functional enough to get the job done.

Of course you have to find a dictator that is ready to invest all the time and energy to care for a project over a prolonged time and is actually capable of doing so while avoiding to alienate the user base. That's a pretty tall order.


> I'm not a particular fan of XML, even if it's functional enough to get the job done.

XML by itself is okay-ish. The true design by comittee disasters are the specs surrounding it. XMLSignature, SOAP, etc


Why is it a binary value? What about masochists, or people who lost a bet and want to be stabbed just a little? Or strangled?


You can put a window that covers the bottom half of the content the defaults to all assaults being allowed also has a way to customize which assaults you would like. It shouldn't be possible to uncheck necessary assaults for the website might not work.


And “by not work” we mean “will work exactly as it should, but little Timmy in marketing will get a frowny face and won’t go out for drinks on Friday, so you have to tick it”.


Yea, legitimate with illegitimate is a weird kind of calculation, as the risk with illegitimate market is to end up in jail, and few people want to calculate the monetary value of lost time due to incareration and all the fallout that comes with it.

The more interesting question would be, if the bug bounty is enough to keep legitimate researchers engaged to investigate and document the threats. But..

The bug bounty itself is only a drop in the bucket for security companies, as it's a, unsteady and b, not enough to cover even trivial research environment cost.

Pratcially it's a nice monetary and reputation bonus (for having the name associated with the detection) in addition to the regular bussiness of providing baseline security intelligence, solutions and services to enterprises, which is what earns the regular paycheck.

Living from quests and bonties is more the realm of fantasy.


Is it actually illegal to sell an exploit to the highest bidder? Obviously deploying or using the exploit violates any number of laws.

From a speech perspective, if I discovered an exploit and wrote a paper explaining it, what law prevents me from selling that research?


(I'm not a lawyer but) I think that would involve you in the conspiracy to commit the cybercrime, if you developed the exploit and sold it to an entity that used it with wrongful intent.

https://www.law.cornell.edu/uscode/text/18/1029 gives the definition and penalties for committing fraud and/or unauthorized access, and it includes the development of such tools.

A lot of it includes the phrasing "with intent to defraud" so it may depend on whether the court can show you knew your highest bidder was going to use it in this way.

(apologies for citing US-centric law, I figured it was most relevant to the current discussion but things may vary by jurisdiction, though probably not by much)


You only risk prison if you sell it to the "bad guys" on the black market. Sell it to people who can jail the bad guys instead; that is, our governments.


Yep. The US did drop around 160,800 tons of conventional bombs on Japan during WWII, thought that's still relatively tame compared to the 623,000 tons they drop on Germany. Though the two nukes more than made up for it, I guess.

Bomb findings during construction is nothing especially rare in these countries.


The conventional bombing of Japan was scheduled for massive increase. To quote Ian Toll's "Twilight of the Gods":

> If the war had lasted any longer than it did, the scale and ferocity of the conventional bombing campaign would have risen to inconceivable new heights. [...] At the height of the bombing campaign, between May and August 1945, a monthly average of 34,402 tons of high explosive and incendiary bombs were dropped on Japan. According to USAAF chief Hap Arnold, the monthly total would have reached 100,000 tons in September 1945, and then risen steadily month by month. By early 1946, if the Japanese were still fighting, eighty USAAF combat groups would be operating against Japan, a total of about 4,000 bombers. In January 1946, they would drop 170,000 tons of bombs on Japan, surpassing in one month the cumulative tonnage actually dropped on the country during the entire Pacific War. By March 1946, the anticipated date of the CORONET landings on the Tokyo plain, the monthly bombing figure would surpass 200,000 tons.


> The conventional bombing of Japan was scheduled for massive increase.

Allegedly.

It's possible it's true, but claims like this have the incentive of selling the "atom bombing Hiroshima and Nagasaki was necessary and justified" narrative behind them, so that should be taken into account as a factor.

It doesn't even have to be consciously disingenous - the more one can convince oneself (and thus eventually others) of how destructive and costly conventional warfare would have been, the more digestible the nuclear option becomes, so there's a lot of motivation to fuel some motivated reasoning.


There's no reason to doubt it. The resources that had been devoted to Europe were freed up and now could be fully focused on Japan.


Professionals talk logistics indeed. To imagine what kind is pipeline would be required to enable such a venture. Producing, assembling, and shipping millions of tons of explosives as a continual operation.


'Between 1965 and 1975, the United States and its allies dropped more than 7.5 million tons of bombs on Vietnam, Laos, and Cambodia—double the amount dropped on Europe and Asia during World War II.' - https://storymaps.arcgis.com/stories/2eae918ca40a4bd7a55390b...


>Though the two nukes more than made up for it, I guess.

Not if you go by the kiloton rating of those two bombs: they were each in the kiloton range (around 10-15 kT IIRC), so if you add a generous 30,000 tons to the 160,800 you mentioned before, that's 190,800 tons, still far short of the 623,000 tons dropped on Germany.


So like, is “no unexploded ordinances detected” a checkbox/service for those “call before you dig” organizations in those places?


In some parts of France, you can’t dig without getting a specialized surveyor inspection and certificate it’s safe to dig this deep in that place first.


Absolutely. In my country it is mandatory to submit an UXO report as part of getting the building permit for nontrivial stuff. Most of the time this is boring office work (Was there a strategic target nearby during WWII? Are there any records of bombing happening here? Have there been earthworks in the last 70 years significant enough to rule out anything still remaining?) and you get a report noting that there's no risk expected, but sometimes you have to call in the cavalry and go searching with ground-penetrating radar.

It's just part of doing business, really. Same story with archaeological remains, chemical contamination, or threatened animal species.


Oh, the analogy does work. Every construction needs to be adjusted at times. Sure, not as often as software, but new regulations and the passing of time is eating at the substance. After a couple of decades most buildings tend to need major overhaul and that's not much different than software. Even the reasons are similar (e.g. new building codes, energy efficiency standards, obsolote tech stacks - think asbestos and lead pipes). Especially if you live in an area where the city scape needs to be preserved for historical reasons, houses behave very similar to software - just on a different time scale.


> After a couple of decades most buildings tend to need major overhaul and that's not much different than software

Respectfully disagree. Software is like building a house, and then needing to build more rooms every month forever, and every few years having to tear it all down or completely rework the foundation.


Guess it depends on the software. I have seen enough business critical software that was built 15 years ago with the developer having long left the scene and nobody having any idea on how it works internally (much less skill to actually change something).


Yea. Many places in Europe have historical city scape protection. Buildings that have been built centuries ago are being rebuilt internally all the time to fit new purposes and regulations. Not to mention extreme cases like the Kowloon walled city, that was basically a gigant interconnected amalgamation of buildings that housed 35000 people. Nobody envisioned what that would become when it started as an imperial fort, that's for sure. There are many reasons why building are remodelled to fit a new purpose without the new purpose even having existed when the buildings were first conieved.

ps. And even modern buildings suffer from this, like the projects where the requirements change all the time. Like Irelands new children's hospital, that should have cost just a couple of million Euros and balooned to billions. Construction projects are somemites done exactly like software development projects with all the fallout that comes with it. Same story with the airport in Germany (BER).


As a GNOME user, I kinda understand what they want to achieve, but they are seriously short on resources, so there is really little substance to all the rosy aspirations. They are also very oppinionated, which then turns away a lot of liberal developers that just want to scratch their own itches.

As for file manager usability, I grew up with Norton commander and pretty much gave up on ever seeing power user addressed file manager. It's fine for simple office type stuff that I bother few times a month on my Linux system but that's basically it.

When I have any more elaborate needs I fall back to plain old terminal with something like git or maybe even midnight commander, because that's what's getting the job done.

What I find really sad is, that they have like a million bindings to every programming language there is (including one that they made up) and I have no idea how they want to maintain that codebase. The basic API still looks somewhat antiquated and disjointed, but now it's in JavaScript and Vala. So even the more OCD type developers that would accept the design language constraints are frustrated that it looks so sad under the hood.

But I mean, I get it. Building a consistent desktop environment with a clean design language is hard and especially expensive. I'm impressed by what GNOME actually manages to get done with the few resources that they have. Is it anywhere close to being consistent and complete. I don't think so.

But than again, I mostly just use the desktop environment to open Chrome and the terminal, so for me it's perfectly fine.


>What I find really sad is, that they have like a million bindings to every programming language there is (including one that they made up) and I have no idea how they want to maintain that codebase.

I believe that's to do with gobject introspection (see *). From what I understand they mostly generate bindings through gir files. It's actually really cool what they've pulled off with it.

* https://viruta.org/the-magic-of-gobject-introspection.html


CADT strikes again. With so few resources they’d have been much better served refining gnome2 than bulldozing it.


Sadly, doing the work is usually just half of the effort if you want to be independent. Maintaining business contacts is the other half. You seem to have been doing quite some coding in the past on part time jobs. Do you have a database about who you have been working for? Maybe they need some updates or maintainance on the stuff that you have built in the past? Or maybe they know someone who needs something done. Personal connections are what makes any business work.

Not saying you have to be a sociable person. I'm a total introvert myself, but knowing a couple of insiders that act as multipliers for these kinds of jobs would sound like it could help you, especially if you have a proven track record.


I found that ChatGPT needs to be rained in with the prompts, and then it does a very impressive job. E.g. you can create a function prototype (with input and output expectations) and in the body tell the logic you are thinking about in meta-code. Then tell it to write the actual code. It's also good if you want to immerse yourself into a new programming language and outline what kind of program you want, and expect the results to be different from what you throught, but insightful.

Now if you throw larger context or more obscure interface expectations at it, it'll start to discard code and hallucinate.


There is a reason that Esperanto or one of it's siblings is not the language of global understanding and we are discussing in English. The world at large generally does not care about these kinds of fancy. It's half a miracle that we agreed to whatever the Chromium engine implements to be a baseline that most folks build stuff on.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: