Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm sorry, but the need for parallelism is exaggerated; it's mostly a fad. It greatly complicates language design and debugging such that it should be added and used with care. Systems software (OS, networking, database engines, etc.) do indeed need good support for it, but the majority of domain applications only need basic support for parallelism (unless you are doing something silly, like reinventing a database). If language designers don't learn to say no to fads, the language becomes a bloated mess that only a mother can love. Let other suckers test fads for viability, and only add features that are relevant and time-tested for the target audience.


Parallelism is certainly not a fad. We have such different conceptions of the reality around us that it's not even worth arguing with you (in particular, after seeing your other comment about the Evils of Lambdas). I'll happily play your role of the "sucker."

Moreover, you missed my point, unless you're genuinely making an argument for radical ludditism.


He said it's mostly a fad, not entirely. Obviously parallelism is frequently very useful.

I think what he was getting at is that the idea of languages integrating parallelism such that almost all code would be parallel all the time has proven to be a dead end. For the functional language research world promised for a long time that the main benefit for FP was automatic parallelism (no mutable states, you see). But it never really happened.

And in the more mainstream world, Java 8 invested heavily in parallel streams, but I've never seen an actual parallel stream in real code. I see threads all the time, despite everyone agreeing how awful they are. Parallel streams? Auto-parallelised FP code? No, doesn't happen. The level of parallelism is too small for programmers to think about. Most datasets are too small and most loops too unimportant to even take the risk of a bug creeping in through attempting to use parallelism, regardless of how convenient.


> I think what he was getting at is that the idea of languages integrating parallelism such that almost all code would be parallel all the time has proven to be a dead end.

But that's obviously not what I was talking about. I wasn't even talking about parallelism in the context of language design. If an old tool is written in C and uses shared global mutable state so extensively as to make a performance boost from parallelism so impractical as to require a rewrite, then that's exactly the thing stopping people from "just" redirecting their efforts to improving old software. Because sometimes small incremental improvements aren't enough, and getting the buy-in to do a full rethink of an old tool takes tremendous social capital. And this is just about improving the old tool using its existing language---this isn't about going and using a new language with better support for parallelism. (Which, by the way, comes in many flavors.)

At no point did I talk about fancy parallelism related language features. At no point did I say that we should just go and add parallelism to an old program because that's what all the cool kids are doing. I framed it as a specific solution to a specific problem: making the old program faster. People like faster programs and speed ain't a fad. If you want to bring in examples of adding parallelism that don't improve performance, then great, but leave me out of it, because at no point did I advocate that.

I think your interpretation of tabtab's comment is a real reach, but whatever. Particularly given their other comments denouncing the evils of lambdas. Really? Give me a break.


I've been in a good many debates about the practically of lambdas and parallelism for typical application development[1], and I stand by my claim. My opponents failed to give sufficient practical examples that stood up to scrutiny. Granted, some of it is subjective, but the fact it's subjective means it's not a necessity.

True, sometimes it's good to have more than one way to do something, but sometimes it also creates a bigger and unnecessary learning curve. In the cases I looked at, making better OOP would result in a simpler language than adding lambdas, at least in my judgement. I'd be glad to debate it further, but this is not the best forum for such.

[1] Some argue the same languages should be used for systems software and applications development. I don't agree.


I see your debates and raise you practice. I've done plenty of typical application development that makes practical use of lambdas and parallelism for great benefit.


I'm not saying they "don't work", only that other features/designs can often do them also without using/adding new constructs: parsimony of language features. Can you suggest a forum to continue such debates? If not, I'll see what I can find. (My past favorite forums died.)


You've pretty drastically changed topics. I have no interest in a language design debate with someone who thinks of lambdas as "magic Legos." As I said before, our conceptions of reality are so drastically different that it would be a gratuitous waste of my time.

You've gone from "parallelism is a fad" to "let's subjectively evaluate language design based on my ideas of parsimony." Parallelism isn't some kind of language feature I'm touting. It's an implementation tactic that can be used to make certain programs faster. Before such tactics were common, it was easy to write programs that were blissfully unaware of it in such a way that it is difficult to bolt on later. I used parallelism as one possible example of something that would be incredibly difficult to add to older programs to combat the notion that we should stop re-inventing the world and instead improve older stuff.

Like most things in this world, parallelism can be misused. This is hardly interesting and really doesn't need to be pointed out. Moreover, the degree to which parallelism can be wielded effectively will, in part, depend on your tools. Some tools make parallelism easier to use or reason about than other tools do. With that said, even that was immaterial to my point, because I wasn't doing a comparative analysis of parallelism across programming languages. I was just using it as an example of a concrete improvement that one could make to older programs that is often not practical to do, precisely because they are older programs that are widely used. Parallelism isn't the only example of this kind of improvement; jerf provided other examples in this thread. But parallelism is an easy one to grasp because most programmers with a few years of experience can appreciate what it's like to refactor a large program that is deeply coupled to unsynchronized shared global mutable state to a program that isn't---which is generally a good idea if you want to add parallelism to it.

Your initial response to this was:

> I'm sorry, but the need for parallelism is exaggerated; it's mostly a fad.

But this is completely pointless. That parallelism is a useful way to make a program faster is taken as axiomatic in my comment. There is enough of my code (and others) out in the wild that uses parallelism to achieve some concrete measurable improvement that I feel it is obviously true and really doesn't need any further explanation. But here you are, nitpicking at an axiom with a bunch of nonsense about "fads," and completely missing my point in the process.


Re: "let's subjectively evaluate language design based on my ideas of parsimony." -- I didn't claim that. I don't know where you got it. Note that "less code to do X" is often the claim made, and "code size" is probably the most objective metric available to compare techniques (but still imperfect). "Number of lines needing changes per change request X" is another. Other claim types such as "elegant" is often in the eye of the beholder. There is no universally accepted ruler (metric) for "elegant".

And I'm not against SOME parallelism support in app languages. It's just my experience that if you "need" to do it A LOT in applications, you are probably doing something wrong.

Re: "Like most things in this world, parallelism can be misused. This is hardly interesting and really doesn't need to be pointed out." -- Fads tend to greatly increase the % of misuse.

And lambdas and OOP do fight over similar territory, or at least can if the language has certain features. For example, Java coders may say, "I need lambdas here because OOP can't do what I need." But it turns out Java's OOP can't do what's needed, not something inherent in OOP.

Anyhow, English is often not sufficient by itself to clarify such debates; we'd need specific code samples and scenarios to explore.


> And I'm not against SOME parallelism support in app languages.

Once again, you said:

> I'm sorry, but the need for parallelism is exaggerated; it's mostly a fad.

So, what's your point? Every time someone talks about parallelism---whether it's legitimate or not---you see it as your duty to pontificate on the "fad" that is parallelism? You've completely and utterly missed my point, which had nothing at all to do with debating the legitimate use of paralleism in a specific case. That there are legitimate uses of parallelism is enough to make my point.

> It's just my experience that if you "need" to do it A LOT in applications, you are probably doing something wrong.

Again, so what? This has literally nothing to do with anything I've said in this thread.

So, umm, thanks for derailing the conversation into your own pet cause?

> And lambdas and OOP do fight over similar territory, or at least can if the language has certain features. For example, Java coders may say, "I need lambdas here because OOP can't do what I need." But it turns out Java's OOP can't do what's needed, not something inherent in OOP.

Again, so what? This seems like a useless academic point. At no point did I invite such navel gazing.


Re: "That there are legitimate uses of parallelism is enough to make my point." -- Yes, there are niches that need such; I don't dispute that. "Faddism" is a matter of degree. Every big language shoehorning in lots of X because niche Y needs X is probably a case of faddism. Again, we'd probably have to examine specific instances. English by itself won't settle this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: