Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Is Lisp a Blub Language? (derkeiler.com)
39 points by ehsanul on April 26, 2010 | hide | past | favorite | 76 comments


I find in certain cases that I'm looking across the power spectrum from the Lisp point of view. I recently ported some code from Haskell to Clojure and occasionally found myself missing Haskell's amazingly expressive type system. Trying to give Lisp such a type system would make a lot of things that are currently easy in Lisp hard or impossible.

There are certain kinds of mutually exclusive features that make certain kinds of problems easier or harder. Examples that come to mind are mutability vs immutability and static typing vs dynamic. Most of the things I sometimes find lacking in the Lisps I'm familiar with are of this sort.


> There are certain kinds of mutually exclusive features that make certain kinds of problems easier or harder.

Right. The fundamental problem with the whole "blub" argument is that there isn't one linear continuum of language power. There are problems for which Erlang's supervision hierarchy and distribution primitives or Prolog's backtracking are a killer feature. This doesn't mean they're "More Powerful than Lisp", just better suited to certain kinds of problems because they committed to some very specific trade-offs.

But, these same trade-offs have far-reaching implications for the language semantics, so you can't just use macros to graft them on after the fact. You can embed a mini-Prolog in Lisp, sure, but adding fully native logic variables (as in Prolog or Oz) would be far from trivial.


Blubness of a language comes from its practitioners not knowing about useful features from a higher-level language (or not grasping the utility of such a feature).

If your favorite Lisp lacks a certain feature you want, it's easy to add, making Lisp the anti-Blub.

(And if your favorite Lisp makes it hard to add it, you've picked the wrong favorite!)


Some features are not easy to add. For example, one important feature of Python is that the language is designed with consistency and readability in mind, and combined with the "preferably one obvious way to do it"-ideal means that code written by other people is easier to read and understand. This makes code and knowledge sharing easier, and the network effect creates a blooming ecosystem for libraries.

How do you easily add that feature to your favorite Lisp?


Your example is a feature of the Python philosophy (or community) not the Python language.


The philosophy is hardcoded in the design of the language. For example, the BDFL has explicitly stated he doesn't want macros because it will hurt the readability (1), and he rejected support for multi-line anonymous functions because he didn't find a syntax which he thought was clear and readable enough. Clearly this is a very different philosophy than the one behind Lisp.

(1) The quote from Guido: Programmable syntax is not in Python's future -- or at least it's not for Python 3000. The problem IMO is that everybody will abuse it to define their own language. And the problem with that is that it will fracture the Python community because nobody can read each other's code any more. (http://mail.python.org/pipermail/python-3000/2006-April/0002...)


Well it's easy to disagree with him on many points.

The use of macros that I've seen in CL source doesn't indicate that people are creating other programming languages out of Lisp. In fact, it makes the source easier to read by allowing a natural level of terseness by way of abstraction.

Python's method of abstraction is via encapsulation in it's class/meta-class/object model.

However, this is an argument for another post.

Funny how these Lisp posts continue to flame across the net.


Programmable syntax is not in Python's future -- or at least it's not for Python 3000. The problem IMO is that everybody will abuse it to define their own language. And the problem with that is that it will fracture the Python community because nobody can read each other's code any more

I like Python and use it everyday, and I this is simply FUD. Sadly it's the kind of argument I hear coming too often from people in the Python community unfamiliar with Lisp when attempting to critique powerful Lisps.


Is it really FUD though? Having gone through a few bruising experiences with different libraries having incompatible object systems built in Javascript, I have come to appreciate the advantages of only having one way to implement certain types of structures.

It doesn't need to be hard-coded into the language though - a decent Standard Library showing how things should be done, and a culture maintained by the community would be enough. That would allow everyone the freedom to do what they want if they say a definate advantage in breaking with convention, whilst making it easy for people to generate libraries that are interoperable...


I think macros do have certain disadvantages (they make debugging seem to look harder, more syntax, etc, ...).

But I find things like extensive use of MOP also make maintenance of programming more challenging.

Common Lisp has never tried to take away 'power' from users.

Scheme had a different philosophy: reduce everything to the most basic and pleasing constructs. But that approach has its own disadvantages - if one arrives at the bottom of programming language constructs, working 'upwards' is a problem.

Take for example the argument lists: Common Lisp has things like keywords, optional and rest arguments. Plain Scheme only has rest arguments. Adding other argument interpretation is possible, but is only really use if the language would support it and would make use of it.


>Clearly this is a very different philosophy than the one behind Lisp.

Depends on which lisp you mean. There are many dialects of lisp and each have their own philosophies and ideals. Scheme, for example, is a minimalist lisp whose philosophy is probably not too different from python - although python has a far bigger standard library (but that is a separate issue).


I actually wound up in an email discussion about that with Guido. He made it clear that it wasn't just that he didn't find a syntax he liked, but that he didn't try hard because he's not convinced that code that makes heavy use of anonymous functions is a good idea. And the latter seems to me to be his real objection.


The line between the language and the philosophy driving the language's development is blurry, at best.


With Common Lisp, you'd create a library with a package, say, consistent-lisp, and clone all the features of common-lisp into it, except with consistent, readable names and with any other fixes like argument order, etc. The actual language part of this is not hard to do, it's the community part that's difficult.


Some things are difficult to add even to Lisp. Static typing being an example.


As MagV mentioned, Typed Scheme (http://www.ccs.neu.edu/home/samth/typed-scheme/) is an example of static typing in a lisp-like language. I've used it, and it is great. It interoperates smoothly with the normal dynamic typed Scheme. Typing is at the module level (a module must be either statically or dynamically typed) and contracts (http://doc.plt-scheme.org/reference/contracts.html) are used to enforce invariants at module boundaries.


Difficult, but not impossible. Qi and Typed Scheme are two examples.


It's hard to add static typing (and get much utility out of it) without forcing its use. If you do that, you lose dynamic typing.


... and that's bad because?

If you want static typing, then I don't think trading out dynamic typing is a big deal to you.

Though as pointed out else where, many implementations allow you to give their compiler type hints... which is a sort of half-way "best of both worlds" system.


The halfway "best of both worlds" is impossible to attain, in my opinion. As far as I understand it, dynamic typing (meaning, extensive usage of runtime type information) has unmatched flexibility. Static typing (meaning, extensive analysis of syntactic types at compile time) completely prevent large classes of errors.

If you want a middle ground, you may lose some of the flexibility, and you still won't be able to prove as much as a full static type system. In the end, the "best of both world" could rapidly become the worst of both worlds.

As I see it, we have to compromise. When you design a type system, you want to maximize 3 virtues: flexibility, simplicity, and error sensitiveness. Alas, of these 3, you can only have 2. Dynamic type systems typically are simple and flexible, but hardly prove anything (which explain why unit tests are so useful). Advanced type systems like Haskell's are quite flexible and prevent many errors, but they are complex. Others, like Java's, are simpler but not as flexible (nor as error proof). And of course you have horrible type systems, like C++'s, which lacks all 3 virtues.


I find that with a good, type-aware compiler such as SBCL, Common Lisp hits a really sweet spot on the typing issue. It catches a crapload of mistakes at compile time that would be runtime errors in Python (the language, incidentally the SBCL compiler is also named Python) and most other dynamic languages. The type-language of CL is also more expressive than anything else I have seen, featuring unions, intersections, predicates, subtyping etc. But most importantly, it stands in the background, and doesn't stop you from executing programs that don't conform to some preconceived notion of what typing should be like.


right, that's a good example


On the other hand you could make an argument that static typing isn't actually a language feature as much as it is a step in the process of programming.

Static typing consists of (a) tagging variables with their type and (b) running a program that uses these type tags to check and/or rewrite the program.

It's easy to add type tags to a lisp program. It's just that Lisp doesn't specify that second program that checks and transforms the first. So I would say Lisp is half way there when it comes to static typing.

That's a pretty contrived argument though... :-)


no, adding type declarations to Lisp is the easy part. when it comes to static typing, standard Common Lisp offers very little.

Declaring types? That has been done. In Common Lisp:

    (defun twice (n)
      (declare (number n))
      (the number (* n 2)))
The difficult parts are:

* the type system and its capabilities

* make the operations of the type system sound

* determining sub-types

* type inference

* integration with the rest of the language (where data objects also have something like types)

Common Lisp provides lots of infrastructure for all kinds of things, but very little for a type system. For example in Lisp one can determine the value of an expression via EVAL, but there is no function to compute the type of an expression (other than a type of the computed value).


Pattern matching is the missing feature. I keep thinking about switching to Clojure from Mathematica, but then I think "How can anyone get anything done in Clojure? It doesn't even have pattern matching."

Its not something that can get patched in a library, because the way symbols and evaluation need to work is different (and simpler) than in a Lisp. There is no distinction between macro-s and nonmacros - everything is just a tree transformation.

The main benefit of first-class pattern matching is that your function definitions get a lot more succinct and expressive, since you can encode quite a lot of information in the structure of the arguments, and elegantly unfold the definition from the short and common case to a parameterized sequence of generalizations.


What exactly do you mean by pattern matching? I understand it in the ML sense. The Lisp language I use, PLT Scheme, has extensible pattern matching: http://docs.plt-scheme.org/reference/match.html The implementation isn't simple but the techniques are published (see "Pattern matching for Scheme" http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.53.2...) so any Lisp language could implement this. Clojure has less powerful pattern matching, but it does the most common stuff.

The linked article is really about Common Lisp, where the issue is a standard that hasn't been updated in a long time. This doesn't stop individual implementations from making their own advances but in my limited knowledge of the CL implementations this doesn't seem to be occurring.


This is actually occurring. Most advances are made in cross-platform libraries (networking, regexps, threading, pattern matching, lazy evaluation, etc.) but some implementations have their own extra that make them interesting (by no means an exhaustive list):

ABCL (http://common-lisp.net/project/armedbear/) runs on the JVM.

CCL (http://www.clozure.com/clozurecl.html) has excellent integration with Objective-C and Cocoa on OS X.

ECL (http://ecls.sourceforge.net/) is easily embeddable in C / C++.

The commercial offerings Allegro CL (http://www.franz.com/products/allegrocl/) and LispWorks (http://www.lispworks.com/) keep on improving and extending their own implementations in interesting ways.


Good question. Here is what I mean:

1. Pattern matching as the basis for function definition, to determine which code executes and expedite argument destructuring.

2. Patterns themselves should have first-class representation (preferably symbolic), so you can generate them in one place and use them in another.

3. Implicit in this is that the structure of the language is systematic enough to make this worthwhile, meaning something s-expression based, or perhaps something like Scala that achieves similar ends in a much different way.


How close does Qi (http://en.wikipedia.org/wiki/Qi_%28programming_language%29) come to what you desire?


you might check sometimes the documentation of CL implementations for the advances that they provide over the standard


A little bit of googling gave me the sense that pattern matching in Mathematica is different from the pattern matching present in several functional languages. Please let me know how these Clojure features differ from that:

Clojure has pattern matching for the arguments in a function definition similar to ML or Haskell. It's often used as a more verbose alternative to optional args, but it's more powerful fundamentally.

Clojure has destructuring in binding forms like let, described here: http://clojure.org/special_forms

Clojure has multimethods which let you pick the method to use based on an arbitrary dispatch function.

There's a pattern matching macro that seems to be the beginning of a DSL for predicates: http://www.brool.com/index.php/pattern-matching-in-clojure


There is 1 mechanism instead of 4, and it also happens to be the fundamental basis of evaluation.

There is no way to measure "absolute power" of elegance, which is the issue being gotten at here.

Not to say Mathematica has it gotten it perfect either, because it hasn't. But I have yet to see a superior implementation, and have seen a lot of clunky ones in my search.


I certainly miss ML-style pattern matching when I'm in Lisp. I sometimes find myself (poorly) simulating the idea with a whole bunch of multi-method specializers.

Based on my limited experience, it seems like you'd really need a more static type system to make the most of pattern matching though. Are there any dynamically typed languages with powerful/useful pattern matching?


As far as doing it in Lisp goes, this is an interesting attempt: http://common-lisp.net/project/cl-match/doc/clmatch.htm


Erlang of course.


Prolog


Prolog has unification, which is considerably more powerful than just pattern matching. The semantics of Prolog require the ability to unify data structures with some parts not-yet-specified, and backtrack later if they cannot be bound to something valid. In contrast, pattern matching happens all at once and is unidirectional. (Though even basic pattern matching is tremendously useful, IMHO.)

Erlang's pattern matching is somewhat like Prolog's, but with backtracking removed. (Backtracking clashes with Erlang's other semantics.)


Putting structure assumptions into function argument lists is not necessarily a good idea. One exposes the implementation.

Pattern matching function definitions are easy to do in Lisp.

But given the nature of Lisp, much of the pattern matching then needs to be done at runtime - which leads to less efficient code and invites people to write totally inefficient code.


Consider the Mathematica function Take[], which would save millions of man hours if it existed in other languages.

Take[{a,b,c,d},2] --> {a,b}

Take[{a,b,c,d},-2] -> {c,d}

Take[{a,b,c,d},{1,3}] -> {a,b,c}

Take[{a,b,c,d},{1,-1,2}] -> {a,c}

Take[{{a,b,c,d},{1,2,3,4},{5,6,7,8}},2,2] -> {{a, b}, {1, 2}}

etc. In my book this is clearly useful, and its only scratching the surface. ( {} is actually List[] in Mathematica FullForm ... )

The point of symbolic representation is to represent the intended meaning. Its nothing about the "internal" implementation. Using symbols and simple tree structures to compactly express stuff is extremely powerful, and not coincidentally the essential way that human language works.

Sure, with a sufficiently dynamic language you can implement such functionality. The problem with Lisp is that by default symbols want to evaluate, and if you want to treat them symbolically you have to operate in a "special" mode. This makes things too complicated. There is a reason you don't see meaning represented by structure in pretty much any other language besides Mathematica.


It looks like you have never programmed in Lisp.

A function like Take is easy to write in Lisp. Lisp has many similar functions like that - but with a better interface.

> There is a reason you don't see meaning represented by structure in pretty much any other language besides Mathematica.

Could it really be that you missed the AI software that has been written in Lisp in the last five decades?


Take is a really terrible example, even python has take built into the syntax (Take[lst, 2] -> lst[0:2]). But Mathematica does have nice pattern matching which few lisps have:

    SolvePoly[a_*x+b_=0] := { -b/a }
    SolvePoly[a_*x^2+b_*x+c_=0] := \
        { (-b + sqrt(b^2-4*a*c))/2a, (-b - sqrt(b^2-4*a*c))/2a }
    SolvePoly[_] := "I only took high school algebra"
Of course, in principle one could write a pattern matching macro in lisp, and I imagine there are already some halfway implementations of it. That's pretty much just Turing Equivalence.


The point was not lst[0:2] in Python. See the definition of Take in Mathematica - it is quite a bit more capable.

You need also differentiate between 'pattern matching' and 'rewrite system'. Pattern matching is just taking a pattern, some data and see if it matches.

    (match '(+ (* ?a x) ?b) '(+ (* 123.0 x) z)) -> T
Routines like the above are found in many books about Lisp and have been provided in Lisp libraries for decades.

Specifying rewrite rules with patterns for mathematical purposes (simplification, integration, differentiation, ...) is also almost as old as Lisp. Norvig's book 'Paradigm's of AI Programming' explains how it is implemented in Lisp. These things are at the heart of several computer algebra systems written in Lisp - like Macsyma.


how would one define SolvePoly or even Take for that matter in terms of Mathematica if it were not "built in"? I am not saying it is impossible (I really do not know and thus I am curious).


Same way you would do it in lisp - you'd build a function to do pattern matching and use conditionals to test which pattern is matched.

    (defun solve-poly (p)
        (if (match '(+ (* ?a x) ?b) poly))
            (-b/a  where b, a come from (extract-values-from-pattern-match poly))
           (some code for second order)
     ))
(Some code borrowed from lispm's comment. Not sure which library the match function comes from.)


What is ' other than a special mode? The fact is that symbols are treated more systematically in Mathematica, and that makes it easier to assemble and dissemble symbolic structures of all sorts.

Sounds like a case of blub. You look at Mathematica and see some weird stuff that is probably equivalent in power to multimethods or whatever, I look at Lisp and think how can I possibly live without civilized pattern matching.

As far as Lisp-based AI goes, its in fact very easy to miss it, but this is probably not the thread to get into Lisp's cultural issues.


Symbols are not treated more systematically in Mathematica.

QUOTE is a special operator in Lisp that causes the evaluator to return its argument unevaluated.

     (quote a) -> a
     (quote (+ 1 2)) -> (+ 1 2)
     (quote "abc") -> "abc"
Mathematica has a different strategy for computation, one that is based on rewrite rules. Expressions are rewritten until they can't no longer be rewritten.

In Lisp evaluation

    (+ a a)
gives an error if the variable A has no value.

In Mathematica it would be reduced to

   (* 2 a)  ; in Lisp syntax
if A has no value.

But that has little to do with the processing capabilities of Lisp. A function like Take can be written very natural in Lisp - there is NO special mode needed. The function takes data and computes results.

You may want to get out of your Mathematica blub and learn some Lisp. Writing the 'Take' function is a good exercise. Stuff like that are basic exercises in Lisp courses.


What is ' other than a special mode?

I usually implement it as a reader macro. :)


it's a shortcut to quote


it's a shortcut to quote

Yep.

My comment was about implementing one's own Lisp reader.

    (define *reader-macros* `(
      (#\' ,(lambda () (list 'quote (read))))))
instead of

    (case dispatch-char
      ((#\') (list 'quote (read)))
      ...)


> There is a reason you don't see meaning represented by structure in pretty much any other language besides Mathematica.

Actually, this is done in Lisp and Prolog all the time.

Off the top of my head: J, K, Haskell, and Joy have take functions, and writing them in Lisp and Prolog is not difficult.


> The problem with Lisp is that by default symbols want to evaluate, and if you want to treat them symbolically you have to operate in a "special" mode.

So you're saying that variable bindings should be lazy? That if a symbol is not bound to a value, it should be treated as an unevaluated symbol?

I'm not quite certain what your point is. The actual functionality of Take can be easily replicated in most languages, and indeed, most languages have a version of Take called "slice".

But if your example is meant to demonstrate why lazy symbol evaluation is good, then I'm afraid it fails on that point too. There doesn't seem to be anything in your example that is particularly noteworthy.

Could you perhaps explain your point in more detail?


Clojure and Scheme have take and drop, though the semantics are slightly different from those of Mathematica. They can be written in almost any language.


My point is that putting "structure assumptions in argument lists", though not a good idea 100% of the time, is often extremely useful, as illustrated by the example.

The greater point is that if this is so, one might as well deal in a model of computation that is a natural fit for that way of thinking.

I've been doing a lot of work in Scala and Actionscript. How I wish they had that family of functions in full generality. Why are they missing from most languages? Because that is not how people think in them.


Your example above did not show any special pattern matching capabilities whatsoever.


Wow, Lisp has Take. I wrote one. It took me a few minutes:

    CL-USER 58 > (take '(a b c d) 2)
    (A B)

    CL-USER 59 > (take '(a b c d) -2)
    (C D)

    CL-USER 60 > (take '(a b c d) '(0 3))
    (A B C)

    CL-USER 61 > (take '(a b c d) '(0 -1 2))
    (A C)

    CL-USER 62 > (take '((a b c d) (1 2 3 4) (5 6 7 8)) 2 2)
    ((A B) (1 2))
That was easy.


would you mind posting your definition?


    (defun take-1 (it what)
      (cond ((eq what :all) it)
            ((eq what :none) nil)
            ((and (numberp what) (plusp what))
             (subseq it 0 what))
            ((and (numberp what) (minusp what))
             (last it (- what)))
            ((and (consp what)
                  (= (length what) 1)
                  (numberp (first what)))
             (nth (first what) it))
            ((and (consp what)
                  (= (length what) 2)
                  (numberp (first what))              
                  (numberp (second what)))
             (let ((end (if (minusp (second what))
                            (+ (length it) (second what))
                          (second what))))
               (subseq it (first what) end)))
            ((and (consp what)
                  (= (length what) 3)
                  (numberp (first what))
                  (numberp (second what))
                  (numberp (third what)))
             (let ((end (if (minusp (second what))
                            (+ (length it) (second what))
                          (second what))))
               (loop for e = (subseq it (first what)) then (nthcdr (third what) it)
                     for i from (first what) below end by (third what)
                     collect (first e))))))


    (defun take (thing &rest description)
      (cond ((null description) nil)
            ((and (consp description)
                  (= (length description) 1))
             (take-1 thing (first description)))
            (t (loop for e in (take-1 thing (first description))
                     collect (apply #'take e (rest description))))))


If you don't know how to write a pattern matching library in Lisp, you don't really know Lisp and should probably refrain from spreading misinformation about it.


Aren't all languages blub languages depending on the usecase?

If you write AI programs c++ is a blub language. How can you get anything done without macros?

If you program drivers PHP is a blub language. How can you get anything done without direct access to the hardware?

If you're doing webapps lisp is a blub language. How can you get anything done with a syntax that's so different from HTML and so difficult to read?

The whole premise of a blub language depends entirely on what you're trying to accomplish - the right tool for the right job.

Disclaimer - I'm not much of a programmer, so maybe my examples don't hold up, but you get the idea :-)


I actually picked up Clojure because I was:

Tired of having to write C++ to get decent graphics performance.

Tired of having to write C++ to get decent audio performance.

Tired of my cool dynamic web programming languages not being fast enough to do the above.

Tired of Python, Ruby, C++, C, Java, Objective-C, JavaScript having so little to offer in the way of elegant concurrency.

So I picked a Lisp because it could do more.


Disclaimer noted, but as to your lisp/HTML example, HTML and lisps actually have a lot in common - if you strip away the end tags and the angle brackets, you get something very lispy looking

    <div>
      <span>Hello</span>
    </div>
has the same (prefix) order of operators and operands as

    (div
      (span "Hello"))
Off the top of my head, the "hiccup" package for Clojure has an 'html' function (macro?) that translates just that sort of stuff directly into html


Interesting. Do you by any chance have links to some newbie resources? Maybe it's worth reading up on some sort of lisp. Clojure maybe?


Generating HTML from Lisp is very easy and natural. Of course, there are situations where it might be better to use an HTML template, and there are libraries for that too. Here are a few more examples of HTML generation in Lisp:

A tutorial about writing an HTML-generating DSL in Common Lisp: http://gigamonkeys.com/book/practical-an-html-generation-lib...

The Common Lisp HTML generation library most people actually use[0]: http://weitz.de/cl-who/

HTML generation in PLT Scheme's Continue framework: http://docs.plt-scheme.org/continue/index.html#(part._.Rende...

Arc's built-in HTML generation (powers this site): http://files.arcfn.com/doc/html.html

[0] I'm not actually certain about that, but I think it's the most popular.


clj-html - http://github.com/mmcgrana/clj-html

which is deprecated by

hiccup - http://github.com/weavejester/hiccup

though the latter has pretty lame examples


Also, AFAIK people generally use Lisp to generate HTML, not as a replacement to HTML.


A "blub language" isn't an inferior language, or one with less features. It's a language in the middle of the ladder that goes from assembly to the ideal language. All the languages you described except Lisp are, in PG's mind, blub languages. Notice that Lisp can be applied to every of your examples: just use it as a code generator (which is very commonly the case for HTML, and I've seen it done for C as well), or use an appropriate dialect (eg. bitc for low-level, arc or clojure for web programming, CL for AI).

What you're describing is just a less powerful or a less appropriate language.


So how many here can take any problem and use any language to solve it language-optimally with all known best practices with the least amount of code?

How do you figure out that one language is better than another for a given context where hundreds of people with 10 years of experience with language X would solve a problem faster and more elegant with language Y and zero experience?

Are the people finding certain languages unreadable and ugly really the people that should decide what language to use for a problem at hand? Or the experienced craftsmen with expertise in their old, ugly and "unredable" language?

Personally I find it rather funny and ironic that noobs are the ones driving the language of choice because they have learned how good their new and powerful language is - and because they are having trouble reading anything else.

In this way the noobs get an edge over Old-timers with years of experience that actually know how to write beautiful code in their old ugly language.

The noobs doesn't realize this until their favourite language is considered old and obsolete by even newer noobs, and they start to call themselves Old-timers..

I think most people will have a native language where they express themselves better and more elegant than any other language. They will probably solve any problem faster with their native language compared to the optimally correct besserwisser language. I bet their solution would be more readble and elegant too, compared to be using a language they don't have any experience with.


No.

I'll explain why: all languages are in some sense equally expressive, because they are Turing-complete. But some languages don't have particular abstractions, for example classes; so in that sense they are not expressive, because you can't express those abstractions.

But Lisp has macros. This means that any abstraction it doesn't yet support, you can add.


No, because you can't add language invariants (guarantees that thing X will never happen), and without invariants such as pervasive immutability, certain features are impossible to add.

While it's possible to add features to Lisp that are at odds with its fundamental model of evaluation, it's usually done by adding an interpreter (or, occasionally, compiler) for a nested sublanguage. There are several interpreters in SICP and EoPL, several Lisp books have a mini Prolog, etc. This is handy (and Lisps do make it relatively easy), but you can't graft something like Erlang's entire semantics onto Lisp with just macros.


> No, because you can't add language invariants (guarantees that thing X will never happen)

That's true. Macros aren't a perfect solution.

> without invariants such as pervasive immutability, certain features are impossible to add.

As are certain optimisations.

> While it's possible to add features to Lisp that are at odds with its fundamental model of evaluation, it's usually done by adding an interpreter (or, occasionally, compiler) for a nested sublanguage.

Indeed, which is in a certain sense cheating.

If one is designing a language, and one hopes that one's language will become popular, then it's likely (in fact inevitable) that the language will be used for tasks that the designer hasn't anticipated. So how can a designer cater to this? Macros are one way, and IMO a powerful one.

Another is to make the language so that it is easy to mix-and-match it with other languages, so that it can call code in other languages, be called by other languages, use common data structures and serialisation formats, etc. Implementing in the JVM goes some way to meeting this goal.


Macros aren't always a great solution. I think Haskell is the best example: many features, such as the lazy evaluation, the type system, and especially referential transparency would be very hard to add on Lisp without making it very verbose and ugly. Here's an example:

http://marijn.haverbeke.nl/monad.html


Were there any interesting suggestions in the reply to the op?


(@lispm Looks like our symbolic language flame war has exceeded yc metrics)

Yes, I understand what ' does. You are manually controlling evaluation. The same way, once upon a time, people manually controlled garbage collection.

Having a+a explode by default means that the whole time you have to be juggling what is intended to be used symbolically or not. This seems to not be a 100% perfect realization of the code == data paradigm.

I'm glad you now agree that structure is a good way to encode meaning. Now, what is a more idiomatic way to manipulate that information? Walking the tree manually, or expressing those patterns of structure directly?

Unfortunately, in Lisp, you need to "evaluation manage" those structures. And its not just quote, its the whole macro language with its own idiosyncrasies. Its just a lot easier to have a single elegant system with the right defaults.

I'm the kind of person who implements models of computation as a recreational activity. I've probably wished for more granular evaluation control 1% of the time, but having civilized pattern matching (and representation) has vastly increased productivity and code density.


Do you know Prolog? It sounds like it might appeal to you. Rather than evaluating expressions, it does pattern matching (unification, really) on data structures, rewriting and evaluating them as specified. It can pass around uninstantiated variables and do depth-first search* through known facts/rules to find complete matches, backtracking when it hits dead ends or alternative solutions are requested.

Rather than saying Prolog has pattern matching, it almost makes more sense to say it is pattern matching. It's very central to its model of computation.

* Breadth-first and other search techniques are easy to write, depth-first is just the default.


Yes, I understand what ' does. You are manually controlling evaluation. The same way, once upon a time, people manually controlled garbage collection.

Having a+a explode by default means that the whole time you have to be juggling what is intended to be used symbolically or not. This seems to not be a 100% perfect realization of the code == data paradigm.

I don't think you've thought this through.

If (+ a a) evaluated to (+ a a), then code would not be data, because there would be no code, only data.

There is a compiler for this ideal, pure-declarative language: it's called cat.

Alternatively, you could have a separate bracket type for evaluation, but that doesn't solve your complaint.

Unfortunately, in Lisp, you need to "evaluation manage" those structures. And its not just quote, its the whole macro language with its own idiosyncrasies.

The macro language is called Lisp.

Its just a lot easier to have a single elegant system with the right defaults.

If you don't grasp macros, then Lisp will seem useless to you. This is actually the origin of the term Blub.

I'm the kind of person who implements models of computation as a recreational activity. I've probably wished for more granular evaluation control 1% of the time, but having civilized pattern matching (and representation) has vastly increased productivity and code density.

I'm the kind of person who implements working Lisps as a recreational activity. I guarantee you that adding a (def/pat ) pattern matching function definition form is just a macro away (and already exists in some Lisps).

At that point, you don't need to manually quote your patterns, which I believe addresses the remainder of your objections.


I haven't said anything about encoding of meaning, you are dreaming.

You are the kind of person of Xah Lee...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: