2012-04-16

Lisp Hackers: Pascal Costanza

Pascal Costanza is a researcher, and an active Common Lisp programmer and community enthusiast:


  • he's the maintainer of Closer to Mop library, that provides a common facade to the MOP implementation in different Lisps, and is the basis of some of his more advanced libraries like: ContextL and FilteredFunctions;

  • the originator of Common Lisp Document Repository (CDR) project, that collects proposals for improving the language (a la JCP for Java or PEP for Python);

  • and the author of a Highly Opinionated Guide to Lisp, which can serve as introductory text for those, who come from other languages. (It was quite a useful text for me, when I started studying Lisp.)


In the interview Pascal shares a lot if insight into his main topic of interest — programming language design — grounded in his experience with Lisp, Java, C++ and other languages.

Tell us something interesting about yourself.

I share a birthday with Sylvester Stallone and George W. Bush. I have been a DJ for goth and industrial music in the Bonn/Cologne area in Germany in the past. I once played a gay Roman emperor in comedic theatre play. I played a few live shows with a band called "Donner über Bonn" ("Thunder over Bonn"). My first rock concert I ever attended was Propaganda in Cologne in 1985. The first programming language I really liked was Oberon. I often try to hide pop culture references in my scientific work, and I wonder if anybody ever notices. My first 7" single was "Major Tom" by Peter Schilling, my first 12" single was "IOU" by Freeez, my first vinyl album was "Die Mensch-Maschine" by Kraftwerk, and my first CD album was "Slave to the Rhythm" by Grace Jones. I don't remember what my first CD single was.


What's your job? Tell us about your company.

I currently work for Intel, a company whose primary focus is on producing CPUs, but that also does business in a lot of other hardware and software areas. (Unfortunately, Intel's legal department requires me to mention that the views expressed in this interview are my own, and not those of my employer.)

I work in a project that focuses on exascale computing, that is, high-performance computers with millions of cores that will be on the market by the end of the decade, if everything goes well. I am particularly involved in developing a scheduler for parallel programs that can survive hardware failures, which due to the enormous scale of such machines cannot be solved by hardware alone anymore, but also need to be dealt with at the software level. The scheduler is based on Charlotte Herzeel's PhD thesis, and you can find more information about it in a paper about her work and at http://www.exascience.com/cobra/.


Do you use Lisp at work? If yes, how you've made it happen? If not, why?

At Intel, I do all software prototyping in Lisp. The scheduler I mentioned above is completely developed and tested in Lisp, before we port it to C++, so that other people in the same project and outside can use it as well. It didn't require a major effort to convince anybody to do this in Lisp. It is actually quite common in the high-performance computing world that solutions are first prototyped in a more dynamic and flexible language, before they are ported to what is considered a "production" language. Other languages that are used in our project are, for example, MATLAB, Python and Lua. (Convincing people to use Lisp beyond prototyping would probably be much harder, though.)

The implementation we use for prototyping is LispWorks, which is really excellent. It provides a really complete, well-designed and efficient API for parallel programming, which turns LispWorks into one of the best systems for parallel programming of any language, not just in the Lisp world. The only other system that is more complete that I am aware of is Intel's Threading Building Blocks for C++.


What brought you to Lisp? What holds you?

I have participated in one of the first Feyerabend workshops, organized by Richard Gabriel, one of the main drivers behind the original Common Lisp effort. I have also read his book Patterns of Software around that time. Later we had a small discussion in the patterns discussion mailing list. He tried to promote Lisp as a language that has the "quality without a name", and I made some cursory remarks about Lisp's unnecessarily complicated syntax, just like anybody else who doesn't get it yet.

To me, the most important comment he made in that discussion was: "True, only the creatively intelligent can prosper in the Lisp world." The arrogance I perceived in that comment annoyed me so much that it made me want to learn Lisp seriously, just to prove him wrong and show him that Lisp is not as great as he thought it is. As they say, the rest is history.

I actually dabbled a little bit in Lisp much earlier, trying out a dialect called XLisp on an Atari XL computer at the end of the 80's. Unfortunately, it took too long to start up XLisp, and there was not enough RAM left to do anything interesting beyond toy examples, plus I was probably not smart enough yet to really get it. I was just generally curious about programming languages. For example, I also remember trying out some Prolog dialect on my Atari XL.

In the end, Lisp won me over because it turns out that it is the mother of all languages. You can bend it and turn it into whatever language you want, from the most flexible and reflective interpreted scripting language to the most efficient and static compiled production system. For example, the scheduler mentioned above easily gets in the range of equivalent C/C++-based schedulers (like Cilk+, TBB, or OpenMP, for example), typically only a factor of 1.5 away for typical benchmarks, sometimes even better. On the other hand, ContextL uses the reflective features of the CLOS Metaobject Protocol to bend the generic function dispatch in really extreme ways. I am not aware of any other programming language that covers such a broad spectrum of potential uses.


What's the most exciting use of Lisp you had?

When I decided to make a serious attempt at learning Common Lisp, I was looking for a project that would be large enough to prove to myself that it is actually possible to use it for serious projects, but that would also be manageable in a reasonable amount of time. At that time, I was intimately familiar with the Java Virtual Machine architecture, because I had developed compilers for Java language extensions as part of my Diploma and PhD theses. So I decided to implement a Java Virtual Machine in Common Lisp - under normal circumstances, I wouldn't have dared to do this, because this is quite a complex undertaking, but I had read in several places that Lisp would be suitable for projects that you would normally not dare to do otherwise, so I thought I would give it a try. Over the course of 8 weeks, with something like 2 hours per day, or so (because I was still doing other stuff during the day), I was able to get a first prototype that would execute a simple "Hello, World!" program. On top of that, it was a portable (!) just-in-time compiler: It loaded the bytecode from a classfile, translated it into s-expressions that resemble the bytecodes, and then just called Common Lisp's compile function to compile those s-expressions, relying on macro and function definitions for realizing these "bytecodes as s-expressions." I was really impressed that this was all so easy to do.

The real moment of revelation was this: to make sure to reuse as many of the built-in Common Lisp features as possible, I actually translated Java classes into CLOS classes, and Java methods into CLOS methods. Java's super calls posed a problem, because it was not straightforward to handle super calls with plain call-next-method calls. Then I discovered user-defined method combinations, which seemed like the right way to solve this issue, but I was still stuck for a while. Until I discovered that moving a backquote and a corresponding unquote around actually finally fixed everything. That was a true Eureka moment: In every other programming language that I am aware of, all the problems I encountered until that stage would have required a dozen redesigns, and several attempts to start the implementation completely from scratch, until I would have found the right way to get everything in the right places. But Common Lisp is so flexible that at every stage in your development, you can tweak and twist things left and right, but in the end you still get a convincing, clean, and efficient design. As far as I can tell, this is not possible in any other language (not even Scheme).


What you dislike the most about Lisp?

There is not much to dislike about Lisp itself. There are some technical details here and there, some minor inconsistencies, but nothing that cannot be fixed in easy and straightforward ways. From a purely conceptual point of view, Common Lisp is one, if not the most complete and best integrated programming language that covers a lot of ground. Some rough edges are just to be expected, because nothing is ever perfect.

What concerns me a lot more is that there is too much unwarranted arrogance in the Lisp community. I don't know exactly where this comes from, but some Lispers seem to believe, just because they understand some Lisp concepts, that they are much smarter than anybody else in the universe. What they are forgetting is that computer science as a discipline is very, very young. In one or two hundred years from now, Lisp concepts will be common knowledge, just like elementary algebra. There is no reason to be arrogant just because you know that the earth is round, even if most other people still believe that it is flat.


Describe your workflow, give some productivity tips to fellow programmers.

I strongly believe that the one thing that made me most productive as a programmer is my interest in doing some form of art. I used to spend a lot of time making my own music, both by myself with synthesizers and computers, as well as in bands. I also was an actor in an amateur theater group. Art gives you a sense of making parts (notes, chords, melodies, rhythms, or acts, characters, plot lines) relate to each other and form a coherent whole. It also makes you aware that there is an inner view on a piece of music or a play, as seen by the artist, but also an outer view, as seen or heard by an audience, and if you want to make good art, you need to be able to build a bridge between those two parts.

Programming is exactly the same: you need to make parts (functions, data structures, algorithms) relate to each other, and you need to bridge the inner view (as seen by the designer and implementer) and the outer view (as seen by the user of a library or the end user of the final software).

The important aspect here is that you need to be able to change perspectives a lot, and shift between the local, detailed view, the global, architectural view, and the many different levels of a layered design. This is especially important when it comes to designs that incorporate meta-programming techniques and reflective approaches, but also already for simpler designs.

Like in art, the concrete workflow and the concrete tools that work best vary a lot for different people. It's also a good idea to just play around with ideas, expecting that most of them will turn out useless and need to be thrown away. Artists do this all the time. Artificial, seemingly nonsensical rules and restrictions can be especially enlightening (use only effect-free functions; use only functions that receive exactly one argument, not more, not less; make every function pass around an additional environment; use only classes with exactly two slots; etc., etc.), and then try to build larger programs strictly following such rules - this will make your mind a lot more flexible and train you to see new potential solutions that you wouldn't see otherwise.

(I actually believe that this is what makes fans of static typing so excited: Static type systems always impose some artificial restrictions on your programs, and enforce them to the extent that programs that violate these rules are rejected. If you then program in such a statically typed programming language, you will indeed have some interesting insights and see new solutions that you would otherwise miss. However, the category error that fans of static typing often seem to make is that they ascribe the results to the static type system, and therefore usually get stuck with one particular set or kinds of rules.)

Apart from that, I believe that LispWorks is a really good development environment.


Among software projects you've participated in what's your favorite?

I don't know how to answer that. At any point in time, I'm always most excited by the one I'm currently working on. So far, I am quite proud of ContextL and the ClassFilters project for Java, because they are or were both used in one way or the other in "real" applications. Closer to MOP is a favorite project of mine, because it makes me feel like I can give something back to the Lisp community from which I otherwise benefit so much. But this doesn't mean I dislike anything else I have done in the past.


One of your papers, "Reflection for the Masses", researches the ideas behind 3-Lisp, "a procedurally reflective dialect of LISP which uses an infinite tower of interpreters". Can you summarize them here?

That paper was actually mostly the work of Charlotte Herzeel, and I was only her sounding board for detailing the ideas in that paper. Reflection is one of the essential concepts that was born out of Lisp. Every program is about something, for example a financial application is about bank accounts, money and interest rates, and a shopping application is about shopping items, shopping carts and payment methods. A program can also be about programs, which turns it into a meta-program. Lisp macros are meta-programs, because they transform pieces of code in the form of s-expressions into other pieces of code. C++ templates are also meta-programs in the same sense. Some meta-programs are about "themselves," and thus become reflective programs.

3-Lisp is "procedurally reflective" in two senses: On the one hand, it allows you to inspect and change the body of procedures (functions). Common Lisp, for example, also gives you that, in the form of function-lambda-expression and compile/eval, among others. On the other hand, 3-Lisp also allows you to inspect and alter the control flow. Scheme, for example, gives you that in the form of call/cc, but in 3-Lisp this is actually part of the eval interface. 3-Lisp goes further than Common Lisp and Scheme combined, in that it provides not only first-class access to function bodies and continuations, but also to lexical environments, always both with facilities to inspect and modify them, and provides a clean and integrated interface to all of these features. Unfortunately, because 3-Lisp goes that far, it cannot be fully compiled but always needs to be able to resort to interpretation, if necessary.

The reason for the requirement to always have an interpreter around is because the eval interface in 3-Lisp is so flexible that you can run programs inside an eval invocation that in turn can inspect and change (!) the environment in which eval runs, and can then invoke further evals in those changed environments, to arbitrary levels of recursive invocations of eval. These recursive invocations of eval build what is called a reflective tower, where every level in the tower is conceptually an interpreter being executed by an interpreter one level up in the tower of interpreters. The amazing thing about 3-Lisp is that an implementation of 3-Lisp can actually collapse the tower into one level of interpretation, and arrange that one interpreter in such a way that the several different levels of interpretations are only simulated, so the tower is actually just an "illusion" created by the 3-Lisp implementation.

This may all sound quite esoteric, but is actually practically relevant, because there is strong evidence that all meta-programming approaches eventually need such towers, and some ad-hoc way to collapse the towers. For example, you can find the tower in Racket's and R6RS's macro systems, where they are explicitly mentioned; in Common Lisp's macros, where eval-when is used to control which part of the tower sees which definitions; in the CLOS Metaobject Protocol, where the generic function dispatch can be influenced by other generic functions, which can in turn be modified by some meta-classes at a higher level; in the template metaprogramming system of C++, where "concepts" were devised for C++11 (and rejected) to introduce a type system for the template interpreter; and so on, and so on. If you understand the concept of a reflective tower better, you can also better understand what is behind these other meta-programming approaches, and how some of their sometimes confusing semantic difficulties can be resolved.


If you had all the time in the world for a Lisp project, what would it be?

I have some ideas how to design reflection differently for a Lisp dialect, which I believe has not been tried before. If I had all the time in the world, I would try to do that. I also have many other ideas, so I'm not sure if I would be able to stick to a single one.


Anything else I forgot to ask?

I think one of the most underrated and most underused Lisp dialects is ISLISP. I think people should take a much closer look at it, and should consider to use it more often. Especially, it would be an excellent basis for a good teaching language.


Discussion on HackerNews submit

1 comment: