January 2018

S M T W T F S
  123456
78910111213
14151617181920
21222324252627
28293031   

Style Credit

Expand Cut Tags

No cut tags
Monday, January 7th, 2008 10:33 pm
This afternoon I gratefully took possession of the copy of Iverson's A Programming Language which the library had been holding for me. Within two sentences, he'd managed to say something so thought-provoking that I felt compelled to post about it here:
Applied mathematics is concerned with the design and analysis of algorithms or programs. The systematic treatment of complex algorithms requires a suitable programming language for their description, and such a programming language should be concise, precise, consistent over a wide area of application, mnemonic, and economical of symbols; it should exhibit clearly the constraints on the sequence in which operations are performed; and it should permit the description of a process to be independent of the particular representation chosen for the data.
(Italics his; bold mine1).

Whoa. I can't think of any mainstream language that "exhibits clearly the constraints on the sequence in which operations are performed" - the only languages of which I'm aware that take this as a central concept are things like make and ant. Most languages force you to specify an explicit sequence for all operations; Haskell, schizophrenic beast that it is, allows you a choice of no constraints2 (in the purely functional sublanguage) or explicit sequencing (in the IO monad). You can probably get something more complicated by using unsafePerformIO, but this would like as not cause demons to fly out of your nose. Scheme comes closest: forms in a progn have an explicit sequence, and the value of a function call depends on the values of its arguments, but different arguments may not depend on each other (unlike in Common Lisp, where evaluation of arguments is left-to-right). But that doesn't allow you to clearly express something like "A depends on B and C; B depends on D, and C depends on D and E":
          A
         / \
        B   C
         \ / \
          D   E
Now a language like that would indeed make parallel programming easy. And, I think, it could make programming in general a lot easier and more bug-free. I have no idea to what extent APL deals with this kind of thing (having only read a couple of sentences of the book...), but I'm excited to find out!

Do any of you know of any languages where this kind of thing is possible? Which ones? And how?


1 I'll just note in passing that there's a lot of food for thought in the bit I haven't emphasised...
2 This is, of course, not quite right: the value of x + y depends on both the value of x and that of y. But thanks to laziness, a call to f x y might result in the evaluation of x, or y, or both, or neither, and you have no way of knowing without inspecting the code of f.
Thursday, January 10th, 2008 04:36 pm (UTC)
I think to express "A depends on B and C; B depends on D, and C depends on D and E" in haskell, you do something like:

A = f B C where
B = g D
C = h D E

Yesterday I saw a talk called "XML Stream Processing Using a Lazy Concurrent Language". That guy presented an extension to haskell that allows you to explicitly process some of your mutually independent arguments in parallel, using the syntax:

let B = g D
C = h D E
in
A = f B C

So, I guess you might like that :)