January 2018

S M T W T F S
  123456
78910111213
14151617181920
21222324252627
28293031   

Style Credit

Expand Cut Tags

No cut tags
pozorvlak: (Default)
Thursday, March 6th, 2008 12:16 pm
There's a meme out there that Java is the new COBOL. The rationale is summarised nicely in the first couple of paragraphs of this Wiki page, before they ramble off into a rant about how Smalltalk should have taken over the world:
Is Java the new Cobol? A good argument can be made that yes, it is. Considering the billions of lines of Cobol code that have been (or still are) in production; that observation should in no way be considered a slur. Java has succeeded in the marketplace (partially) because it is a language of its time; not one behind the times - or ahead of them.

Java is the new COBOL because it's becoming the new de facto standard for enterprise applications. Like COBOL, it's a good language for its time, but there are better ones. Like COBOL, it's bloated with features. Like COBOL, it's being used beyond its original view.
This comparison is appealing, but false. As I've been saying for years, Java is the new Ada.

Consider:
Java Ada is a compiled, structured, statically typed, imperative, and object-oriented high-level computer programming language. It was originally designed by a small team led by James Gosling Jean Ichbiah. Java Ada emphasises "programming in the large", portability, multiprogramming using threads the rendezvous mechanism, and, in particular, the development of reliable software, which is aided by a combination of strong static typing and runtime checks inserted by the compiler. The restrictions of static typing are partially alleviated by the ability to create "generic" types and subroutines. Java Ada is particularly intended for development of software for embedded systems and hard real-time environments. The familiar C++ Pascal-like syntax aids new learners, and a large standard library provides much common functionality as standard.
It's instructive to ask why Java took off when Ada largely didn't. There's no single answer, but I think the following factors explain a lot:
  • More programmers were familiar with C syntax than Pascal syntax (or associated Pascal syntax with limited Pascal environments they'd used at school).
  • Java came from Sun, a company with programmer cred, and had the names of famous and respected hackers (James Gosling, Guy Steele) attached to it. Influence leaders in the hacker community took it seriously. Ada came from the DoD and Honeywell, and some guy that nobody'd ever heard of.
  • Sun marketed the hell out of Java, and did so to the right people.
  • Early Ada compilers were very poor, and by the time good compilers were available, the language had acquired a bad reputation.
  • Mostly, though, I think it was timing. Ada was developed just before OOP went mainstream, so missed out on that bandwagon. By the time Java came along, OOP was well-established, and the masses were ready for a language in which OOP was built-in rather than bolted on. They were also familiar with the horrible complexity of C++, and were ready for something like C++ but less painful. "Reliable software" was a less effective rallying call back in 1980, unless you were developing control systems for aircraft (Ada's original niche). Ada was ahead of its time in many ways; Java, as the Wiki poster said above, was very much of its time.
There are some other suggestions here, from someone who appears to have been around at the time and is thus more reliable than me.

The new COBOL is, of course, Visual Basic.
pozorvlak: (babylon)
Thursday, February 21st, 2008 04:53 pm
I went to an all-day sales pitch conference sponsored by Wolfram Research yesterday, all about just how damn cool the new version of Mathematica is. There's a lot to like: as a language, it seems to have a nice blend of Lisp-like and APL-like features, so you can do all your standard functional programming tricks and what looks like a decent subset of array programming tricks, as well as writing normal imperative code. The standard library is, of course, vast, with loads of clever symbolic, numerical, graphics and GUI code built in, and in the new version there's also lots of standard geographical/scientific/financial/etc data available, import and export filters for loads of standard formats, and other niceties. One thing I really liked was the Manipulate[] function: hand it an expression (which can evaluate to a number, a symbolic form, a graph, a 3D plot, a sound file, an animation, or whatever) and a list of parameters, and it will automagically construct a GUI widget with sliders and checkboxes that allow you to manipulate the parameters interactively and observe the result. You can even control the parameters using a gamepad, if you want... They seem to have made a major effort to make everything interoperate smoothly in the new version - one slightly silly demo they showed us was putting slider bars as the limits of an integral, and changing the value of the result as the bar was dragged about. That was always the major problem with open source mathematics software, from my limited experience - nothing does everything, so you have to learn N different incompatible sublanguages, write loads of glue code, and constantly switch applications. The Sage guys seem to be working on this, though - I'll have to check it out.

Have a look at the big collection of Mathematica demos at http://demonstrations.wolfram.com, which includes a lot of examples of Manipulate[]. There are videos, or you can download a free-as-in-beer notebook reader.

In other news, I've been having a bit of a play with the NetBeans IDE for Java, and really liking it. I've got used to doing everything in vi and the command-line, which has its upsides, but IDEs can make life so much easier for the beginner. In particular, NetBeans' wiggly red underlining has been a huge help in learning the language, and the integrated documentation browser is very nice. I haven't needed the automated refactoring support yet, but it's fun to play with - select! Click! Extract Method! :-)

But here's my question - why is it so slow? I know it's written in an interpreted language, but so is Emacs, and that runs without too much complaint on 1980s hardware. And the compiler's written in C, unless I'm much mistaken, and that's slow as hell too. Or, conversely, why were the compilers with Delphi and Turbo Pascal so fast? Simple Java programs take several seconds to compile on my 1GHz machine, where their Pascal equivalents would have compiled in an eyeblink on its predecessor's predecessor1. Is there something about Pascal that makes it especially easy to compile, and if so, what is it? Java seems at least as regular to me, and generating bytecode ought to be easier than generating native code. Or is Anders Hejlsberg just a ninja?

Thesis now at 63 pages and 22563 words, according to wc *.tex, which means that I've written 20 pages and, um, several thousand words in the last sixteen days (nearly 1000 words today, but many of those were "XXX proof here"). Progress is being made, though there's an awful lot still to do.

1 I'm sure I've mentioned our fifteen-minute link times for our medium-sized C++ app when I was working at $company: while our network of file dependencies wasn't as bad as it could have been, it still resulted in the linker having to do a lot of work. And while compiling can be distributed easily around a network, linking can't :-(
pozorvlak: (kittin)
Saturday, February 16th, 2008 01:20 am
No thesis today - I had a six hour job interview through in Edinburgh, for the maths consultancy people (who, it turns out, are university friends of my Canadian friend Jeff - small world!). The interview went reasonably well, though it could have been better - I was a bit stressed out by the whole interview situation and wasn't as sharp as I could have been. The maths questions (all based on tasks they've actually encountered in the field) were completely outside my area, and hence somewhat challenging for me, but I managed to come up with not-entirely-stupid answers to them without too many hints. The programming questions were mostly straightforward (what does this bit of recursive Prolog do, implement the following standard mathematical functions in Java, write a simple method against this random spec), apart from one where I had to find an error in some threaded Java code. Did I mention that I've never written threaded code before, and I don't speak Java very well? Then they gave me lunch, then brought me back to the office and asked me all the questions from the first round again. Apparently the answer I gave to the "where would you like to be in five years?" question was very good :-)

Anyway, the company looks really cool, and the people all seemed to be good guys, so fingers crossed on that one. They've got another couple of people to interview, and they'll get back to me some time in March.
pozorvlak: (gasmask)
Tuesday, February 5th, 2008 02:35 am
[livejournal.com profile] totherme kindly drew my attention to this blog post today. The author, Slava Akhmechet, tries to explain away that horrible feeling of unproductivity and frustration that I know so well from my attempts to use Haskell: his claim is that it's just that my expectations are miscalibrated from using imperative languages. A Haskell solution to a given problem, he claims, will take the same amount of thought as a solution in another language, but much less typing: by simple statistics, therefore, you're going to spend a lot of time staring at the screen not doing anything visible, and this can feel very unproductive if you're not used to it.

As an example, he gives the code
extractWidgets :: (Data a) => a -> [String]
extractWidgets = nub . map (\(HsIdent a)->a) . listify isWidget
    where isWidget (HsIdent actionName)
              | "Widget" `isSuffixOf` actionName = True
          isWidget _ = False
Bleh! :-( This function takes a parse tree representing some Haskell code, and extracts all the identifiers ending in "Widget", then removes the duplicates. Slava challenges any Java programmers reading to do the same in five lines or fewer.

Pointless language pissing match, including some Arc coding )

On language concision in general: I typically find that Haskell requires fewer lines for a given program, but Perl requires fewer characters, and they both use about the same number of tokens. Lisp is longer and messier than Haskell for easy problems, but quickly gains ground as the problems get harder.1 The APL family own on all three axes. This is extremely unscientific, of course, and because I don't know much APL/J I can't be sure how it extends to harder problems. I did once email Paul Graham asking why, if succinctness was power, he didn't switch to a member of the APL family; he has yet to reply, but I don't attach any great significance to this.

And having got all that stuff out of my head, I'm going back to bed. Hopefully I'll be able to sleep this time :-)

1Fun exercise: translate the example code in On Lisp into Haskell (or Perl, Ruby, etc.). It really helps you to get to grips with the material in the book. I found that the Haskell solutions were shorter and cleaner than the Lisp solutions up until about a third of the way through the book, at which point Lisp started to catch up with a vengeance: shortly thereafter, he was doing things in a few lines of Lisp that simply couldn't be done in finitely many lines of Haskell. I'd be very interested to see solutions written by someone who knew what they were doing, though!
pozorvlak: (Default)
Wednesday, December 19th, 2007 08:14 pm
An email from the people to whom I applied:
I missed the point entirely, it seems :-( )
I wouldn't mind so much if they'd asked for a "Personal Statement" or something - to me, the term "Research Proposal" suggests that it should mainly be about the research that you propose to undertake. But apparently not. I've got until, er, tomorrow to submit another one :-(

In other news, hearken ye programmers unto Steve Yegge's latest drunken blog rant. I've been having similar thoughts myself, related to Bad Things that have happened to me with big codebases: it's a large part of why I'm so interested in the APL family*. But I'd like to stick my neck out and say that the way Steve feels about Java is the way I feel about Haskell.

* I note in passing that that page comes up first in a Google search for "APL lesson" - epic win!