Archive for the ‘History of Computing’ Category

On porting an ACE program to HTML5 (among other things)

March 1, 2017

In recent times I’ve been thinking about ACE BASIC, a compiler for the Amiga I stopped working on just over 20 years ago; nostalgia’s setting in I guess. A few years ago wrote a bit about ACE in relation to BASIC’s 50th; there’s more in these 1994 and 1997 articles.

As mentioned in the 50th post, a wonderfully thriving community built up around ACE between 1991 and 1996 centred upon an active mailing list and contributions of ideas, example programs and tools. I have been mildly (but pleasantly) surprised by a couple of things since I stopped development:

  1. Continued use of ACE on Amiga hardware and emulators for at least a decade afterward.
  2. An project to modify the code generator for post mid-90s Amiga operating systems and additional targets such as PowerPC and Intel.

Among other things, I’ve been thinking about a re-write for a modern platform or target, e.g. HTML5. The world of the 90s was still very platform-centric, but in the same year I stopped developing ACE, version 1.0 of the Java Development Kit was released, putting the power of Java and its  virtual machine into the hands of eager programmers. Java and JavaScript helped to consolidate the browser as an important platform and to define the shape of web development in a post-CGI (Common Gateway Interface, not Computer Generated Imagery) world.

A new compiler or interpreter is a non-trivial task, especially in my current spare-time-poor state, but I wanted to explore how an ACE program could be rewritten for an HTML5 context.

One of my favourite ACE programs was an implementation of IFS (Iterated Function Systems) to generate simple 2D structures such as ferns, trees, the Sierpinski Triangle and so on. So I started with this. It’s simple yet complex enough to allow for a comparison of approaches.

Here are a few observations on the original IFS ACE source code (ifs.b) and my initial HTML5 port of the code.

  • JavaScript in the browser appears to be faster than ACE on the Amiga. Sure, processors and clock speeds have improved since the mid-90s but ACE generated native 68000 assembly code. Then again, to think of JavaScript as an interpreted language is very simplistic with just in time compilation in widespread use.
  • The ACE code is quite data-centric. DATA statements containing comma-separated values are read into two dimensional arrays, so the original data is not close to where it’s used and it’s not clear what the numbers are associated with. I could have taken this approach in the port, but chose instead to create a data object, a map, close to the point of use, copying the original array names (bad in some cases: a, b, c, okay in others: xoffset, yscale) from the ACE program for use as map key names, to make a correspondence easier to see.
    • This meant first transposing the data (in Excel) so that DATA columns became rows.
    • Preserving the existing DATA organisation could be accomplished by introducing functions such as data() and read() that create and read, respectively, a pool of data values. For future DATA-centric ACE program ports, I’ll try that approach.
  • In ACE, the creation of a menu and its items is simple as shown by the creation of the Special menu below; this menu name is an Amiga convention. Shortcut key letters are optional.
    • menu 1,0,1,"Project"
      menu 1,1,1,"Sierpinski Triangle"
      menu 1,2,1,"Square"
      menu 1,3,1,"Fern"
      menu 1,4,1,"Tree #1"
      menu 1,5,1,"Tree #2"
      menu 1,6,1,"Sunflower"
      menu 1,7,0,"-------------------"
      menu 1,8,1,"Help...","H"
      menu 1,9,1,"About...","A"
  • Compare this with the odd combination of HTML, CSS and JavaScript in this initial attempt at a port.
  • On the other hand, ACE’s (and so AmigaBASIC’s) reliance upon numeric identifiers is almost as unreadable as a collection of DATA statements. The MENU statements above declare the Project menu to be the first (from the left of a menu bar), with each menu item numbered in order of desired appearance and 1 or 0 enabling or disabling the menu item. Subsequent enable/disable operations on menus must refer to the same combination of numeric IDs, e.g. menu 1,2,0 would disable the Square item. Also, menu item selection handling is a bit awkward in ACE.

The code eventually morphed into what I’ve dubbed ACEjs, in the spirit of some other JavaScript library/frameworks. I’m not claiming any novelty here. The idea was to answer the question: how might ACE code look in a modern context? I’m less concerned with slavishly preserving the look and feel of the program, i.e. I’m not trying to make it look like it’s running on an Amiga. I just want to make it functionally equivalent.

Here’s a screenshot of the simple example ifs.b program in ACEjs form:

IFS in ACEjs

I don’t currently have a screenshot of ifs.b running on an Amiga or an emulator.

In any case, the outcome so far is that I have made progress toward an ACE-inspired JavaScript library for HTML5. Here are some key aspects:

  • CSS, DOM, jQuery (so JavaScript) as web assembly language but only JavaScript code needs to be written.
  • Functions like menu(), window(), dialog() manipulate the DOM to add elements (canvas, list etc) via jQuery under the hood.
    • A menu declaration corresponding to the ACE code’s Project menu (with Help and separator items omitted) follows, a key difference being that menu items are paired with callback functions (e.g. see sierpinski below), another being that there is no support for shortcut keys currently:
      • acejs.menu("Project", [
            ["Sierpinski Triangle", sierpinski],
            ["Square", square],
            ["Fern", fern],
            ["Tree #1", tree1],
            ["Tree #2", tree2],
            ["Sunflower", sunflower],
            ["About...", about]
        ]);
    • A window declaration that adds a framed canvas looks like this:
      • wdw_id = acejs.window("IFS", 640, 400);
    • and operations on the window make use of an ID:
      • acejs.clear(wdw_id);
      • acejs.pset(wdw_id, outX, outY, pixelColor);
    • Multiple menus and windows can be created.
    • acejs.css is almost non-existent currently. I’m sure someone who delights in CSS could make it look suitably dark and brooding with their eyes closed. I make no claim to have any special talent in web design.

There’s arguably no need for a compiler or interpreter. JavaScript’s iteration, selection, and expressions are adequate. Having said that, ACEjs could form the basis of a target if I ever chose to write another ACE compiler or interpreter (with all that spare time of mine).

With ACEjs you only have to write an app.js source file for your application and use a standard index.html that brings in your code and whatever else is needed, in particular acejs.css (trivial right now) and acejs.js. The only special thing you have to do is to define an init() function in app.js to be invoked by the framework. The best way to see how this works is to look at the example.

You can either download the contents of the public_html directory and open index.html in your browser or see the example application running here.

In early 2000 I wrote an article for Sky & Telescope (S&T) magazine’s astronomical computation column entitled Scripting: a programming alternative which proposed JavaScript as a suitable alternative to BASIC for astronomical computation, long used by S&T and others to disseminate programs. Even at that time, JavaScript was arguably the only programming language interpreter available on almost every personal computer, by virtue of the ubiquity of web browsers.

In essence, JavaScript had become the equivalent of the BASIC interpreter every old personal computer (formerly called microcomputers, especially in the 80s) once had. I made the example programs from the article available and experimented further; some examples show the original BASIC listing along with the JavaScript implementation.

A variant of the ideas that led to ACEjs are revealed in what I said on this page:

Peter Girard has suggested the creation of an ECMAScript code library for astronomical algorithms.

An idea I’ve had is to write a BASIC (which dialect: GWBASIC, QBasic, etc?) to ECMAScript translator, written in ECMAScript or Java. One could paste BASIC code into a text area on a web page, and have ECMAScript and HTML code generated on the fly. This would make the BASIC code on Sky & Telescope‘s web site available as interactive programs. Or, it could generate a listing, making Peter Girard’s idea of a code library easier to achieve.

Of course, there are now plenty of examples of BASIC interpreters written in JavaScript, e.g. here’s a QBasic implementation that generates bytecode and uses canvas. Then again, as I have noted, my aim was not to slavishly recreate the exact look & feel of the original platform.

S&T showed some initial interest in JavaScript, again in 2005 regarding an orbit viewer page I wrote that combined JavaScript, a Java applet and cross-domain AJAX while Internet Explorer allowed it, and before CORS was a thing.

Of course since then and for all kinds of reasons, JavaScript has come to dominate rich client browser applications, especially after the introduction of AJAX, and has generally become the assembly language of the web. More recently we’ve seen the rise of Node.js, an explosion of JavaScript web frameworks (Angular, React, …), and mobile JavaScript development frameworks like Apache Cordova. JavaScript has good points and bad along with detractors aplenty, but it’s hard to argue with its success.

History has shown that a programming language does not have to be perfect to succeed. I love C, but it’s far from perfect and holes in its type system allow one to, as the saying goes, “shoot one’s foot off”. Additionally, these same holes are responsible for security vulnerabilities in the operating systems we rely upon. Notice, I’m not saying that C itself is responsible (it’s not a person or a company) for exploits of those vulnerabilities; that’s attributable to the moral barrenness of the people involved. It’s unlikely that we’ll see the sum total of mainstream OS-land rewritten in safer languages (Rust, Haskell, …), to save us from ourselves, anytime soon.

But I digress…

I could repurpose ACE to generate JavaScript, but we are living in a time of “programming language plenty”. Creating a new language today should be considered a last resort. Domain Specific Languages, sure. Libraries and frameworks, perhaps. New languages? Looking at what’s available first before reinventing the wheel should be considered a responsibility. Also, a language is no longer enough by itself. You need an ecosystem of tools (IDE, debugger at least) and libraries for anyone to care enough to want to use your shiny new language beyond very simple programs. ACE had a couple of IDEs but no debugger. Heck, I didn’t even use a debugger when writing the compiler! Now I seem to live in source level debuggers. I’m obviously getting soft. 🙂

When I was a junior academic in the computing department at UTAS in the mid-90s, upon learning about my development of ACE, a senior and sometimes less-than-tactful colleague remarked that creating a new language was, as he so delicately put it, “a wank”. I disagreed. ACE was about providing the power of a compiled language for a particular platform (Amiga) to people who knew an interpreted language (AmigaBASIC), wanted to leverage that experience and existing code and didn’t feel confident enough to learn the dominant systems-level language of the time (C). It was also about improving the precursor language.

Now, I would agree that the decision to create a new programming language or library requires some circumspection, at the very least. But the programming language landscape has expanded a lot since the mid-90s. There is of course value in writing an interpreter or compiler, just for the learning as an end in itself and every computer science or software engineering student should do so.

So, in the end: why ACEjs?

In part because I wanted to explore simple ways to write or port a certain class of application (e.g. old ACE programs) to client-side web applications.

Partly out of a sense of nostalgia.

In part because I want to learn more JavaScript, Canvas, jQuery and jQuery-ui and the subtle differences between JavaScript versions.

Mostly, I wanted to get a bundle of ideas out of my system, which I’ve more or less done.

ACEjs is a simple starting point and if it’s useful to me, I’ll continue to improve it; if not, it will happily fade away. So far, I’ve tested it using Chrome version 56 and Safari version 9 and ECMAScript (the underlying JavaScript standard) 5 and 6.

Finally, look in the About box of the example application for a small dedication, also present in the even simpler About box of ifs.b; my wife spent far too long listening to me talk about programming languages and compilers in the 90s. Now the talk is more likely to be about variable stars. Thanks Karen. We both like ferns as well, IFS generated or natural. 🙂

In any case, enjoy. Feedback welcome.

Marvin Minsky (1927 to 2016)

January 31, 2016

MIT Artificial Intelligence (AI) pioneer Marvin Minsky died at the age of 88 on January 24th in Boston.

2006_marvin_minsky1

See this EE Times blog post for a good summary.

Marvin Minsky’s participation in the 1956 Dartmouth Conference along with John McCarthy, Nathaniel Rochester, and Claude Shannon gave rise to the term artificial intelligence. While considerable progress has been made in the domain of machine intelligence, Minsky’s book The Emotion Machine deals with some of the more recalcitrant aspects of human-level intelligence.

His work ranged widely, including early work in neural networks (perceptrons) or connectionism and symbolic or classical AI including expert systems. I took university classes in classical AI and enjoyed experimenting with neural networks in the 90s, but my knowledge representation Master’s thesis was very much in the symbolic camp.

Minsky provided advice for the movie 2001: A Space Odessey regarding the delusional HAL 9000 computer. He made this remark about the film:

Kubrick’s vision seemed to be that humans are doomed, whereas Clarke’s is that humans are moving on to a better stage of evolution.

I’ll end with more Minsky quotes that provide some insight into this influential man’s thought processes. I particularly like the final tongue-in-cheek comment.

No computer has ever been designed that is ever aware of what it’s doing; but most of the time, we aren’t either.

If you just have a single problem to solve, then fine, go ahead and use a neural network. But if you want to do science and understand how to choose architectures, or how to go to a new problem, you have to understand what different architectures can and cannot do.

I believed in realism, as summarized by John McCarthy’s comment to the effect that if we worked really hard, we’d have an intelligent system in from four to four hundred years.

Peter Naur’s passing

January 21, 2016

Danish computer science pioneer, Peter Naur, died on January 3 2016 after a short illness, aged 87.

peter_naur sourcehttp://www.naur.com

 

Peter Naur received the ACM Turing award in 2005 for “…fundamental contributions to programming language design and the definition of Algol 60, to compiler design, and to the art and practice of computer programming”.

He is best known as the original editor of the Algol 60 Report, and as the “N” in BNF or Backus-Naur Form (with John Backus of Fortran, and other, fame), first used to describe the syntax of Algol 60. He objected to this and thought BNF should denote Backus-Normal Form instead. Nevertheless, MacLennan (1983), in Principles of Programming Languages: Evaluation and Implementation), notes the following about the connection between BNF and Naur:

Peter Naur, then the editor of the Algol Bulletin, was surprised because Backus’s definition of Algol-58 did not agree with his interpretation of the Algol-58 report. He took this as an indication that a more precise method of describing syntax was required and prepared some samples of a variant of the Backus notation. As a result, this notation was adopted for the Algol-60 report…

I gave examples of BNF from the Report along with Algol code fragments in a talk I gave to the Australian Computer Society about the 50th anniversary of Algol 60. Compiler construction tools like lex & yacc arose from the creation of BNF and variations such as EBNF (with which I have spent more time) led to more expressive and concise programming language grammars and still more powerful tools.

Alan Perlis commented in 1978, with a pun on the begin and end keywords used to delimit code blocks, that:

Algol’s is the linguistic form most widely used in describing the new exotic algorithms…Where important new linguistic inventions have occurred, they have been imbedded usually within an Algol framework, e.g. records, classes, definitions of types and their operations,…, modules. Algol has become so ingrained in our thinking about programming that we almost automatically base investigations in abstract programming on an Algol representation to isolate, define, and explicate our ideas…It was a noble begin but never intended to be a satisfactory end.

Others have remarked upon the contribution of Algol:

Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. (1980 Turing Award Lecture, C.A.R. Hoare)

Lisp and Algol, are built around a kernel that  seems as natural as a branch of mathematics. (Metamagical Themas, Douglas Hofstadter)

Algol 60 lives on in the genes of Scheme and Pascal. (SICP, Abelson & Sussman)

Block structure, lexical scope, and recursion are just a few features that no Pascal, C/C++, Python, Java, C# or  programmer would find surprising. Naur and his collaborators played a large part in shaping the programming languages we think and code in today. Scheme is a dialect of Lisp — Lisp predating Algol — that was ultimately influenced by it, e.g. lexical scope and by the “language report” document approach (see Revised Report on the Algorithmic language Scheme).

As Perlis alludes to above, many of the beneficiaries of the descendants of Algol are unaware of how their language constrains the way in which they think about programs and programming, the Sapir-Whorf hypothesis in action.

The late Dutch computer scientist, Edsger Dijkstra remarked:

Several friends of mine, when asked to suggest a date of birth for Computing Science, came up with January 1960, precisely because it was Algol 60 that showed the first ways in which automatic computing could and should and did become a topic of academic concern.

Naur started his career as an astronomer, but changed his profession after encountering computers. He was not fond of the idea of programming as a branch of mathematics and saw it as very much a human activity, the sub-title of a 1992 book (Computing: A Human Activity) by Naur. Section 1.4 entitled Programming as Theory Building challenges the simplistic Agile mantra that source code is all that matters, whereas in fact, like JPEG image files, it can be seen as a lossy format, a distillation of a programmer’s thoughts with lost context, mitigated only in part by appropriate comments (an art form in itself).

Peternaur

sourcehttps://en.wikipedia.org/wiki/Peter_Naur

In Programming as Theory Building, Naur outlines scenarios relating to writing a compiler for similar languages and the development of a large real-time industrial monitoring system. In his words:

The conclusion seems inescapable that at least with certain kinds of large programs, the continued adaption, modification, and correction of errors in them, is essentially dependent on a certain kind of knowledge possessed by a group of programmers who are closely and continuously connected with them.

Naur was offered a professorship in computer science at the University of Copenhagen in 1969. He thought that computer science was fundamentally about the nature and use of data, didn’t much like the phrase and coined the term datology, which he thought had a more human orientation, giving rise to what has been called the Copenhagen interpretation of computer science (as opposed to quantum physics).

Later in his career, Peter Naur was critical of contemporary conceptions of science and philosophy, developed a theory of human thinking called Synapse-State Theory of Mental Life, contrasted human vs computer thinking (see this video) and rejected the possibility of strong AI. I may not agree with all of Naur’s ideas in this area, but consider them worth hearing.

As an aside, this Y-Combinator post about Naur’s passing makes the interesting observation that a number of widely used programming languages other than Algol 60 have Danish origins, e.g.

  • PHP (Rasmus Lerdorrf)
  • Turbo Pascal, Delphi, C# (Anders Hejlsberg)
  • C++ (Bjarne Stroustrup)

It has occurred to me increasingly in the last few years that many of our field’s pioneers are reaching the end of their lives. Even confining oneself to those associated with programming language design and implementation, since 2010, at least the following have died:

  • Denis Ritchie: C
  • John McCarthy: Lisp
  • Robin Milner: ML
  • Peter Naur: Algol

Before I started writing this post, I knew about Naur’s association with Algol 60 and BNF, but that’s about all. Now I’ve discovered that, like so many pioneers, he had depths I have only just started to explore, even if confining myself to his work Computing: A Human Activity.

 

BASIC’s 50th, early micros, and ACE BASIC for the Amiga

May 4, 2014

I enjoyed reminiscing about BASIC when it recently turned 50, on May 1 2014. I learned more about the events surrounding the creation of Dartmouth BASIC from the Dartmouth web pages and especially from interview videos with co-inventors John Kemeny and Thomas Kurtz. Given my development of ACE BASIC for the Amiga in the mid-90s, the BASIC programming language has a special place in my heart. More about ACE shortly. My first experience with BASIC and programming in general was in 1977, in my second year of high school (Norwood High). Our class marked up a deck of cards (in pencil) with a BASIC program and submitted them to Angle Park Computing Centre. A week or two later I remember receiving a printout of a partial run, showing an ASCII plot of some function (a deceleration curve I think) tantalisingly cut short by an error, the details of which I don’t recall.

At the time I thought that was an interesting experience but set it aside. As I described here, in 1978, the school bought a PDP-11 and installed it in an air-conditioned room complete with a card reader, printer, and terminals. I remember seeing the machine for the first time, gawking in wonder through the glass window in the door to the room. 11_20_console_hirespdp11-software2-r   For the first 6 months most students were only allowed to create card decks rather than using a terminal. At least the turnaround time was less than for Angle Park: you could get your program run’s print-out, correct errors in your card deck and submit it again via the card reader’s hopper.

Apart from a small amount of class-time exposure to the machine, I became a “computer monitor”, assigned on a roster to be there while others used the computer, given a modicum of responsibility for looking after the machine (e.g. card reader or printer problems, booting) but I didn’t learn too much more about the PDP-11 that way.

What really hooked me, was eventually being allowed to use the terminals (pictured at right) and the interactive BASIC programming that entailed. There was plenty of competition for terminal time! One of the first interactive programs I wrote was a simple guess-the-number game in which the user was told whether a guess was less or greater than the number the machine was “thinking” of. It seems trivial now but that experience of interacting with an “artificial intelligence” (as it seemed to me at the time) was intoxicating and this has stayed with me. Some fellow students started playing around with machine language on the PDP-11; that was a little beyond me at the time but an understanding of that level would become important for me later.

In the late ’70s, Tandy had a computer store in Gawler Place, Adelaide. I used to catch a bus into town on Friday nights, pull up a chair at a TRS-80 Model 1 on display and sit there for an hour or two typing in BASIC source code for games from a book; the sales people didn’t seem to mind too much. 🙂

When I’d finished year 12 of high school, had started working as a nurse in 1981, and was earning money, I bought a CASIO FX-702P, essentially a calculator with an interface for a cassette recorder (for programs and data and printer that was programmable in BASIC. frontcvr220px-CW-E-frontWithin a year or so, I had a Sinclair ZX-81 connected to my parents’ old HMV black and white TV in the country (where I visited most weekends): a big screen indeed! This odd little machine fired my imagination via its space-age programming manual cover. Adding the 16K RAM expansion pack (shown below at rear) allowed much larger programs to be written compared to the unexpanded 1K machine. ZX81 Programming in BASIC while listening to music like Kraftwerk’s Computer World, with simplistic, German-accented lyrics like these:

I program my home computer. Beam myself into the future.

it somehow seemed that the future was coming fast and that it was going to be overwhelmingly positive. This was a time of innocent joy when nothing was standardised (hardware or operating systems), the term micro-computer was more likely to be used than personal computer, the sterile DOS-based IBM PC “business computer” was barely beginning to emerge and the Macintosh was not yet in sight.

The pages of magazines like Australian Personal Computer and Compute! were filled with BASIC program listings for specific machines just crying out to be adapted to other BASIC dialects. Reading books such as Christopher Evans’ The Mighty Micro (1979) filled me with optimism for the future. Reading Isaac Asimov’s I, Robot and the like fired my imagination, as did TV shows like Dr Who and Blake’s 7. To be honest, all of this was also somewhat of a welcome escape from the daily realities of being a young nurse.

My next machine was a Commodore PET (CBM 4016). Built like a Sherman tank, I added a 5.25″ floppy disk drive (that had cooling problems!) and a dot matrix printer via the PET’s IEEE interface. I happily spent many weekends creating games in BASIC on this computer. I also wrote a version of Eliza-the-psychotherapist that kindled an interest in artificial intelligence and language processing. Occasionally entering the PET’s machine language monitor programming got me thinking more about low-level concepts (processor registers etc). Reading a book called Programming the 6502 by Rodnay Zaks (1981) helped further my understanding. OLYMPUS DIGITAL CAMERA That PET was followed by a VIC-20 and C-64 into the mid-80s both of which I (of course) programmed in BASIC and a bit of hand-assembled 6502/6510 machine code POKEd into obscure areas of memory (such as the cassette buffer, not in use when a 5.25″ floppy disk drive was the secondary storage device). I started to gain some exposure to databases (SuperBase 64), word processors and other programming languages like Pascal. Interfacing with relay boards and sensors was also something I enjoyed using BASIC for with these machines, achieved by POKEing values into and PEEKing values from I/O memory locations. vic20C64_startup_animiert In 1987, after a couple of years going off in various directions, I moved from Adelaide to Tasmania to work as a nurse (in ICU, Recovery etc) where I met my future wife, Karen. I didn’t have any computer with me there because I initially thought I’d only stay for a year or so but ended up staying for a decade. My first computer purchase in Tasmania was an Acorn Electron, little brother to the BBC Micro and programmable in a BBC dialect of BASIC. I also learned a bit of LISP (from a cassette-loaded interpreter) using the Electron. Acorn_Electron_4x3 Commodore_Amiga_500Plus20110501_uae4all_(30-09-2011)_(amiga_500_emu_for_pandora)

By far the most important computer purchase ever for me was a Commodore Amiga 500. I learned so much from that machine, initially programming it in AmigaBASIC and smatterings of machine code, then in C and a handful of other languages. The Amiga’s pre-emptive multi-tasking operating system and state of the art graphics and sound capabilities were fantastic. It was also to this machine that I connected my first hard disk drive. I wrote simple astronomy programs, a simple drawing program for Karen, and created an alarm and security system with infra-red sensors, keypad, strobe light etc. It even woke me (or more likely Karen so she could come to my aid) up if I went for a sleep walk. 🙂 I also used the Amiga and C64 for the pre-Internet Viatel (Australia’s Teletex system), bulletin boards, and Compuserve.

I took a statistics course at UniTas (Launceston) for fun in 1987 and a year or so later had started an Applied Computing degree there. I took a double major in computer science and philosophy. This ultimately lead me away from a career in nursing and onto a software engineering career (after stints as a computer systems officer and a junior academic post-graduation). One of the subjects I took as an undergraduate was “Advanced Programming” in which we wrote a compiler for a subset of Pascal into p-codes (similar to UCSD p-codes and not unlike Java VM bytecodes) rather than assembly or machine code for the native machine (Intel). One outcome is that I became increasingly interested in programming language translation and programming paradigms (especially object oriented, functional, logic and concurrent). Another outcome is that I resolved to take that knowledge and write a compiler for the Amiga for a language that I myself would want to use, not just as an academic exercise.

In October 1991, I started development of ACE BASIC for the Commodore Amiga computer. It was released to testers in March 1992 and made available for public consumption in February 1993. Like the original Dartmouth BASIC, ACE was compiled, unlike many implementations that have been interpreters. ACE was a compiler for the interpreted Microsoft AmigaBASIC that shipped with the Amiga.

This article written for the online Amiga Addicts journal gives some idea of the history and motivations for ACE and here is an interview I gave in 1997 about ACE. Although the instruction set of the Amiga’s 68000 processor was not quite as orthogonal as the PDP-11’s, it was still really nice. ACE compiled BASIC source into peephole optimised 68000 assembly code.

KL_Thomson_TS68000

 

This was assembled to machine code by Charlie Gibbs’ A68K assembler and linked against library code with the Software Distillery’s Blink linker (later I also used PhxAsm and PhxLnk). I wrote 75% of ACE’s runtime libraries in 68000AssemblyLanguageProgramming_2ndEdition68000, later waking up to the idea that C would have been a more productive choice. One upside is that I became quite comfortable working with assembly language. I’ve made use of that comfort in recent years when working with hardware simulator testing (ARM, PowerPC etc) and micro-controller compilers.

A wonderful community of enthusiastic users built up around ACE. I wrote an integrated development environment, and a community member wrote one too (Herbert Breuer’s ACE IDE is shown below).

Another member wrote a “super-optimiser” that rewrote parts of ACE’s generated assembly code to be even faster than I managed with my simple optimisations.

aide HB

ACE was guilty of a criticism by the Dartmouth BASIC co-inventors (John Kemeny and Tom Kurtz) kemeny_and_kurtz_250pxlevelled at many BASICs since their first: of being machine-specific. But then that was the intent for ACE: to make programming the Amiga more approachable to more people, combining the simple abstractions of BASIC with the unique features of the Amiga and the run-time efficiency of a compiled language like C.

Given the Amiga’s demise, around 1996 I moved onto other platforms. I wrote a LISP interpreter for the Newton PDA (also doomed; I can pick ’em!) between 1998 and 2000. That was fun and had a nice small community associated with it, but it didn’t match ACE and its community.

I eventually came to possess PCs, programming them with a smattering of GW-BASIC, quite a lot of Turbo Pascal, some Microsoft Quick C, a little assembly, and Visual BASIC.

When Java appeared in 1996 I greeted it with enthusiasm and have been an advocate of it and the Java Virtual Machine, as a professional and spare-time software developer, on and off ever since. These days I’m more likely to code in Java, C/C++, Python (where once I would have used Perl) or perhaps R rather than a BASIC dialect, none of which denigrates BASIC.

The fact is that BASIC made early microcomputers accessible such that many of us interacted with them in ways more directly than is possible with modern computers (PCs and Macs), despite all their advantages and power. Arguably, we expected less from the machines yet engaged in highly creative relationships with them. Anyone who has spent much time programming will recognise the allure. The interactive nature of these early BASIC machines only added to this.

I agree with the famous Dutch computer scientist Edsger Dijkstra when he says that:

Computing Science is no more about computers than astronomy is about telescopes.

dijkstra

I also sympathise with his declaration that the BASIC statement GOTO could be considered harmful, due to the “spaghetti code” it leads to. But I don’t really agree with his assessment that:

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

I heard the same propaganda from a University lecturer. Apparently some us of were able to be “rehabilitated”.  Then again, along with his comments about BASIC, Dijkstra made some unkind comments about other programming languages, including COBOL, Fortran, and APL, for example:

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.

With apologies to Grace Murray Hopper, I have more sympathy with this last one. 🙂

archives_hopper

The truth is that all programming languages are crude approximations of the Platonic ideal of bringing together two minds: one artificial, one natural. There are no programming languages about which I can say: there are no improvements possible here, and there are very few languages that make communion with the machine a beautiful experience. All are to be reviled and admired for different qualities. But BASIC, in all its dialects, with all its flaws, and with the benefit of 20/20 hindsight and large gobs of nostalgia, was mostly to be admired and was directly responsible for many of us falling in love with the idea and activity of programming.

programming_languages_tower_of_babel_poster-red3cfd7db90b4be9ab4a803dee3a65b3_wxb_8byvr_512

Viva la BASIC! Happy 50th anniversary!

Turing CODEBREAKER film available

November 13, 2013

In November 2012 I posted about the centenary of Alan Turing’s birth and mentioned the film CODEBREAKER about his life and tragic death. It’s finally available for general sale!

Update: I’ve just ordered my copy; looking forward to having this resource!

ACS History SIG talks (slides from 2010)

April 25, 2013

In 2010 I gave two talks in the context of the ACS History SIG, one about the 50th anniversary of the programming language Algol 60 and another that explored the scope of Computing History.

Alan Turing: a household name?

November 25, 2012

A hundred years have elapsed since the birth of the mathematician, codebreaker, and father of computer science, Alan Turing.

Due to space restrictions, a drastically shorter version of  what follows appeared on page 16 of the November/December ACS Information Age magazine.

In response to an online petition in the lead-up to the centenary of Alan Turing’s birth, the British PM, Gordon Brown, said in 2009: “Turing was dealt with under the laws of the time, and we can’t put the clock back, his treatment was utterly unfair. On behalf of the British government and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.” [1] This statement concerned the appalling treatment Turing received for “gross indecency”, the charge made against him as a homosexual person living in the UK in the mid-20th century. His choices were chemical castration and jail. He chose the former, which affected his concentration and self-esteem, undoubtedly contributing to his apparent suicide via a cyanide-dipped apple in 1954.

It would be an understatement to say that Turing achieved much in his 42 years. He contributed to a fundamental problem in mathematics, in the process becoming the father of computer science prior to the existence of general purpose computing machines. He played a pivotal role in the Second World War as a Bletchley Park cryptanalyst for which he was awarded an OBE, wrote a seminal paper on the modeling of biological growth, worked on pioneering computer projects, and founded the field of Artificial Intelligence (AI).

For anyone in the computing field, Turing’s most important contribution was his 1936 paper “On Computable Numbers” and in particular, the abstraction he described and used to present the halting problem, now known as the Turing Machine, the conceptual essence of a general purpose computer.

Turing was keenly interested in algorithms and applications, independently arriving at the utility of the subroutine library. He wrote and optimised early library routines e.g. for long division, random number generation and investigated numerical analysis problems such as rounding errors. He wrote code relating to number theory for the Manchester computer. He also wrote a chess program that was only simulated on paper due to a lack of computer time being made available.

For those with more pragmatic inclinations, from 1945 to 1951, after his time at Bletchley Park, contemporaneous with ACS founder John Bennett’s work on CSIRAC, Turing was involved with pioneering computer projects including the design of the Automatic Computing Engine (ACE, later built as the Pilot ACE), The Manchester Baby or Small Scale Experimental Machine (SSEM), and the Ferranti Mark I, for which he wrote the Programmer’s Manual in 1951. The Pilot ACE is on display in the London Science Museum. His design frequently changed, was optimal in terms of hardware, but complex to program. Turing said: “In working on the ACE I am more interested in the possibility of producing models of the brain than in the practical applications to computing”. [2]

Early insights into the nature of AI set down in a paper entitled “Computing Machinery and Intelligence”, published in the philosophy journal Mind in 1950, led to his notion of an “Imitation Game”, the now famous “Turing Test”, a means by which to determine whether a questioner is communicating with an entity with human-level AI.

The ACM presents the Turing Award annually to someone who has contributed something that is judged to be of major and lasting importance to the computing science field. One of its recipients, Alan Perlis, in 1966 said: “On what does and will the fame of Turing rest? That he proved a theorem showing that for a general computing device—later dubbed a “Turing Machine”—there existed functions which it could not compute? I doubt it. More likely it rests on the model he invented and employed: his formal mechanism. This model has captured the imagination and mobilized the thoughts of a generation of scientists”.

Arguably, Turing is to Computing as Einstein is to Physics. In 2005, there were celebrations worldwide of Einstein’s “year of miracles”. This year there have been similar celebrations of Turing’s birth 100 years ago. [3] Einstein and E=MC2  are well known, but can the same be said of Turing and his Machine? Is he a household name along with Einstein? Many take for granted the existence of the computer, smart phones, and a myriad other computationally-enabled devices found in virtually every facet of our modern lives. We, as computing professionals, should strive to make better known the work of Turing and his contemporaries, and more generally, the broader history of our field.

I looked into the possibility of an Adelaide cinema screening of the film, CODEBREAKER, about Turing’s life (via TodPix) but received a response to say that there are no plans for a theatrical release in Australia; it was screened on SBS One in June 2012 on a 3 year contract, so perhaps it will be aired again.

Update (November 2013): CODEBREAKER is now available for sale on DVD!

References

  1. http://www.abc.net.au/radionational/programs/scienceshow/alan-turing-e28093-thinker-ahead-of-his-time/4034006
  2. Lavington, S. (ed.), 2012, “Alan Turing and his Contemporaries”, BCS
  3. http://amturing.acm.org/acm_tcc_webcasts.cfm
  4. http://www.turingfilm.com
  5. http://www.turing.org.uk/turing/
  6.  “The ACM Turing Award Lectures: The First Twenty Years”, 1987, ACM Press

Explaining how computers work with the TEC-1

May 30, 2010

I was recently asked to give a talk to my son’s primary school class about how computers work.

The Powerpoint slides from the talk consist mostly of pictures and towards the end, a small Z80 machine code program (to add a number to itself) for the TEC-1 single board computer.

TEC-1 image from Issue 10 of Talking Electronics Magazine

TEC-1 from Issue 10 of Talking Electronics Magazine cover

My wife and I created a short video showing the program being entered and executed multiple times via the TEC-1’s hex keypad.

As I told the kids during that talk, if you want to understand how a computer really works, you need to get close to the machine-level and talk about processors, memory, buses and so on. So we did, and despite leaving out a lot of details, I think the idea of going from X = X+X to a sequence of simple instructions and a numeric representation palatable to a Z80 made some sense to many of the kids, and at least provided a source of fascination to most. Apart from that, I think it was fun.

We also spent a lot of time talking about the extent to which computers now pervade our lives and how much we take that and the people whose ideas and work made it all possible for granted, including Babbage and Lovelace, Leibniz, Boole, Turing, and so many hardware and software pioneers.

Like many hobbyists in the 70s, 80s and beyond, the idea of building a simple computer from components in a kit was alluring. I’ve been doing paid software development for almost 30 years but was a hobbyist for more than a decade or more before that. I was introduced to the Joy of Computing in Year 10 due to the purchase of a PDP-11/04 by my school (Norwood High in Adelaide) in the late 70s. Along with a love of astronomy that continues to this day, I maintained an interest in programming throughout the 80s, during which time I was a nurse. I eventually decided to convert one of my hobbies into a profession, but still maintain the attitude of a hobbyist, developing open source software such as my current project: VStar.

My hope is that I’ve instilled in at least some of those kids a hunger to know more about computers and programming.

Lisp’s 50th birthday

October 29, 2008

John McCarthy‘s Lisp programming language —is 50 years old (October 2008). Lisp is the second oldest programming language still in use today, next to Fortran.

John McCarthy

John McCarthy

Lisp50 at OOPSLA 2008 celebrated Lisp’s contributions.

I celebrated by giving a talk to the Australian Java User Group in Adelaide about Clojure, a new dialect of Lisp for the JVM.

There’s a lot of interesting material to be found by Googling, but here are a few relevant links:

A decade ago I developed LittleLisp for the ill-fated Newton PDA.

There’s a nice parody song called The Eternal Flame which is all about Lisp, and here’s some amusing xkcd Lisp cartoons:
Lisp still looms large:
  • in Emacs as e-lisp;
  • it has mature free implementations (e.g. take a look at PLT Scheme);
  • and active commercial implementations (e.g. the LispWorks mailing list is very active).
Lisp refuses to lay down and die. In his 1979 History of Lisp paper John McCarthy said:

One can even conjecture that LISP owes its survival specifically to the fact that its programs are lists, which everyone, including me, has regarded as a disadvantage. 

In ANSI Common Lisp, Paul Graham points out that Lisp has always put its evolution into the hands of its programmers, and that this is why it survives, especially via the macro systems as found in some dialects (e.g. Common Lisp, Clojure), which make the full power of the language available to generate Lisp code at compile time.

Irrespective of how widely used Lisp dialects are today, we should continue to remember its contributions to programming: code as data, higher order functions, application of functions to the elements of a list, an emphasis upon recursive solutions to problems, erasure of abandoned data (garbage collection), the Read-Eval-Print Loop (REPL), to name a few.

As for the future, it’s always uncertain. Here are some notes about the future of Lisp from the OOPSLA Lisp50 session, which suggests that Clojure may be a big part of that. Next year’s International Lisp Conference has the working title “Lisp: The Next 50 Years”. 
 
I’ll end with a quote from Edsger Dijkstra:

—Lisp has jokingly been called “the most intelligent way to misuse a computer”. I think that description is a great compliment because it transmits the full flavor of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.

On the importance of pure research

January 14, 2007

I recently finished reading the book Engines of Logic (2000) by Martin Davis (apparently published as The Universal Computer in some countries) of Davis-Putnam SAT-solver algorithm fame, a book about the origins of computer science from the viewpoint of the mathematicians who founded it, in particular: Leibniz, Boole, Frege, Cantor, Hilbert, Godel and Turing.

Leibniz had the notion that it ought to be possible to be able to write down ideas in a language (he called this a universal characteristic) such that “serious men of good will” could sit together to solve some problem by calculation using an algebra of logic he referred to as the calculus ratiocinator.

Despite attempts at such a language and algebra of logic by Leibniz, it was ultimately the work of his successors that gave rise to the logic that made automated computation possible.

Of Leibniz’s work Davis said that “What Leibniz has left us is his dream, but even this dream can fill us with admiration for the power of human speculative thought and serve as a yardstick for judging later developments.” 

In the epilogue, Davis had this to say:

The Dukes of Hanover thought they knew what Leibniz should be doing with his time: working on their family history. Too often today, those who provide scientists with the resources necessary for their lives and work try to steer them in directions deemed most likely to provide quick results. This is not only likely to be futile in the short run, but more importantly, by discouraging investigations with no obvious immediate payoff, it shortchanges the future. 

These days, universities and it seems, too many aspects of society are becoming shackled to the oft-times short sighted and petty expectations of business, as if it mattered as an end in itself. We would do well to pay attention to history.

On the subject of history, it occurs to me increasingly that most of what we study is in fact historical in nature. Incremental advances in computer science, software engineering,  astronomy, and Science in general are mere blips on the vast landscape of accumulated knowledge. When I read books such as Engines of Logic and The Art of Electronics, I am overwhelmed by the contributions of countless scientists and engineers over decades, to say nothing of the work of the founders of Science such as Copernicus, Galileo, Kepler, Newton, and Einstein.