The Smile Position Paper

A Thought from a Tweet

This morning, I bumped into Paul Ford’s quote, and it was so brilliant that I just had to expand on it.  In the design of Smile I’ve thought about this for years — what matters in computation, and have I made the language focus on that? — but I think it’s worth talking about it as a position paper.  Ford hits the nail on the head:  Smile has an opinion, and it’s a strong opinion, just one that’s very different from most languages in use today.

Let’s first see how Ford describes the position of JavaScript (the only language he includes a fraction of a position paper for):

https://twitter.com/ftrain/status/1071149526747152389

The DOM, the user, running in a sandbox.  Dev speed, isomorphism, frameworks.  IMPATIENCE.  These are interesting foci, to be sure.  But not the ones I’d choose.

The Fifty-Year Question

Smile’s beginnings in the ’90s were prompted by the fifty-year question:  What would it take for a programming language to survive fifty years?  What would have to be part of the design in order for it to last?

We have no shortage of languages that were designed for the moment.  Or, worse, designed in the moment, like VBA.  But most languages are designed around a current need:  Perl came into existence because shell scripts and AWK weren’t quite good enough to munge the log-file data Larry Wall kept bumping into as a sysadmin.  C came into existence because Dennis Ritchie needed to make a language for Brian Kernighan to build an OS in, and they didn’t have the hardware resources for a “big” or “proper” language.

Some of these have scaled and grown surprisingly well.  Some, like SNOBOL, didn’t last long when better languages or tools came along to obsolesce them.

At the time I posed it, the fifty-year question didn’t even make sense:  The computer industry hadn’t even existed for fifty years!  In 1938, there weren’t enough vacuum tubes in the world to even handle TinyC, much less NodeJS.  But as I write this, 2018 is drawing to a close, and that means enough time has elapsed that we can actually see what fifty years looks like in computing.

Fifty Years Ago

So what was the computing world of 1968?  Let’s take a look at the hardware:

  • The integrated circuit had been invented nine years before.  By 1968, a chip could have around 500 transistors on it.  (The computer I’m typing this on has 4.8 billion in just its CPU.)
  • IBM System/360 mainframes were the powerhouses of the era.  These were 32-bit machines, and running at 16 MHz (0.16 GHz), with about a megabyte of memory, a large enough institution could buy one for just a few million dollars (maybe a mere $10 million in today’s dollars).
  • DEC PDP minicomputers were making inroads.  For just $72,000 (a cool half million today), you too could own an 18-bit PDP-7, running at 250 KHz (0.25 MHz), with 4K of memory, and you only needed one room to put it in.
A PDP-7 under restoration.  Courtesy Wikipedia:  By en:User:Toresbe,, CC SA 1.0. [link]

That’s the hardware, but what about languages?

  • ALGOL was the popular mainframe structured language, despite Niklaus Wirth storming out of the design committee the year before to begin designing a simpler variant of it that he would name Pascal.
  • COBOL was used by businesses for accounting work.  At nine years old, COBOL had been cemented by IBM as the right way to do business, if you could afford the mainframe to run it.
  • FORTRAN had just reached its 10th birthday, and was growing entrenched in the scientific and engineering communities as the tool of choice, again, if you worked for a university that could afford the mainframe to run it.
  • LISP was popular in the burgeoning new field of computer science at the big universities, but was too slow for anything “practical.”
  • CPL and BCPL had been invented, but their designs were too big and unwieldy to be fully implemented or usable.  Over the next five years, Dennis Ritchie would take BCPL and strip it down to make B, and then expand it up into C.

Notably, if you wanted to code on the PDP machines, your choices were DEC’s FOCAL language or assembly.  BASIC would eventually be ported to it, but the PDP-8 wasn’t capable enough to handle any of the “real” languages.

So what was on the horizon in 1968?

  • Dijkstra’s famous “Go To Considered Harmful” essay was published in March 1968.  It would eventually lead to modern structured programming, but by the end of 1968, it was still a massive controversy, with most programmers believing “goto” was still a critical language construct.
  • Object-oriented programming was introduced in SIMULA-67, but it was only a year old, and it wouldn’t make a really solid dent in the computer industry for another twenty years.
  • Scheme, Prolog, and ML were still a half-decade away from being invented.
  • The first commercial microprocessor, the Intel 4004, wouldn’t be invented for another three years.
  • Personal computers were nearly a full decade away, and not even a dream in anyone’s eye.

The Point of Fifty Years

What’s the point of all this?  To show just how much things change in fifty years in this business!  In the last fifty years —

  • Chips have 100 million times more transistors.
  • The CPU of a million-dollar IBM mainframe that took up several large rooms in 1968 is less capable than the CPU on a knockoff cheap $20 Chinese Android watch in 2018.
  • The computer I’m typing on has 16 gigabytes of RAM on it; that’s ten thousand times as much as on that 1968 IBM mainframe that cost a hundred times more than my house in today’s dollars.
  • Networks went from the world’s first packet transmission in August 1968 to even refrigerators are on the Internet in 2018.

In short, what fifty years means is that things change not just by hundreds of times, or thousands of times, but by millions and billions of times.

We have to assume that in fifty years, the computing hardware of 2068 will be a billion times more capable than the computing hardware of 2018, that there will be a billion times as many computing devices, and that they will do things we can’t conceive of.

That’s what fifty years looks like in computing.  Heck of a business to be in.

The Fifty-Year Survivors

So then what does it take for a programming language to survive fifty years’ worth of change?  Or, perhaps a better question:  What does it take for a programming language to be not only useful but meaningful after fifty years of change?

Let’s look, then, at some of the languages that survived the last fifty years, and see how they did it!  In this, I’m going to focus not on the languages that are “just barely hanging on,” in institutional places that don’t handle change very well, but on languages that still have new, active projects being developed in them, and that are themselves being expanded on and refined.  It’s a short list.

  • FORTRAN is one of the granddaddies, and it just keeps hanging on.  Today’s Fortran is a far cry from punch-card Fortran 1959, but scientists and engineers are still using it, still building things with it, and still sending rockets into space with it.  It’s grown structured-programming components, and object-orientation, and while it’s still pretty clunky from most computer scientists’ perspective, it’s survived by being useful to a critical group of very demanding people, and by adapting when new hardware or new concepts came along and were unavoidable.
  • LISP is not only still with us, but survives in more forms than ever.  From Clojure to AutoCAD to Emacs to the aliens in the video-game Abuse, Lisp manages to keep finding new ways to be relevant.  (Brendan Eich, creator of JavaScript, first wanted to have Scheme become the de-facto scripting language of the web, but instead his bosses at Netscape wanted a language that “looked more like Java.”)  Lisp survived by being adaptable:  Very little is required to be part of its design other than the parentheses and a few basic concepts that come directly from lambda calculus itself.  You can describe the language on a page (McCarthy’s Lisp), make a Lisp in a day (SIOD), and call nearly anything with parentheses-for-syntax a Lisp and get away with it.

Aaaaand…  that’s about it.  COBOL still exists in a few places, and BASIC still does too, but neither look like the originals, and it’s rare to see new software projects being started in them.

So what lets a language survive?  Even if you look at languages from forty years, or thirty years back, there are only a few traits that seem to be common to all of them:

  • Adaptability.  To survive in a changing world, the language must be able to change too.  You can’t be beholden to your punch cards when keyboards get invented.
  • Non-involvement of the designer.  In the most adaptable languages, you can bend, mutate, and tweak them to handle new circumstances without involving the designer.  Threads were bolted onto C without discussing it with Dennis Ritchie.  Objects were added to Lisp without McCarthy approving the designs.
  • Minimalism.  Fortran mostly does math.  Lisp has six primitives.  C has 17 keywords.  The less you build in, the less people have to work around what you built in.
  • Simple, pure foundations.  Again, Fortran is just math on steroids.  Lisp is a half-step up from the untyped lambda calculus.  C is “high-level assembly language.”
  • Be really good at something.  Do one thing, and do it well.  Even when strong contenders like Rust come along, C will still be preferred by a lot of people for its simplicity and purity.

These are the guidelines I focused on when I designed Smile.  I can’t guarantee Smile is a fifty-year language, and I may not be alive in fifty years to find out (92 is old by anybody’s definition) — but I can try.

Smile’s Position Paper

So now that I have the backstory, I can list my bullets to answer Paul Ford’s philosophical question.  What matters in Smile?  This is my answer:

  • Build in as little as possible.  To survive a long time, I can’t be opinionated about what language features should or shouldn’t exist.  async/await is hot today, but it might be dead tomorrow.  So the core of the language has as little as I can put in it, while still being useful enough to build what you need on top of it.
  • Syntax matters.  Humans speak natural languages, not computer languages, and that’s not going to change.  So the language must adapt to us, not the other way around.  Every construct in programming is a tiny domain-specific language, so Smile should not only focus on having ways to easily create DSLs that precisely match our needs and ways of thinking, but should make sure that all of those DSLs can play in harmony with each other.
  • Functions are the foundation.  The lesson of Lisp is that pure-functional computation is a good way to reason about thinking.  Functions (proper lexical closures) are one of the “deep primitives” of Smile, and I think that’s okay:  Math is built on functions, and math isn’t going away.
  • Symbols and data structures are critical.  Every program that manipulates non-trivial data has to store it in non-trivial ways.  Arrays, structures, trees, graphs — and all of those things are populated with not just strings but symbolic values.  Computation is not rarely about pure math, but has almost always been about manipulating relationships between pieces of information.
  • Text matters.  Half of computation’s history has been littered with new tools or even whole languages for munging text.  Human beings are speech-oriented, communication-oriented creatures, and that won’t change.  So Smile has not just a String, but a robust set of text operations, Unicode operations, regex operations, and parsing operations, because a lot of what we do consists of manipulating text.

Almost nothing built in; adaptable syntax; functions, symbols, and data structures at the core of the design; and text-crunching tools.  These are the core philosophical principles of Smile, and if you study them carefully enough, you’ll see that they all have something in common:  human language and structured knowledge.

Not types, not objects, not classes, not the web, not frameworks, not tools, not distributed servers, not a ton of other fads that have come and may yet go:  Smile is rooted in simple concepts that are thousands of years old, rooted in philosophy, mathematics, and thought:  Language and knowledge themselves.

So, then, what do I — and what does Smile — think matters most in computation?  Making working with knowledge easy, simple, and human-oriented.  And I think those are choices that can last fifty years.

Comments Off on The Smile Position Paper

Filed under Uncategorized

Comments are closed.