Sunday, January 06, 2019

Swift: Origins and My Personal History with it

Once upon a time there was a graduate student named Chris Lattner, who had a bright idea. That bright idea turned into LLVM (an acronym for Low Level Virtual Machine), essentially compiler technology, the popularity of which has been on a tear ever since.

Lattner went to work for Apple, but continued to be heavily involved in the development of LLVM, notably including in the extension of Clang (LLVM's C language front-end) to support C++. (In the LLVM world, a front-end turns what we commonly think of as computer code into LLVM-IR, LLVM Intermediate Representation, from which it can then be further transformed by a back-end into machine code for some specific platform, after a bit of polishing while still in LLVM-IR.)

As I understand it, in the wake of that effort, Lattner thought there had to be a better way, and set out to create what has become known as Swift (see also). As you might guess from its name, one of the design goals for Swift was that it should run quickly, especially as compared with scripting languages like Javascript that are interpreted into machine code as they are executed, rather than being compiled ahead of time.

Another primary design goal was that it should be safe, immune to many of the categories of defects that find their way into computer software. Yet another was that it should be modern, bringing together features from an assortment of other programming languages, features that make code easier to write and comprehend.

Something that may never have been a goal so much as an underlying assumption was that it should leverage LLVM. Of course it should; who would even think to question that, myself included! Swift's most tangible existence is as a front-end compiler for LLVM, implementing and evolving in lockstep with an evolving language specification. (That front-end compiler determines what will and will not be allowed to progress to the LLVM-IR stage.)

But to get back to the story of its origin, Lattner worked alone on this for awhile, then showed it to a few others within Apple, where it at first became something of a skunkworks project, then a more substantial project involving people from outside the company, but still keeping a low profile. In fact it kept such a low profile that when it was publicly introduced at WWDC 2014 nearly everyone was taken completely off-guard.

That public introduction brings to mind another design goal, or maybe design constraint, which was that Swift had to be interoperable with Apple's existing APIs (Application Programming Interfaces), otherwise it would have had a much more difficult time gaining traction. Being interoperable with the APIs meant being interoperable with Objective-C, the language Apple began using when it acquired NeXT in 1997. I can't speak to how Swift might have turned out differently if that were not the case, but I'm relatively confident this requirement served to accelerate its development by dictating certain decisions, obviating the need for extended discussion. (Swift also inherited some of the features of Objective-C, notably including automatic reference counting and the use of both internal and external parameter labels in functions, initializers, and methods, which contribute to its readability.)

So it's June, 2014, and Swift has just been announced to the world. Despite the vast majority of the existing code base being in Objective-C, C, or C++, both at Apple and at other companies providing software for Apple's platforms, the writing was plainly on the wall that Swift would eventually largely if not entirely displace Objective-C, just not right away. Since I didn't personally have a large Objective-C code base, and what I did have I'd basically neglected for over three years, I saw nothing to hold me back from diving into Swift, well nothing other than having very limited time to give to it.

However, as I got further into it, I discovered some details that muted my enthusiasm. Most importantly for my purposes was Swift's initial unsuitability for hard real-time use, like synthesizing complex sound on the fly (my primary use-case). It still isn't really suitable for such use, but it is getting closer.

I also had quibbling issues, including the initial lack of a Set type (collections of unique elements) despite that much of the functionality of sets had already been developed as part of the Dictionary type, and then, when a Set type was introduced, it felt like a step-child, with an initializer based on Array syntax (ordered collections which may have duplicate elements). I remember thinking, if Swift had started out with a Set type with its own syntax, it would have made far more sense for Dictionary syntax to be based on that rather than on arrays, since a Dictionary is essentially a Set with associated values, all of the same type. (There are only so many options for symbolic enclosure on most keyboards — parentheses, curly braces, square brackets, and angle brackets — and these were already fully subscribed — by expressions, blocks, arrays, and type specifications, respectively. Other symbols and combinations of symbols are available but would not be as straightforward to utilize, and, in any case, it's a bit late in the game to be making anything other than additive changes, alternative syntax that does not replace what already exists.)

Another quibble revolved around numeric values not being directly usable as Booleans (true/false). In C, in a Boolean context, zero evaluates to false, and any other value evaluates to true. This can be very convenient and is one of my favorite features of that language. Yes, in Swift one can always use a construction like numericValue != 0, but when you're accustomed to just placing a numeric value in a Boolean context and having it evaluated as a Boolean, that feels slightly awkward. I get that using numerics as Booleans invites the use of complicated Boolean constructions, which can make code more difficult to read. There have been many times when I've had to parse what I had myself strung out onto a single line, by breaking it out over several lines, to be able to understand it and gauge its correctness. Even so, it initially annoyed me that I would have to give this up in Swift. (I've long since gotten over this and now prefer the absence of implicit type conversions. In Swift, with regard to types, what you see is what you get, and that's a good thing!)

But there were also aspects of Swift I liked right away! Optionals, for example, just made so much sense I was amazed they hadn't already been incorporated into every language in existence, and enums with the choice of raw or associated values were clearly a huge improvement over how enumerations work in C and Objective-C. Also, once it finally settled down, function/method declaration and call-site syntax hit the sweet spot for me. Likewise access control — with the exception of the descriptive but nevertheless slightly awkward 'fileprivate' key word, but no more than I'm likely to ever have need for that I can live with it and certainly have no intention of attempting to reopen that can of worms at this late date!

Even though it initially seemed overdone, I like the emphasis on value types (value semantics). I'm comfortable with a little bit of indirection, but begin to get nervous when dealing with pointers to pointers, and get positively fidgety when dealing with pointer arithmetic. (You know you want to keep those stack frames small, and pointers to objects allocated on the heap can be the most direct means to that end, but it can get out of hand.) Happily, Swift not only doesn't take away the reference option, it makes using it easier and safer, while also preventing value types from resulting in overly large stack frames and excessive copying.

I also like the notions of protocols and generics, but in a fuzzier way, since I really don't completely comprehend either, and there are particular fundamental protocols, like Sequence, the existence of which I am vaguely aware of but not much more than that. I suppose these are the sorts of things I'll be delving into here going forward.

But to climb back up to my preferred edge-of-space point of view, I've had this sneaking suspicion all along that Swift is composed from primitives of some sort, although you might have to be a compiler adept to really understand what they are and how they fit together. To use the example of sets, dictionaries, and arrays, from above, the essential characteristic of a set is that its elements are unique; no two are identical. The Set type, as implemented, is unordered, but you might also have OrderedSet, and it is in fact quite possible to create such a type, moreover Apple's Objective-C APIs already include both NSOrderedSet and NSMutableOrderedSet. Likewise you might want an unordered collection, the elements of which are not necessarily unique. Uniqueness and ordering are independent attributes of collections.

Besides Sequence, among the protocols I'm vaguely aware of, are such things as Collection, MutableCollection, Numeric, and so forth. While these are straining in this direction, they really aren't the primitives that exist (only?) in my imagination, which at this point I would expect to be written in C++ rather than Swift, or only expressible in LLVM-IR.

I'm more curious about this than is good for me, considering how unprepared I am to understand things at this level, but there's no point in fighting it. Show me bread and I see the ingredients and the processes applied to them to make the finished product. Show me a car and I see parts and the assembly process. It's in my nature, and so it's also likely to show up here, to the extent I make any headway in wrapping my head around such esoterica.

(17Feb2019) I just realized that there's a general principle to be extracted from this, which is that there's no point in fighting a tendency to dream about what might be. Instead, it's better to choose dreams that are both achievable and worth the effort, and build bridges to them from current reality.

Update: As if to add credibility to my hunch about Swift being built up from abstract and/or compiler-level primitives, on January 10th Jeremy Howard published High Performance Numeric Programming with Swift: Explorations and Reflections, in which he states "Chris [Lattner] described the language to me as “syntax sugar for LLVM”, since it maps so closely to many of the ideas in that compiler framework."

No comments: