Not quite sure how this conversation came about, but I got into an online chat with a fellow developer over the underpinnings of the Swift programming language. I always thought that it took most of its cues from Scala and C#, with perhaps a passing not towards Jetbrain's fledgling Kotlin language.

'Nope,' the other guy said. 'This Swift language has Python written right through it.'

Have to say, I wasn't really convinced. I mean the most obvious difference is that Swift doesn't rely on indention to ascertain semantic rules.

But my learned colleague assured me that was just superficial: 'That's just compiler stuff; you have to look much deeper.'

After a few minutes though, I could see what he was driving at. When you look at the design of a programming language, you need to go beyond the statements and the symbols and look deeply at what the language designers were trying to achieve. In this regard, the designers of Python and Swift shared the same goal: simplicity. One of the credos I keep running across in the Python forums is that wherever possible there should be just one way to do one thing. This idea keeps the language clean and simple. Now, Swift may look different, but the recent decisions they've take for Version 3 of the language makes me think that they want to bring it closer to the ideals that Python was created under. For example, rather than adding features to the language, it looks like the Cupertino mothership is looking to take some out.

Exhibit A – The Deprecated Decrement

Most modern languages have shortcut operators for incrementing/decrementing values (usually something like ++ and --). Java has them. C# has them, and at the moment, Swift has them. But I'll tell you what language doesn't support increment/decrement operators: Python. They're a handy shortcut to be sure, but that's all they are: a shortcut. The problem from Apple's point of view is that the operators can lead to this sort of nonsense:

while (++x < y++)

Yeah, it's valid code, but you can see how it'll cause something of a headache for the next programmer who has to come in and fix it. Besides which, there's a way to achieve the same results without making the code a nightmare to debug: just use the regular operators to add 1 to the variables.

x = x + 1
y = y + 1

will do the same job, and will be much easier to follow. This also maintains the notion of immutability. For a few years now, software development has been leaning towards functional programmming and the idea that any operation should return, as far as is possible, an immutable result. The problem with the increment/decrement operator is that this notion of immutability is hidden.

Exhibit B – The Stolen Curry

The newer pseudo-functional languages, such as Scala and Kotlin, include short cuts to curry functions. Swift used the same double-bracket notion. The old code would look like this:

func curried(x: Int)(y: String) -> Float {
  return Float(x) + Float(y)!
}

If you know how currying works under the hood then the code probably isn't difficult to work out: The function curried(x) will return another function that will use x internally, take another parameter y and return a Float value. Okay, that's great if you know currying, but what if you're new to the concept? Remember that Swift is designed to be a teaching language as well as a systems language and an application language and a server-side language. The currying notation is going, and will be replaced by an explicit declaration of what is actually happening.

func curried(x: Int) -> (String) -> Float {
  return {(y: String) -> Float in
    return Float(x) + Float(y)!
  }
}

As you can see, the function is explicity declared as taking an integer (x) as a parameter and then returning a function that takes a string as a parameter returning a Float. And in the code we can see explicitly that we are in fact returning a function. Yes, a little more verbose, but a lot easier for the layperson to see what's going on. Of course, the function is called in exactly the same way:

val float_add = curried(10)(5)

See? The function call curried(10) returns a function that is called with 5 as a parameter.

So what does this have to do with Python?

Python is a language that favours simplicity and ease of understanding over brevity. Like Swift3, it doesn't have increment/decrement operators, and there is no shortcut to declaring curried functions:

def curried(x):

    def _addY(y):
        return x + y

    return _addY

So going forward Swift3 means you will have to write a few more extra lines of code, but six months down the line, when you read it again, chances are you'll understand what it's doing.


Comments

comments powered by Disqus