Skip to content

Instantly share code, notes, and snippets.

@simonwhitaker
Last active August 29, 2015 14:02
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save simonwhitaker/7f8f4c815fa5c40f6abe to your computer and use it in GitHub Desktop.
Save simonwhitaker/7f8f4c815fa5c40f6abe to your computer and use it in GitHub Desktop.

I was intrigued by this example from The Swift Programming Guide's Extensions section:

extension Double {
    var km: Double { return self * 1_000.0 }
    var m: Double { return self }
    var cm: Double { return self / 100.0 }
    var mm: Double { return self / 1_000.0 }
    var ft: Double { return self / 3.28084 }
}

let threeFeet = 3.ft
println("Three feet is \(threeFeet) meters")
// prints "Three feet is 0.914399970739201 meters

Surely 3 would be an Int, so this would fail to compile because ft isn't defined for Ints? But it doesn't. It appears that in this case the type of that 3 literal is being inferred based on the instance property called on it. The compiler is saying, "he wants to call .ft on that number; the only numeric type with a .ft property is Double, so let's make it a Double."

Which you could imagine getting a bit dangerous; if I subsequently declare a .ft declared property on Int, the compiler will use that instead:

extension Double {
    var km: Double { return self * 1_000.0 }
    var m: Double { return self }
    var cm: Double { return self / 100.0 }
    var mm: Double { return self / 1_000.0 }
    var ft: Double { return self / 3.28084 }
}

extension Int {
    var ft: Int { return self * 2 } // because why not
}

let threeFeet = 3.ft
println("Three feet is \(threeFeet) meters")
// prints "Three feet is 6 meters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment