#DOT with higher-kinded types
A version of DOT with higher-kinded types that's based on the Wadlerfest formulation.
##Syntax
Var x, y, z
Value v, w ::= lambda(x:T)t
new(x:T)d
Term s, t ::= x
v
x y term application
x.a term member selection
let x = s in t let
Definition d ::= { a = t } term member
{ A : K } type member
d & d
RefType R ::= x.A
R x
Type S, T, U ::= R
Top top
Bot bottom
S & T intersection
all(x:S)T dependent function type
rec(x:T) recursive record type
{ a : T } term member
{ A : K } type member
Kind K ::= all(x:S)K dependent function kind
S..T type interval kind
Context Γ ::= ∅ empty
Γ, x: T extension
###Term typing Γ ⊢ x: T
These are exactly as in original DOT.
x:T in Γ
-----------
Γ ⊢ x : T
Γ, x:T ⊢ t: U
-------------------------------
Γ ⊢ lambda(x:T)t : all(x: T)U
Γ, x: T ⊢ d: T
-------------------
Γ ⊢ new(x:T)d : T
Γ ⊢ x: all(z:T)U Γ ⊢ y: T
------------------------------
Γ ⊢ x y : [z:=y]U
Γ ⊢ t: T Γ, x: T ⊢ u: U x ∉ fv(U)
-----------------------------------------
Γ ⊢ let x = t in u : U
Γ ⊢ x: {a: T}
---------------
Γ ⊢ x.a : T
Γ ⊢ x: T
------------------
Γ ⊢ x: rec(x: T)
Γ ⊢ x: rec(x: T)
------------------
Γ ⊢ x: T
Γ ⊢ x: T Γ ⊢ x: U
-----------------------
Γ ⊢ x: T & U
Γ ⊢ x: T Γ ⊢ T <: U
------------------------
Γ ⊢ x: U
###Definition typing Γ ⊢ d: T
The second rule is adapted, the other two are as in original DOT.
Γ ⊢ t : T
----------------------
Γ ⊢ {a = t} : {a: T}
K singleton
-------------------
Γ ⊢ {A:K} : {A:K}
Γ ⊢ d1: T1 Γ ⊢ d2: T2
--------------------------
Γ ⊢ d1 & d2 : T1 & T2
T..T singleton
K singleton
---------------------
all(x:S)K singleton
Γ ⊢ x: {A: K}
---------------
Γ ⊢ x.A: K
Γ ⊢ R: all(x:S)K Γ ⊢ y: S
-------------------------------
Γ ⊢ R y: [x:=y]K
###Sub-kinding: Γ ⊢ K1 <: K2
(new)
Γ ⊢ S2 <: S1 Γ ⊢ T1 <: T2
------------------------------
Γ ⊢ S1..S2 <: T1..T2
Γ ⊢ S2 <: S1 Γ, x: S2 ⊢ K1 <: K2
------------------------------------
Γ ⊢ all(x:S1)K1 <: all(x:S2)K2
###Subtyping Γ ⊢ T1 <: T2
The rules are as in original DOT, except for the last three rules, which are straightforward adaptations.
Γ ⊢ T <: T
Γ ⊢ S <: T Γ ⊢ T <: U
--------------------------
Γ ⊢ S <: U
Γ ⊢ T <: Top
Γ ⊢ Bot <: T
Γ ⊢ T & U <: T
Γ ⊢ T & U <: T
Γ ⊢ S <: T Γ ⊢ S <: U
--------------------------
Γ ⊢ S <: T & U
Γ ⊢ S2 <: S1 Γ, x: S2 ⊢ T1 <: T2
-------------------------------------
Γ ⊢ all(x:S1)T1 <: all(x:S2)T2
Γ ⊢ T1 <: T2
---------------------
Γ ⊢ {a:T1} <: a:T2}
Γ ⊢ K1 <: K2
----------------------
Γ ⊢ {A:K1} <: {A:K2}
Γ ⊢ R: S..T
-------------
Γ ⊢ S <: R
Γ ⊢ R: S..T
-------------
Γ ⊢ R <: T
I think you are right in that we don't need the subtyping rule for fully applied type operators. I'm not sure there's a workaround for partial applications though, e.g.
I think this case might crop up in the meta theory, but I'm not sure.
Meta theory
As for the meta theory being a "straightforward adaptation of the original proof", I'm afraid that's not the case. Here's why.
Recall that the original proof relies on a "tightened" subtyping judgment,
Γ ⊢# S <: T
with more stringent subtyping rules for type selections:where
⊢!
denotes strict typing, which only allows very limited use of subsumption. The original rules are then shown to be admissible in typing contextsΓ
that correspond to stores. The admissibility proof makes essential use of the "has-member" judgment, which is used to "unzip"/"invert" typings of the formΓ ⊢# x: {A: S..T}
intoΓ ⊢! x: {A: U..U}
andΓ ⊢# S <: U, U <: T
. The requirement thatΓ
correspond to a store is crucial because otherwise we might be in a situation whereΓ, x: {A: S..T} ⊢# x: {A: S..T}
and the intervalS..T
cannot be "unzipped" further.In the presence of HK types, we might want to introduce a similar tight subtyping judgment where only
are allowed, and a similar "has-member" (or rather "is-interval") judgment for types. However, things quickly get complicated. Firstly, the premise of the above rules might be derived as
so one needs mutually inductively defined "has-member" judgments for both terms and types. Secondly, the judgment for types also needs to handle application, i.e. we need one or more rules
Whatever the precise formulation of these rules, they will have to deal with the fact that
S
might have been kinded asNote that
S
is not an interval, andy: T₂
).This situation is reminiscent of the trouble caused by the presence of a subtyping rule for recursive types in DOT. However, unlike that rule, the (∀-<::-∀) subkinding rule is crucial in this calculus. Without it, we can't type e.g.
I'm currently looking into possible solutions to this issue. Ideas are welcome.
Other issues
There are a number of issues with the above proposal which I tried to address in the updated/extended version (see https://gist.github.com/sstucki/3fa46d2c4ce6f54dc61c3d33fc898098). Here's a short discussion of these issues:
ANF in types
Just as Wadlerfest DOT, this proposal uses ANF throughout. I don't think this is helpful. At the very least, it requires
let
binders in types, since type-level computations need to be "boxed" and bound to term variables before they can be used in type applications. E.g. the following HK type definitions, can't be encoded using ANF without the use of type-levellet
binders:The intuitive encodings don't work because type applications of the form
R1 (R2 x)
are not permitted syntactically. We also can't bind the inner applications to helper variables outside of the definitions since they depend on theX
parameter bound in the abstraction.I think introducing
let
binders (along with corresponding kinding and subtyping rules) would unnecessarily complicate the calculus. Let's just not use ANF.Type-level lambdas vs. singleton kinds
This is not so much an issue as a clarification: in my extended proposal I suggest to use type-level lambdas to initialize arrow/forall-kinds, e.g.
This is consistent with the way term-level lambdas inhabit arrow types and with the literature on HK types. You seem to prefer the use of singleton arrow-kinds and subkinding instead, e.g.
which is probably closer to what happens in the compiler? Of course, lambdas are still there, they are just disguised as singleton kinds now:
So I'm perfectly happy with either proposal, just as long as it doesn't complicate the meta theory.
To simplify the exposition, I'd treat singleton kinds syntactically (rather than through a separate judgment). I.e.
Kinding of non-RefTypes
Probably the biggest source of extra rules in my extended proposal comes from the fact that it gives kinds to all types.
You make a syntactic distinction between Types and RefTypes, and only RefTypes are kinded. I find that rather odd, could you maybe elaborate a bit on that? Why would you have types that are unkinded?
In any case, it leads to rather absurd results. Since RefTypes are included in Types,
R..R
is an OK singleton kind, which means any RefTypeR
, irrespective of its actual kind can be treated as if it had kindR..R
and thus (by subsumption)⊥..⊤
. This means we can encode and type-check definitions likeI don't know whether this constitutes a soundness issue or not, but it certainly seems strange.
The best solution I could come up with is to assign to every "proper" type
T
its singleton kindT..T
, at the expense of eight additional kinding rules (one for each non-RefType plus one for lambdas). Obviously, I'm open for suggestions on how this could be simplified.Minor issues
The type in the conclusion of the typing rule for recursive records is wrong (
x
might be free inT
). It should beThe typing rule for
let
binders is missing:The subsumption rule for kinding is also missing: