Package level generics may look less confusing.
package abc(Key,Value)
contract CompileCheck(a1 Key, a2 Value) {
//...
}
type Pair struct {
K Key
V Value
}
func NewPair(k Key, v Value) Pair {
return Pair{K: k, V: v}
}
package xyz
import "abc(int32,float32)" myabc
func newPair() myabc.Pair {
return myabc.NewPair(1, 1.0)
}
Introduce two step compilation. Firstly create abc[Key,Value].lib from source. Then create abc(int32,float32).a from abc[Key,Value].lib . Both abc[Key,Value].lib and abc(int32,float32).a can be saved to reduce next compilation time.
Hi, I've got some comment about your idea.
Why not directly
So it is self documenting and you don't need to declare the types twice.
It would work really well for the std/container libraries. The elephant in the room is the rest of standard libraries.
sync.Pool and sync.Map would have to be placed in a new package .
I wonder how filter/map/reduce would be written. You don't want to import the package N times.
Perhaps it could be written like that:
And if you can infer the types, you could skip the type specification and do return
fmp.Filter(v,f)
(But I except this is the hard part)I'm glad there is some discussion about package level generics because it feel quite simpler:
For package author, put everything in the beginning of your file and the the rest of the code is the same as today. You don't need for each of your method to add the type parameter.
For package user, either the package is one file like container/list and it's really easy to understand just by ready the first line. Or for bigger package it could be placed in doc.go
P.S I would not put the types in the import path and place the name in the beginning. Like that :
import myabc "abc"(int32,float32)