Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
GPT-2 345M Config
{
"initializer_range": 0.02,
"layer_norm_epsilon": 1e-05,
"n_ctx": 1024,
"n_embd": 1024,
"n_head": 16,
"n_layer": 24,
"n_positions": 1024,
"vocab_size": 50257
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment