Last active
January 4, 2025 04:01
-
-
Save aidando73/943b5f02d35571eb783f3cca5afa6e59 to your computer and use it in GitHub Desktop.
comments.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thanks for all your contributions @aidando73 -- we want to keep supporting
completions
at least for now because we believe having raw access to a model is as important. Unlike other providers, Llamas are open source and people play with them and iterate in a variety of ways. The kinds of manipulations we do with a chat_completion endpoint internally may not be what users intend sometimes. Sometimes, they just want a carefully formatted prompt to hit the model directly.On that theme, I think it would be great if Groq could build completions endpoints on their end too. But until that time,
NotImplementedError()
would have to do.