Skip to content

Instantly share code, notes, and snippets.

@renaud
Last active May 13, 2023 18:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save renaud/f728e5d8045a792ce44544d9067046f6 to your computer and use it in GitHub Desktop.
Save renaud/f728e5d8045a792ce44544d9067046f6 to your computer and use it in GitHub Desktop.
run LLM locally
!pip install guidance flash_attn einops transformers
# https://medium.com/@marcotcr/exploring-chatgpt-vs-open-source-models-on-slightly-harder-tasks-aa0395c31610
import guidance
find_roots = guidance('''
{{#system~}}
{{llm.default_system_prompt}}
{{~/system}}
{{#user~}}
Please find the roots of the following equation: {{equation}}
Think step by step, find the roots, and then say:
ROOTS = [root1, root2...]
For example, if the roots are 1.3 and 2.2, say ROOTS = [1.3, 2.2].
Make sure to use real numbers, not fractions.
{{~/user}}
{{#assistant~}}
{{gen 'answer'}}
{{~/assistant~}}''')
mpt = guidance.llms.transformers.MPTChat('mosaicml/mpt-7b-chat', device=1)
#vicuna = guidance.llms.transformers.Vicuna('yourpath/vicuna-13b', device_map='auto')
#chatgpt = guidance.llms.OpenAI("gpt-3.5-turbo")
equation = 'x^2 + 3.0x = 0'
roots = [0, -3]
answer_mpt = find_roots(llm=mpt, equation=equation)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment