Skip to content

Instantly share code, notes, and snippets.

@mberman84
Created August 14, 2023 14:07
Show Gist options
  • Star 42 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
  • Save mberman84/c5d77723d2848e890f41a7a226f36eec to your computer and use it in GitHub Desktop.
Save mberman84/c5d77723d2848e890f41a7a226f36eec to your computer and use it in GitHub Desktop.
MetaGPT
# Must have conda installed
# It costs approximately $0.2 (in GPT-4 API fees) to generate one example with analysis and design, and around $2.0 for a full project.
conda create -n metagpt python=3.11.4
conda activate metagpt
npm --version # to check you have npm installed
# optional: install node if you don't have it
npm install -g @mermaid-js/mermaid-cli
git clone https://github.com/geekan/metagpt
cd metagpt
python -m pip install -r requirements.txt
python setup.py install
# create openai API key
# insert API key into config/config.yaml
# make sure to set the model correctly depending on which model you have access to
python startup.py "Write a cli snake game based on pygame" # optional --code_review True (better code, costs more)
# if you get ModuleNotFoundError: No module named 'bs4’, run "python -m pip install bs4"
python startup.py "Write a cli snake game based on pygame" # do this again if you got the ModuleNotFoundError
@bewithdhanu
Copy link

how to use this with GPT-3.5 ?

@Mansoury22
Copy link

how to use this with GPT-3.5 ?

OPENAI_API_MODEL: "gpt-3.5-turbo"

@bewithdhanu
Copy link

with gpt-3.5-turbo, I see MAX_TOKENS is set to 1500 in config.yaml, but I am getting this error

This model's maximum context length is 4097 tokens. However, you requested 4238 tokens (2738 in the messages, 1500 in the completion). Please reduce the length of the messages or completion.

@rsfutch77
Copy link

It failed to generate any code for me and would break each time I got to the code generation step. The requirements worked fine. After I commented out all the mermaid_to_file commands in design_api.py, then it finally generated code.

@kbolt
Copy link

kbolt commented Sep 25, 2023

For line #7 the command to install nodejs within conda from the commandline is:
conda install -c conda-forge nodejs

@brown7477
Copy link

Will this generate ReactJS webapps?
I am only seeing python results even after requesting ReactJS in the prompt.

@kbolt
Copy link

kbolt commented Oct 10, 2023

Will this generate ReactJS webapps? I am only seeing python results even after requesting ReactJS in the prompt.

The prompt for the programmer persona is set to only output python. You will have to tweak it on your end to use anything different.

@brown7477
Copy link

Will this generate ReactJS webapps? I am only seeing python results even after requesting ReactJS in the prompt.

The prompt for the programmer persona is set to only output python. You will have to tweak it on your end to use anything different.

Any idea how to do this?

@iconixgroups
Copy link

TypeError: '>' not supported between instances of 'float' and 'str'

@JianJinglin
Copy link

JianJinglin commented Oct 26, 2023

with gpt-3.5-turbo, I see MAX_TOKENS is set to 1500 in config.yaml, but I am getting this error

This model's maximum context length is 4097 tokens. However, you requested 4238 tokens (2738 in the messages, 1500 in the completion). Please reduce the length of the messages or completion.

The same situation was with me when I used gpt-3.5-turbo. When I changed it with gpt 4.0, it worked and created a program exactly like the one in Berman's video.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment