Skip to content

Instantly share code, notes, and snippets.

@louis-she
Last active September 26, 2023 10:56
Show Gist options
  • Save louis-she/65aae09b7096550ae7cda9e1cc272f3e to your computer and use it in GitHub Desktop.
Save louis-she/65aae09b7096550ae7cda9e1cc272f3e to your computer and use it in GitHub Desktop.
一键部署各种 internlm
cd ~
available_models=(internlm/internlm-chat-7b internlm/internlm-chat-7b-8k internlm/internlm-chat-20b lmdeploy/turbomind-internlm-chat-20b-w4)
num_gpus=$(nvidia-smi -L | wc -l)
model_name=$(dirname $1)
found=false
for model in "${available_models[@]}"
do
if [ "$model" == "$1" ]
then
found=true
break
fi
done
if [ "$found" == false ]
then
echo "Usage: $0 \$model_name"
echo "Available model_name is: ${available_models[@]}"
exit 1
fi
if pip show lmdeploy >/dev/null 2>&1
then
echo "Package lmdeploy is already installed"
else
echo "Installing Python dependencies..."
pip install lmdeploy socksio
fi
if git lfs >/dev/null 2>&1
then
echo "Git lfs is already installed"
else
echo "Installing git lfs extension..."
curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
sudo apt-get install -y git-lfs
git lfs install
fi
echo "Download repo and weights..."
if [ ! -d ${model} ]
then
git clone https://huggingface.co/${model} ${model}
fi
# if model start with turbomind
if [[ $model == lmdeploy/* ]]
then
python3 -m lmdeploy.serve.gradio.app --tp=${num_gpus} --server_name=0.0.0.0 ${model}
else
echo "Converting model to turbomind..."
python3 -m lmdeploy.serve.turbomind.deploy --tp=${num_gpus} --dst_path=${model}_workspace ${model_name} ${model}
echo "Launching"
python3 -m lmdeploy.serve.gradio.app --tp=${num_gpus} --server_name=0.0.0.0 ${model}_workspace
fi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment