Skip to content

Instantly share code, notes, and snippets.

@yugokamo
Created July 23, 2023 08:10
Show Gist options
  • Save yugokamo/928f1818bae6e008d3baa7774f71ab57 to your computer and use it in GitHub Desktop.
Save yugokamo/928f1818bae6e008d3baa7774f71ab57 to your computer and use it in GitHub Desktop.
Llama.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"machine_shape": "hm",
"gpuType": "T4",
"toc_visible": true,
"authorship_tag": "ABX9TyN/f+82WFsMwXKo1l3PyZYc",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/yugokamo/928f1818bae6e008d3baa7774f71ab57/llama.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"source": [
"## Packageのインストール"
],
"metadata": {
"id": "Q2SCa2dtIv6n"
}
},
{
"cell_type": "code",
"source": [
"# パッケージのインストール\n",
"!pip install transformers sentencepiece accelerate xformers"
],
"metadata": {
"id": "aRJoBKi8IRK5"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Hugging Faceへのログイン"
],
"metadata": {
"id": "5kQSH99SeTpM"
}
},
{
"cell_type": "code",
"source": [
"!git config --global credential.helper store"
],
"metadata": {
"id": "W-K8u_Osd-4b"
},
"execution_count": 2,
"outputs": []
},
{
"cell_type": "code",
"source": [
"!huggingface-cli login"
],
"metadata": {
"id": "676l498JdSwS"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## トークナイザーとパイプラインの準備"
],
"metadata": {
"id": "CSifHtgFOU3C"
}
},
{
"cell_type": "code",
"source": [
"from transformers import AutoTokenizer\n",
"import transformers\n",
"import torch\n",
"\n",
"# モデルID\n",
"model = \"meta-llama/Llama-2-7b-chat-hf\"\n",
"\n",
"# トークナイザーとパイプラインの準備\n",
"tokenizer = AutoTokenizer.from_pretrained(model)\n",
"pipeline = transformers.pipeline(\n",
" \"text-generation\",\n",
" model=model,\n",
" torch_dtype=torch.float16,\n",
" device_map=\"auto\",\n",
")"
],
"metadata": {
"id": "Yx7HEV-vOZEL"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## 推論の実行"
],
"metadata": {
"id": "DqGqFbVmfNEp"
}
},
{
"cell_type": "code",
"source": [
"prompt = \"\"\"USER: What is the highest mountain in Japan?\n",
"\"\"\"\n",
"\n",
"sequences = pipeline(\n",
" prompt,\n",
" do_sample=True,\n",
" top_k=5,\n",
" num_return_sequences=1,\n",
" eos_token_id=tokenizer.eos_token_id,\n",
" max_length=50,\n",
")\n",
"print(sequences[0][\"generated_text\"])"
],
"metadata": {
"id": "v8l0fmz1fPMC"
},
"execution_count": null,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment