This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"config_general": { | |
"lighteval_sha": "?", | |
"num_fewshot_seeds": 1, | |
"override_batch_size": 4, | |
"max_samples": null, | |
"job_id": "", | |
"start_time": 1163608.425196265, | |
"end_time": 1173616.769654949, | |
"total_evaluation_time_secondes": "10008.34445868386", |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
First install: pip install datasets pandas rich transformers | |
Usage: | |
# Loglikelihood evals | |
python view_details.py --filepath path/to/parquet/details | |
# Generative evals | |
python view_details.py --filepath path/to/parquet/details --is_generative |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This is a modified version of TRL's `SFTTrainer` example (https://github.com/huggingface/trl/blob/main/examples/scripts/sft_trainer.py), | |
# adapted to run with DeepSpeed ZeRO-3 and Mistral-7B-V1.0. The settings below were run on 1 node of 8 x A100 (80GB) GPUs. | |
# | |
# Usage: | |
# - Install the latest transformers & accelerate versions: `pip install -U transformers accelerate` | |
# - Install deepspeed: `pip install deepspeed==0.9.5` | |
# - Install TRL from main: pip install git+https://github.com/huggingface/trl.git | |
# - Clone the repo: git clone github.com/huggingface/trl.git | |
# - Copy this Gist into trl/examples/scripts | |
# - Run from root of trl repo with: accelerate launch --config_file=examples/accelerate_configs/deepspeed_zero3.yaml --gradient_accumulation_steps 8 examples/scripts/sft_trainer.py |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# coding=utf-8 | |
# Copyright 2023 The HuggingFace Inc. team. All rights reserved. | |
# | |
# Licensed under the Apache License, Version 2.0 (the "License"); | |
# you may not use this file except in compliance with the License. | |
# You may obtain a copy of the License at | |
# | |
# http://www.apache.org/licenses/LICENSE-2.0 | |
# | |
# Unless required by applicable law or agreed to in writing, software |
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# coding=utf-8 | |
# Copyright 2023 The HuggingFace Team. All rights reserved. | |
# | |
# Licensed under the Apache License, Version 2.0 (the "License"); | |
# you may not use this file except in compliance with the License. | |
# You may obtain a copy of the License at | |
# | |
# http://www.apache.org/licenses/LICENSE-2.0 | |
# | |
# Unless required by applicable law or agreed to in writing, software |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import torch | |
from m4.training.packing import image_attention_mask_for_packed_input_ids, incremental_to_binary_attention_mask | |
from m4.training.utils import build_image_transform | |
from io import BytesIO | |
from PIL import Image | |
import requests | |
from transformers import AutoTokenizer, AutoModelForCausalLM | |
MAX_SEQ_LEN=2048 |
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"type":"accuracy","value":0.8967889908256881,"name":"Accuracy"}} | |
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"type":"precision","value":0.8898678414096917,"name":"Precision"}} | |
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"type":"recall","value":0.9099099099099099,"name":"Recall"}} | |
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"type":"auc","value":0.9672186789593331,"name":"AUC"}} | |
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"type":"f1","value":0.8997772828507795,"name":"F1"}} | |
{"id":"628dfaf7554de818ab126e2d","dataset":{"name":"glue","type":"glue","config":"sst2","split":"validation"},"metric":{"ty |
NewerOlder