Skip to content

Instantly share code, notes, and snippets.

View ibndias's full-sized avatar
⚔️

Derry Pratama ibndias

⚔️
View GitHub Profile
@MohamedAlaa
MohamedAlaa / tmux-cheatsheet.markdown
Last active May 7, 2024 06:03
tmux shortcuts & cheatsheet

tmux shortcuts & cheatsheet

start new:

tmux

start new with session name:

tmux new -s myname
@gbaman
gbaman / HowToOTGFast.md
Last active May 1, 2024 08:26
Simple guide for setting up OTG modes on the Raspberry Pi Zero, the fast way!

Setting up Pi Zero OTG - The quick way (No USB keyboard, mouse, HDMI monitor needed)

More details - http://blog.gbaman.info/?p=791

For this method, alongside your Pi Zero, MicroUSB cable and MicroSD card, only an additional computer is required, which can be running Windows (with Bonjour, iTunes or Quicktime installed), Mac OS or Linux (with Avahi Daemon installed, for example Ubuntu has it built in).
1. Flash Raspbian Jessie full or Raspbian Jessie Lite onto the SD card.
2. Once Raspbian is flashed, open up the boot partition (in Windows Explorer, Finder etc) and add to the bottom of the config.txt file dtoverlay=dwc2 on a new line, then save the file.
3. If using a recent release of Jessie (Dec 2016 onwards), then create a new file simply called ssh in the SD card as well. By default SSH i

@lewtun
lewtun / sft_trainer.py
Last active April 27, 2024 21:28
Fine-tuning Mistral 7B with TRL & DeepSpeed ZeRO-3
# This is a modified version of TRL's `SFTTrainer` example (https://github.com/huggingface/trl/blob/main/examples/scripts/sft_trainer.py),
# adapted to run with DeepSpeed ZeRO-3 and Mistral-7B-V1.0. The settings below were run on 1 node of 8 x A100 (80GB) GPUs.
#
# Usage:
# - Install the latest transformers & accelerate versions: `pip install -U transformers accelerate`
# - Install deepspeed: `pip install deepspeed==0.9.5`
# - Install TRL from main: pip install git+https://github.com/huggingface/trl.git
# - Clone the repo: git clone github.com/huggingface/trl.git
# - Copy this Gist into trl/examples/scripts
# - Run from root of trl repo with: accelerate launch --config_file=examples/accelerate_configs/deepspeed_zero3.yaml --gradient_accumulation_steps 8 examples/scripts/sft_trainer.py
@austinmarton
austinmarton / sendRawEth.c
Created February 27, 2012 08:40
Send a raw Ethernet frame in Linux
/*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*/
#include <arpa/inet.h>
#include <linux/if_packet.h>
#include <stdio.h>
@mimoo
mimoo / erase_from_memory.h
Last active April 8, 2024 21:17
Include this file to get the `erase_from_memory` function that zeros memory. See https://www.cryptologie.net/article/419/zeroing-memory-compiler-optimizations-and-memset_s/
#ifndef __ERASE_FROM_MEMORY_H__
#define __ERASE_FROM_MEMORY_H__ 1
#define __STDC_WANT_LIB_EXT1__ 1
#include <stdlib.h>
#include <string.h>
void *erase_from_memory(void *pointer, size_t size_data, size_t size_to_remove) {
#ifdef __STDC_LIB_EXT1__
memset_s(pointer, size_data, 0, size_to_remove);
@Vini2
Vini2 / SimpleDemoGA.java
Last active March 27, 2024 15:58
A simple implementation of a genetic algorithm
import java.util.Random;
/**
*
* @author Vijini
*/
//Main class
public class SimpleDemoGA {
@oskar456
oskar456 / wgcf.py
Last active February 17, 2024 12:47
Cloudflare WARP linux client (using wg-quick for actual tunnel setup)
#!/usr/bin/env python3
import subprocess
import json
import os
from pathlib import Path
import requests
from requests.compat import urljoin
@ibndias
ibndias / sft_trainer.py
Created February 8, 2024 11:51 — forked from lewtun/sft_trainer.py
Fine-tuning Mistral 7B with TRL & DeepSpeed ZeRO-3
# This is a modified version of TRL's `SFTTrainer` example (https://github.com/huggingface/trl/blob/main/examples/scripts/sft_trainer.py),
# adapted to run with DeepSpeed ZeRO-3 and Mistral-7B-V1.0. The settings below were run on 1 node of 8 x A100 (80GB) GPUs.
#
# Usage:
# - Install the latest transformers & accelerate versions: `pip install -U transformers accelerate`
# - Install deepspeed: `pip install deepspeed==0.9.5`
# - Install TRL from main: pip install git+https://github.com/huggingface/trl.git
# - Clone the repo: git clone github.com/huggingface/trl.git
# - Copy this Gist into trl/examples/scripts
# - Run from root of trl repo with: accelerate launch --config_file=examples/accelerate_configs/deepspeed_zero3.yaml --gradient_accumulation_steps 8 examples/scripts/sft_trainer.py
@naufalso
naufalso / pdf-color-pages-classifier-and-counter.ipynb
Last active January 3, 2024 06:27
pdf-color-pages-classifier-and-counter.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.