Skip to content

Instantly share code, notes, and snippets.

View ZekiJohn's full-sized avatar
💭
continuous learning...

Zekaryas ZekiJohn

💭
continuous learning...
  • MD
View GitHub Profile
@ZekiJohn
ZekiJohn / install_anaconda.md
Created October 24, 2023 01:17 — forked from kauffmanes/install_anaconda.md
Install Anaconda on Windows Subsystem for Linux (WSL)

Thanks everyone for commenting/contributing! I made this in college for a class and I no longer really use the technology. I encourage you all to help each other, but I probably won't be answering questions anymore.

This article is also on my blog: https://emilykauffman.com/blog/install-anaconda-on-wsl

Note: $ denotes the start of a command. Don't actually type this.

Steps to Install Anaconda on Windows Ubuntu Terminal

  1. Install WSL (Ubuntu for Windows - can be found in Windows Store). I recommend the latest version (I'm using 18.04) because there are some bugs they worked out during 14/16 (microsoft/WSL#785)
  2. Go to https://repo.continuum.io/archive to find the list of Anaconda releases
  3. Select the release you want. I have a 64-bit computer, so I chose the latest release ending in x86_64.sh. If I had a 32-bit computer, I'd select the x86.sh version. If you accidentally try to install the wrong one, you'll get a warning in the terminal. I chose `Anaconda3-5.2.0-Li
@rain-1
rain-1 / llama-home.md
Last active May 16, 2024 04:58
How to run Llama 13B with a 6GB graphics card

This worked on 14/May/23. The instructions will probably require updating in the future.

llama is a text prediction model similar to GPT-2, and the version of GPT-3 that has not been fine tuned yet. It is also possible to run fine tuned versions (like alpaca or vicuna with this. I think. Those versions are more focused on answering questions)

Note: I have been told that this does not support multiple GPUs. It can only use a single GPU.

It is possible to run LLama 13B with a 6GB graphics card now! (e.g. a RTX 2060). Thanks to the amazing work involved in llama.cpp. The latest change is CUDA/cuBLAS which allows you pick an arbitrary number of the transformer layers to be run on the GPU. This is perfect for low VRAM.

  • Clone llama.cpp from git, I am on commit 08737ef720f0510c7ec2aa84d7f70c691073c35d.
# STEP 1: Load
# Load documents using LangChain's DocumentLoaders
# This is from https://langchain.readthedocs.io/en/latest/modules/document_loaders/examples/csv.html
from langchain.document_loaders.csv_loader import CSVLoader
loader = CSVLoader(file_path='./example_data/mlb_teams_2012.csv')
data = loader.load()
@jjsquady
jjsquady / mysql-sail.md
Last active March 9, 2024 16:50
Laravel Sail - GRANT ALL PRIVILEGES to user

Run this command in sail project folder:

docker-compose exec mysql bash

execute the mysql -u root -p command

provide the default password password

then executes

@0xjac
0xjac / private_fork.md
Last active May 22, 2024 11:28
Create a private fork of a public repository

The repository for the assignment is public and Github does not allow the creation of private forks for public repositories.

The correct way of creating a private frok by duplicating the repo is documented here.

For this assignment the commands are:

  1. Create a bare clone of the repository. (This is temporary and will be removed so just do it wherever.)

git clone --bare git@github.com:usi-systems/easytrace.git

@xeoncross
xeoncross / composer.md
Last active March 26, 2024 02:59
How composer actually works

Composer

the missing quick-start guide

We will assume we have a package/project called https://github.com/foo/bar

Most redistributable packages are hosted on a version control website such as Github or Bitbucket. Version control repositories often have a tagging system where we can define stable versions of our application. For example with git we can use the command:

git tag -a 1.0.0 -m 'First version.'

With this we have created version 1.0.0 of our application. This is a stable release which people can depend on. If we wanted to include "foo/bar" then we could create a composer.json and specify the tagged release we wanted: