Skip to content

Instantly share code, notes, and snippets.

View tokestermw's full-sized avatar

Motoki Wu tokestermw

View GitHub Profile
@twolfson
twolfson / README.rst
Last active April 29, 2024 01:46
Evaluation and comparison of various Python templating libraries

gist-python-templating-evaluation

@Hellisotherpeople
Hellisotherpeople / blog.md
Last active May 4, 2024 01:57
You probably don't know how to do Prompt Engineering, let me educate you.

You probably don't know how to do Prompt Engineering

(This post could also be titled "Features missing from most LLM front-ends that should exist")

Apologies for the snarky title, but there has been a huge amount of discussion around so called "Prompt Engineering" these past few months on all kinds of platforms. Much of it is coming from individuals who are peddling around an awful lot of "Prompting" and very little "Engineering".

Most of these discussions are little more than users finding that writing more creative and complicated prompts can help them solve a task that a more simple prompt was unable to help with. I claim this is not Prompt Engineering. This is not to say that crafting good prompts is not a difficult task, but it does not involve doing any kind of sophisticated modifications to general "template" of a prompt.

Others, who I think do deserve to call themselves "Prompt Engineers" (and an awful lot more than that), have been writing about and utilizing the rich new eco-system

@rain-1
rain-1 / LLM.md
Last active May 5, 2024 07:13
LLM Introduction: Learn Language Models

Purpose

Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.

Avoid being a link dump. Try to provide only valuable well tuned information.

Prelude

Neural network links before starting with transformers.

import torch
import torch.nn as nn
import torch.nn.functional as F
# helpers
def make_unit_length(x, epsilon=1e-6):
norm = x.norm(p=2, dim=-1, keepdim=True)
return x.div(norm + epsilon)
worker_processes 1;
events {
worker_connections 1024;
}
http {
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
@rxwei
rxwei / ad-manifesto.md
Last active November 9, 2023 09:58
First-Class Automatic Differentiation in Swift: A Manifesto

Feature Store

Uber Michelangelo

https://eng.uber.com/michelangelo/

Finding good features is often the hardest part of machine learning and we have found that building and managing data pipelines is typically one of the most costly pieces of a complete machine learning solution.

A platform should provide standard tools for building data pipelines to generate feature and label data sets for training (and re-training) and feature-only data sets for predicting. These tools should have deep integration with the company’s data lake or warehouses and with the company’s online data serving systems. The pipelines need to be scalable and performant, incorporate integrated monitoring for data flow and data quality, and support both online and offline training and predicting. Ideally, they should also generate the features in a way that is shareable across teams to reduce duplicate work and increase data quality. They should also provide strong guard rails and controls to encourage and empower users to adop

@elmarhaussmann
elmarhaussmann / text_classification_character_rnn.py
Created February 23, 2018 05:12
Character based text classification with TPUEstimator
# Based on the example from the TensorFlow repository: https://github.com/tensorflow/tensorflow/
# https://github.com/tensorflow/tensorflow/blob/671baf080238025da9698ea980cd9504005f727c/tensorflow/examples/learn/text_classification_character_rnn.py
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import argparse
import sys
@awni
awni / ctc_decoder.py
Last active April 18, 2024 19:14
Example CTC Decoder in Python
"""
Author: Awni Hannun
This is an example CTC decoder written in Python. The code is
intended to be a simple example and is not designed to be
especially efficient.
The algorithm is a prefix beam search for a model trained
with the CTC loss function.