Skip to content

Instantly share code, notes, and snippets.

@iolalla
iolalla / LLMs.md
Last active January 4, 2023 18:32 — forked from yoavg/LLMs.md

Some remarks on Large Language Models

Yoav Goldberg, January 2023

Audience: I assume you heard of chatGPT, maybe played with it a little, and was impressed by it (or tried very hard not to be). And that you also heard that it is "a large language model". And maybe that it "solved natural language understanding". Here is a short personal perspective of my thoughts of this (and similar) models, and where we stand with respect to language understanding.

Intro

Around 2014-2017, right within the rise of neural-network based methods for NLP, I was giving a semi-academic-semi-popsci lecture, revolving around the story that achieving perfect language modeling is equivalent to being as intelligent as a human. Somewhere around the same time I was also asked in an academic panel "what would you do if you were given infinite compute and no need to worry about labour costs" to which I cockily responded "I would train a really huge language model, just to show that it doesn't solve everything!". W

@iolalla
iolalla / rnn-lstm.py
Last active August 18, 2017 06:33 — forked from monikkinom/rnn-lstm.py
Tensorflow RNN-LSTM implementation to count number of set bits in a binary string
#Source code with the blog post at http://monik.in/a-noobs-guide-to-implementing-rnn-lstm-using-tensorflow/
#Original gist: https://gist.github.com/AlmostDan/a5f4b57104532a68bda2c274f8bcf16f
#Tensorboard inspirational: https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/4_Utils/tensorboard_basic.py
import numpy as np
#import random
from random import shuffle
import tensorflow as tf
NUM_EXAMPLES = 10000