Skip to content

Instantly share code, notes, and snippets.

View msmakhlouf's full-sized avatar
🏠
Working from home

Mohammed Makhlouf (Mak) msmakhlouf

🏠
Working from home
View GitHub Profile
@msmakhlouf
msmakhlouf / LLMs.md
Created January 3, 2023 16:34 — forked from yoavg/LLMs.md

Some remarks on Large Language Models

Yoav Goldberg, January 2023

Audience: I assume you heard of chatGPT, maybe played with it a little, and was imressed by it (or tried very hard not to be). And that you also heard that it is "a large language model". And maybe that it "solved natural language understanding". Here is a short personal perspective of my thoughts of this (and similar) models, and where we stand with respect to language understanding.

Intro

Around 2014-2017, right within the rise of neural-network based methods for NLP, I was giving a semi-academic-semi-popsci lecture, revolving around the story that achieving perfect language modeling is equivalent to being as intelligent as a human. Somewhere around the same time I was also asked in an academic panel "what would you do if you were given infinite compute and no need to worry about labour costs" to which I cockily responded "I would train a really huge language model, just to show that it doesn't solve everything!". We

Keybase proof

I hereby claim:

  • I am msmakhlouf on github.
  • I am msmakhlouf (https://keybase.io/msmakhlouf) on keybase.
  • I have a public key ASAnAPRdjDPfax_-cWN35TEAfa3UnS1TKcGx_Y2wqA8SrQo

To claim this, I am signing this object: