Skip to content

Instantly share code, notes, and snippets.

@TikkunCreation
Last active August 9, 2023 03:35
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save TikkunCreation/5de1df7b24800cc05b4823da4116f1ad to your computer and use it in GitHub Desktop.
Save TikkunCreation/5de1df7b24800cc05b4823da4116f1ad to your computer and use it in GitHub Desktop.
openai self study / deep neural network learning

Sam Altman: "I think if you have a smart person who has learned to do good research and has the right sort of mindset, it only takes about six months to make them, you know, take a smart physics researcher and make them into a productive AI researcher. So we don't have enough talent in the field yet, but it's coming soon. We have a program at open AI that does exactly this. And I'm astonished how well it works."

Types of roles at AI companies

  • Software Engineer (not the focus of this Gist): Build customer-facing features, optimize applications for speed and scale, use AI APIs. Prompt engineering expertise is generally helpful, but AI experience beyond using the APIs or using ChatGPT like an expert is generally not needed. This Gist isn't aimed at this role.
  • Machine Learning Engineer: Build pipelines for data management, model training, and model deployment, to improve models (not the focus of this Gist). And/or implement cutting-edge research papers (a focus of this Gist).
  • Research Engineer (a focus of this Gist though missing resources on massive-scale): Build massive-scale distributed machine learning systems. Focus on massive-scale and large distributed systems.
  • Research Scientist (a focus of this Gist): Develop new ML techniques to push the state of the art forward.

Research Science focused (though, still helpful if you're interested in Research Engineering)

https://openai.com/research/spinning-up-in-deep-rl

https://iconix.github.io/notes/2018/10/07/what-i-learned

https://github.com/iconix/openai/blob/master/syllabus.md

See also other OpenAI fellows/scholars blog posts (" We ask all Scholars to document their experiences studying deep learning to hopefully inspire others to join the field too.") eg https://openai.com/blog/openai-scholars-2021-final-projects

https://80000hours.org/podcast/episodes/chris-olah-unconventional-career-path/

https://80000hours.org/podcast/episodes/richard-ngo-large-language-models/

John Schulman: https://www.youtube.com/watch?v=hhiLw5Q_UFg

Alec Radford: https://www.youtube.com/watch?v=BnpB3GrpsfM, https://www.youtube.com/watch?v=3X3EY2Fgp3g, https://www.youtube.com/watch?v=S75EdAcXHKk, https://www.youtube.com/watch?v=VINCQghQRuM, https://www.youtube.com/watch?v=KeJINHjyzOU

https://web.archive.org/web/20200813005847/http://wiki.fast.ai:80/index.php/Calculus_for_Deep_Learning and https://www.quantstart.com/articles/matrix-algebra-linear-algebra-for-deep-learning-part-2/ (via https://openai.com/blog/openai-scholars-2019)

https://www.deepmind.com/learning-resources/introduction-to-reinforcement-learning-with-david-silver

Research Engineering focused (though, still helpful if you're interested in Research Science)

http://karpathy.github.io/2019/04/25/recipe/

https://karpathy.medium.com/yes-you-should-understand-backprop-e2f06eab496b

https://course.fast.ai/

These HN threads https://news.ycombinator.com/item?id=35114530

https://www.deeplearning.ai/ and https://www.coursera.org/collections/machine-learning Andrew Ng

https://github.com/karpathy/nanoGPT

@TikkunCreation
Copy link
Author

Most up to date version is now here: https://llm-utils.org/AI+Learning+Curation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment