Skip to content

Instantly share code, notes, and snippets.

@aparrish
Last active October 11, 2024 01:03
Show Gist options
  • Save aparrish/2f562e3737544cf29aaf1af30362f469 to your computer and use it in GitHub Desktop.
Save aparrish/2f562e3737544cf29aaf1af30362f469 to your computer and use it in GitHub Desktop.
Understanding word vectors: A tutorial for "Reading and Writing Electronic Text," a class I teach at ITP. (Python 2.7) Code examples released under CC0 https://creativecommons.org/choose/zero/, other text released under CC BY 4.0 https://creativecommons.org/licenses/by/4.0/
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@Zaravanon
Copy link

Great, Thank You!

@tugcekizilltepe
Copy link

Great, well-explained tutorial, thank you!

@prakashr7d
Copy link

Not sure why I'm getting the following error, working on macOS with Jupyter Lab, Python 2.7 and Spacy 2.0.9:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-2-090b6e832a74> in <module>()
      3 # It creates a list of unique words in the text
      4 tokens = list(set([w.text for w in doc if w.is_alpha]))
----> 5 print nlp.vocab['cheese'].vector

lexeme.pyx in spacy.lexeme.Lexeme.vector.__get__()

ValueError: Word vectors set to length 0. This may be because you don't have a model installed or loaded, or because your model doesn't include word vectors. For more info, see the documentation: 
https://spacy.io/usage/models

You want to download 'en_core_web_lg' model

@saiankit
Copy link

OMG !! Really had a great time reading this beautiful gist. Very well explained.

@DavidHarar
Copy link

Thanks!

@mikeolubode
Copy link

I was led here by a tutorial on word vectors from youtube. Thanks for the simplicity!

@yishairasowsky
Copy link

very good

@robertocsa
Copy link

Thank you for sharing this. Excelent job!

@avneesh91
Copy link

this is amazing, thank you for explanation!!

@prateekcaire
Copy link

Thanks!!

@adebiasi
Copy link

Very nice tutorial!

One question:
A word near the origin (0,0,0 ...) in the n-space has less possibility to be the result of an addition among words. As opposite, a word very distant of the origin could be the result of many possible additions among many words. Does this mean that complex concepts are far for the origin and basic concepts are near?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment