Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Spacy R crashes on tokenizer
corpus <- readr::read_csv("corpus.csv")
# Work with
#t <- corpus$text[1:100]
t <- corpus$text[124]
spacyr::spacy_tokenize(t)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment