Skip to content

Instantly share code, notes, and snippets.

@necronet
Created June 5, 2020 17:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save necronet/a2ab9686291f03165679613de4816386 to your computer and use it in GitHub Desktop.
Save necronet/a2ab9686291f03165679613de4816386 to your computer and use it in GitHub Desktop.
Spacy R crashes on tokenizer
corpus <- readr::read_csv("corpus.csv")
# Work with
#t <- corpus$text[1:100]
t <- corpus$text[124]
spacyr::spacy_tokenize(t)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment