#What is Word Embedding I say Word2Vec Word Embedding converts natural language into a vector that computers can understand. It can make 'looks like' a computer understands natural language. It can measure similarity of between words. It makes it easier to handle by vectoring words. It can make inferences through vector operations. Because the meaning of word itself is digitized as a vector.
Paper Link: [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding][https://arxiv.org/abs/1810.04805]