# Word Embedding
Word embedding is a technique in natural language processing (NLP) that involves representing words as numerical vectors, or embeddings. The goal of word embedding is to capture the semantic meaning of words in a way that can be used by machine learning models to perform various NLP tasks, such as text classification, named entity recognition, and machine translation.
The process of generating word embeddings typically involves training a neural network on a large corpus of text. The network takes as input a window of words from the text and attempts to predict the next word in the sequence. As the network is trained, it learns to associate each word with a dense vector of numerical values that capture the word's meaning in the context of the training data.
[[Word2Vec]]
## Source
- ChatGPT