SEARCH
You are in browse mode. You must login to use MEMORY

   Log in to start


From course:

Intro to AI 2

» Start this Course
(Practice similar questions for free)
Question:

Transformer - Vector embeddings

Author: Christian N



Answer:

Key idea: Similar words should have similar representation vectors • Mapping of words (or images) into a multi-dimensional space called Embedding space, or Latent space • In this space, similar concepts are close to each other In the example, the words “cat” and “dog” are often use in similar context, therefore they are close in the latent space. The word “car”, while very similar to the word “cat”, is used in different contexts, and therefore is far away in the latent space.


0 / 5  (0 ratings)

1 answer(s) in total