Machine Learning Glossary: Difference between revisions
| Line 2: | Line 2: | ||
==A== | ==A== | ||
* Attention - An element used in transformers which involves computing the product of | * Attention - An element used in transformers which involves computing the product of query embeddings and key embeddings to compute the interaction between sequence elements. | ||
==B== | ==B== | ||