Embeddings, Attention, FNN and Everything You Wanted to Know But Were Afraid to Ask

Embeddings, Attention, FNN and Everything You Wanted to Know But Were Afraid to Ask Introduction With this, I’m trying to start a series of articles dedicated to LLMs (large language models), neural networks, and everything related to the AI abbreviation. The goals of writing these articles are, of course, self-serving, because I myself started diving into these topics relatively recently and faced the fact that there seems to be a mass of information, articles, and documents written in small print with pretentious diagrams and formulas, reading which, by the end of a paragraph, you forget what the previous one was about. Therefore, here I will try to describe the essence of the subject area at a conceptual level - and so I promise to avoid mathematical formulas and tricky graphs as much as possible, seeing which, the reader inevitably catches themselves wanting to close the browser tab and visit the nearest liquor store. So - no formulas (except the simplest ones), no pretentiousness, no pretensions of looking smarter than I am. Your grandmother should understand these articles, and if that didn’t work out - then I failed the task. ...

February 9, 2025 · 16 min · Anton Chirikalov