// AUTHOR
Elnaz R. Koupaei

Decoding_Basic_AI_Concepts

Wait, how does LLMs actually work?

<query_01>

Define: LLM

Large Language Model

Think of it as a prediction engine. It’s a complex program trained on the internet's data. Its sole purpose isn't "thinking"—it's calculating the probability of the next word in a sequence.

<query_02>

G in GPT is for Generative

Creation vs. Retrieval

Search engines find what exists. Generative AI creates what could exist. It builds answers pixel by pixel, token by token, creating something new every time you press enter.

<query_03>

P in GPT is for Pre-trained

The Knowledge Cutoff

The AI read thousands of books before you ever met. It knows how language works structurally, but its knowledge is frozen in the past (the moment training stopped).

<query_04>

T in GPT is for Transformer

Attention Mechanism

This is the secret sauce. A Transformer doesn't read left-to-right. It looks at the whole sentence at once, weighing how words relate to each other (e.g., connecting "Bank" to "River" vs "Money").

<query_05>

The Token Economy

Apple = 4721

Computers don't read words; they do math. Words are chopped into Tokens and assigned IDs. The AI processes these number sequences to find patterns in the chaos.

<query_06>

Hallucinations

Confidently Wrong

Because it predicts patterns not facts, LLMs can lie convincingly. If the most probable next word creates a fake fact, the AI will write it without hesitation.

<query_07>

Algorithmic Bias

Mirroring Us

AI eats what we feed it. If the internet data contains stereotypes (it does), the AI will reproduce them. It is not neutral; it is a reflection of its training data.

<query_08>

The Trust Factor

Verify, Don't Trust

Use LLMs as a "Smart Intern," not a "Senior Expert." They are incredible at summarizing and formatting, but dangerous if treated as the sole source of truth.

<query_09>

Ghost in the Shell?

No Feelings

It might say "I'm sad," but it's just predicting that "sad" is the word that usually follows "I am" in that context. It is a simulation, not a sentient being.

<query_10>

The ELIZA Effect

Human Psychology

It is natural to bond with things that talk back. But remember: it's a one-way street. The AI is a mirror reflecting your own empathy back at you using math.