Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In


Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here


Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.


Have an account? Sign In Now

You must login to ask a question.


Forgot Password?

Need An Account, Sign Up Here

You must login to add post.


Forgot Password?

Need An Account, Sign Up Here
Sign InSign Up

Qaskme

Qaskme Logo Qaskme Logo

Qaskme Navigation

  • Home
  • Questions Feed
  • Communities
  • Blog
Search
Ask A Question

Mobile menu

Close
Ask A Question
  • Home
  • Questions Feed
  • Communities
  • Blog
Home/ Questions/Q 3394
Next
In Process

Qaskme Latest Questions

daniyasiddiqui
daniyasiddiquiCommunity Pick
Asked: 12/11/20252025-11-12T14:13:35+00:00 2025-11-12T14:13:35+00:00In: Technology

What role do tokenization and positional encoding play in LLMs?

tokenization and positional encoding play in LLMs

deeplearningllmsnlppositionalencodingtokenizationtransformers
  • 0
  • 0
  • 11
  • 23
  • 0
  • 0
  • Share
    • Share on Facebook
    • Share on Twitter
    • Share on LinkedIn
    • Share on WhatsApp
    Leave an answer

    Leave an answer
    Cancel reply

    Browse


    1 Answer

    • Voted
    • Oldest
    • Recent
    • Random
    1. daniyasiddiqui
      daniyasiddiqui Community Pick
      2025-11-12T14:53:44+00:00Added an answer on 12/11/2025 at 2:53 pm

      The World of Tokens Humans read sentences as words and meanings. Consider it like breaking down a sentence into manageable bits, which the AI then knows how to turn into numbers. “AI is amazing” might turn into tokens: → [“AI”, “ is”, “ amazing”] Or sometimes even smaller: [“A”, “I”, “ is”, “ ama”,Read more

      The World of Tokens

      • Humans read sentences as words and meanings.
      • Consider it like breaking down a sentence into manageable bits, which the AI then knows how to turn into numbers.
      • “AI is amazing” might turn into tokens: → [“AI”, “ is”, “ amazing”]
      • Or sometimes even smaller: [“A”, “I”, “ is”, “ ama”, “zing”]
      • Thus, each token is a small unit of meaning: either a word, part of a word, or even punctuation, depending on how the tokenizer was trained.
      • Similarly, LLMs can’t understand sentences until they first convert text into numerical form because AI models only work with numbers, that is, mathematical vectors.

      Each token gets a unique ID number, and these numbers are turned into embeddings, or mathematical representations of meaning.

       But There’s a Problem Order Matters!

      Let’s say we have two sentences:

      • “The dog chased the cat.”
      • “The cat chased the dog.”

      They use the same words, but the order completely changes the meaning!

      A regular bag of tokens doesn’t tell the AI which word came first or last.

      That would be like giving somebody pieces of the puzzle and not indicating how to lay them out; they’d never see the picture.

      So, how does the AI discern the word order?

      An Easy Analogy: Music Notes

      Imagine a song.

      Each of them, separately, is just a sound.

      Now, imagine if you played them out of order the music would make no sense!

      Positional encoding is like the sheet music, which tells the AI where each note (token) belongs in the rhythm of the sentence.

      Position Selection – How the Model Uses These Positions

      Once tokens are labeled with their positions, the model combines both:

      • What the word means – token embedding
      • Where the word appears – positional encoding

      These two signals together permit the AI to:

      • Recognize relations between words: “who did what to whom”.
      • Predict the next word, based on both meaning and position.

       Why This Is Crucial for Understanding and Creativity

      • Without tokenization, the model couldn’t read or understand words.
      • Without positional encoding, the model couldn’t understand context or meaning.

      Put together, they represent the basis for how LLMs understand and generate human-like language.

      In stories,

      • they help the AI track who said what and when.
      • In poetry or dialogue, they serve to provide rhythm, tone, and even logic.

      This is why models like GPT or Gemini can write essays, summarize books, translate languages, and even generate code-because they “see” text as an organized pattern of meaning and order, not just random strings of words.

       How Modern LLMs Improve on This

      Earlier models had fixed positional encodings meaning they could handle only limited context (like 512 or 1024 tokens).

      But newer models (like GPT-4, Claude 3, Gemini 2.0, etc.) use rotary or relative positional embeddings, which allow them to process tens of thousands of tokens  entire books or multi-page documents while still understanding how each sentence relates to the others.

      That’s why you can now paste a 100-page report or a long conversation, and the model still “remembers” what came before.

      Bringing It All Together

      •  A Simple Story Tokenization is teaching it what words are, like: “These are letters, this is a word, this group means something.”
      • Positional encoding teaches it how to follow the order, “This comes first, this comes next, and that’s the conclusion.”
      • Now it’s able to read a book, understand the story, and write one back to you-not because it feels emotions.

      but because it knows how meaning changes with position and context.

       Final Thoughts

      If you think of an LLM as a brain, then:

      • Tokenization is like its eyes and ears, how it perceives words and converts them into signals.
      • Positional encoding is to the transformer like its sense of time and sequence how it knows what came first, next, and last.

      Together, they make language models capable of something almost magical  understanding human thought patterns through math and structure.

      See less
        • 0
      • Reply
      • Share
        Share
        • Share on Facebook
        • Share on Twitter
        • Share on LinkedIn
        • Share on WhatsApp

    Related Questions

    • Are we moving toward
    • How are agentic AI s
    • What’s the future of
    • What are “agentic AI
    • What is the differen

    Sidebar

    Ask A Question

    Stats

    • Questions 453
    • Answers 444
    • Posts 4
    • Best Answers 21
    • Popular
    • Answers
    • daniyasiddiqui

      “What lifestyle habi

      • 5 Answers
    • Anonymous

      Bluestone IPO vs Kal

      • 5 Answers
    • mohdanas

      Are AI video generat

      • 4 Answers
    • daniyasiddiqui
      daniyasiddiqui added an answer 1. The early years: Bigger meant better When GPT-3, PaLM, Gemini 1, Llama 2 and similar models came, they were… 14/11/2025 at 4:54 pm
    • daniyasiddiqui
      daniyasiddiqui added an answer 1. Teaching Methods That Work Best in Online & Hybrid Learning 1. The Flipped Classroom Model Rather than having class… 14/11/2025 at 3:25 pm
    • daniyasiddiqui
      daniyasiddiqui added an answer 1. What traditional assessments do well and why they still matter It is easy to fault exams, yet they do… 14/11/2025 at 2:43 pm

    Related Questions

    • Are we mov

      • 1 Answer
    • How are ag

      • 1 Answer
    • What’s the

      • 1 Answer
    • What are “

      • 1 Answer
    • What is th

      • 1 Answer

    Top Members

    Trending Tags

    ai aiineducation analytics artificialintelligence company digital health edtech education geopolitics global trade health language machinelearning multimodalai news people tariffs technology trade policy tradepolicy

    Explore

    • Home
    • Add group
    • Groups page
    • Communities
    • Questions
      • New Questions
      • Trending Questions
      • Must read Questions
      • Hot Questions
    • Polls
    • Tags
    • Badges
    • Users
    • Help

    © 2025 Qaskme. All Rights Reserved

    Insert/edit link

    Enter the destination URL

    Or link to existing content

      No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.