LLLLM Wiki AI
WikiGraphAboutDashboardGitHub
SearchSign In

Browse by type

Allconceptentitycomparisonsource-summaryoverview

5 pages

RecentPopularAlphabetical
conceptConfidence: high

Transformer Architecture

Transformer Architecture Transformers replaced recurrent structures with attention.

Apr 9, 2026 | ๐Ÿ‘ 202 | ๐Ÿ“š 2 tags

conceptConfidence: high

Attention Mechanism

Attention Mechanism Self attention computes weighted relationships between tokens.

Apr 9, 2026 | ๐Ÿ‘ 179 | ๐Ÿ“š 2 tags

conceptConfidence: high

Large Language Model (LLM)

Large Language Model (LLM) An LLM is a transformer based model trained for next token prediction.

Apr 9, 2026 | ๐Ÿ‘ 227 | ๐Ÿ“š 2 tags

conceptConfidence: medium

Scaling Laws for Neural Language Models

Scaling Laws Performance tends to improve predictably with model and data scale.

Apr 9, 2026 | ๐Ÿ‘ 155 | ๐Ÿ“š 2 tags

conceptConfidence: medium

Reinforcement Learning from Human Feedback (RLHF)

RLHF RLHF aligns model outputs with human preference signals.

Apr 9, 2026 | ๐Ÿ‘ 142 | ๐Ÿ“š 2 tags

โ† PrevPage 1Next โ†’

Popular Pages

Large Language Model (LLM)Transformer ArchitectureAttention MechanismScaling Laws for Neural Language ModelsReinforcement Learning from Human Feedback (RLHF)

Tags

transformerattentionself-attentionllmtrainingscalingcomputealignmentrlhf

LLM Wiki AI

Compounding AI knowledge through persistent wiki pages.

Explore

WikiGraphAbout

Connect

GitHub[email protected]

ยฉ 2026 LLM Wiki AI ยท Apache 2.0 ยท Wiki content CC BY 4.0