awesome_LLMs_interview_notes and llm_interview_note

These are **competitors**: both are curated note repositories for LLM engineer interview preparation covering similar technical domains (algorithms, applications, interview questions), so a learner would typically choose one comprehensive resource over the other rather than use both in tandem.

Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 2/25
Adoption 10/25
Maturity 8/25
Community 20/25
Stars: 1,310
Forks: 327
Downloads:
Commits (30d): 0
Language:
License: MIT
Stars: 13,130
Forks: 1,304
Downloads:
Commits (30d): 0
Language: HTML
License:
Stale 6m No Package No Dependents
No License Stale 6m No Package No Dependents

About awesome_LLMs_interview_notes

jackaduma/awesome_LLMs_interview_notes

LLMs interview notes and answers:该仓库主要记录大模型(LLMs)算法工程师相关的面试题和参考答案

About llm_interview_note

wdndev/llm_interview_note

主要记录大语言大模型(LLMs) 算法(应用)工程师相关的知识及面试题

Covers Transformer architecture fundamentals (attention mechanisms, positional encoding, tokenization) alongside practical implementations like LLaMA and ChatGLM model internals, with dedicated sections on distributed training strategies (data/tensor/pipeline parallelism), inference optimization via vLLM and quantization, and alignment techniques including RLHF and DPO. Integrates with frameworks like DeepSpeed and Megatron, while companion projects (tiny-llm-zh, tiny-rag, tiny-mcp) provide hands-on implementations for pretraining, RAG systems, and MCP-based agents on resource-constrained hardware.

Scores updated daily from GitHub, PyPI, and npm data. How scores work