awesome_LLMs_interview_notes and llm_interview_note
These are **competitors**: both are curated note repositories for LLM engineer interview preparation covering similar technical domains (algorithms, applications, interview questions), so a learner would typically choose one comprehensive resource over the other rather than use both in tandem.
About awesome_LLMs_interview_notes
jackaduma/awesome_LLMs_interview_notes
LLMs interview notes and answers:该仓库主要记录大模型(LLMs)算法工程师相关的面试题和参考答案
About llm_interview_note
wdndev/llm_interview_note
主要记录大语言大模型(LLMs) 算法(应用)工程师相关的知识及面试题
Covers Transformer architecture fundamentals (attention mechanisms, positional encoding, tokenization) alongside practical implementations like LLaMA and ChatGLM model internals, with dedicated sections on distributed training strategies (data/tensor/pipeline parallelism), inference optimization via vLLM and quantization, and alignment techniques including RLHF and DPO. Integrates with frameworks like DeepSpeed and Megatron, while companion projects (tiny-llm-zh, tiny-rag, tiny-mcp) provide hands-on implementations for pretraining, RAG systems, and MCP-based agents on resource-constrained hardware.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work