AI / ML Engineer
AI/ML engineers build systems that learn from data. This path covers the vocabulary for discussing model architectures, training pipelines, evaluation metrics, and the rapidly evolving language of LLMs, embeddings, and RAG — used daily in design reviews, paper discussions, and stakeholder updates.
Topics covered
- ML fundamentals
- LLMs & foundation models
- RAG & embeddings
- MLOps & deployment
- Model evaluation
- Responsible AI
Vocabulary spotlight
4 terms every AI / ML Engineer should know in English:
When an LLM generates plausible-sounding but factually incorrect information
"The model hallucinated a citation — always verify AI-generated references."
Adapting a pre-trained model to a specific task by training it on domain-specific data
"We fine-tuned GPT on our support tickets to improve its product knowledge."
Vector representations of text that capture semantic meaning for similarity search
"We store document embeddings in a vector database for semantic retrieval."
The maximum amount of text an LLM can process in a single inference call
"The 200k-token context window lets us pass the entire codebase to the model."
📚 Vocabulary Reference
Key terms organised by category for AI / ML Engineers:
ML Fundamentals
LLMs & Transformers
RAG & Retrieval
MLOps
Recommended exercises
Real-world scenarios you'll practise
- Explaining model evaluation metrics to a non-technical product manager
- Presenting RAG architecture trade-offs in a system design review
- Writing an ML model card for internal governance
- Discussing responsible AI concerns with a compliance officer
🎯 Interview questions specific to this role
Practise answering these questions out loud — or in writing. Each question targets a real interviewer concern for AI / ML Engineers.
- What is the difference between a transformer and an RNN?
- How do you evaluate the quality of an LLM-based application?
- What is retrieval-augmented generation and when would you use it?
- How do you detect and mitigate bias in a machine learning model?
- Walk me through your MLOps process from model training to production deployment.