Quickstart
This section provides a quickstart example for creating an AI Agent with Llama Stack.
Prerequisites
- Python 3.12 or higher (if not satisfied, refer to FAQ: How to prepare Python 3.12 in Notebook)
- Llama Stack Server installed and running via Operator (see Install Llama Stack), with
VLLM_URLpointing at a vLLM-served model endpoint (see install notes) - Access to a Notebook environment (e.g., Jupyter Notebook, JupyterLab)
- Python environment with
llama-stack-client,fastmcp(for the MCP section), and other notebook dependencies installed
Quickstart Example
A simple example of creating an AI Agent with Llama Stack is available in the following resources:
- Notebook:Llama Stack Quick Start Demo
Download the notebook and upload it to a Notebook environment to run.
The notebook demonstrates:
- Two tool options: client-side tools (
@client_tool) and MCP tools (FastMCP +toolgroups.register) - Shared agent flow: connect to Llama Stack Server, select a model, create an
Agentwithtools=AGENT_TOOLS, then run sessions and streaming turns - Streaming responses and event logging
- Optional FastAPI deployment of the
agent
FAQ
How to prepare Python 3.12 in Notebook
-
Download the pre-compiled Python installation package:
-
Extract with:
-
Install and Register Kernel:
-
Switch kernel in the notebook page:
- Open your Notebook environment (e.g., Jupyter Notebook or JupyterLab) in the browser, then open an existing notebook or create a new one.
- In the notebook interface, find the current kernel name (usually shown in the top-right corner of the page, e.g., "Python 3" or "python3").
- Click that kernel name, or use the menu Kernel → Change Kernel.
- In the kernel list, select "Python 3.12" (the display name registered in step 3).
- After switching, new cells will run with Python 3.12.
Note: When executing python and pip commands directly in the notebook page, the default python will still be used. You need to specify the full path to use the python312 version commands.
Additional Resources
For more resources on developing AI Agents with Llama Stack, see:
- Llama Stack Documentation - The official Llama Stack documentation covering all usage-related topics, API providers, and core concepts.
- Llama Stack Core Concepts - Deep dive into Llama Stack architecture, API stability, and resource management.
- Llama Stack GitHub Repository - Source code, example applications, distribution configurations, and how to add new API providers.
- Llama Stack Example Apps - Official examples demonstrating how to use Llama Stack in various scenarios.