代码

cookbook/models/ollama/basic_stream.py
from typing import Iterator  # noqa
from agno.agent import Agent, RunResponse  # noqa
from agno.models.ollama import Ollama

agent = Agent(model=Ollama(id="llama3.1:8b"), markdown=True)

# Get the response in a variable
# run_response: Iterator[RunResponse] = agent.run("Share a 2 sentence horror story", stream=True)
# for chunk in run_response:
#     print(chunk.content)

# Print the response in the terminal
agent.print_response("编写一个两句话的恐怖故事", stream=True)

用法

1

创建虚拟环境

打开 Terminal 并创建一个 python 虚拟环境。

python3 -m venv .venv
source .venv/bin/activate
2

安装 Ollama

请遵循安装指南并运行:

ollama pull llama3.1:8b
3

安装库

pip install -U ollama agno
4

运行代理

python cookbook/models/ollama/basic_stream.py