Getting Started
Quickstart
Install
pip install stasoTo upgrade to the latest version:
pip install -U stasoPython 3.11+ required.
Initialize
Get your API key from the Staso dashboard, then:
import staso as st
st.init(
api_key="ak_...",
agent_id="my-agent",
environment="production",
)Add Decorators
Put @st.agent on your agent's entry-point. Put @st.tool on its tools. Done.
@st.agent(name="support-agent")
def handle_request(user_message: str) -> str:
context = search_faq(user_message)
return generate_response(context, user_message)
@st.tool(name="search_faq")
def search_faq(query: str) -> str:
return db.search(query)Every call to handle_request now shows up on your dashboard — which tools ran, what they returned, how long each step took, and any errors.
Group by Conversation
Tie multiple agent runs into a conversation:
with st.conversation("conversation-123"):
handle_request("How do I reset my password?")
handle_request("That didn't work")Shut Down
Flush remaining traces before your process exits:
st.shutdown()Full Example
import staso as st
st.init(api_key="ak_...", agent_id="support-agent", environment="production")
st.integrations.patch_anthropic() # auto-trace Anthropic calls
@st.tool(name="search_faq")
def search_faq(query: str) -> str:
return "To reset your password, go to Settings > Security > Reset Password."
@st.agent(name="support-agent")
def run_agent(user_message: str) -> str:
import anthropic
faq = search_faq(user_message)
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": f"Context: {faq}\n\nQuestion: {user_message}"}],
)
return response.content[0].text
with st.conversation("customer-jane-001"):
print(run_agent("How do I reset my password?"))
st.shutdown()On the dashboard you'll see:
- support-agent (agent) — total duration, status
- search_faq (tool) — input query, returned result
- anthropic.messages.create (llm) — model, token usage, latency
Next
- Tracing your code —
@agent,@tool,@trace - Conversations — Conversation grouping
- LLM integrations — Auto-instrument Anthropic, OpenAI
- Claude Code — Trace Claude Code CLI conversations
- Configuration — All options