Overview
AI Agent Observability provides full visibility into how the Tasks Management AI Agent processes requests. Using LangSmith integration, operations teams can monitor performance, trace decision-making, and quickly identify issues.Key Benefit: Understand exactly what the AI “thinks” when processing task requests, enabling faster troubleshooting and continuous improvement.
This feature adds monitoring capabilities only. The AI agent continues to work exactly as before with no changes to existing functionality.
What You Can See
Request Processing
Full visibility into how the AI agent processes each task request from start to finish
Decision Transparency
See exactly what the AI “thinks” when filling out task forms, including category and service selection
Performance Metrics
Response times, success rates, and throughput metrics for monitoring agent health
Error Tracking
Quickly identify and resolve issues with detailed error logs and stack traces
Benefits
Faster Troubleshooting
When issues occur with AI-assisted task creation, observability data helps identify:- What input the AI received
- How the AI interpreted the request
- Where in the process an error occurred
- What caused unexpected behavior
Better Understanding of AI Decision-Making
See the reasoning behind AI choices:- Why a specific category was selected
- How priority levels are determined
- What factors influenced service selection
- How schedule suggestions are generated
Performance Monitoring
Track key metrics to ensure optimal performance:- Average response times
- Success/failure rates
- Request volume trends
- Resource utilization
Environments
The AI agent is monitored in both development and production environments:| Environment | Dashboard Name | Purpose |
|---|---|---|
| Development | Tasks Management Agent - Dev | Testing and debugging new features |
| Production | Tasks Management Agent - Prod | Monitoring live user interactions |
LangSmith Dashboard
Access the LangSmith observability dashboard
How It Works
Trace Flow
- User initiates request — User describes a task in the AI-assisted form
- Agent receives input — The request is logged and traced
- Processing begins — Each step of AI reasoning is recorded
- Decisions are made — Category, service, priority selections are logged
- Response returned — Final output and timing are captured
- Metrics updated — Performance data is aggregated
What Gets Traced
Input Processing
Input Processing
- Raw user input text
- Parsed entities and keywords
- Context information (facility, user role)
- Request timestamp
AI Reasoning
AI Reasoning
- Category matching logic
- Service selection criteria
- Priority determination factors
- Schedule suggestion reasoning
Output Generation
Output Generation
- Selected category and service
- Assigned priority level
- Suggested schedule
- Confidence scores (when available)
Performance Data
Performance Data
- Total request duration
- Individual step timings
- Token usage (for LLM calls)
- Memory utilization
Errors and Exceptions
Errors and Exceptions
- Error messages and stack traces
- Failed step identification
- Retry attempts
- Fallback activations
Use Cases
Investigating User-Reported Issues
When a user reports that the AI made an incorrect suggestion:- Find the specific request in LangSmith
- Review the input the AI received
- Trace the decision-making steps
- Identify where the logic diverged
- Determine if it’s a training issue or edge case
Monitoring System Health
Regular health checks using observability data:- Review daily success rates
- Check average response times
- Identify any error spikes
- Monitor resource utilization trends
Improving AI Performance
Use trace data to enhance the AI:- Identify common misclassifications
- Find patterns in failed requests
- Discover edge cases for training
- Measure impact of model updates
Best Practices
Access Requirements
Access to the LangSmith dashboard requires appropriate permissions. Contact your system administrator if you need access to:- View traces and logs
- Access performance dashboards
- Configure alerts and notifications
FAQ
Does this change how the AI works?
Does this change how the AI works?
No. Observability is purely monitoring—it records what the AI does without changing any functionality. The AI agent continues to work exactly as before.
Is user data stored in traces?
Is user data stored in traces?
Traces may include task request content for debugging purposes. All data is handled in accordance with AllCare’s privacy and security policies.
Who can access the observability dashboard?
Who can access the observability dashboard?
Access is restricted to authorized AllCare operations and engineering staff. Contact your administrator for access requests.
How long are traces retained?
How long are traces retained?
Trace retention follows LangSmith’s default policies. Contact the engineering team for specific retention periods.