[Index]

Model: data/AISettings

System-wide AI and LLM settings. Controls LLM summarization parameters and LangSmith diagnostics tracing.

Model Details: data/AISettings

Title Description Details
Name * Name of this AI settings instance. Default: Global
  • Field Name: name
  • Type: String
  • Default: Global
  • MaxLength: 1024
Summarization Timeout (seconds) Maximum time in seconds to wait for an LLM summarization call before timing out. Default: 300
  • Field Name: summarization_timeout
  • Type: Integer
  • Minimum: 30
  • Maximum: 900
  • Default: 300
Max Summary Length (characters) Maximum number of characters for a generated LLM summary before truncation. Default: 2000
  • Field Name: max_summary_length
  • Type: Integer
  • Minimum: 500
  • Maximum: 10000
  • Default: 2000
Max Input Tokens Maximum estimated token budget for the summarization LLM input prompt. Large detection outputs are compressed or capped to fit within this limit. Uses chars/4 approximation. Default: 100000
  • Field Name: max_input_tokens
  • Type: Integer
  • Minimum: 10000
  • Maximum: 500000
  • Default: 100000
Diagnostics LangSmith tracing and diagnostics settings for LLM observability.
  • Field Name: diagnostics
  • Type: Object
Enable LangSmith Tracing Enable LangSmith tracing for LLM calls. Do not enable in production unless actively debugging.
  • Field Name: diagnostics.langsmith_tracing_enabled
  • Type: Boolean
LangSmith Project Name Project name in LangSmith for organizing traces.
  • Field Name: diagnostics.langsmith_project
  • Type: String
  • MaxLength: 1024
LangSmith Workspace ID Workspace ID in LangSmith for organizing traces.
  • Field Name: diagnostics.langsmith_workspace_id
  • Type: String
  • MaxLength: 1024
LangSmith API Key API key for authenticating with LangSmith. Stored encrypted.
  • Field Name: diagnostics.langsmith_api_key
  • Type: String
  • Is Password: True
  • Store Encrypted: True
  • MaxLength: 1024
Hide Inputs When enabled, LLM input payloads are not sent to LangSmith. Recommended for production to protect sensitive data. Default: True
  • Field Name: diagnostics.langsmith_hide_inputs
  • Type: Boolean
  • Default: True
Hide Outputs When enabled, LLM output payloads are not sent to LangSmith. Recommended for production to protect sensitive data. Default: True
  • Field Name: diagnostics.langsmith_hide_outputs
  • Type: Boolean
  • Default: True