memoryscope.scheme.model_response

pydantic model memoryscope.scheme.model_response.ModelResponse[源代码]

基类:BaseModel

Show JSON schema
{
   "title": "ModelResponse",
   "type": "object",
   "properties": {
      "message": {
         "anyOf": [
            {
               "$ref": "#/$defs/Message"
            },
            {
               "type": "null"
            }
         ],
         "default": null,
         "description": "generation model result"
      },
      "delta": {
         "default": "",
         "description": "New text that just streamed in (only used when streaming)",
         "title": "Delta",
         "type": "string"
      },
      "embedding_results": {
         "anyOf": [
            {
               "items": {
                  "items": {
                     "type": "number"
                  },
                  "type": "array"
               },
               "type": "array"
            },
            {
               "items": {
                  "type": "number"
               },
               "type": "array"
            }
         ],
         "default": [],
         "description": "embedding vector",
         "title": "Embedding Results"
      },
      "rank_scores": {
         "additionalProperties": {
            "type": "number"
         },
         "default": {},
         "description": "The rank scores of each documents. key: index, value: rank score",
         "title": "Rank Scores",
         "type": "object"
      },
      "m_type": {
         "$ref": "#/$defs/ModelEnum",
         "default": "generation_model",
         "description": "One of LLM, EMB, RANK."
      },
      "status": {
         "default": true,
         "description": "Indicates whether the model call was successful.",
         "title": "Status",
         "type": "boolean"
      },
      "details": {
         "default": "",
         "description": "The details information for model call, usually for storage of raw response or failure messages.",
         "title": "Details",
         "type": "string"
      },
      "raw": {
         "default": "",
         "description": "Raw response from model call",
         "title": "Raw"
      },
      "meta_data": {
         "default": {},
         "description": "meta data for model response",
         "title": "Meta Data",
         "type": "object"
      }
   },
   "$defs": {
      "Message": {
         "description": "Represents a structured message object with details about the sender, content, and metadata.\n\nAttributes:\n    role (str): The role of the message sender (e.g., 'user', 'assistant', 'system').\n    role_name (str): Optional name associated with the role of the message sender.\n    content (str): The actual content or text of the message.\n    time_created (int): Timestamp indicating when the message was created.\n    memorized (bool): Flag to indicate if the message has been saved or remembered.\n    meta_data (Dict[str, str]): Additional data or context attached to the message.",
         "properties": {
            "role": {
               "description": "The role of the message sender (user, assistant, system)",
               "title": "Role",
               "type": "string"
            },
            "role_name": {
               "default": "",
               "description": "Name describing the role of the message sender",
               "title": "Role Name",
               "type": "string"
            },
            "content": {
               "description": "The primary content of the message",
               "title": "Content",
               "type": "string"
            },
            "time_created": {
               "description": "Timestamp marking the message creation time",
               "title": "Time Created",
               "type": "integer"
            },
            "memorized": {
               "default": false,
               "description": "Indicates if the message is flagged for memory retention",
               "title": "Memorized",
               "type": "boolean"
            },
            "meta_data": {
               "additionalProperties": {
                  "type": "string"
               },
               "default": {},
               "description": "Supplementary data attached to the message",
               "title": "Meta Data",
               "type": "object"
            }
         },
         "required": [
            "role",
            "content"
         ],
         "title": "Message",
         "type": "object"
      },
      "ModelEnum": {
         "description": "An enumeration representing different types of models used within the system.\n\nMembers:\n    GENERATION_MODEL: Represents a model responsible for generating content.\n    EMBEDDING_MODEL: Represents a model tasked with creating embeddings, typically used for transforming data into a\n        numerical form suitable for machine learning tasks.\n    RANK_MODEL: Denotes a model that specializes in ranking, often used to order items based on relevance.",
         "enum": [
            "generation_model",
            "embedding_model",
            "rank_model"
         ],
         "title": "ModelEnum",
         "type": "string"
      }
   }
}

Fields:
field message: Message | None = None

generation model result

field delta: str = ''

New text that just streamed in (only used when streaming)

field embedding_results: List[List[float]] | List[float] = []

embedding vector

field rank_scores: Dict[int, float] = {}

The rank scores of each documents. key: index, value: rank score

field m_type: ModelEnum = ModelEnum.GENERATION_MODEL

One of LLM, EMB, RANK.

field status: bool = True

Indicates whether the model call was successful.

field details: str = ''

The details information for model call, usually for storage of raw response or failure messages.

field raw: Any = ''

Raw response from model call

field meta_data: Dict[str, Any] = {}

meta data for model response