memoryscope.core.worker

class memoryscope.core.worker.BaseWorker(name: str, context: ~typing.Dict[str, ~typing.Any], memoryscope_context: ~memoryscope.core.utils.singleton.singleton.<locals>._singleton, context_lock=None, raise_exception: bool = True, is_multi_thread: bool = False, thread_pool: ~concurrent.futures.thread.ThreadPoolExecutor | None = None, **kwargs)[source]

Bases: object

BaseWorker is an abstract class that defines a worker with common functionalities for managing tasks and context in both asynchronous and multi-thread environments.

__init__(name: str, context: ~typing.Dict[str, ~typing.Any], memoryscope_context: ~memoryscope.core.utils.singleton.singleton.<locals>._singleton, context_lock=None, raise_exception: bool = True, is_multi_thread: bool = False, thread_pool: ~concurrent.futures.thread.ThreadPoolExecutor | None = None, **kwargs)[source]

Initializes the BaseWorker with the provided parameters.

Parameters:
  • name (str) – The name of the worker.

  • context (Dict[str, Any]) – Shared context dictionary.

  • context_lock (optional) – Lock for synchronizing access to the context in multithread mode.

  • raise_exception (bool, optional) – Flag to control whether exceptions should be raised.

  • is_multi_thread (bool, optional) – Flag indicating if the worker operates in multithread mode.

  • thread_pool (ThreadPoolExecutor, optional) – Thread pool executor for managing multithread tasks.

  • kwargs – Additional keyword arguments.

submit_async_task(fn, *args, **kwargs)[source]

Submits an asynchronous task to the worker.

Parameters:
  • fn (callable) – The function to be executed.

  • args – Positional arguments for the function.

  • kwargs – Keyword arguments for the function.

Raises:

RuntimeError – If called in multithread mode.

gather_async_result()[source]

Executes all asynchronous tasks and gathers their results.

Returns:

A list of results from the asynchronous tasks.

Raises:

RuntimeError – If called in multithread mode.

submit_thread_task(fn, *args, **kwargs)[source]

Submits a task to be executed in a separate thread.

Parameters:
  • fn (callable) – The function to be executed.

  • args – Positional arguments for the function.

  • kwargs – Keyword arguments for the function.

gather_thread_result()[source]

Gathers results of all submitted multithread tasks.

Yields:

The result of each completed task.

run()[source]

Executes the worker’s main logic and manages execution flow and exception handling.

Uses a Timer to log the execution time of the worker.

get_workflow_context(key: str, default=None)[source]

Retrieves a value from the shared context.

Parameters:
  • key (str) – The key for the context value.

  • default (optional) – Default value if the key is not found.

Returns:

The value from the context or the default value.

set_workflow_context(key: str, value: Any)[source]

Sets a value in the shared context.

Parameters:
  • key (str) – The key for the context value.

  • value (Any) – The value to be set.

has_content(key: str)[source]

Checks if the context contains a specific key.

Parameters:

key (str) – The key to check in the context.

Returns:

True if the key is in the context, otherwise False.

Return type:

bool

class memoryscope.core.worker.DummyWorker(embedding_model: str = '', generation_model: str = '', rank_model: str = '', **kwargs)[source]

Bases: MemoryBaseWorker

class memoryscope.core.worker.MemoryBaseWorker(embedding_model: str = '', generation_model: str = '', rank_model: str = '', **kwargs)[source]

Bases: BaseWorker

FILE_PATH: str = '/home/runner/work/MemoryScope/MemoryScope/memoryscope/core/worker/memory_base_worker.py'
__init__(embedding_model: str = '', generation_model: str = '', rank_model: str = '', **kwargs)[source]

Initializes the MemoryBaseWorker with specified models and configurations.

Parameters:
  • embedding_model (str) – Identifier or instance of the embedding model used for transforming text.

  • generation_model (str) – Identifier or instance of the text generation model.

  • rank_model (str) – Identifier or instance of the ranking model for sorting the retrieved memories wrt. the semantic similarities.

  • **kwargs – Additional keyword arguments passed to the parent class initializer.

The constructor also initializes key attributes related to memory store, monitoring, user and target identification, and a prompt handler, setting them up for later use.

property chat_messages: List[List[Message]]

Property to get the chat messages.

Returns:

List of chat messages.

Return type:

List[Message]

property chat_messages_scatter: List[Message]

Property to get the chat messages.

Returns:

List of chat messages.

Return type:

List[Message]

property chat_kwargs: Dict[str, Any]

Retrieves the chat keyword arguments from the context.

This property getter fetches the chat-related parameters stored in the context, which are used to configure how chat interactions are handled.

Returns:

A dictionary containing the chat keyword arguments.

Return type:

Dict[str, str]

property user_name: str
property target_name: str
property workflow_name: str
property language: LanguageEnum
property embedding_model: BaseModel

Property to get the embedding model. If the model is currently stored as a string, it will be replaced with the actual model instance from the global context’s model dictionary.

Returns:

The embedding model used for converting text into vector representations.

Return type:

BaseModel

property generation_model: BaseModel

Property to access the generation model. If the model is stored as a string, it retrieves the actual model instance from the global context’s model dictionary.

Returns:

The model used for text generation.

Return type:

BaseModel

property rank_model: BaseModel

Property to access the rank model. If the stored rank model is a string, it fetches the actual model instance from the global context’s model dictionary before returning it.

Returns:

The rank model instance used for ranking tasks.

Return type:

BaseModel

property memory_store: BaseMemoryStore

Property to access the memory vector store. If not initialized, it fetches the global memory store.

Returns:

The memory store instance used for inserting, updating, retrieving and deleting operations.

Return type:

BaseMemoryStore

property monitor: BaseMonitor

Property to access the monitoring component. If not initialized, it fetches the global monitor.

Returns:

The monitoring component instance.

Return type:

BaseMonitor

property prompt_handler: PromptHandler

Lazily initializes and returns the PromptHandler instance.

Returns:

An instance of PromptHandler initialized with specific file path and keyword arguments.

Return type:

PromptHandler

property memory_manager: MemoryManager

Lazily initializes and returns the MemoryHandler instance.

Returns:

An instance of MemoryHandler.

Return type:

MemoryHandler

get_language_value(languages: dict | List[dict]) Any | List[Any][source]

Retrieves the value(s) corresponding to the current language context.

Parameters:

languages (dict | list[dict]) – A dictionary or list of dictionaries containing language-keyed values.

Returns:

The value or list of values matching the current language setting.

Return type:

Any | list[Any]

prompt_to_msg(system_prompt: str, few_shot: str, user_query: str, concat_system_prompt: bool = True) List[Message][source]

Converts input strings into a structured list of message objects suitable for AI interactions.

Parameters:
  • system_prompt (str) – The system-level instruction or context.

  • few_shot (str) – An example or demonstration input, often used for illustrating expected behavior.

  • user_query (str) – The actual user query or prompt to be processed.

  • concat_system_prompt (bool) – Concat system prompt again or not in the user message. A simple method to improve the effectiveness for some LLMs. Defaults to True.

Returns:

A list of Message objects, each representing a part of the conversation setup.

Return type:

List[Message]

class memoryscope.core.worker.MemoryManager(memoryscope_context: ~memoryscope.core.utils.singleton.singleton.<locals>._singleton, workerflow_name: str = 'default_worker')[source]

Bases: object

The MemoryHandler class manages memory nodes with memory store.

__init__(memoryscope_context: ~memoryscope.core.utils.singleton.singleton.<locals>._singleton, workerflow_name: str = 'default_worker')[source]
property memory_store: BaseMemoryStore

Property to access the memory store. If not initialized, it fetches the memory store from the global context.

Returns:

The memory store instance associated with this worker.

Return type:

BaseMemoryStore

clear()[source]

Clear all memory nodes cached, reset the class instance.

add_memories(key: str, nodes: MemoryNode | List[MemoryNode], log_repeat: bool = True)[source]

Add the memories.

Parameters:
  • key (str) – The key mapping to memory nodes.

  • nodes (List[MemoryNode]) – A single memory node or a list of memory nodes to be updated.

  • log_repeat (bool) – Log duplicated memory node or not.

set_memories(key: str, nodes: MemoryNode | List[MemoryNode], log_repeat: bool = True)[source]

Add the memories into ‘_id_memory_dict’ and ‘_key_id_dict’.

Parameters:
  • key (str) – The key mapping to memory nodes.

  • nodes (List[MemoryNode]) – A single memory node or a list of memory nodes to be updated.

  • log_repeat – if log_repeat=True, print log info

get_memories(keys: str | List[str]) List[MemoryNode][source]

Fetch the memories by keys.

Parameters:

keys (str | List[str]) – The key mapping to memory nodes.

Returns:

Memories mapped to the key.

Return type:

List[MemoryNode]

delete_memories(nodes: MemoryNode | List[MemoryNode], key: str | None = None)[source]

Delete the memories.

Parameters:
  • key (str) – The key mapping to memory nodes.

  • nodes (List[MemoryNode]) – A single memory node or a list of memory nodes to be deleted.

update_memories(keys: str = '', nodes: MemoryNode | List[MemoryNode] | None = None) dict[source]

Update the memories.

Parameters:
  • keys (str) – The memories.

  • nodes (List[MemoryNode]) – A single memory node or a list of memory nodes to be updated.