trinity.explorer package#
Subpackages#
Submodules#
- trinity.explorer.explorer module
ExplorerExplorer.__init__()Explorer.setup_weight_sync_group()Explorer.prepare()Explorer.get_weight()Explorer.explore()Explorer.explore_step()Explorer.need_sync()Explorer.need_eval()Explorer.eval()Explorer.benchmark()Explorer.save_checkpoint()Explorer.sync_weight()Explorer.shutdown()Explorer.is_alive()Explorer.serve()Explorer.get_actor()
- trinity.explorer.explorer_client module
- trinity.explorer.scheduler module
- trinity.explorer.workflow_runner module
Module contents#
- class trinity.explorer.Explorer(config: Config)[source]#
Bases:
objectResponsible for exploring the taskset.
- async setup_weight_sync_group(master_address: str, master_port: int, state_dict_meta: List | None = None)[source]#
- async get_weight(name: str) Tensor[source]#
Get the weight of the loaded model (For checkpoint weights update).
- async explore() str[source]#
- The timeline of the exploration process:
- <βββββββββββ one period ββββββββββββ-> |
- explorer | <βββββ- step_1 βββββ> | |
- | <βββββ- step_2 βββββ> | |β¦ || <βββββ- step_n βββββ> | || <βββββββ- eval βββββββ> | <β sync β> |
|--------------------------------------------------------------------------------------|
trainer | <β idle β> | <β step_1 β> | <β step_2 β> | β¦ | <β step_n β> | <β sync β> |
- async serve() None[source]#
Run the explorer in serving mode.
In serving mode, the explorer starts an OpenAI compatible server to handle requests. Agent applications can be deployed separately and interact with the explorer via the API.
import openai client = openai.OpenAI( base_url=f"{explorer_server_url}/v1", api_key="EMPTY", ) response = client.chat.completions.create( model=config.model.model_path, messages=[{"role": "user", "content": "Hello!"}] )