trinity.explorer package#
Subpackages#
Submodules#
- trinity.explorer.explorer module
Explorer
Explorer.__init__()
Explorer.setup_weight_sync_group()
Explorer.prepare()
Explorer.get_weight()
Explorer.explore()
Explorer.explore_step()
Explorer.need_sync()
Explorer.need_eval()
Explorer.eval()
Explorer.benchmark()
Explorer.save_checkpoint()
Explorer.sync_weight()
Explorer.shutdown()
Explorer.is_alive()
Explorer.serve()
Explorer.get_actor()
- trinity.explorer.explorer_client module
- trinity.explorer.scheduler module
- trinity.explorer.workflow_runner module
Module contents#
- class trinity.explorer.Explorer(config: Config)[source]#
Bases:
object
Responsible for exploring the taskset.
- async setup_weight_sync_group(master_address: str, master_port: int, state_dict_meta: List | None = None)[source]#
- async get_weight(name: str) Tensor [source]#
Get the weight of the loaded model (For checkpoint weights update).
- async explore() str [source]#
- The timeline of the exploration process:
- <——————————— one period ————————————-> |
- explorer | <—————- step_1 ————–> | |
- | <—————- step_2 ————–> | |… || <—————- step_n —————> | || <———————- eval ——————–> | <– sync –> |
|--------------------------------------------------------------------------------------|
trainer | <– idle –> | <– step_1 –> | <– step_2 –> | … | <– step_n –> | <– sync –> |
- async serve() None [source]#
Run the explorer in serving mode.
In serving mode, the explorer starts an OpenAI compatible server to handle requests. Agent applications can be deployed separately and interact with the explorer via the API.
import openai client = openai.OpenAI( base_url=f"{explorer_server_url}/v1", api_key="EMPTY", ) response = client.chat.completions.create( model=config.model.model_path, messages=[{"role": "user", "content": "Hello!"}] )