This is the official repository of the paper: Tool-Planner: Dynamic Solution Tree Planning for Large Language Model with Tool Clustering
Update your environment for the required dependency.
conda create --name tool-planner python=3.11 -y
conda activate tool-planner
pip install -r requirement.txtUse the following link to download the Toolbench dataset. It will be used to extract the tool calling methods and descriptions from RapidAPI, and to complete the subsequent setup of the Toolkit.
Google Drive or Tsinghua Cloud
- Putting the file to the main directory.
mv your_path/datas Tool-Planner- Make sure you have the access of
ToolBenchAPI. refer to Toolbench to apply for the permission.
Downloading SimCSE model to your local directory.
mv your_path/sup-simcse-roberta-base Tool-Planner/modelOr Upload it from HuggingFace.
tokenizer = AutoTokenizer.from_pretrained("princeton-nlp/sup-simcse-roberta-base")
model = AutoModel.from_pretrained("princeton-nlp/sup-simcse-roberta-base")Use your ChatGPT API and Toolbench API, replace the API key in run.sh, and run the following script.
bash script/run.shTo obtain the corresponding result information for your query, change the number of toolkits and adjust the file location of the input query.
export TOOLBENCH_KEY=""
export OPENAI_KEY=""
export PYTHONPATH=./
python main.py \
--toolkit_dir src/toolkits \
--tool_api_dir datas/toolenv/tools/ \
--backbone_model gpt_3.5 \
--toolkit_num 20 \
--openai_key $OPENAI_KEY \
--tool_env datas/toolenv/tools/ \
--tool_output_file tool_lib/my_tool_library.json \
--toolkit_output_file tool_lib/my_toolkit_library.json \
--input_query_file data/instruction/G3_query.json \
--output_answer_file data/instruction/tool_result.json \
--simcse_file model_lib/sup-simcse-roberta-base \
--toolbench_key $TOOLBENCH_KEYIn the tool_lib, we provide a example of the toolkit, corresponding to the generated toolkit contents, change your output tool and toolkit file on --toolkit_output_file and --tool_output_file.
- Experiments have demonstrated that our method exhibits competitive performance and efficient performance configuration under different approaches.
[24/01/23] 🔥 Our paper was accepted on ICLR 2025! [24/06/09] 🔥 We have released the version 1.0.0 for Tool-Planner.
If you use this codebase or Tool-Planner inspires your work, we would greatly appreciate it if you could star the repository and cite it using the following BibTeX entry:
@inproceedings{
liu2025toolplanner,
title={Tool-Planner: Task Planning with Clusters across Multiple Tools},
author={Yanming Liu and Xinyue Peng and Jiannan Cao and Shi Bo and Yuwei Zhang and Xuhong Zhang and Sheng Cheng and Xun Wang and Jianwei Yin and Tianyu Du},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=dRz3cizftU}
}
We thank the ZJU TRAIL Lab assistance for extending Tool-Planner!


