Skip to content

Instruct LLAMA integration, more informative logging (metrics, API cost estimation, customization)

Compare
Choose a tag to compare
@martiansideofthemoon martiansideofthemoon released this 06 Jun 17:43
· 21 commits to main since this release

This update to our PIP package contains a number of new features requested by our beta users:

  • You can now use Instruct LLAMA as well besides ChatGPT! Set --model_name retrieval+llama+npm to use it. The factscore.download_data script has been fully updated to also install Instruct LLAMA if --llama_7B_HF_path is set as shown in the README.

  • We have add a lot more information into our logging, such as alternative metrics (respond ratio, # of generated facts) and OpenAI API cost estimation. We have also moved to the logging package and have a few flags to ensure a nicer logging experience (--print_rate_limit_error and --verbose).

  • More customization of cache is now possible, with --model_dir and --data_dir directories.