Skip to content

Commit

Permalink
Address sporadic hanging of evals on certain samples (#1482)
Browse files Browse the repository at this point in the history
As has been brought up before (#1384, #1292,
#270), evals suffer from a hanging
issue, where an evaluation run will hang for a very long time (if not
indefinitely) at the end of a run (say, on the 99th sample of out 100).

This PR addresses this issue, by replacing a seemingly redundant
single-threaded thread creation that was happening when making requests,
nested inside the already multi-threaded eval loop. My impression is
that this nested multithreading was causing overhead that resulted in
the hanging experienced.

I had also noticed this hanging issue in `EVALS_SEQUENTIAL=1` mode
(where it no longer occurs at the end, but instead randomly in the
middle of the run).

I was able to identify the source of this issue though debugging print
statements that ultimately pointed to the `request_with_timeout`
function as the culprit.

We have tested the new `request_with_timeout` code on a fork where we
have run multiple new and pre-existing evals, including with 3rd party
solvers, and found no change in behaviour or errors, and a clear
improvement on the hanging issue.
  • Loading branch information
thesofakillers authored Mar 25, 2024
1 parent 5805c20 commit bfe3925
Showing 1 changed file with 6 additions and 9 deletions.
15 changes: 6 additions & 9 deletions evals/utils/api_utils.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
"""
This file defines various helper functions for interacting with the OpenAI API.
"""
import concurrent
import logging
import os

Expand Down Expand Up @@ -38,16 +37,14 @@ def openai_completion_create_retrying(client: OpenAI, *args, **kwargs):

def request_with_timeout(func, *args, timeout=EVALS_THREAD_TIMEOUT, **kwargs):
"""
Worker thread for making a single request within allotted time.
Function for making a single request within allotted time.
"""
while True:
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
future = executor.submit(func, *args, **kwargs)
try:
result = future.result(timeout=timeout)
return result
except concurrent.futures.TimeoutError:
continue
try:
result = func(*args, timeout=timeout, **kwargs)
return result
except openai.APITimeoutError as e:
continue


@backoff.on_exception(
Expand Down

0 comments on commit bfe3925

Please sign in to comment.