Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[genai-module][models] Support automatic function calling in generate_content_stream #106

Open
lmsh7 opened this issue Jan 9, 2025 · 10 comments
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.

Comments

@lmsh7
Copy link

lmsh7 commented Jan 9, 2025

Environment details

  • Programming language: Python
  • OS: macOS
  • Language runtime version: 3.12.7
  • Package version: 0.4.0

Steps to reproduce

  1. import os
    from google import genai
    from google.genai import types
    from google.genai.types import (
        GoogleSearch,
    )
    
    
    client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])
    
    
    def search(query: str) -> str:
        return query
    
    
    # mytools = [{"google_search": GoogleSearch()}]
    mytools = [search]
    
    
    generation_config = types.GenerateContentConfig(
        temperature=1,
        top_p=0.95,
        top_k=40,
        max_output_tokens=8192,
        tools=mytools,
    )
    params = {
        "model": "gemini-2.0-flash-exp",
        "config": generation_config,
        "contents": "Search something about hongkong",
    }
    for chunk in client.models.generate_content_stream(**params):
        print(chunk.text)
  2. Traceback (most recent call last):
      File "/Users/lmsh7/Code/project8/./bug_report.py", line 33, in <module>
        print(chunk.text)
              ^^^^^^^^^^
      File "/Users/lmsh7/Code/project8/.venv/lib/python3.12/site-packages/google/genai/types.py", line 2483, in text
        raise ValueError(
    ValueError: GenerateContentResponse.text only supports text parts, but got function_call partvideo_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None
@lmsh7 lmsh7 added priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Jan 9, 2025
@Giom-V
Copy link

Giom-V commented Jan 9, 2025

Hello @lmsh7, I think your issue is because you can't simply use chunk.text . You need to do something like:

if chuck.text  is not None:
  print(chunk.text)
elif chuck.function_call is not None:
  # do something, or maybe nothing.

@lmsh7
Copy link
Author

lmsh7 commented Jan 9, 2025

Hello @lmsh7, I think your issue is because you can't simply use chunk.text . You need to do something like:

if chuck.text  is not None:
  print(chunk.text)
elif chuck.function_call is not None:
  # do something, or maybe nothing.

Could you provide a working example? When I attempt this...

Traceback (most recent call last):
  File "/Users/lmsh7/Code/project8/bug_report.py", line 33, in <module>
    if chunk.text is not None:
       ^^^^^^^^^^
  File "/Users/lmsh7/Code/project8/.venv/lib/python3.12/site-packages/google/genai/types.py", line 2483, in text
    raise ValueError(
ValueError: GenerateContentResponse.text only supports text parts, but got function_call partvideo_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None

@Giom-V
Copy link

Giom-V commented Jan 9, 2025

Try using chunk.candidates[0].content.parts[0] instead of chunk. That solves your error, but as far as I can see you still do not get the grounding data.

@lmsh7
Copy link
Author

lmsh7 commented Jan 9, 2025

Try using chunk.candidates[0].content.parts[0] instead of chunk.

I tried this approach but the results were not what I expected:

# Using streaming approach
for chunk in client.models.generate_content_stream(**params):
    if chunk.candidates[0].content.parts[0] is not None:
        print(chunk.candidates[0].content.parts[0])

Output:

video_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None
video_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=None function_response=None inline_data=None text=''

In comparison, when using the non-streaming approach:

# Using regular approach
response = client.models.generate_content(**params)
print(response.text)

Output:

I searched for "hongkong" and the API returned a result "hongkong". Is there anything else I can help you with?

@Giom-V
Copy link

Giom-V commented Jan 9, 2025

You forgot the .text in your code.

Here's a more complete loop you can use:

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
      for part in candidate.content.parts:
        print(part.text)

@lmsh7
Copy link
Author

lmsh7 commented Jan 10, 2025

You forgot the .text in your code.

Here's a more complete loop you can use:

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
      for part in candidate.content.parts:
        print(part.text)

Here is full code to reproduce, the result still None.

import os
from google import genai
from google.genai import types
from google.genai.types import (
    GoogleSearch,
)


client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])


def search(query: str) -> str:
    return query


# mytools = [{"google_search": GoogleSearch()}]
mytools = [search]


generation_config = types.GenerateContentConfig(
    temperature=1,
    top_p=0.95,
    top_k=40,
    max_output_tokens=8192,
    tools=mytools,
)
params = {
    "model": "gemini-2.0-flash-exp",
    "config": generation_config,
    "contents": "Search something about hongkong",
}

# response = client.models.generate_content(**params)
# print(response.text)

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
        for part in candidate.content.parts:
            print(part.text)

@Giom-V
Copy link

Giom-V commented Jan 13, 2025

@lmsh7 You get no output because you override the search function with one that returns nothing. Remove mytools = [search] and uncomment mytools = [{"google_search": GoogleSearch()}] and it should work.

@Giom-V
Copy link

Giom-V commented Jan 15, 2025

@lmsh7 I think I see what the issue is. If you check the full output, the function_callpart is filled, but function_response is empty, so I think the model is not running your function.

I'll check if that's how it is expected to work (I don't think so). In the meantime you might have to handle the function calls yourself as we do in the live-api tool notbook or more simply use the former SDK

@Giom-V Giom-V reopened this Jan 15, 2025
@sasha-gitg sasha-gitg added type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design. and removed type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Jan 15, 2025
@sasha-gitg sasha-gitg changed the title Streaming generation is not working with automatic function calling [genai-module][models] Support automatic function calling in generate_content_stream Jan 15, 2025
@Giom-V
Copy link

Giom-V commented Jan 15, 2025

@lmsh7 I got confirmation that automatic function calling doesn't work at the moment with steaming and the new SDK so I as said in my previous message you'll either have to handle the call yourself or use the former SDK.

@lmsh7
Copy link
Author

lmsh7 commented Jan 15, 2025

@lmsh7 I got confirmation that automatic function calling doesn't work at the moment with steaming and the new SDK so I as said in my previous message you'll either have to handle the call yourself or use the former SDK.

Hopefully this will be resolved in the future. Thank you for your confirmation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.
Projects
None yet
Development

No branches or pull requests

3 participants