Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add JSON format option to ollama #647

Merged
merged 2 commits into from
Mar 6, 2024
Merged

Conversation

corani
Copy link
Contributor

@corani corani commented Mar 6, 2024

feat: add JSON format option to ollama

A new functionality was introduced to allow the setting of a specified
data format (currently Ollama only supports JSON). This is done via the
WithFormat option. The change provides more flexibility and control
over the format of data processed by the client.

Moreover, the test TestWithFormat has been added to assert the proper
functioning of this new feature.

Using the JSON format allows you to simulate Functions using prompt
injection, as it forces Ollama to respond in valid JSON.

example: add new Ollama functions example

Add an example showing how the Ollama JSON format in combination with
prompt injection can be used to simulate Functions.

Output:

$ go run .
2024/03/06 11:03:24 Call: getCurrentWeather
2024/03/06 11:03:25 Call: finalResponse
2024/03/06 11:03:25 Final response: The forecast for Beijing is sunny and windy with a temperature of 6 degrees Celsius.

Copy link
Owner

@tmc tmc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you address the lint issue please, and comment new exported symbol (WithFormat).

@corani
Copy link
Contributor Author

corani commented Mar 6, 2024

Can you address the lint issue please, and comment new exported symbol (WithFormat).

Done. I've also added an example of how to use this to simulate functions with Ollama.

@corani
Copy link
Contributor Author

corani commented Mar 6, 2024

@eliben this may allow for some fun projects for your blog series 😄

A new functionality was introduced to allow the setting of a specified
data format (currently Ollama only supports JSON). This is done via the
`WithFormat` option. The change provides more flexibility and control
over the format of data processed by the client.

Moreover, the test `TestWithFormat` has been added to assert the proper
functioning of this new feature.

Using the JSON format allows you to simulate `Functions` using prompt
injection, as it forces Ollama to respond in valid JSON.
Add an example showing how the Ollama `JSON` format in combination with
prompt injection can be used to simulate Functions.
Copy link
Owner

@tmc tmc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tmc tmc merged commit 81643a8 into tmc:main Mar 6, 2024
3 checks passed
@corani corani deleted the corani/ollamajson branch March 7, 2024 01:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants