Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ feat(ollama): improve connection check method and provide selector for user to control model options #1397

Merged
merged 10 commits into from
Mar 15, 2024

Conversation

sjy
Copy link
Contributor

@sjy sjy commented Feb 27, 2024

💻 变更类型 | Change Type

  • ✨ feat port ollama-js with browser req support, enable donwload uninstalled model via pull service (client req)
  • 🐛 fix ollama service check (client req)
  • ♻️ refactor
  • 💄 style
  • 🔨 chore
  • ⚡️ perf
  • 📝 docs

🔀 变更说明 | Description of Change

📝 补充信息 | Additional Information

[ollamajs updated (browser support)]https://github.com/ollama/ollama-js/releases/tag/v0.4.9
image

image image

Copy link

vercel bot commented Feb 27, 2024

Someone is attempting to deploy a commit to the LobeHub Team on Vercel.

A member of the Team first needs to authorize it.

@lobehubbot
Copy link
Member

👍 @sjy

Thank you for raising your pull request and contributing to our Community
Please make sure you have followed our contributing guidelines. We will review it as soon as possible.
If you encounter any problems, please feel free to connect with us.
非常感谢您提出拉取请求并为我们的社区做出贡献,请确保您已经遵循了我们的贡献指南,我们会尽快审查它。
如果您遇到任何问题,请随时与我们联系。

@sjy sjy changed the title Fix/ollama checker fix(ollama): change connection check way and provide selector for user to control model options Feb 27, 2024
Copy link

codecov bot commented Feb 27, 2024

Codecov Report

Attention: Patch coverage is 81.00000% with 19 lines in your changes are missing coverage. Please review.

Project coverage is 92.96%. Comparing base (e472d6e) to head (fa8847c).

Files Patch % Lines
src/services/ollama.ts 70.31% 19 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1397      +/-   ##
==========================================
- Coverage   93.06%   92.96%   -0.10%     
==========================================
  Files         234      235       +1     
  Lines       12646    12730      +84     
  Branches     1528     1537       +9     
==========================================
+ Hits        11769    11835      +66     
- Misses        877      895      +18     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@arvinxx arvinxx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

还没全看完,粗看了下,有些细节先调整下

src/app/settings/llm/components/Checker.tsx Outdated Show resolved Hide resolved
src/app/settings/llm/components/ModelSelector.tsx Outdated Show resolved Hide resolved
src/services/ollama.ts Outdated Show resolved Hide resolved
@sjy sjy marked this pull request as draft February 27, 2024 08:01
@sjy sjy force-pushed the fix/ollama_checker branch 6 times, most recently from 014bf58 to 57ec878 Compare March 3, 2024 13:29
@sjy sjy requested a review from arvinxx March 3, 2024 13:34
@sjy sjy marked this pull request as ready for review March 3, 2024 13:35
const response = await this.getOllamaClient().list();
return response;
} catch {
response = createErrorResponse(ChatErrorType.ServiceUnavailable, {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

我建议新增一个 Ollama 的 error 类型,然后针对这个类型展示明确的 Ollama 未启动的提示。

image

现在这样看上去对用户来说不太友好,不一定清楚具体发生了啥。真正有用的信息反而是 'please check whether your ollama service is available' 这条

src/app/settings/llm/Ollama/Checker.tsx Show resolved Hide resolved
src/features/Conversation/Error/InvalidOllamaModel.tsx Outdated Show resolved Hide resolved
Copy link
Contributor

@arvinxx arvinxx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

其他我感觉差不多了,后续要调整的我再给改改。

src/app/settings/llm/Ollama/Checker.tsx Outdated Show resolved Hide resolved
src/hooks/useDownloadMonitor.ts Outdated Show resolved Hide resolved
@arvinxx arvinxx changed the title fix(ollama): change connection check way and provide selector for user to control model options ✨ feat(ollama): improve connection check method and provide selector for user to control model options Mar 13, 2024
@arvinxx
Copy link
Contributor

arvinxx commented Mar 15, 2024

优化了一版下载样式:

image

但同样发现了一些问题:

  1. 用户不知道模型本身有多大,不知道当前下载了多少尺寸了;
  2. 如果是中途断点下载,那么下载的速度会不准。
  3. 不支持取消下载;
  4. 下载过程中允许关闭提示可能不太合理。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Optimized the download style of the first version:

image

But some problems were also discovered:

  1. The user does not know how big the model itself is or how many sizes are currently downloaded;
  2. If the download is interrupted midway, the download speed will be inaccurate.
  3. Cancel downloading is not supported;
  4. It may not be reasonable to allow the prompt to be turned off during the download process.

Copy link
Contributor

@arvinxx arvinxx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

上面提到的几个点是进一步的优化项,这一版我觉得可以先合了~

Copy link

vercel bot commented Mar 15, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
lobe-chat ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 15, 2024 2:28pm

@arvinxx arvinxx merged commit 675902f into lobehub:main Mar 15, 2024
4 of 6 checks passed
@lobehubbot
Copy link
Member

❤️ Great PR @sjy ❤️

The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world.
项目的成长离不开用户反馈和贡献,感谢您的贡献! 如果您对 LobeHub 开发者社区感兴趣,请加入我们的 discord,然后私信 @arvinxx@canisminor1990。他们会邀请您加入我们的私密开发者频道。我们将会讨论关于 Lobe Chat 的开发,分享和讨论全球范围内的 AI 消息。

github-actions bot pushed a commit that referenced this pull request Mar 15, 2024
## [Version 0.137.0](v0.136.0...v0.137.0)
<sup>Released on **2024-03-15**</sup>

#### ✨ Features

- **ollama**: Improve connection check method and provide selector for user to control model options.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's improved

* **ollama**: Improve connection check method and provide selector for user to control model options, closes [#1397](#1397) ([675902f](675902f))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
@lobehubbot
Copy link
Member

🎉 This PR is included in version 0.137.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

github-actions bot pushed a commit to bentwnghk/lobe-chat that referenced this pull request Mar 16, 2024
## [Version&nbsp;1.19.0](v1.18.0...v1.19.0)
<sup>Released on **2024-03-16**</sup>

#### ✨ Features

- **ollama**: Improve connection check method and provide selector for user to control model options.
- **misc**: Support groq as a model provider.

#### 🐛 Bug Fixes

- **misc**: Fix rename, Fix URL typo.

#### 💄 Styles

- **misc**: Update Markdown in ChatItem.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's improved

* **ollama**: Improve connection check method and provide selector for user to control model options, closes [lobehub#1397](https://github.com/bentwnghk/lobe-chat/issues/1397) ([675902f](675902f))
* **misc**: Support groq as a model provider, closes [lobehub#1569](https://github.com/bentwnghk/lobe-chat/issues/1569) [lobehub#1562](https://github.com/bentwnghk/lobe-chat/issues/1562) [lobehub#1570](https://github.com/bentwnghk/lobe-chat/issues/1570) ([a04c364](a04c364))

#### What's fixed

* **misc**: Fix rename ([2faf6cf](2faf6cf))
* **misc**: Fix URL typo, closes [lobehub#1590](https://github.com/bentwnghk/lobe-chat/issues/1590) ([97137a9](97137a9))

#### Styles

* **misc**: Update Markdown in ChatItem ([be75549](be75549))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
denvey pushed a commit to denvey/lobe-chat that referenced this pull request Mar 17, 2024
…for user to control model options (lobehub#1397)

* 🐛 fix(ollama): change checker with ollama's tags api

* ✨feat(ollama): add error card to pull model

* 🚚 chore: move files

* 💄 style: update llava logo

* 🐛 fix: add ollama service unavailable error type

* 🐛 fix: ollama show passed with error message exists

* ✨ feat(ollama): add download moniter to show speed and eta remaining time

* 🚨 ci: fix lint

* 💄 style: improve download style

* 🌐 style: add i18n

---------

Co-authored-by: shijianyue <[email protected]>
Co-authored-by: arvinxx <[email protected]>
denvey pushed a commit to denvey/lobe-chat that referenced this pull request Mar 17, 2024
## [Version&nbsp;0.137.0](lobehub/lobe-chat@v0.136.0...v0.137.0)
<sup>Released on **2024-03-15**</sup>

#### ✨ Features

- **ollama**: Improve connection check method and provide selector for user to control model options.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's improved

* **ollama**: Improve connection check method and provide selector for user to control model options, closes [lobehub#1397](lobehub#1397) ([675902f](lobehub@675902f))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
@sjy
Copy link
Contributor Author

sjy commented Mar 21, 2024

优化了一版下载样式:

image 但同样发现了一些问题:
  1. 用户不知道模型本身有多大,不知道当前下载了多少尺寸了;
  2. 如果是中途断点下载,那么下载的速度会不准。
  3. 不支持取消下载;
  4. 下载过程中允许关闭提示可能不太合理。

Create another PR here: https://github.com/lobehub/lobe-chat/pull/1659/checks

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Optimized the download style of version 1:

image But some problems were also found:
  1. The user does not know how big the model itself is or how many sizes are currently downloaded;
  2. If the download is interrupted midway, the download speed will be inaccurate.
  3. Cancel download is not supported;
  4. It may not be reasonable to allow the prompt to be turned off during the download process.

Optimized the download style of version 1:

image But some problems were also found:
  1. The user does not know how big the model itself is or how many sizes are currently downloaded;
  2. If the download is interrupted midway, the download speed will be inaccurate.
  3. Cancel download is not supported;
  4. It may not be reasonable to allow the prompt to be turned off during the download process.

Create another PR here: https://github.com/lobehub/lobe-chat/pull/1659/checks

miroshar-success added a commit to miroshar-success/OpenAI_Integraion_platform that referenced this pull request Apr 5, 2024
## [Version&nbsp;0.137.0](lobehub/lobe-chat@v0.136.0...v0.137.0)
<sup>Released on **2024-03-15**</sup>

#### ✨ Features

- **ollama**: Improve connection check method and provide selector for user to control model options.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's improved

* **ollama**: Improve connection check method and provide selector for user to control model options, closes [#1397](lobehub/lobe-chat#1397) ([675902f](lobehub/lobe-chat@675902f))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants