Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Groq requests failed with status code: 403 #310

Closed
6 tasks done
Har-Kuun opened this issue Dec 8, 2024 · 1 comment
Closed
6 tasks done

Groq requests failed with status code: 403 #310

Har-Kuun opened this issue Dec 8, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Har-Kuun
Copy link

Har-Kuun commented Dec 8, 2024

  • 我已确认目前没有类似 issue
  • 我已确认我已升级到最新版本
  • 我已完整浏览项目 README 和项目文档并未找到解决方案
  • 我理解并愿意跟进此 issue,协助测试和提供反馈
  • 我将以礼貌和尊重的态度提问,不得使用不文明用语 (包括在此发布评论的所有人同样适用, 不遵守的人将被 block)
  • 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

问题描述
不知为何,所有的Groq模型请求都会产生403错误。其他厂商的模型比如OpenAI, Anthropic Claude没有问题。另外,如果用第三方API中转,也可以正确请求Groq的相关模型,比如llama等;只有官方服务器的请求会报错。

使用MITM抓到的Post请求直接使用curl在服务器上运行,可以得到正确的结果,但是在coai里面就会产生403报错。

Curl请求如下;能在SSH中获得正确回显,因此应该不是IP被屏蔽了。

curl -H 'User-Agent: Go-http-client/1.1' -H 'Authorization: Bearer gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' -H 'Content-Type: application/json' --compressed -X POST https://api.groq.com/openai/v1/chat/completions -d '{"model":"llama-3.3-70b-versatile","messages":[{"role":"user","content":"test"}],"max_tokens":2000,"stream":true,"presence_penalty":0,"frequency_penalty":0,"temperature":0.6,"top_p":1}'

日志信息

Coai日志报错:

[WARNING] - [2024-12-08 12:34:32] - [channel] caught error request failed with status code: 403 for model llama-3.3-70b-versatile at channel Groq
[INFO] - [2024-12-08 12:34:32] - [channel] channels are exhausted for model llama-3.3-70b-versatile
[WARNING] - [2024-12-08 12:34:32] - request failed with status code: 403 (model: llama-3.3-70b-versatile, client: xx.xx.xx.xx)

谢谢。

@Har-Kuun Har-Kuun added the bug Something isn't working label Dec 8, 2024
@Har-Kuun
Copy link
Author

Har-Kuun commented Dec 8, 2024

Problem solved by setting up a HTTP proxy on the server itself, then routing all requests through the local proxy. Not sure why it worked, but I'll call it a day.

@Har-Kuun Har-Kuun closed this as completed Dec 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant