You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[WARNING] - [2024-12-08 12:34:32] - [channel] caught error request failed with status code: 403 for model llama-3.3-70b-versatile at channel Groq
[INFO] - [2024-12-08 12:34:32] - [channel] channels are exhausted for model llama-3.3-70b-versatile
[WARNING] - [2024-12-08 12:34:32] - request failed with status code: 403 (model: llama-3.3-70b-versatile, client: xx.xx.xx.xx)
谢谢。
The text was updated successfully, but these errors were encountered:
Problem solved by setting up a HTTP proxy on the server itself, then routing all requests through the local proxy. Not sure why it worked, but I'll call it a day.
问题描述
不知为何,所有的Groq模型请求都会产生403错误。其他厂商的模型比如OpenAI, Anthropic Claude没有问题。另外,如果用第三方API中转,也可以正确请求Groq的相关模型,比如llama等;只有官方服务器的请求会报错。
使用MITM抓到的Post请求直接使用curl在服务器上运行,可以得到正确的结果,但是在coai里面就会产生403报错。
Curl请求如下;能在SSH中获得正确回显,因此应该不是IP被屏蔽了。
日志信息
Coai日志报错:
谢谢。
The text was updated successfully, but these errors were encountered: