-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Utilize console logs for log drains #107
Conversation
in order to be consisted with other SDKs and repo, I am renaming the check-format script and CI job to lint. Its more easier to remember and use accros repos.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
edbd868
to
2ccc4ba
Compare
@bahlo I think we should sacrifice the response status being part of every log, instead it would be part of the report at the end. or maybe as a separate log if needed. |
@bahlo I see that the lambda functions produce a report with status code in there, so maybe that's enough, we could implement the same for edge functions and other backend functions as well. This way we don't hold the logs till the end of the function, and we don't slow down the execution. {
"report": {
"durationMs": 585.48,
"maxMemoryUsedMb": 94
},
"request": {
"host": "next-axiom-integration-test.vercel.app",
"id": "fra1::iad1::xhzvt-1675955393623-726036485fe3",
"method": "GET",
"path": "/api/api_log",
"scheme": "https",
"statusCode": 200,
},
"vercel": {
"route": "/api/api_log",
"source": "lambda",
}
} |
I pushed a new commit to test separation of reports from the logger, now the logger should start throttle logs right away, it won't wait until the function returns. |
79d70d1
to
adafce1
Compare
adafce1
to
1ac88af
Compare
separate the function reports from the logger
1ac88af
to
20dd7b9
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Co-authored-by: Arne Bahlo <[email protected]>
@bahlo I am hesitant about the flush calls I removed, I will put them back. as regarding the payload, we still need to implement something in the backend to parse the incoming JSON. this is an example of the payload received by this PR: {
"level": "info",
"message": "{\"level\":\"error\",\"message\":\"NEXT_AXIOM::API_LOG\",\"_time\":\"2023-03-20T17:08:33.281Z\",\"fields\":{\"filename\":\"api_log.ts\"},\"vercel\":{\"environment\":\"production\",\"region\":\"iad1\",\"source\":\"lambda\",\"route\":\"/api/api_log\"},\"request\":{\"startTime\":1679332113281,\"path\":\"/api/api_log\",\"method\":\"GET\",\"host\":\"next-axiom-integration-test.vercel.app\",\"userAgent\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36\",\"scheme\":\"https\",\"ip\":\"****\",\"region\":\"iad1\"}}",
"request": {
"host": "next-axiom-integration-test.vercel.app",
"id": "lhr1::iad1::znfrf-1679332111452-6f9bf7f7fa48",
"ip": "105.33.205.37",
"method": "GET",
"path": "/api/api_log",
"scheme": "https",
"statusCode": 304,
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36"
},
"vercel": {
"deploymentId": "dpl_Aj7SLvL9kLCkCFKMhfBpmQQnZzJp",
"deploymentURL": "next-axiom-integration-test-oytkaxazq-islam-axiomco.vercel.app",
"environment": "production",
"projectId": "prj_lE4FKQwJP2j6OHU64KI4W4URp3kV",
"projectName": "next-axiom-integration-test",
"region": "lhr1",
"route": "/api/api_log",
"source": "lambda-log",
"userName": "islam-axiomco"
}
} is there a better way other than prepending a text? |
@schehata sad seeing this being closed. I'd love to see those changes in The issue I have recently posted (#128) would probably be resolved if I could use the log drain instead of sending the logs via I have also asked a question on Discord that is related to these changes. I am curious about why this effort has been stopped. Thanks for the effort though 🙏🏼 |
hey @JannikWempe, thanks for reaching out, really appreciate your feedback on this. We stopped because of the 4kb limit on vercel, but we clearly see the performance win. I will put this back on track and will tackle it soon. right now we are focusing on supporting Next.js 13, after that I will have the time to visit this feature again. |
Closes DX-467;
introduced transports, where the logger can decide which transport to use based on the environment variables.
a special transport for LogDrain support is included, to improve logging performance on lambda/edge functions running on Vercel platform where the integration is enabled. Users who would like to use this should export an environment variable to use that transport.
TODO: