Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Utilize console logs for log drains #107

Closed
wants to merge 8 commits into from

Conversation

dasfmi
Copy link
Collaborator

@dasfmi dasfmi commented Feb 1, 2023

Closes DX-467;

introduced transports, where the logger can decide which transport to use based on the environment variables.
a special transport for LogDrain support is included, to improve logging performance on lambda/edge functions running on Vercel platform where the integration is enabled. Users who would like to use this should export an environment variable to use that transport.

export ENABLE_AXIOM_LOG_DRAIN=true

TODO:

  • find a way to attach response status to console logs

in order to be consisted with other SDKs and repo, I am
renaming the check-format script and CI job to lint. Its more
easier to remember and use accros repos.
@vercel
Copy link

vercel bot commented Feb 1, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated
next-axiom-example ✅ Ready (Inspect) Visit Preview 💬 Add your feedback Mar 20, 2023 at 5:00PM (UTC)

@dasfmi dasfmi marked this pull request as ready for review February 2, 2023 08:54
@dasfmi dasfmi requested a review from bahlo February 2, 2023 08:58
@dasfmi
Copy link
Collaborator Author

dasfmi commented Feb 9, 2023

@bahlo I think we should sacrifice the response status being part of every log, instead it would be part of the report at the end. or maybe as a separate log if needed.

@dasfmi
Copy link
Collaborator Author

dasfmi commented Feb 9, 2023

@bahlo I see that the lambda functions produce a report with status code in there, so maybe that's enough, we could implement the same for edge functions and other backend functions as well. This way we don't hold the logs till the end of the function, and we don't slow down the execution.

{
  "report": {
    "durationMs": 585.48,
    "maxMemoryUsedMb": 94
  },
  "request": {
    "host": "next-axiom-integration-test.vercel.app",
    "id": "fra1::iad1::xhzvt-1675955393623-726036485fe3",
    "method": "GET",
    "path": "/api/api_log",
    "scheme": "https",
    "statusCode": 200,

  },
  "vercel": {

    "route": "/api/api_log",
    "source": "lambda",

  }
}

@dasfmi
Copy link
Collaborator Author

dasfmi commented Feb 10, 2023

@bahlo I see that the lambda functions produce a report with status code in there, so maybe that's enough, we could implement the same for edge functions and other backend functions as well. This way we don't hold the logs till the end of the function, and we don't slow down the execution.

{
  "report": {
    "durationMs": 585.48,
    "maxMemoryUsedMb": 94
  },
  "request": {
    "host": "next-axiom-integration-test.vercel.app",
    "id": "fra1::iad1::xhzvt-1675955393623-726036485fe3",
    "method": "GET",
    "path": "/api/api_log",
    "scheme": "https",
    "statusCode": 200,

  },
  "vercel": {

    "route": "/api/api_log",
    "source": "lambda",

  }
}

I pushed a new commit to test separation of reports from the logger, now the logger should start throttle logs right away, it won't wait until the function returns.

separate the function reports from the logger
Copy link
Member

@bahlo bahlo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

README.md Outdated Show resolved Hide resolved
Co-authored-by: Arne Bahlo <[email protected]>
@dasfmi
Copy link
Collaborator Author

dasfmi commented Mar 20, 2023

@bahlo I am hesitant about the flush calls I removed, I will put them back.

as regarding the payload, we still need to implement something in the backend to parse the incoming JSON. this is an example of the payload received by this PR:

{
  "level": "info",
  "message": "{\"level\":\"error\",\"message\":\"NEXT_AXIOM::API_LOG\",\"_time\":\"2023-03-20T17:08:33.281Z\",\"fields\":{\"filename\":\"api_log.ts\"},\"vercel\":{\"environment\":\"production\",\"region\":\"iad1\",\"source\":\"lambda\",\"route\":\"/api/api_log\"},\"request\":{\"startTime\":1679332113281,\"path\":\"/api/api_log\",\"method\":\"GET\",\"host\":\"next-axiom-integration-test.vercel.app\",\"userAgent\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36\",\"scheme\":\"https\",\"ip\":\"****\",\"region\":\"iad1\"}}",
  "request": {
    "host": "next-axiom-integration-test.vercel.app",
    "id": "lhr1::iad1::znfrf-1679332111452-6f9bf7f7fa48",
    "ip": "105.33.205.37",
    "method": "GET",
    "path": "/api/api_log",
    "scheme": "https",
    "statusCode": 304,
    "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36"
  },
  "vercel": {
    "deploymentId": "dpl_Aj7SLvL9kLCkCFKMhfBpmQQnZzJp",
    "deploymentURL": "next-axiom-integration-test-oytkaxazq-islam-axiomco.vercel.app",
    "environment": "production",
    "projectId": "prj_lE4FKQwJP2j6OHU64KI4W4URp3kV",
    "projectName": "next-axiom-integration-test",
    "region": "lhr1",
    "route": "/api/api_log",
    "source": "lambda-log",
    "userName": "islam-axiomco"
  }
}

is there a better way other than prepending a text?

@dasfmi dasfmi closed this May 22, 2023
@JannikWempe
Copy link

JannikWempe commented Jun 28, 2023

@schehata sad seeing this being closed. I'd love to see those changes in next-axiom. Especially the ability to basically choose the transport and just utilizing the log drain instead of using fetch.

The issue I have recently posted (#128) would probably be resolved if I could use the log drain instead of sending the logs via fetch.

I have also asked a question on Discord that is related to these changes.

I am curious about why this effort has been stopped.

Thanks for the effort though 🙏🏼

@dasfmi
Copy link
Collaborator Author

dasfmi commented Jul 5, 2023

hey @JannikWempe, thanks for reaching out, really appreciate your feedback on this. We stopped because of the 4kb limit on vercel, but we clearly see the performance win. I will put this back on track and will tackle it soon. right now we are focusing on supporting Next.js 13, after that I will have the time to visit this feature again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants