Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add streaming response support #204

Merged
merged 3 commits into from
Apr 12, 2023
Merged

Add streaming response support #204

merged 3 commits into from
Apr 12, 2023

Conversation

bnusunny
Copy link
Contributor

@bnusunny bnusunny commented Apr 12, 2023

This PR adds support for Lambda Streaming Response. I switched the response compression to tow_http::compression::CompressionLayer, because it supports the streaming body.

Issue #, if available:

close #203

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

This PR adds support for Lambda streaming response. The response compression is switched to tow-http::CompressionLayer, because of better support for streaming body.
@bnusunny bnusunny requested a review from calavera April 12, 2023 11:35
@metaskills
Copy link

First, thanks a ton for this work. I just did my first test with LWA in front of an Express server and it worked amazingly well.

Screenshot 2023-08-06 at 5 04 36 PM

I have a question is there an explicit interface to this or does it naturally work with any backend? For example, in an AWS runtime image I would be able to use awslambda.streamifyResponse and do something like this I saw from @aidansteele

    responseStream = awslambda.HttpResponseStream.from(responseStream, metadata);
    responseStream.write("Streaming with Helper \n");
    await new Promise(r => setTimeout(r, 1000));
    responseStream.write("Hello 0 \n");
    await new Promise(r => setTimeout(r, 1000));
    responseStream.write("Hello 1 \n");
    await new Promise(r => setTimeout(r, 1000));
    responseStream.write("Hello 3 \n");
    await new Promise(r => setTimeout(r, 1000));
    responseStream.end();

Is there a way to use the stream within LWA or is it simple getting everything from the proxied web server in a single batch? Hope that makes sense.

@metaskills
Copy link

Nevermind, it works juts fine! I did this in Express and it worked amazing well. Y'all are great! I'm used to having to write all this up in the Ruby side.

app.get("/stream", async (_req, res) => {
  res.write("Streaming...\n");
  await new Promise((r) => setTimeout(r, 1000));
  res.write("Hello 0 \n");
  await new Promise((r) => setTimeout(r, 1000));
  res.write("Hello 1 \n");
  await new Promise((r) => setTimeout(r, 1000));
  res.write("Hello 2 \n");
  await new Promise((r) => setTimeout(r, 1000)); 
  res.write("Hello 3 \n");
  res.end();
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for response streaming
3 participants