Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for not having to chunk decode answer from server #7

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

X-Ryl669
Copy link
Contributor

@X-Ryl669 X-Ryl669 commented Aug 9, 2016

By default, any HTTP/1.1 client must accept chunked encoded answer. For HTTP/1.0, it's not mandatory, and downgrading HTTP version to 1.0 is a lot less work than implementing chunked decoding on the ESP8266.

Also reduce UART speed used so it's possible to capture it via a serial terminal and UART adapter (115200 bauds is a standard rate for terminal, 460800 is not supported on mine). UART is not used anymore anyway for data transfer so it's a safe move.

Reduce UART speed used so it's possible to capture it via a serial terminal and UART adapter
@davidgfnet
Copy link
Owner

The server already takes care of not sending chunked responses. I agree that using HTTP1.0 works in any case but it seems a bit dodgy to me. Also unrelated change (baud rate change!)

@X-Ryl669
Copy link
Contributor Author

X-Ryl669 commented Aug 9, 2016

It does for Apache because it does not chunk if the content length is known beforehand (this is not the case with the other servers however).

Because the compression is not known beforehand, it's not very clean to set up a fixed content-length there. With the upcoming patches to remove the "fixed" content-length server side, this is required to avoid chunking.

Right, without the baud rate change, it can't debug the stuff, but I can split it in 2 PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants