Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The file size exceeds the limitation set for your server. #1674

Open
jiriks74 opened this issue Mar 9, 2022 · 43 comments
Open

The file size exceeds the limitation set for your server. #1674

jiriks74 opened this issue Mar 9, 2022 · 43 comments
Assignees

Comments

@jiriks74
Copy link

jiriks74 commented Mar 9, 2022

What is the current behavior?
Opening odp file in nextcloud that's bigger than 100MB fails, even if I set file limits in default.json and ds-common.conf

If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem.
Open an odp file greater than 100MB (it's the same with pptx)

What is the expected behavior?
File opens

DocumentServer version:
7.0.1

Operating System:
Debian Bullseye

Browser version:
Firefox 97.0.2

@ShockwaveNN
Copy link
Contributor

Hi, you should change limit in several places

ONLYOFFICE/Docker-DocumentServer#354 (comment)

I think this would help you

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

@ShockwaveNN

Hi, you should change limit in several places

ONLYOFFICE/Docker-DocumentServer#354 (comment)

I think this would help you

Well, I backed up my default.json and copied the one in documentserver-example (I did modifications so the script wouldn't work). I ran the script and now nginx doesn't want to start. The script put client_max_body_size in completly wrong place in nginx.conf. I fixed it and It seems running now. But I still get the max filesize error...

@ShockwaveNN
Copy link
Contributor

Could you share a link to your file, I think some convert limits should be increased too, but I need your file to be sure

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

Could you share a link to your file, I think some convert limits should be increased too, but I need your file to be sure

May have found the problem. The server is behind proxy. I have set the limit to 1G for it, but it wasn't in the location / directive. Changed it. looks like it works, but I can't test rn as I'm doing the docker image for RPI and it takes all the resources from the server. I'll update you once I know the results

@ShockwaveNN
Copy link
Contributor

I think it's a really bad idea to try to convert 100mb file on RPI

Those files require a lot of process power even on desktop PC, so I think a lot of troubles may raise

If you share file - I can check it on usual x86, but sorry - we officially do not support arm version so you're on your own if x86 version working and arm is not

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

I think it's a really bad idea to try to convert 100mb file on RPI

Those files require a lot of process power even on desktop PC, so I think a lot of troubles may raise

If you share file - I can check it on usual x86, but sorry - we officially do not support arm version so you're on your own if x86 version working and arm is not

I'm totally aware of that. The problem is that I don't know the config files were well, so I don't know what to change.

I'll post the file once I have time to do so

Also as the server's running on external HDD and I'm running heavy operations on that poor HDD only office has troubles loading. I'll see once the packages are done installing.

PS: I should really get a SSD...

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

I think it's a really bad idea to try to convert 100mb file on RPI

Those files require a lot of process power even on desktop PC, so I think a lot of troubles may raise

If you share file - I can check it on usual x86, but sorry - we officially do not support arm version so you're on your own if x86 version working and arm is not

Here is the file: https://we.tl/t-O3QESks2zz

I'm running it in docker container rn. I tried the script again without any help. The script doesn't change client_max_body_size in /etc/onlyoffice/documentserver/nginx/include/ds-common.conf but increasing that to 5000M doesn't help either. The convert limits are changed by the script, if they're FileConverter:inputLimits:type:zip:uncompressed the're increased to around 500M and maxDownloadBytes is increased as well.

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

Ehm... that thing just opened. I ran the script, modified /etc/nginx/nginx.conf as the client_max_body_size wasn't in proper place and I increased it in /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf. I'll recreate the container and post the steps I had to do with a modified version of the script

@jiriks74
Copy link
Author

jiriks74 commented Mar 9, 2022

So this worked for me:

#!/usr/bin/env bash

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json

sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json

sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf

service nginx restart
supervisorctl restart all

PS: The performance is surprisingly good.

PPS:But I have problems with setting up secret key in nextcloud. It refuses to download the test file on setup. I'll probably make another issue as I cannot find any solutions for it.

@jiriks74 jiriks74 closed this as completed Mar 9, 2022
@jiriks74
Copy link
Author

PS: the file is so big that it crashes my phone's browser and sometimes it even restarts the UI lol

@ShockwaveNN
Copy link
Contributor

PS: the file is so big that it crashes my phone's browser and sometimes it even restarts the UI lol

Yeah it may be, if file is too big I don't think we got a choice to render it somehow with less memory usage

I see this file is not very complicated but has a huge jpg images inside it, no point of using 5mb jpg images in presentation...

@jiriks74
Copy link
Author

PS: the file is so big that it crashes my phone's browser and sometimes it even restarts the UI lol

Yeah it may be, if file is too big I don't think we got a choice to render it somehow with less memory usage

I see this file is not very complicated but has a huge jpg images inside it, no point of using 5mb jpg images in presentation...

Yup, but I can't tell that to my teacher XD. They don't really understand PCs. It's for my maturita exam and they print these images on paper an they probably just didn't care to downscale them. Anyway it probably isn't that big of a deal as I can open it in the mobile app just fine and after reopening Firefox on the phone it's struggling, but it doesn't crash. I may have not enough ram left or Firefox may have been restricted by android, but it's working.

Also I've modified the script you've mentioned so it will change everything, so you can just run it and it works.

I'm thinking of adding it to my repo and making it so that you can edit some variables in docker-compose.yml and it will run the script on container creation. It would be only one time thing and to change the values again you'd have to recreate the container but that's less of a hassle than getting into the container, opening all configuration files and figuring out what to change or being able to find the script in the first place, getting into the container, creating file, pasting the script and running the script.

I don't know docker-compose and dockerfile that much, but I'd figure it out

What do you think?

@ShockwaveNN
Copy link
Contributor

I think creating some kind of all-for-one script is a good idea
Really not sure if there will be a lot of users of it, but since you're in school right now - I think it will be a great experience for you and for your future career )

@jiriks74
Copy link
Author

jiriks74 commented Mar 10, 2022

I think creating some kind of all-for-one script is a good idea
Really not sure if there will be a lot of users of it, but since you're in school right now - I think it will be a great experience for you and for your future career )

Yeah, I'm thinking of something like this in the docker-compose.yml

SetLargerLimits = true
MaxSizeXML = xM
MaxSizePPT = yM
MaxSizeDOC = zM
MaxDownloadSize = (maxDownloadBytes  in default.json if I'm correct)
MaxDownloadSizeNGINX = (probably = MaxDownlosdSize so it wouldn't be needed but I'd have to look at that)

This would allow you to set the limits individually like in default.jsom and nginx would be probably set to the same limit as the max download bytes in default.json as if a file would be larger than allowed in default.json it would get blocked by only office anyways so no need for larger limits in nginx.

I think that this would be simple enough and setting it on container creation should be enough IMO. The script would create a lock file so it wouldn't run again and mess up configuration files and to change the limits you'd recreate the container.

@ShockwaveNN
Copy link
Contributor

I don't think where is a point to make limits different for each file type, like I think if you have a lot of huge pptx chances that same teachers make huge docx also, so one same (big) limit for all types of files is good

@jiriks74
Copy link
Author

I don't think where is a point to make limits different for each file type, like I think if you have a lot of huge pptx chances that same teachers make huge docx also, so one same (big) limit for all types of files is good

Good point. I just wanted to preserve the options you have for configuring. Even the script you sent the link to has one limit for ppt (500M) another for xml (300M) and another for docx (500M) - don't quote me on the numbers, but there were set differently.

I honestly don't know the reasons for the disfrent limitations that are set in the project, if xml files are better compressed as they're only text they could me 20% smaller than the same size compressed pptx, etc. I don't want to mess around with things I don't understand, so that was the mindset behind it.

So we'd be left with something like:

MaxDownloadSize = x
MaxUncompressedSize = y 

@suishaojian
Copy link

So this worked for me:

#!/usr/bin/env bash

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json

sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json

sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf

service nginx restart
supervisorctl restart all

PS: The performance is surprisingly good.

PPS:But I have problems with setting up secret key in nextcloud. It refuses to download the test file on setup. I'll probably make another issue as I cannot find any solutions for it.

So this worked for me:

#!/usr/bin/env bash

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json

sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json

sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf

service nginx restart
supervisorctl restart all

PS: The performance is surprisingly good.

PPS:But I have problems with setting up secret key in nextcloud. It refuses to download the test file on setup. I'll probably make another issue as I cannot find any solutions for it.

Hi jiri,
I ran the script just like what you post to reconfig some configs.. and then restart the container. but it won't work... and I still got the limit error ... should I do something else ?

@jiriks74
Copy link
Author

@suishaojian
You have to do docker exec -it onlyoffice /bin/bash then something like nano /app/ds/largeFiles paste it in and sh /app/ds/largeFiles.sh and that's it. No restart needed.

You don't have to restart the container.

Did you do docker-compose down and docker-compose up -d?
This will remove all the changes you did in the container leaving only data in folders setup as volumes.

@suishaojian
Copy link

@suishaojian You have to do docker exec -it onlyoffice /bin/bash then something like nano /app/ds/largeFiles paste it in and sh /app/ds/largeFiles.sh and that's it. No restart needed.

You don't have to restart the container.

Did you do docker-compose down and docker-compose up -d? This will remove all the changes you did in the container leaving only data in folders setup as volumes.

No... not works .. I did not run docker-compose command , and I execute each line separately like bellow :
image
and then restart nginx and supervisorctl ..

I want to make it clear that the version are as bellow:

image

@jiriks74
Copy link
Author

I see that the line 46 and 36 are the same. I suggest recreating the container as this script can be ran only once, it will mess things up if it's ran twice, and pasting it into a file as I wrote above. If this won't work I'll try to find another solution

@suishaojian
Copy link

I see that the line 46 and 36 are the same. I suggest recreating the container as this script can be ran only once, it will mess things up if it's ran twice, and pasting it into a file as I wrote above. If this won't work I'll try to find another solution

Ok. I did what you told me to do:create a new container and follow your shell script exactly, but unfortunately still got an error.
image
image

@suishaojian
Copy link

I see that the line 46 and 36 are the same. I suggest recreating the container as this script can be ran only once, it will mess things up if it's ran twice, and pasting it into a file as I wrote above. If this won't work I'll try to find another solution

Ok. I did what you told me to do:create a new container and follow your shell script exactly, but unfortunately still got an error. image image

@jiriks74 sorry to bother you ..could you pls offer any idea ? I'm stuck on this thing now .... thankyou so much.

@jiriks74
Copy link
Author

jiriks74 commented Mar 17, 2022

@suishaojian
How large Is the file? Would it be possible for you to send it to me, so I could try it out?

Could you also show me your docker-compose.yml (without secrets of course)?

@suishaojian
Copy link

@suishaojian How large Is the file? Would it be possible for you to send it to me, so I could try it out?

Could you also show me your docker-compose.yml (without secrets of course)?

I'm sorry, but I tried another larger file and found it worked... Which means it's probably my big old powerpoint's problem. Let me check the file again. Thank you for your assistance. So far, it's not a configuration problem.

@jiriks74
Copy link
Author

@suishaojian How large Is the file? Would it be possible for you to send it to me, so I could try it out?

Could you also show me your docker-compose.yml (without secrets of course)?

I'm sorry, but I tried another larger file and found it worked... Which means it's probably my big old powerpoint's problem. Let me check the file again. Thank you for your assistance. So far, it's not a configuration problem.

What are the file types?

@suishaojian
Copy link

@suishaojian How large Is the file? Would it be possible for you to send it to me, so I could try it out?
Could you also show me your docker-compose.yml (without secrets of course)?

I'm sorry, but I tried another larger file and found it worked... Which means it's probably my big old powerpoint's problem. Let me check the file again. Thank you for your assistance. So far, it's not a configuration problem.

What are the file types?

PPT. The strange situation is that after I re-edit the PPT and upload it, it can be edited online normally. I am sorry that it is not convenient to send to you because it is an internal document of the company. I can only assume that there is a problem with this file.

@jiriks74
Copy link
Author

I understand it's not sendable, you don't have to be sorry about that.

From what I understand, the files can/are compressed so maybe it was too large after decompressing? But the limit is at 5gb for that.

Or it could just be some error that appeared when saving the file (checksum mismatch maybe?) that got solved by opening it in desktop app and saving it again.

Why do you still use old formats btw? The best compatibility is with the .***x files.

Take everything with grain of salt, I don't really understand the server or the format, these are my assumptions from what I know about them.

@suishaojian
Copy link

I understand it's not sendable, you don't have to be sorry about that.

From what I understand, the files can/are compressed so maybe it was too large after decompressing? But the limit is at 5gb for that.

Or it could just be some error that appeared when saving the file (checksum mismatch maybe?) that got solved by opening it in desktop app and saving it again.

Why do you still use old formats btw? The best compatibility is with the .***x files.

Take everything with grain of salt, I don't really understand the server or the format, these are my assumptions from what I know about them.

Ok, I understand. Actually, I didn't write it down clearly, I did use PPTX files. The file in question is 40MB in size. So far only that file has problems. I can edit files over 50M without any problems. I didn't actually see any problems in the logs. now, I can only continue to monitor whether there will be other large files with similar problems.

@jiriks74
Copy link
Author

@suishaojian

This script allows me to open odt presentations from my teacher which are over 100MB (and my raspberry pi is able to run that thing pretty well LOL) so you'll be alright for twice the size probably.

@suishaojian
Copy link

@suishaojian

This script allows me to open odt presentations from my teacher which are over 100MB (and my raspberry pi is able to run that thing pretty well LOL) so you'll be alright for twice the size probably.

That would be great!!! I'll go find a file that's over 100M and try it out.

@suishaojian
Copy link

Well , I got another problem :
When I tried to open the historical version at 512KB/s download speed, I got the following problem and finally reported an error.
image
image

But I had no problem at 1.25MB/s download speed. From this I can tell that Internet speed is a factor. But I was wondering if there was anything I didn't configure? Why can I edit a document online, but I can't click on the historical version at a very low Internet speed?

I have not found the corresponding issues. If there are similar issues, please send me the link. Thank you very much.

@jiriks74
Copy link
Author

If you'd like I'm maintaining a Docker image that has the option for larger files built in and can be set by a variable in docker-compose.yml. Everything can be found on my GitHub repository if you're interested in how it works.

@jiriks74
Copy link
Author

@ShockwaveNN
I've successfully added the option for larger files to my Docker image and docker-compose.yml. If you'd like to take a look, I've made some modifications to run-document-server.sh to make it work. I'm unable to have the limit set by a variable RN, it's the one used in the script that was run from inside the container. Maybe I'll be able to mess around with variables in the strings, but I think that it's unnecessary as this max size shouldn't be reached really....

@ShockwaveNN
Copy link
Contributor

@jiriks74 Looks good for me, really hope someone finds this usefully, thanks

P.S. I don't think we'll change our default limits right now, because they are chosen to be in the best ratio of performance\file opening support so bigger limits may results more troubles for some of our customers, so it's up to admins to change those limits for specific user scenarios

@jiriks74
Copy link
Author

@ShockwaveNN
I didn't change the defaults and I understand that you chose them for a reason. I just added a - LARGER_FILE_LIMITS='true' to the docker-compose.yml (and added a script that's triggered by this variable to run-document-server.sh) so that it's not as hard to make the limits larger. You just uncomment the variable, do docker-compose up and you have file limits that you shouldn't hit (the ones we put into the simple script you had to run in the container).

The point of this is that you now don't have to run the script inside the container every time you recreate it which is pretty annoying (if you do the task repetitively all the time, just make a script, right?).

I'm thinking of also adding a variable to regenerate all presentation themes. I'd have to think about that as this would make the startup process way longer. I'd leave that to the user probably. I'd block the port or shut down the server when it would start generating so there would be no data loses.

@ShockwaveNN
Copy link
Contributor

Sorry if you get me wrong

The point of this is that you now don't have to run the script inside the container every time you recreate it which is pretty annoying (if you do the task repetitively all the time, just make a script, right?).

Yeah I totally agree with you, I've just clarified that our defaults will stay same in future

I'm thinking of also adding a variable to regenerate all presentation themes.

There is one already, GENERATE_FONTS - ignore the name )

If it false - no fonts or presentation themes are generated

@jiriks74
Copy link
Author

Sorry if you get me wrong

The point of this is that you now don't have to run the script inside the container every time you recreate it which is pretty annoying (if you do the task repetitively all the time, just make a script, right?).

Yeah I totally agree with you, I've just clarified that our defaults will stay same in future

I'm thinking of also adding a variable to regenerate all presentation themes.

There is one already, GENERATE_FONTS - ignore the name )

If it false - no fonts or presentation themes are generated

Ok, but I still have to manually trigger generating presentation themes when I recreate the container. Is there something wrong?

@ShockwaveNN
Copy link
Contributor

Ok, but I still have to manually trigger generating presentation themes when I recreate the container. Is there something wrong?

You should generate presentation themes each time if GENREATE_FONTS is false and you didn't changed list of themes, so I need some clarification on the issue

@jiriks74
Copy link
Author

Hello, it seems that there was a change that broke the script to set larger file limits:

The script from above
#!/usr/bin/env bash

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json

sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json

sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf

service nginx restart
supervisorctl restart all

I tried to run it multiple times and in a new container. The values are changed so there's probably an extra variable or something. I was looking through my proxy config (the documentserver is behind a nginx proxy) and the body size was set to 1000M.

It behaves as if the default limits are still set - prezentations cca over 60-70MB won't open

@jiriks74 jiriks74 reopened this Mar 15, 2023
@jiriks74
Copy link
Author

Nevermind, it was proxy timeout as the file took too long to download. Would be useful if the error message reflected that rather than talking about file limits. @ShockwaveNN

@webdesign7
Copy link

Hello, I am still having this issue ! :(

i am trying to open a PDF of 2.4GB and I managed to increase the limits and I am not getting the size limit error

so the file loads .. ( shows the loader a few minutes ) then I am getting this error popup:

Screenshot 2024-03-14 at 18 51 43

I span a xlarge instance on AWS for this

Any other ideas?

@jiriks74
Copy link
Author

Hello, I am still having this issue ! :(

i am trying to open a PDF of 2.4GB and I managed to increase the limits and I am not getting the size limit error

so the file loads .. ( shows the loader a few minutes ) then I am getting this error popup:

Screenshot 2024-03-14 at 18 51 43

I span a xlarge instance on AWS for this

Any other ideas?

It is probably not the same issue since you don't get the file limit error..

Check your server's resources - does it have enough RAM and CPU power to manage such a large file? If you're using something like a Raspberry Pi it will not be enough.

I think that it's in your best interest to open a separate issue as it's most likely not related.

@jiriks74
Copy link
Author

jiriks74 commented Apr 9, 2024

After upgrading to 8.0.1 I get this issue again. I've tried running the file increase script in this comment again and it no longer works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants