-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The file size exceeds the limitation set for your server. #1674
Comments
Hi, you should change limit in several places ONLYOFFICE/Docker-DocumentServer#354 (comment) I think this would help you |
Well, I backed up my |
Could you share a link to your file, I think some convert limits should be increased too, but I need your file to be sure |
May have found the problem. The server is behind proxy. I have set the limit to 1G for it, but it wasn't in the |
I think it's a really bad idea to try to convert 100mb file on RPI Those files require a lot of process power even on desktop PC, so I think a lot of troubles may raise If you share file - I can check it on usual x86, but sorry - we officially do not support arm version so you're on your own if x86 version working and arm is not |
I'm totally aware of that. The problem is that I don't know the config files were well, so I don't know what to change. I'll post the file once I have time to do so Also as the server's running on external HDD and I'm running heavy operations on that poor HDD only office has troubles loading. I'll see once the packages are done installing. PS: I should really get a SSD... |
Here is the file: https://we.tl/t-O3QESks2zz I'm running it in docker container rn. I tried the script again without any help. The script doesn't change |
Ehm... that thing just opened. I ran the script, modified |
So this worked for me: #!/usr/bin/env bash
sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json
sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf
sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf
service nginx restart
supervisorctl restart all PS: The performance is surprisingly good. PPS:But I have problems with setting up secret key in nextcloud. It refuses to download the test file on setup. I'll probably make another issue as I cannot find any solutions for it. |
PS: the file is so big that it crashes my phone's browser and sometimes it even restarts the UI lol |
Yeah it may be, if file is too big I don't think we got a choice to render it somehow with less memory usage I see this file is not very complicated but has a huge jpg images inside it, no point of using 5mb jpg images in presentation... |
Yup, but I can't tell that to my teacher XD. They don't really understand PCs. It's for my maturita exam and they print these images on paper an they probably just didn't care to downscale them. Anyway it probably isn't that big of a deal as I can open it in the mobile app just fine and after reopening Firefox on the phone it's struggling, but it doesn't crash. I may have not enough ram left or Firefox may have been restricted by android, but it's working. Also I've modified the script you've mentioned so it will change everything, so you can just run it and it works. I'm thinking of adding it to my repo and making it so that you can edit some variables in I don't know docker-compose and dockerfile that much, but I'd figure it out What do you think? |
I think creating some kind of |
Yeah, I'm thinking of something like this in the
This would allow you to set the limits individually like in I think that this would be simple enough and setting it on container creation should be enough IMO. The script would create a lock file so it wouldn't run again and mess up configuration files and to change the limits you'd recreate the container. |
I don't think where is a point to make limits different for each file type, like I think if you have a lot of huge pptx chances that same teachers make huge docx also, so one same (big) limit for all types of files is good |
Good point. I just wanted to preserve the options you have for configuring. Even the script you sent the link to has one limit for ppt (500M) another for xml (300M) and another for docx (500M) - don't quote me on the numbers, but there were set differently. I honestly don't know the reasons for the disfrent limitations that are set in the project, if xml files are better compressed as they're only text they could me 20% smaller than the same size compressed pptx, etc. I don't want to mess around with things I don't understand, so that was the mindset behind it. So we'd be left with something like:
|
Hi jiri, |
@suishaojian You don't have to restart the container. Did you do |
No... not works .. I did not run docker-compose command , and I execute each line separately like bellow : I want to make it clear that the version are as bellow: |
I see that the line 46 and 36 are the same. I suggest recreating the container as this script can be ran only once, it will mess things up if it's ran twice, and pasting it into a file as I wrote above. If this won't work I'll try to find another solution |
@jiriks74 sorry to bother you ..could you pls offer any idea ? I'm stuck on this thing now .... thankyou so much. |
@suishaojian Could you also show me your |
I'm sorry, but I tried another larger file and found it worked... Which means it's probably my big old powerpoint's problem. Let me check the file again. Thank you for your assistance. So far, it's not a configuration problem. |
What are the file types? |
PPT. The strange situation is that after I re-edit the PPT and upload it, it can be edited online normally. I am sorry that it is not convenient to send to you because it is an internal document of the company. I can only assume that there is a problem with this file. |
I understand it's not sendable, you don't have to be sorry about that. From what I understand, the files can/are compressed so maybe it was too large after decompressing? But the limit is at 5gb for that. Or it could just be some error that appeared when saving the file (checksum mismatch maybe?) that got solved by opening it in desktop app and saving it again. Why do you still use old formats btw? The best compatibility is with the Take everything with grain of salt, I don't really understand the server or the format, these are my assumptions from what I know about them. |
Ok, I understand. Actually, I didn't write it down clearly, I did use PPTX files. The file in question is 40MB in size. So far only that file has problems. I can edit files over 50M without any problems. I didn't actually see any problems in the logs. now, I can only continue to monitor whether there will be other large files with similar problems. |
This script allows me to open odt presentations from my teacher which are over 100MB (and my raspberry pi is able to run that thing pretty well LOL) so you'll be alright for twice the size probably. |
That would be great!!! I'll go find a file that's over 100M and try it out. |
If you'd like I'm maintaining a Docker image that has the option for larger files built in and can be set by a variable in |
@ShockwaveNN |
@jiriks74 Looks good for me, really hope someone finds this usefully, thanks P.S. I don't think we'll change our default limits right now, because they are chosen to be in the best ratio of performance\file opening support so bigger limits may results more troubles for some of our customers, so it's up to admins to change those limits for specific user scenarios |
@ShockwaveNN The point of this is that you now don't have to run the script inside the container every time you recreate it which is pretty annoying (if you do the task repetitively all the time, just make a script, right?). I'm thinking of also adding a variable to regenerate all presentation themes. I'd have to think about that as this would make the startup process way longer. I'd leave that to the user probably. I'd block the port or shut down the server when it would start generating so there would be no data loses. |
Sorry if you get me wrong
Yeah I totally agree with you, I've just clarified that our defaults will stay same in future
There is one already, If it false - no fonts or presentation themes are generated |
Ok, but I still have to manually trigger generating presentation themes when I recreate the container. Is there something wrong? |
You should generate presentation themes each time if GENREATE_FONTS is false and you didn't changed list of themes, so I need some clarification on the issue |
Hello, it seems that there was a change that broke the script to set larger file limits: The script from above#!/usr/bin/env bash
sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json
sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf
sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i 's/^client_max_body_size 100m;$/client_max_body_size 1000m;/' /etc/onlyoffice/documentserver/nginx/includes/ds-common.conf
service nginx restart
supervisorctl restart all I tried to run it multiple times and in a new container. The values are changed so there's probably an extra variable or something. I was looking through my proxy config (the documentserver is behind a nginx proxy) and the body size was set to It behaves as if the default limits are still set - prezentations cca over 60-70MB won't open |
Nevermind, it was proxy timeout as the file took too long to download. Would be useful if the error message reflected that rather than talking about file limits. @ShockwaveNN |
After upgrading to 8.0.1 I get this issue again. I've tried running the file increase script in this comment again and it no longer works. |
What is the current behavior?
Opening odp file in nextcloud that's bigger than 100MB fails, even if I set file limits in
default.json
andds-common.conf
If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem.
Open an odp file greater than 100MB (it's the same with pptx)
What is the expected behavior?
File opens
DocumentServer version:
7.0.1
Operating System:
Debian Bullseye
Browser version:
Firefox 97.0.2
The text was updated successfully, but these errors were encountered: