Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File size exceeds the limitation #1812

Closed
codesmaker opened this issue Jun 30, 2022 · 6 comments
Closed

File size exceeds the limitation #1812

codesmaker opened this issue Jun 30, 2022 · 6 comments
Labels
waiting feedback Issues that we waiting to be answered from author of issue

Comments

@codesmaker
Copy link

codesmaker commented Jun 30, 2022

Hi,

I have OnlyOffice Document Server integrated with Nextcloud. When I try to open big document files in Nextcloud (~70 MB), I get the following error:
image

Here is what I have in the 'default.json' file:

"FileConverter": {
"converter": {
"maxDownloadBytes": 314572800,
"downloadTimeout": {
"connectionAndInactivity": "2m",
"wholeCycle": "2m"
},
"downloadAttemptMaxCount": 3,
"downloadAttemptDelay": 1000,
"maxprocesscount": 1,
"fontDir": "null",
"presentationThemesDir": "null",
"x2tPath": "null",
"docbuilderPath": "null",
"docbuilderAllFontsPath": "null",
"docbuilderCoreFontsPath": "",
"args": "",
"spawnOptions": {},
"errorfiles": "",
"streamWriterBufferSize": 8388608,
"maxRedeliveredCount": 2,
"inputLimits": [
{
"type": "docx;dotx;docm;dotm",
"zip": {
"uncompressed": "100MB",
"template": ".xml"
}
},
{
"type": "xlsx;xltx;xlsm;xltm",
"zip": {
"uncompressed": "1000MB",
"template": "
.xml"
}
},
{
"type": "pptx;ppsx;potx;pptm;ppsm;potm",
"zip": {
"uncompressed": "100MB",
"template": "*.xml"
}
}
]
}

I have no errors in the '/var/log/onlyoffice/documentserver/converter/out.log' or '/var/log/onlyoffice/documentserver/converter/err.log'. Any idea what's causing it?

The OnlyOffice Document Server version is 7.1.1-23 runs on Ubuntu 20.04.

Thanks a lot for your help.

@ShockwaveNN
Copy link
Contributor

I think you need to change nginx

See here

@codesmaker
Copy link
Author

codesmaker commented Jun 30, 2022

Hi @ShockwaveNN ,

I followed what you suggested but unfortunately I'm still getting the same error message. Here is what I did:

#!/usr/bin/env bash

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver-example/production-linux.json

sed -i '9iclient_max_body_size 1000M;' /etc/onlyoffice/documentserver-example/nginx/includes/ds-example.conf
sed -i '16iclient_max_body_size 1000M;' /etc/nginx/nginx.conf

sed -i -e 's/104857600/10485760000/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/50MB/5000MB/g' /etc/onlyoffice/documentserver/default.json
sed -i -e 's/300MB/3000MB/g' /etc/onlyoffice/documentserver/default.json

service nginx restart
supervisorctl restart all

Here is my 'default.json' file after the changes:

"FileConverter": {
                "converter": {
                        "maxDownloadBytes": 10485760000,
                        "downloadTimeout": {
                                "connectionAndInactivity": "2m",
                                "wholeCycle": "2m"
                        },
                        "downloadAttemptMaxCount": 3,
                        "downloadAttemptDelay": 1000,
                        "maxprocesscount": 1,
                        "fontDir": "null",
                        "presentationThemesDir": "null",
                        "x2tPath": "null",
                        "docbuilderPath": "null",
                        "docbuilderAllFontsPath": "null",
                        "docbuilderCoreFontsPath": "",
                        "args": "",
                        "spawnOptions": {},
                        "errorfiles": "",
                        "streamWriterBufferSize": 8388608,
                        "maxRedeliveredCount": 2,
                        "inputLimits": [
                                {
                                "type": "docx;dotx;docm;dotm",
                                "zip": {
                                        "uncompressed": "5000MB",
                                        "template": "*.xml"
                                }
                                },
                                {
                                "type": "xlsx;xltx;xlsm;xltm",
                                "zip": {
                                        "uncompressed": "3000MB",
                                        "template": "*.xml"
                                }
                                },
                                {
                                "type": "pptx;ppsx;potx;pptm;ppsm;potm",
                                "zip": {
                                        "uncompressed": "5000MB",
                                        "template": "*.xml"
                                }
                                }
                        ]
                }
        }

I also added ""maxFileSize" and set it to "10485760000" as you can see below but it made no difference.

"services": {
                "CoAuthoring": {
                        "server": {
                                "maxFileSize": 10485760000,
                                "port": 8000,
                                "workerpercpu": 1,
                                "mode": "development",
                                "limits_tempfile_upload": 10485760000,
                                "limits_image_size": 26214400,
                                "limits_image_download_timeout": {
                                        "connectionAndInactivity": "2m",
                                        "wholeCycle": "2m"
                                },

Thanks again for your help.

@ShockwaveNN
Copy link
Contributor

Could you share a file you've tried to open?

@codesmaker
Copy link
Author

I'm afraid that I can't because it contains some personal data. I'll try to find a big generic Excel file.

@ShockwaveNN
Copy link
Contributor

ShockwaveNN commented Jul 1, 2022

Also my collegues inform me that if file is not opened because of limit - you should try to reupload this file once again after limits change

or you could check DocumentServer integrated example (enabled at http://docserver/welcome) and see if via example this file opens fine. In that case - it's not DocumentServer fault

@ShockwaveNN ShockwaveNN added the waiting feedback Issues that we waiting to be answered from author of issue label Jul 6, 2022
@ShockwaveNN
Copy link
Contributor

This issue was closed due to no response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
waiting feedback Issues that we waiting to be answered from author of issue
Projects
None yet
Development

No branches or pull requests

2 participants