Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can not backup data to s3 #1065

Open
saidkind opened this issue Dec 18, 2024 · 13 comments
Open

can not backup data to s3 #1065

saidkind opened this issue Dec 18, 2024 · 13 comments

Comments

@saidkind
Copy link

saidkind commented Dec 18, 2024

here is the error info:

2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.149 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.150 ERR pkg/backup/upload.go:561 > UploadCompressedStream return error: context canceled
2024-12-17 22:05:00.150 INF pkg/clickhouse/clickhouse.go:327 > clickhouse connection closed
2024-12-17 22:05:00.150 FTL cmd/clickhouse-backup/main.go:668 > error="one of upload table go-routine return error: can't upload: operation error S3: PutObject, https response error StatusCode: 400, RequestID: 1811FBDD150B7034, HostID: 03f6d7ba09b0531a178059659f12e65ab6a75adddf2f548b1f37624d55d95fba, api error IncompleteBody: You did not provide the number of bytes specified by the Content-Length HTTP header."
Good bye!

Here is my config.yml

general:
    remote_storage: s3
    backups_to_keep_local: 7
    backups_to_keep_remote: 31
clickhouse:
    username: default
    password: ""
    host: localhost
    port: 19000
    timeout: 5h
s3:
    access_key: "*****"
    secret_key: "*****"
    bucket: "clickhouse"
    max_parts_count: 400000
    endpoint: "http://localhost:9000"
    part_size: 5242880
    allow_multipart_download: true

there is no problem to execute clickhouse-backup create ,but when I want to execute clickhouse-backup create_remote or clickhouse-backup upload, it can upload a part of the backup file ,but after a few minute later ,the error came out.
It seems like a http problem ,but I couldn't find any config about http

@Slach
Copy link
Collaborator

Slach commented Dec 18, 2024

remove

s3:
  max_parts_count: 400000
  part_size: 5242880

these options have good default values

@Slach Slach closed this as completed Dec 18, 2024
@saidkind
Copy link
Author

saidkind commented Dec 18, 2024

@Slach thanks for your help, but the same error came out after I remove those two options

@Slach Slach reopened this Dec 18, 2024
@Slach
Copy link
Collaborator

Slach commented Dec 18, 2024

could you share full log for the following command?

LOG_LEVEL=debug S3_DEBUG=1 clickhouse-backup upload <your-backup-name>

@Slach
Copy link
Collaborator

Slach commented Dec 18, 2024

hmm, interesting
could you share

LOG_LEVEL=debug S3_DEBUG=1 UPLOAD_CONCURRENCY=1 S3_CONCURRENCY=1 clickhouse-backup upload <your-backup-name>

need figure out which request exactly failed

@saidkind
Copy link
Author

saidkind commented Dec 18, 2024

I have been running this command for about an hour, it runs slowly. I will reply you tomorrow
thanks for your help

@saidkind
Copy link
Author

hmm, interesting could you share

LOG_LEVEL=debug S3_DEBUG=1 UPLOAD_CONCURRENCY=1 S3_CONCURRENCY=1 clickhouse-backup upload <your-backup-name>

need figure out which request exactly failed

it succeed , but I don't know why.
I will run this command again and see what will happen
here is the log
backup.log

@Slach
Copy link
Collaborator

Slach commented Dec 19, 2024

maybe data in your source table was changed

@saidkind
Copy link
Author

it succeed again.

maybe data in your source table was changed

but i just upload the backup files , it's no problem to run 'clickhouse-backup create'.
so when i upload the files, it will check the data once again?

@Slach
Copy link
Collaborator

Slach commented Dec 23, 2024

Your original error happened during upload

2024-12-17 22:05:00.150 FTL cmd/clickhouse-backup/main.go:668 > error="one of upload table go-routine return error: can't upload: operation error S3: PutObject, https response error StatusCode: 400, RequestID: 1811FBDD150B7034, HostID: 03f6d7ba09b0531a178059659f12e65ab6a75adddf2f548b1f37624d55d95fba, api error IncompleteBody: You did not provide the number of bytes specified by the Content-Length HTTP header."

so you need to reproduce it and provide full logs to figure out

@Slach
Copy link
Collaborator

Slach commented Dec 31, 2024

any news from your side?

@saidkind
Copy link
Author

saidkind commented Dec 31, 2024

@Slach
i can run this command successfully

LOG_LEVEL=debug S3_DEBUG=1 UPLOAD_CONCURRENCY=1 S3_CONCURRENCY=1 clickhouse-backup upload <your-backup-name>

but i cannot run

clickhouse-backup create_remote

or

clickhouse-backup upload

it happens after i move my database to a new server

now i run the first command to backup my data

if you want to know more information, just tell me want you want me to do

@Slach
Copy link
Collaborator

Slach commented Jan 1, 2025

which error message do you see?
provide full logs

@Slach
Copy link
Collaborator

Slach commented Jan 8, 2025

any news from your side?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants