-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to latest iev-data release #66
Comments
@skalee ping on this. |
@ronaldtse Anytime I guess. However we have some pending discussion in concept-model on how to represent a few things, so it won't be complete. Also we need to display multiple sources on page (geolexica/geolexica-server#151) which is simple, but some clarification on that is welcome. |
@skalee we have a demo on Feb 25 (your morning) and must get the full site deployed (on Feb 24). Can you please help make that work? |
Oops, I have overlooked this one. I can try if it's not too late. @ronaldtse? |
Yes please do. Thanks! |
Caveats:
Built new concepts ZIP, now deploying site. Hopefully it won't take too long. |
The new problem is that full IEV deploy takes more than 20 minutes. We'll have to find something out for everyday use. Subset of concepts, perhaps. |
@skalee is it deployed? It’s not showing up. Do you know why it’s slow? The copying of files to S3? |
It's still deploying. One hour and counting. Building the site took half an hour, copying to S3 took all the rest. I hope we can tweak Oh, it just has completed. |
|
@skalee I noticed that the site built in 16 mins but the upload takes more than 30. If we reduce the output (quiet option), it can will be much faster. Let me check if S3 has an option for uploading a compressed archive. |
@ronaldtse I doubt it's due to logging. I mean it may be a contributor, but I doubt it's the biggest one. We do these uploads in a quite weird way with several passes: https://github.com/geolexica/geolexica-server/blob/master/lib/tasks/deploy.rake, maybe this is the reason. As far as I understand, we do that because we want to set our content type headers via CLI arguments. AFAIR there are other ways to set them too. |
https://aws.amazon.com/premiumsupport/knowledge-center/s3-upload-large-files/
|
I will enable S3 Transfer Acceleration at the bucket. |
And this might work! https://www.genui.com/open-source/s3p-massively-parallel-s3-copying |
Welp, not possible: Requirements for using Transfer Acceleration
|
So increasing the number of parallel requests is the only way forward... |
Unfortunately, it looks like it's tool for copying files between buckets, not for uploading local files to bucket. On the other hand, some ideas will be very handy if we decide to craft our own tool using AWS API.
This is mainly focused on uploading large files, not large amounts of files. |
We can close this one I guess. Summing it all up:
See geolexica/geolexica-server#151.
See #68.
See https://github.com/glossarist/iev-demo-site/issues/69.
See geolexica/geolexica-server#161.
See geolexica/geolexica-server#162.
|
Oh, I'll better close it with some nice pull request so that we have cleaner Git history. |
No description provided.
The text was updated successfully, but these errors were encountered: