-
-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
h5repack issues including from static install #5290
Comments
(Update from the author) FWIW, I believe I have successfully compiled a static test-writer HDF5 client that does properly load and use my shared library plugin. So, I believe it is possible to get a statically compiled h5repack to behave as Quincey suggested it should. If you want more details, let me know. I’d be happy to jump on a zoom or Webex or Teams meeting too if useful. I am also able to successfully use the h5ls binary tool from a static-only build of HDF5 and it is able to read zfp compressed data using my plugin. So, whatever is going on seem like it may be somewhat specific to h5repack. |
I will link here to some a relevant comment on H5Z-ZFP repo...LLNL/H5Z-ZFP#137 (comment) |
I've been exchanging emails with @lindstro about this. One thing he reminded me of is that in a static-only scenario, when an executable like Now, I suspect anything as sophisticated as an But, if you do not, that means there is a non-zero chance that a compression plugin developer could use some symbol in HDF5 that is obscure enough that I suspect the chances of this happening are small. But, I don't like that we might just be getting lucky a lot of the time. I think that one of two possible choices may need to be made here for static-only builds. Either forgo support for |
I did some analysis on
|
So, using the information above, I constructed a bad case to see what would happen. I used
But, that example works fine when I don't reference that symbol. |
@markcmiller86 Just to chime in on
When I last looked at trying to do this, I remember that it would be simple to do for CMake builds and a bit harder to do for Autotools builds. Though if we get rid of Autotools.. |
For static-only builds, ChatGPT tells me there are simple ways to force an executable like Yes, you can force the linker to load all object files from a static library ( 1. Use
|
To summarize then...
|
(Created from a helpdesk issue)
I have been trying to use h5repack in HDF5 1.14.0 and 1.14.5 and in particular when we have a static-only build of libhdf5.a and associated binary tools.
I had the impression that a compression plugin should be able to work with a static-only install of HDF5 and, in particular, h5repack should be able to use the plugin when requested via the -f argument. However, that is turning out not to be my experience.
gcc -I. -I/usr/gapps/silo/zfp/1.0.1/blueos_3_ppc64le_ib_p9-gcc-8.3.1/include -I/usr/gapps/silo/hdf5/1.14.5/blueos_3_ppc64le_ib_p9_gcc.8.3.1/include -shared -fPIC -o plugin/libh5zzfp.so H5Zzfp.c -L/usr/gapps/silo/zfp/1.0.1/blueos_3_ppc64le_ib_p9-gcc-8.3.1/lib -lzfp -lsz -lm -lz -ldl -Wl,-Bsymbolic -Wl,-undefined,dynamic_lookup
I will identify a few issues I am seeing in bullets below…
* Specifying -f argument with mandatory filter flag AND naming a specific dataset to apply the filter to DOES NOT issue any error message or error code when it fails. Specfically, I tried…
env HDF5_PLUGIN_PATH=
pwd
/foo /g/g11/miller86/tmp/build-hdf5-1.14.5-blueos_3_ppc64le_ib_p9-gcc-8.3.1/my_install/bin/h5repack -f .silo/#15:UD=32013,0,4,1,0,0,1074528256 /usr/gapps/visit/data/wave0000.silo foo.silo* This is very strange but I’ve attempted to compress a specific dataset with the filter both as mandatory and as optional. And, I get behavior that represents kinda sorta the reverse of what I would expect. And, it appears to be in a “limbo” state…h5ls reports that the dataset has been filtered. But, it doesn’t show the filter by name, only by id and the dataset is the same size as uncompressed. I show 2 runs below. The first uses 0 for optional flag…meaning mandatory. The second uses 1 for optional flag meaning optional. Note however that h5ls on that dataset for the first shows that the dataset was never filtered whereas for the second case, it reports it was filtered but then it doesn’t report it the way I would normally see it if it had truly loaded the plugin correct. I think it failed to load the plugin.
lassen708{miller86}835: env HDF5_PLUGIN_PATH=
pwd
/plugin /g/g11/miller86/tmp/build-hdf5-1.14.5-blueos_3_ppc64le_ib_p9-gcc-8.3.1/my_install/bin/h5repack -f .silo/#15:UD=32013,0,4,1,0,0,1074528256 /usr/gapps/visit/data/wave0000.silo foo.silolassen708{miller86}836: h5ls -vlr foo.silo/.silo/#15 Opened "foo.silo" with sec2 driver.
.silo/#15 Dataset {101/101, 11/11, 16/16}
lassen708{miller86}837: env HDF5_PLUGIN_PATH=
pwd
/plugin /g/g11/miller86/tmp/build-hdf5-1.14.5-blueos_3_ppc64le_ib_p9-gcc-8.3.1/my_install/bin/h5repack -f .silo/#15:UD=32013,1,4,1,0,0,1074528256 /usr/gapps/visit/data/wave0000.silo foo.silolassen708{miller86}838: h5ls -vlr foo.silo/.silo/#15 Opened "foo.silo" with sec2 driver.
.silo/#15 Dataset {101/101, 11/11, 16/16}
abort()
call in my compression plugin as part of thecan_apply()
method and it never aborts. So, I believe it is never loading the plugin.The text was updated successfully, but these errors were encountered: