Giter Site home page Giter Site logo

ch.psi.imagej.hdf5's People

Contributors

simongregorebner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ch.psi.imagej.hdf5's Issues

Error when opening files with "Virtual stack" disabled

I need to open an HDF5 file as a regular stack, not a virtual stack. The reason is that the dataset is signed 16-bit, so to view it correctly in ImageJ I need to apply a calibration using the following macro:

run("XOR...", "value=1000000000000000 stack");
run("Calibrate...", "function=[Straight Line] unit=[Gray Value] text1=[0 65535] text2=[-32768 32767 ]");
run("Enhance Contrast", "saturated=0.5");

This works fine when I open signed 16-bit netCDF files with the netCDF plugin because it reads them as a "real" stack, rather than a virtual stack. The calibration does not work when reading HDF5 files into a virtual stack, because virtual stacks are read-only.

When I uncheck the "Virtual stack" box in the dialog window for this plugin I get an error message in the main ImageJ window saying there is an error reading the file.

Is this checkbox actually supported so that reading into regular stacks is allowed? If so, how do I figure out what the error is?

Migrate to ImageJ2 plugin

This makes the tools much more flexible and allows integration with KNIME and CellProfiler. It would also be worthwhile to integrate with bioformats

no such method error

I am using ImageJ version 1.53f and Java 1.8.0_112 (64-bit) with Ubuntu 16.04 LTS. When I copy the HDF5_Viewer-0.13.0.jar file into ImageJ/plugins folder, restart ImageJ, and try to load an HDF file, I get the following error:

ImageJ 1.53f; Java 1.8.0_112 [64-bit]; Linux 4.15.0-122-generic; 95MB of 100000MB (<1%)
java.lang.NoSuchMethodError: org.slf4j.Logger.trace(Ljava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
at hdf.object.FileFormat.(FileFormat.java:315)
at hdf.object.h5.H5File.(H5File.java:174)
at hdf.object.h5.H5File.(H5File.java:127)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at hdf.object.FileFormat.(FileFormat.java:228)
at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:82)
at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:51)
at ch.psi.imagej.hdf5.HDF5Reader.run(HDF5Reader.java:36)
at ij.IJ.runUserPlugIn(IJ.java:241)
at ij.IJ.runPlugIn(IJ.java:204)
at ij.Executer.runCommand(Executer.java:150)
at ij.Executer.run(Executer.java:68)
at java.lang.Thread.run(Thread.java:745)

These files work with HDF5_Vibez, so I know it isn't a corrupt file issue. I also removed the HDF5_Vibez.jar file from the plugins folder.

hdf.hdf5lib.exceptions.HDF5Exception: Invalid int size

Hi,

I convert the reconstructed tiff-Stacks to a netcdf4-file containing the image data as 4D-uint16-array using python with xarray and numpy.
Trying to import it with hdf5 vibez into ImageJ, I get a "Length is too large" error (roughly 13 billion). It works for smaller dataset.

However, with the psi-hdf5-loader on ra in the "TOMCAT"-Fiji and locally I get following error "Invalid int size":

Dec 02, 2022 10:57:04 AM ch.psi.imagej.hdf5.HDF5Reader open
WARNING: Error while opening: /das/home/fische_r/DASCOELY/Data10/disk1/T_3III_scan_04_test.nc
java.lang.Exception: Failed to read scalar dataset: Invalid int size
at hdf.object.h5.H5ScalarDS.read(H5ScalarDS.java:820)
at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:221)
at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:51)
at ch.psi.imagej.hdf5.HDF5Reader.run(HDF5Reader.java:36)
at ij.IJ.runUserPlugIn(IJ.java:241)
at ij.IJ.runPlugIn(IJ.java:204)
at ij.Executer.runCommand(Executer.java:151)
at ij.Executer.run(Executer.java:69)
at java.lang.Thread.run(Thread.java:748)
Caused by: hdf.hdf5lib.exceptions.HDF5Exception: Invalid int size
at hdf.object.h5.H5Utils.getTotalSelectedSpacePoints(H5Utils.java:118)
at hdf.object.h5.H5ScalarDS.scalarDatasetCommonIO(H5ScalarDS.java:902)
at hdf.object.h5.H5ScalarDS.read(H5ScalarDS.java:816)
... 8 more

Is this somethin you have encountered before and have a workaround? I will try to play around with the dataformat and datasize to find the issue and a solution.

Best,
Robert
[email protected]

Multiple virtual stacks from one file go stale after first one is closed

If opening multiple data sets from the hdf5 file simultaneously (during the same import process) as virtual stacks the file seems to get closed once the first stack is closed in Fiji. This results in all remaining stacks going stale (not able to update their image data from the file any more). The hdf5 file should only be closed once all of its active readers have been closed.

Feature request: Specify the dataset to be loaded from the Fiji macro editor

When loading an hdf5 file using the plugin in Fiji's python scripting interface, it is possible to specify the name of the dataset to be loaded:

from ch.psi.imagej.hdf5 import HDF5Reader
reader = HDF5Reader()
stack = reader.open("",False, "/Users/ebner/Desktop/A8_d_400N030_.h5", "/exchange/data_dark", True)

However, I cannot figure out whether it is possible to do this when using Fiji's macro editor (for native ImageJ macros):

run("HDF5...", "open=<filename>.h5");

Am I missing something here or is that option not available? It would certainly be very helpful.

Installing in latest version of ImageJ (not Fiji)

I am trying to install with the latest version of ImageJ, not Fiji.

I think I correctly installed it following the instructions on their Web site.

This is the listing of the directories at the top level of ImageJ

corvette:/usr/local/ImageJ>ls -l
total 2400
drwxrwxrwx  4 epics domain users      30 Apr 16 13:38 HDF5_Viewer-0.12.0
-rwxr-xr-x  1 epics epics          81005 Jul  2  2013 ImageJ
-rw-r--r--  1 root  root              47 Feb  1  2017 ImageJ.cfg
-rwxr-xr-x  1 root  root             341 Feb  1  2017 ImageJ.desktop
-rw-r--r--  1 root  root         2348096 Apr  6 17:15 ij.jar
drwxr-xr-x  2 root  root              21 Dec 27  2016 images
drwxr-xr-x  5 root  root            4096 Dec 27  2016 jre
drwxr-xr-x  2 root  root            4096 Dec 27  2016 luts
drwxr-xr-x  4 root  root            4096 Dec 27  2016 macros
drwxr-xr-x 14 root  root            4096 Apr 16 13:42 plugins

This is the HDF5_Viewer-0.12.0/ directory:

corvette:/usr/local/ImageJ>ls -l HDF5_Viewer-0.12.0/
total 0
drwxrwxrwx 4 epics domain users 32 Apr 16 13:38 lib
drwxrwxrwx 2 epics domain users 35 Apr 16 13:38 plugins

This is the plugins directory:

corvette:/usr/local/ImageJ>ls -l HDF5_Viewer-0.12.0/plugins/
total 248
-rwxrwxr-x 1 epics domain users 252486 Apr 16 13:35 HDF5_Viewer-0.12.0.jar

I think that when I open ImageJ and do File/Import I should see a choice for HDF5, but I don’t.

Can you tell me what I am doing wrong?

I also see that the lib directory only contains this:

corvette:/usr/local/ImageJ>ls -l HDF5_Viewer-0.12.0/lib/
total 8
drwxrwxrwx 2 epics domain users 4096 Apr 16 13:38 linux64
drwxrwxrwx 2 epics domain users 4096 Apr 16 13:38 mac64

I assume this means there is no support for Windows? How hard would it be to add, given that I have a working version of the 1.10 HDF5 library on Windows?

Use the plugin in "plain" ImageJ

I am running on Windows.

This plugin is working fine when I use Fiji.

I would like it to also work on plain ImageJ. When I try that, and select a file I get this error message:

java.lang.UnsatisfiedLinkError: hdf.hdf5lib.H5.H5dont_atexit()I
	at hdf.hdf5lib.H5.H5dont_atexit(Native Method)
	at hdf.hdf5lib.H5.loadH5Lib(H5.java:246)
	at hdf.hdf5lib.H5.<clinit>(H5.java:230)
	at hdf.hdf5lib.HDF5Constants.<clinit>(HDF5Constants.java:29)
	at hdf.object.h5.H5File.<init>(H5File.java:94)
	at hdf.object.h5.H5File.<init>(H5File.java:127)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at java.lang.Class.newInstance(Class.java:442)
	at hdf.object.FileFormat.<clinit>(FileFormat.java:228)
	at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:82)
	at ch.psi.imagej.hdf5.HDF5Reader.open(HDF5Reader.java:51)
	at ch.psi.imagej.hdf5.HDF5Reader.run(HDF5Reader.java:36)
	at ij.IJ.runUserPlugIn(IJ.java:241)
	at ij.IJ.runPlugIn(IJ.java:204)
	at ij.Executer.runCommand(Executer.java:150)
	at ij.Executer.run(Executer.java:68)
	at java.lang.Thread.run(Thread.java:745)

Any idea what the problem is?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.