Giter Site home page Giter Site logo

hdfgroup / hdf4 Goto Github PK

View Code? Open in Web Editor NEW
12.0 9.0 24.0 57.79 MB

Official HDF4 Library Repository

License: Other

CMake 8.50% Makefile 0.44% Shell 2.12% Perl 0.18% C++ 0.17% C 75.88% M4 1.15% Fortran 6.78% Pascal 0.23% NASL 0.01% Roff 0.06% Java 3.94% HTML 0.07% Scilab 0.27% JavaScript 0.08% CSS 0.06% PLSQL 0.06%

hdf4's Introduction

HDF version 4.3.1-1 currently under development

HDF Logo

master build status HDF-EOS build status netCDF build status BSD

DOCUMENTATION

Full Documentation and Programming Resources for this release can be found at

https://portal.hdfgroup.org/hdf4/

See the RELEASE.txt file in the release_notes/ directory for information specific to the features and updates included in this release of the library.

Several more files are located within the release_notes/ directory with specific details for several common platforms and configurations.

INSTALL - Start Here. General instructions for compiling and installing the library
INSTALL_CMAKE  - instructions for building with CMake (Kitware.com)
INSTALL_WINDOWS and INSTALL_CYGWIN - MS Windows installations.

FORUM and NEWS

The following public forums are provided for public announcements and discussions of interest to the general HDF4 Community.

These forums are provided as an open and public service for searching and reading. Posting requires completing a simple registration and allows one to join in the conversation. Please read the following instructions pertaining to the Forum's use and configuration https://forum.hdfgroup.org/t/quickstart-guide-welcome-to-the-new-hdf-forum

RELEASE SCHEDULE

HDF4 release schedule

HDF4 does not release on a regular schedule. Instead, releases are driven by new features and bug fixes, though we try to have at least one release of each maintenance branch per year. Future HDF4 releases indicated on this schedule are tentative.

Release New Features
4.4.0 Drop FORTRAN 77 support, Drop netCDF 2.3.2 API + tools, Unified library (maybe)

HDF 4.4.0 (February 2025)

  • We will drop support for FORTRAN 77 and move to modern Fortran (2003 or 2008)
  • HDF4 includes an ancient netCDF 2.3.2 API along with HDF4-built ncdump and ncgen tools. Support for these will be dropped in 4.4.0. netCDF APIs and tools should be obtained from Unidata.
  • (maybe) libdf and libmfhdf will be merged into a single libhdf4 library, Fortran will be built as a separate library

The goal of the HDF 4.4.0 release is to address long-standing deficiencies and bring HDF4 in line with HDF5's build practices. This should allow HDF4 to work better with modern systems and compilers and be more easily maintained.

The 4.3 maintenance line will be retired when 4.4.0 releases. There are no more planned HDF 4.3.x releases.

SNAPSHOTS, PREVIOUS RELEASES AND SOURCE CODE

Periodically development code snapshots are provided at the following URL:

https://github.com/HDFGroup/hdf4/releases/tag/snapshot

Source packages for current and previous releases are located at:

https://portal.hdfgroup.org/downloads/

Development code is available at our Github location:

https://github.com/HDFGroup/hdf4.git

Source Distribution Layout

The top level of the source code distribution contains the following subdirectories:

bin -- Scripts for maintenance.

config -- Configuration files to be used by configure script.

doc -- HDF 4.2 to 4.3 Migration Guide

hdf -- The source code for the HDF 'base library', the multi-file annotation interface, the multi-file raster image interface, HDF command line utilities, and a test suite. Please see the README in each directory for further information on each package.

java -- The Java HDF JNI library

m4 -- Autotools macros and libtool files for building with autotools.

mfhdf -- The netCDF(mfhdf) part of the HDF/mfhdf distribution and additional HDF utilities, such as hdp, hrepack, hdfimport, etc.

release_notes -- Installation instructions for UNIX and Windows. Descriptions of new features and bug fixes in this release. Files in this sub-directory can be used as supplemental documentation for HDF.

Third Party Software Requirements

  • JPEG distribution release 6b or later.

  • ZLIB 1.1.4(libz.a) or later.

System Requirements

To build the HDF library from source, you need:

  • C and Fortran compilers. For a list of the supported compilers, see release_docs/RELEASE.txt file.

hdf4's People

Contributors

bljhdf avatar bmribler avatar brtnfld avatar byrnhdf avatar cgohlke avatar dependabot[bot] avatar derobins avatar elmarco avatar fbaker avatar hyoklee avatar jhendersonhdf avatar loricooperhdf avatar lrknox avatar markedwardevans avatar mfolk avatar mike-mcgreevy avatar pimborman avatar qkoziol avatar schwehr avatar scivision avatar sebastic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hdf4's Issues

Possible problems with Vattrinfo2

Hello friends:

Im looking at what may be a bug in HDF4.2r0-Beta, downloaded 2/15/2023 from https://github.com/HDFGroup/hdf4.git and built locally on Ubuntu 22.04.

On the attached file, ./hdp dumpvg gives:

    attr12: name=start_latlon type=5 count=1 size=4
0.041719
    attr13: name=end_latlon type=5 count=1 size=4
-0.004751

but the correct answer is:

   start_latlon = 0.04171858f, -172.38489f ;
   end_latlon = -0.00475051f, 162.88487f ;

which I found using an independent library I am working on at https://github.com/JohnLCaron/cdm-kotlin (Note; work in Progress!)

Its likely Im missing something, for example what is the meaning of nfields in this interface:

intn Vattrinfo2(int32 vgroup_id, intn attr_index, char *attr_name, int32 *data_type, int32 *count, int32
  *size, int32 *nfields, uint16 *refnum)

   
I was guessing that theres only one field in an attribute? so count should = vh.nelems * fld[0].nelems
looks like a bug where Vattrinfo2 is returning count = fld[0].nelems ??

If you are really are supporting more than one field, seem likely that there are other calls that need to be made, but I havent found them using the Vattr* API calls.

Maybe theres a workaround (that hdp is not using either)? It would be great to have a working example on how to
handle vgroup attributes in a general way.

To reproduce:

./hdp dumpvg -r 401 2006166131201_00702_CS_2B-GEOPROF_GRANULE_P_R03_E00.hdf

on this file:

2006166131201_00702_CS_2B-GEOPROF_GRANULE_P_R03_E00.hdf.tar.gz

Thanks for your help!

John

ncgen test fails on Ubuntu aarch64 + Autotools

I believe Apple M1 will have the same failure.

@lrknox or @bmribler , can you confirm it?
I don't have M1 access.
@lkurz, may I access M1?

2023-01-19T20:18:02.7441099Z =============================
2023-01-19T20:18:02.7441330Z Running ncgen tests
2023-01-19T20:18:02.7441546Z =============================
2023-01-19T20:18:02.7442123Z testncgen.sh: line 113: [: -ne: unary operator expected
2023-01-19T20:18:02.7442438Z *** test1.cdl	Thu Jan 19 20:18:01 2023
2023-01-19T20:18:02.7442764Z --- test2.cdl	Thu Jan 19 20:18:01 2023
2023-01-19T20:18:02.7443009Z ***************
2023-01-19T20:18:02.7443220Z *** 50,56 ****
2023-01-19T20:18:02.7443408Z   
2023-01-19T20:18:02.7443602Z    cross =
2023-01-19T20:18:02.7443811Z     4, 5, 0.000244140625,
2023-01-19T20:18:02.7444016Z !   7, 8, 10000000000 ;
2023-01-19T20:18:02.7444220Z   
2023-01-19T20:18:02.7444416Z    i = 10, 20 ;
2023-01-19T20:18:02.7444608Z   
2023-01-19T20:18:02.7444842Z --- 50,56 ----
2023-01-19T20:18:02.7445060Z   
2023-01-19T20:18:02.7445241Z    cross =
2023-01-19T20:18:02.7445448Z     4, 5, 0.000244140625,
2023-01-19T20:18:02.7445666Z !   7, 8, 2147483647 ;
2023-01-19T20:18:02.7445858Z   
2023-01-19T20:18:02.7446050Z    i = 10, 20 ;
2023-01-19T20:18:02.7446251Z   
2023-01-19T20:18:02.7446500Z *** ncgen -b test failed ***
2023-01-19T20:18:02.7446819Z *** ncgen -c test successful ***
2023-01-19T20:18:02.7447067Z =============================
2023-01-19T20:18:02.7447288Z ncgen tests failed: 1
2023-01-19T20:18:02.7447520Z =============================

66 - MFHDF_TEST-hdftest fails on Ubuntu latest aarch64 + CMake.

See

https://github.com/HDFGroup/hdf4/actions/runs/3950638509/workflow

and

https://github.com/HDFGroup/hdf4/blob/5dc383710c8b314c3f1960da76d852622b36fa08/.github/workflows/aarch64.yml

[6747](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6748)
  The following tests FAILED:
[6748](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6749)
  	 66 - MFHDF_TEST-hdftest (Subprocess aborted)
[6749](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6750)
  Errors while running CTest
[6750](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6751)
  Output from these tests are in: /home/runner/work/hdf4/hdf4/hdf4/build/Testing/Temporary/LastTest.log
[6751](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6752)
  Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
[6752](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6753)
  make: *** [Makefile:91: test] Error 8
[6753](https://github.com/HDFGroup/hdf4/actions/runs/3950638509/jobs/6763440333#step:3:6754)
  Error: The process '/home/runner/work/_actions/uraimo/run-on-arch-action/v2/src/run-on-arch.sh' failed with exit code 2

New version?

Looking on hdf-4_2_15...master I think that it would be good to flush all those commits and make new release.
Any plans to do that soon? 🤔

Fix more C warnings

There are still a lot of folks using hdf4 around (I'm one of them). I've seen a few fuzzer bugs in prior versions and am having trouble getting all the platforms I have to support working with the version at head (e.g. x86-64, ppc, and arm). Before trying to dig into more, I figured I'd start with my current default compiler and see if I can do some low risk code cleanup if folks are up for it. My initial run as seeing what compiler warnings are showing up:

rm -rf build-ninja; time (mkdir -p build-ninja && cd build-ninja && cmake -DCMAKE_ANSI_CFLAGS:STRING="-Wall -Wextra -Werror -Wno-implicit-fallthrough -Wno-address -Wno-sign-compare -Wno-pedantic -Wno-stringop-truncation -Wno-type-limits -Wno-use-after-free -Wno-pointer-to-int-cast -Wno-pedantic -Wno-strict-aliasing -Wno-parentheses -Wno-tautological-compare -Wno-unused-parameter -Wno-strict-aliasing -Wno-discarded-qualifiers -Wno-implicit-function-declaration -Wno-alloc-size-larger-than -Wno-int-to-pointer-cast -Wno-array-bounds -Wno-unused-function -Wno-no-unused-const-variable= -Wno-unused-variable -Wno-memset-elt-size -Wno-maybe-uninitialized -Wno-switch -Wno-format-overflow -Wno-unused-but-set-variable" -GNinja .. && cmake --build . && ctest -V .)

I had trouble with being able to turn off -pedantic, so I commented that out in

set (CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -pedantic -Wall -Wextra")

And here here in

set (CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -pedantic -Wall -Wextra")

I will try to make some manageable pull requests that just address one of those compiler warnings at a time (as I have time).

Also, I see reference to things like VAX and $Id that are easy cleanups.

possible cleanup: Remove comment at the end of function closing braces

If folks are okay with this idea, I can make a PR. I'm trying to find non-controversial things that tighten up the code without risk of breaking anything.

Consider:

int32
HCPcskphuff_stread(accrec_t *access_rec)
{
    int32 ret;

    if ((ret = HCIcskphuff_staccess(access_rec, DFACC_READ)) == FAIL)
        HRETURN_ERROR(DFE_CINIT, FAIL);
    return ret;
} /* HCPcskphuff_stread() */    // <---- This comment here

Some functions have a closing function name comment and others do not. With modern editors, there shouldn't need to be these comments as there is brace matching and code folding/hiding. The code base is currently a mix of with and without these comments. If it feels like a function really needs one, I'd hazard a guess that it's a strong signal that the function is to large and should be broken up.

I propose removing all of these end of function comments.

Currently:

cd hdf/src

egrep '^}$' *.c | wc -l
449

egrep '^}.+' *.c | wc -l
839

Build fails on x86_64 OpenBSD 7.2 clang

In file included from /home/runner/work/hdf4/hdf4/hdf4/mfhdf/libsrc/array.c:18:
[1826](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1827)
/home/runner/work/hdf4/hdf4/hdf4/mfhdf/libsrc/local_nc.h:487:62: error: unknown type name 'u_int'
[1827](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1828)
HDFLIBAPI bool_t xdr_shorts    PROTO((XDR * xdrs, short *sp, u_int cnt));
[1828](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1829)
                                                             ^
[1829](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1830)
/home/runner/work/hdf4/hdf4/hdf4/mfhdf/libsrc/array.c:480:11: error: expected ';' after expression
[1830](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1831)
    u_long  count = 0, *countp = NULL;
[1831](https://github.com/HDFGroup/hdf4/actions/runs/3950638539/jobs/6763440046#step:3:1832)
          ^

Does HDF4 require C++ compiler?

Why does CMake look for CXX compiler?

-- The CXX compiler identification is unknown
[1364](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1365)
  CMake Error at mfhdf/test/CMakeLists.txt:2 (project):
[1365](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1366)
    No CMAKE_CXX_COMPILER could be found.
[1366](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1367)
  
[1367](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1368)
    Tell CMake where to find the compiler by setting either the environment
[1368](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1369)
    variable "CXX" or the CMake cache entry CMAKE_CXX_COMPILER to the full path
[1369](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1370)
    to the compiler, or to the compiler name if it is in the PATH.
[1370](https://github.com/hyoklee/hdf4-1/actions/runs/3584974644/jobs/6032344425#step:3:1371)

hdf 4.2.16-2 fails to build on i686 on Fedora rawhide

I'm looking at updating the Fedora hdf4 package and getting the following:

make[2]: Entering directory '/builddir/build/BUILD/hdf-4.2.16-2/build-shared/mfhdf/libsrc'
/bin/sh ../../libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. -I../../../mfhdf/libsrc -I../../hdf/src  -I../../../hdf/src -I../../../mfhdf/libsrc  -DHDF -D_POSIX_C_SOURCE=200809L  -UNDEBUG   -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wno-complain-wrong-lang -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m32 -march=i686 -mtune=generic -msse2 -mfpmath=sse -mstackrealign -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/include/tirpc  -c -o putget.lo ../../../mfhdf/libsrc/putget.c
libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../../../mfhdf/libsrc -I../../hdf/src -I../../../hdf/src -I../../../mfhdf/libsrc -DHDF -D_POSIX_C_SOURCE=200809L -UNDEBUG -O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wno-complain-wrong-lang -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m32 -march=i686 -mtune=generic -msse2 -mfpmath=sse -mstackrealign -fasynchronous-unwind-tables -fstack-clash-protection -I/usr/include/tirpc -c ../../../mfhdf/libsrc/putget.c  -fPIC -DPIC -o .libs/putget.o
make[2]: Leaving directory '/builddir/build/BUILD/hdf-4.2.16-2/build-shared/mfhdf/libsrc'
../../../mfhdf/libsrc/putget.c: In function 'xdr_NCv1data':
../../../mfhdf/libsrc/putget.c:596:36: error: passing argument 2 of 'xdr_long' from incompatible pointer type [-Wincompatible-pointer-types]
  596 |             return (xdr_long(xdrs, (nclong *)values));
      |                                    ^~~~~~~~~~~~~~~~
      |                                    |
      |                                    nclong * {aka int *}
In file included from ../../../mfhdf/libsrc/local_nc.h:56,
                 from ../../../mfhdf/libsrc/putget.c:18:
/usr/include/tirpc/rpc/xdr.h:291:33: note: expected 'long int *' but argument is of type 'nclong *' {aka 'int *'}
  291 | extern bool_t   xdr_long(XDR *, long *);
      |                                 ^~~~~~
../../../mfhdf/libsrc/putget.c: In function 'sd_ncvarput1':
../../../mfhdf/libsrc/putget.c:1641:45: warning: passing argument 4 of 'NCvar1io' discards 'const' qualifier from pointer target type [-Wdiscarded-qualifiers]
 1641 |     return (NCvar1io(handle, varid, coords, value));
      |                                             ^~~~~
../../../mfhdf/libsrc/putget.c:1562:59: note: expected 'char *' but argument is of type 'const void *'
 1562 | NCvar1io(NC *handle, int varid, const long *coords, Void *value)
../../../mfhdf/libsrc/putget.c: In function 'NC_fill_buffer':
../../../mfhdf/libsrc/putget.c:2124:8: warning: suggest explicit braces to avoid ambiguous 'else' [-Wdangling-else]
 2124 |     if (attr != NULL)
      |        ^
make[2]: *** [Makefile:598: putget.lo] Error 1

Drop FORTRAN 77 support

In a future release we will drop support for FORTRAN 77 and endeavor to make the Fortran wrappers behave more like HDF5's (Fortran 90/95 or even 2003 would be required to build the wrappers). This should greatly simplify maintenance as we'll be able to move things like compiler support and build system improvements from HDF5 much more easily.

CVE Issued against HDF4

Hi,

I am seeking approval of HDF4 at my place of employment and there is a report of a CVE issued against this library. I don't fully understand the CVE and see no references to HDF4 itself in it. Would you be able to elaborate if this security vulnerability actually exists? Or is it a false positive?

https://nvd.nist.gov/vuln/detail/CVE-2018-8088

(HDFFR-1607) hdf4 'HXsetdir(NULL)' unsets target directory variable

Date: Tue, 25 Jan 2022 15:34:16 +0000
From:
To: prioritysupport [email protected], HDF Helpdesk [email protected]
Subject: hdf4 'HXsetdir(NULL)' unsets target directory variable
Attachments: "hdf4_repro.cpp" "hdfsdstest1.hdf" "extfile.hdf"

Hello all,

I am seeing an issue with the hdf4 function 'HXsetdir', which I thought I would confirm with you whether it is an expected
behavior or a bug. We are currently using hdf4 version 4.2.5.

The question I have is if the use of 'HXsetdir(NULL)' is expected to reset the target directory variable and remove a
directory which was added previously using 'HXsetdir'.

I am attaching a C repro code to this email where I am trying to do the following:

  1. Open an hdf4 file 'hdfsdstest1.hdf'. I want to attempt and read the dataset 'WrapperDataSet' from this file. However,
    the data pertaining to this dataset is contained at an external file called 'extfile.hdf'.
    'extfile.hdf' is not located on the same folder as 'hdfsdstest1.hdf'.
  2. I execute 'HXsetdir(NULL) ' to unset the directory variable. Next, I execute 'SDreaddata'. The status for this
    operation is -1.
  3. Next, I execute 'HXsetdir' to set the target directory of the external file. After this, when I execute 'SDreaddata'
    again, the status is returned as 0. I believe this is expected.
  4. The issue comes on this step. I execute 'HXsetdir(NULL) ' to unset the directory variable again, and then execute
    'SDreaddata'. However, this time, the status is still returned as 0. I would have expected the status here to be returned
    as -1, since we are unsetting the directory variable (using 'HXsetdir(NULL)') as per the documentation here:
    https://support.hdfgroup.org/release4/doc/RM_PDF.pdf
    I was wondering if anyone can confirm if this is expected behavior. As I mentioned in point (4) above, I would have
    expected 'SDreaddata' to fail after 'HXsetdir(NULL)' is called to unset the directory. However, it still performs the read
    successfully.

Kindly let me know if you have any questions regarding the same, or if you need any more information.

Thanks,

JIRA issue: https://jira.hdfgroup.org/browse/HDFFR-1607

HDF4 installation

Hi,

I was installing the hdf4 library in my new system and I encountered the following error while "make >& gmake.out" and the same for gmake check >& check.out".
error in gmake.out:

libtool: compile: /usr/bin/gfortran -O -c mfgrff.f -o mfgrff.o
mfgrff.f:155:48:

132 | mgsnatt = mgisattr(riid, name, nt, count, data, len(name))
| 2
......
155 | mgsattr = mgisattr(riid, name, nt, count, data, len(name))
| 1
Error: Type mismatch between actual argument at (1) and actual argument at (2) (CHARACTER(0)/INTEGER(4)).
gmake[3]: *** [Makefile:679: mfgrff.lo] Error 1
gmake[3]: Leaving directory '/home/piyush/myHdfNcStuff/hdf4-master/hdf/src'
gmake[2]: *** [Makefile:485: all] Error 2
gmake[2]: Leaving directory '/home/piyush/myHdfNcStuff/hdf4-master/hdf/src'
gmake[1]: *** [Makefile:425: all-recursive] Error 1
gmake[1]: Leaving directory '/home/piyush/myHdfNcStuff/hdf4-master/hdf'
gmake: *** [Makefile:515: all-recursive] Error 1

Duplicate strdup symbol

I'm hitting an error with some sanitizer builds with the 2nd strdup in hdf4 caused by #442 that did HDstrdupstrdup in hdfalloc.c

strdup(const char *s)

backward reference detected: strdup in third_party/gdal/_objs/port_lib/json_object.pic.o refers to third_party/hdf4/_objs/libdf/hdfalloc.pic.o

Windows oneAPI CI fails examples

COMMAND Result: Stack overflow
348 - hdf_HDP-h4ex_GR_create_and_write_chunked_image (Failed)
351 - hdf_HDP-h4ex_GR_create_and_write_image (Failed)
354 - hdf_HDP-h4ex_GR_write_palette (Failed)
398 - mf_HDP-h4ex_VG_add_sds_to_vgroup (Failed)

hdftest fails on Fedora ppc64le/s390x

I'm working on updating the Fedora hdf package to version 4.2.16-2. I'm getting the following test failure on ppc64le and s390x:

Testing hdftest 
----------------------------
hdftest  Test Log
----------------------------
Testing create/read/write compressed datasets (tcomp.c)                PASSED
Testing create/read/write chunked datasets (tchunk.c)                  PASSED
Testing reading of netCDF file using the SDxxx interface (tnetcdf.c)  *** Routine netCDF Read Test 1. SDstart failed on file test1.nc FAILED at line 173 ***
*** Routine SDstart FAILED at line 82 ***
Testing functionality of dimensions (tdim.c)                           PASSED
Testing functions involving empty data sets (temptySDSs.c)             PASSED
Testing various setting attribute features (tattributes.c)             PASSED
Testing getting data size of special data (tdatasizes.c)              *** UNEXPECTED VALUE from SDgetdatasize: SDS named (CompressedData) is 56 at line   70 in ../../../mfhdf/test/tdatasizes.c
 PASSED
Testing getting location info of data (tdatainfo.c)                    PASSED
Testing getting location info of attr and annot data (tattdatainfo.c)  PASSED
Testing a mix of SD, V, and VS functions (tmixed_apis.c)               PASSED
Testing miscellaneous file related functions (tfile.c)                 PASSED
Testing various SDS' properties (tsdsprops.c)                          PASSED
Testing various coordinate variable features (tcoordvar.c)             PASSED
Testing external file functions (texternal.c)                          PASSED
Testing szip compression for datasets (tszip.c)                        PASSED
*** HDF-SD test fails ***

Ensure XDR copyright/license appears in release

The content of mfhdf/xdr/NOTICE.h and mfhdf/xdr/README needs to appear in the distribution root when the HDF4 XDR library is built in. We may want to do merge/rename them to a file called XDR_LICENSE and copy that over.

`_FillValue` is not a valid C symbol

In newer compilers, _FillValue is triggering issues.

See also: Unidata/netcdf-c#2728

Use a most recent version of GCC or CLANG and enforce standards compliance:

https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf

All identifiers that begin with an underscore and either an uppercase letter or another
underscore are always reserved for any use.

#define _FillValue "_FillValue"

/* For SD interface  */
#define _FillValue "_FillValue"
In file included from third_party/gdal/frmts/hdf4/hdf4multidim.cpp:28:
In file included from third_party/gdal/frmts/hdf4/hdf4dataset.h:37:
In file included from third_party/gdal/gcore/gdal_pam.h:35:
In file included from third_party/gdal/gcore/gdal_priv.h:77:
In file included from third_party/stl/cxx17/vector:30:
In file included from include/c++/v1/vector:316:
include/c++/v1/__bit_reference:268:16: error: expected ',' or '>' in template-parameter-list
  268 | template <bool _FillValue, class _Cp>
      |                ^
third_party/hdf4/hdf/src/hlimits.h:275:20: note: expanded from macro '_FillValue'
  275 | #define _FillValue "_FillValue"
      |                    ^
In file included from third_party/gdal/frmts/hdf4/hdf4multidim.cpp:28:
In file included from third_party/gdal/frmts/hdf4/hdf4dataset.h:37:
In file included from third_party/gdal/gcore/gdal_pam.h:35:
In file included from third_party/gdal/gcore/gdal_priv.h:77:
In file included from third_party/stl/cxx17/vector:30:
In file included from include/c++/v1/vector:316:
include/c++/v1/__bit_reference:268:26: error: expected unqualified-id
  268 | template <bool _FillValue, class _Cp>
      |                          ^
/include/c++/v1/__bit_reference:1102:18: error: expected ',' or '>' in template-parameter-list
 1102 |   template <bool _FillValue, class _Dp>
      |                  ^
third_party/hdf4/hdf/src/hlimits.h:275:20: note: expanded from macro '_FillValue'
  275 | #define _FillValue "_FillValue"
      |                    ^

xdrposix.c: excess elements in struct initializer

I've got a compiler issue with the latest compiler versions with xdrposix.c. I think this warning below is indicative that something is wrong with what is in static struct xdr_ops xdrposix_ops. It's also possible that my defines are not quite right as I am building with bazel.

INFO: From Compiling third_party/hdf4/mfhdf/libsrc/xdrposix.c:
third_party/hdf4/mfhdf/libsrc/xdrposix.c:292:5: warning: excess elements in struct initializer [-Wexcess-initializers]
    xdrposix_putint    /* serialize a 32-bit int */
    ^~~~~~~~~~~~~~~

I have

static struct xdr_ops   xdrposix_ops = {
    xdrposix_getlong,   /* deserialize a 32-bit int */
    xdrposix_putlong,   /* serialize a 32-bit int */
#if (_MIPS_SZLONG == 64)
    /* IRIX64 has 64 bits long and 32 bits int. */
    /* It defines two extra entries for get/put int. */
    xdrposix_getint,   /* deserialize a 32-bit int */
    xdrposix_putint,   /* serialize a 32-bit int */
#endif
    xdrposix_getbytes,  /* deserialize counted bytes */
    xdrposix_putbytes,  /* serialize counted bytes */
    xdrposix_getpos,    /* get offset in the stream */
    xdrposix_setpos,    /* set offset in the stream */
    xdrposix_inline,    /* prime stream for inline macros */
#if (defined __sun && defined _LP64) || defined __x86_64__ || defined __powerpc64__
    xdrposix_destroy,   /* destroy stream */
#if !(defined __x86_64__) && !(defined __powerpc64__) || (defined  __sun && defined _LP64) /* i.e. we are on SUN/Intel in 64-bit mode */
    NULL,               /* no xdr_control function defined */
#endif
    /* Solaris 64-bit (arch=v9 and arch=amd64) has 64 bits long and 32 bits int. */
    /* It defines the two extra entries for get/put int. here */
    xdrposix_getint,   /* deserialize a 32-bit int */
    xdrposix_putint    /* serialize a 32-bit int */
#else
#ifdef AIX5L64
    xdrposix_destroy,
    NULL,
    NULL,
    xdrposix_getint,
    xdrposix_putint
#else /*AIX5L64 */
    xdrposix_destroy    /* destroy stream */
#endif /*AIX5L64 */
#endif
};

That's out of date. The current head is:

static struct xdr_ops   xdrposix_ops = {
    xdrposix_getlong,   /* deserialize a 32-bit int */
    xdrposix_putlong,   /* serialize a 32-bit int */
    xdrposix_getbytes,  /* deserialize counted bytes */
    xdrposix_putbytes,  /* serialize counted bytes */
    xdrposix_getpos,    /* get offset in the stream */
    xdrposix_setpos,    /* set offset in the stream */
    xdrposix_inline,    /* prime stream for inline macros */
#if (defined __sun && defined _LP64) || defined __x86_64__ || defined __powerpc64__
    xdrposix_destroy,   /* destroy stream */
#if !(defined __x86_64__) && !(defined __powerpc64__) || (defined  __sun && defined _LP64) /* i.e. we are on SUN/Intel in 64-bit mode */
    NULL,               /* no xdr_control function defined */
#endif
    /* Solaris 64-bit (arch=v9 and arch=amd64) has 64 bits long and 32 bits int. */
    /* It defines the two extra entries for get/put int. here */
    xdrposix_getint,   /* deserialize a 32-bit int */
    xdrposix_putint    /* serialize a 32-bit int */
#else
#ifdef AIX5L64
    xdrposix_destroy,
    NULL,
    NULL,
    xdrposix_getint,
    xdrposix_putint
#else /*AIX5L64 */
    xdrposix_destroy    /* destroy stream */
#endif /*AIX5L64 */
#endif
};

getopt issues with Autotools on older Linux distros

There appears to be a problem with older Linux systems not being able to find getopt functionality when building with the Autotools. On affected systems, configure will succeed but make will not due to the linker not finding the getopt globals (e.g., optind). Handling getopt can be complicated as there are GNU/BSD differences and platforms like Windows lack getopt entirely. HDF4 provides a version of getopt for platforms where it's missing, which is (sadly) duplicated in the repository.

We will fix this by using the same scheme we use in HDF5:

  • Removing the duplicate getopt code
  • Updating the getopt standin to use a H4 prefix
  • Using the getopt standin everywhere, even on systems where getopt is present

Disable Fortran on 64-bit builds

The ancient Fortran 77 interface assigns pointers to fortran INTEGER values (usually 32-bit), which is not portable to systems where pointers are 64 bits.

CMake/configure should determine the size of pointers and FORTRAN integers and prevent building the Fortran interface when they are incompatible. Alternatively, the Fortran interface could be retired or updated to a newer version of Fortran that allows 64-bit integers.

From dff.c:

/*-----------------------------------------------------------------------------  
 * Name:    dfiopen                                                              
 * Purpose: call DFopen to open HDF file                                         
 * Inputs:  name: name of file to open                                           
 *      acc_mode: access mode - integer with value DFACC_READ etc.               
 *      defdds: default number of DDs per header block                           
 *      namelen: length of name                                                  
 * Returns: 0 on success, -1 on failure with DFerror set                         
 * Users:   HDF Fortran programmers                                              
 * Invokes: DFopen                                                               
 * Method:  Convert filename to C string, call DFopen                            
 * Note: DFopen actually return *DF.  In machines that a pointer                 
 *       is bigger than a Fortran INTEGER, this routine would fail.              
 *       This is a design error and has no easy portable solution.               
 *---------------------------------------------------------------------------*/ 

HDF4 specification of Image dimensions

https://support.hdfgroup.org/release4/doc/DS.pdf

p 9-125

DFTAG_LUT
Lookup table
xdim * ydim * elements * NTsize bytes (xdim, ydim, elements,
and NTsize are specified in the corresponding DFTAG_ID)
301 (0x012D)

Should probably say "... are specified in the corresponding DFTAG_LD". It would make no sense to have the lookup table be the same dimensions as the image.

I have at least one example file that has an LUT without LD, so all I can do is ignore. There is also an IP8 tag so I can use that.

p 9-124

DFTAG_RI
Raster image
xdim * ydim * elements * NTsize bytes (xdim, ydim, elements,
and NTsize are specified in the corresponding DFTAG_ID)
302 (0x012E)

This one is more speculative on my part, but it doesnt make sense for elements to be anything other than 1, so one could just leave that out.

(HDFFR-1573) SDwritedata fails after calling SDgetchunkinfo

SDwritedata fails after calling SDgetchunkinfo

Version: hdf-4.2.14

We have existing code developed with HDF4.2r1 that is failing with the latest hdf versions.

Attached sample program demonstrates the problem. Build commands used are in the comments.

Reported through helpdesk: SUPPORT-251

From blj: I created a C version, myhdftest.c, just in case I was not seeing something in the C++ code.
It also shows the issue with SDwritedata failing.
Comment out the call to SDgetchunkinfo and then the code works properly.

JIRA issue: https://jira.hdfgroup.org/browse/HDFFR-1573

4.2.15: cmake build with `HDF4_BUILD_XDR_LIB=OFF` fails on linking

cmake settings:

[tkloczko@devel-g2v x86_64-redhat-linux-gnu]$ cmake -L
CMake Warning:
  No source or binary directory provided.  Both will be assumed to be the
  same as the current working directory, but note that this warning will
  become a fatal error in future CMake releases.


CMake Error: The source directory "/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu" does not appear to contain CMakeLists.txt.
Specify --help for usage, or press the help button on the CMake GUI.
-- Cache values
BUILD_SHARED_LIBS:BOOL=ON
BUILD_STATIC_LIBS:BOOL=ON
BUILD_TESTING:BOOL=ON
BUILD_USER_DEFINED_LIBS:BOOL=OFF
CMAKE_ARCHIVE_OUTPUT_DIRECTORY:PATH=/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu/bin
CMAKE_BUILD_TYPE:STRING=RelWithDebInfo
CMAKE_Fortran_MODULE_DIRECTORY:PATH=/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu/bin
CMAKE_INSTALL_PREFIX:PATH=/usr
CMAKE_Java_ARCHIVE:FILEPATH=/usr/bin/jar
CMAKE_Java_RUNTIME:FILEPATH=/usr/bin/java
CMAKE_LIBRARY_OUTPUT_DIRECTORY:PATH=/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu/bin
CMAKE_RUNTIME_OUTPUT_DIRECTORY:PATH=/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu/bin
CTEST_TEST_TIMEOUT:STRING=3600
HDF4_ALLOW_EXTERNAL_SUPPORT:STRING=ON
HDF4_BUILD_EXAMPLES:BOOL=OFF
HDF4_BUILD_FORTRAN:BOOL=OFF
HDF4_BUILD_JAVA:BOOL=ON
HDF4_BUILD_TOOLS:BOOL=ON
HDF4_BUILD_UTILS:BOOL=ON
HDF4_BUILD_XDR_LIB:BOOL=OFF
HDF4_DISABLE_COMPILER_WARNINGS:BOOL=ON
HDF4_ENABLE_COVERAGE:BOOL=OFF
HDF4_ENABLE_DEPRECATED_SYMBOLS:BOOL=OFF
HDF4_ENABLE_JPEG_LIB_SUPPORT:BOOL=ON
HDF4_ENABLE_NETCDF:BOOL=ON
HDF4_ENABLE_SZIP_SUPPORT:BOOL=OFF
HDF4_ENABLE_Z_LIB_SUPPORT:BOOL=ON
HDF4_JAVA_PACK_JRE:BOOL=OFF
HDF4_PACKAGE_EXTLIBS:BOOL=ON
HDF4_PACK_EXAMPLES:BOOL=OFF
HDF_ENABLE_LARGE_FILE:BOOL=ON
JPEG_DIR:PATH=JPEG_DIR-NOTFOUND
XDR_INCLUDE_DIR:PATH=/usr/include/tirpc
XDR_INT_LIBRARY:FILEPATH=/usr/lib64/libtirpc.so
ZLIB_DIR:PATH=ZLIB_DIR-NOTFOUND

and linking fails on

[ 28%] Linking C executable ../../bin/hdftest
cd /home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu/mfhdf/test && /usr/bin/cmake -E cmake_link_script CMakeFiles/hdftest.dir/link.txt --verbose=1
/usr/bin/gcc  -std=c99 -O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none -fstdarg-opt -w -pedantic -Wall -Wextra -fmessage-length=0 -DNDEBUG -Wl,-z,relro -Wl,--as-needed -Wl,--gc-sections -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -flto=auto -flto-partition=none -fuse-linker-plugin -Wl,--build-id=sha1 CMakeFiles/hdftest.dir/hdftest.c.o CMakeFiles/hdftest.dir/tchunk.c.o CMakeFiles/hdftest.dir/tcomp.c.o CMakeFiles/hdftest.dir/tcoordvar.c.o CMakeFiles/hdftest.dir/tdim.c.o CMakeFiles/hdftest.dir/temptySDSs.c.o CMakeFiles/hdftest.dir/tattributes.c.o CMakeFiles/hdftest.dir/tfile.c.o CMakeFiles/hdftest.dir/tmixed_apis.c.o CMakeFiles/hdftest.dir/tnetcdf.c.o CMakeFiles/hdftest.dir/trank0.c.o CMakeFiles/hdftest.dir/tsd.c.o CMakeFiles/hdftest.dir/tsdsprops.c.o CMakeFiles/hdftest.dir/tszip.c.o CMakeFiles/hdftest.dir/tattdatainfo.c.o CMakeFiles/hdftest.dir/tdatainfo.c.o CMakeFiles/hdftest.dir/tdatasizes.c.o CMakeFiles/hdftest.dir/texternal.c.o CMakeFiles/hdftest.dir/tutils.c.o -o ../../bin/hdftest  -Wl,-rpath,"\$ORIGIN/../lib:\$ORIGIN/" ../../bin/libmfhdf.so.4.15.2 ../../bin/libhdf.so.4.15.2
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_enum'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_int'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_u_long'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_long'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_bytes'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_opaque'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_double'
/usr/bin/ld: ../../bin/libmfhdf.so.4.15.2: undefined reference to `xdr_float'
collect2: error: ld returned 1 exit status
make[2]: *** [mfhdf/test/CMakeFiles/hdftest.dir/build.make:390: bin/hdftest] Error 1
make[2]: Leaving directory '/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu'
make[1]: *** [CMakeFiles/Makefile2:1567: mfhdf/test/CMakeFiles/hdftest.dir/all] Error 2
make[1]: Leaving directory '/home/tkloczko/rpmbuild/BUILD/hdf-4.2.15/x86_64-redhat-linux-gnu'
make: *** [Makefile:169: all] Error 2

Possible to get h4fc through CMake?

I've been trying to build HDF4 using CMake and I got close, but when I built it, I didn't get out h4fc. I did compile for Fortran and I see libmfhdf_fortran.a in my install dir, but no h4fc. If I look around I see code for h4cc:

if (NOT WIN32)
set (_PKG_CONFIG_COMPILER ${CMAKE_C_COMPILER})
configure_file (
${HDF_RESOURCES_DIR}/libh4cc.in
${HDF4_BINARY_DIR}/CMakeFiles/h4cc
@ONLY
)
install (
FILES ${HDF4_BINARY_DIR}/CMakeFiles/h4cc
DESTINATION ${HDF4_INSTALL_BIN_DIR}
PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE
COMPONENT libraries
)
endif ()

but I see no references to h4fc in the CMake files. Help?

XDR Undefined References

Hi there,

I'm attempting to build a package for Alpine Linux from v4.2.15. However, when trying to link in the XDR types, ld is failing with many undefined references, for example to xdr_u_long.

I initially suspected that this was the incompatibility mentioned in section 4.4 of this document, meaning my system's version of xdr (portablexdr) is incompatible with this project.

Therefore I supplied configure with --enable-hdf4-xdr, and removed portablexdr from my system. The bundled version of XDR gets compiled, but I get the same behaviour.

The full linker output can be found here. I'd appreciate any input you can give as to how I might proceed :)

Autotools Java is broken on Ubuntu Linux

Building HDF4 from master w/ --enable-java is broken on Ubuntu Linux 22.04.3 LTS w/ OpenJDK 11.0.21. The tests pass on CMake.

Java was installed from the package manager, with no other setup.

I feel like this should work out of the box. Do we need to munge the paths differently?

============================                                                     
Testing junit.sh                                                                 
============================                                                     
junit.sh  Test Log                                                               
============================                                                     
java  -Xmx1024M -Dorg.slf4j.simpleLogger.defaultLog=trace -Djava.library.path=../../hdf4/lib -cp .:../../hdf4/lib/jarhdf-4.2.17-1.jar:../../hdf4/lib/junit.jar:../../hdf4/lib/hamcrest-core.jar:../../hdf4/lib/slf4j-api-1.7.33.jar:../../hdf4/lib/slf4j-simple-1.7.33.jar:jarhdftest.jar: -ea org.junit.runner.JUnitCore test.TestH4
Testing JUnit-TestH4                                                  [main] INFO hdf.hdflib.HDFLibrary - HDF4 library: hdf_java
[main] INFO hdf.hdflib.HDFLibrary -  NOT successfully loaded from java.library.path
**FAILED**    JUnit-TestH4                                                       
    Expected result differs from actual result                                   
    *** JUnit-TestH4.txt    2024-02-03 16:30:28.268215328 -0800                  
    --- JUnit-TestH4.out    2024-02-03 16:30:28.428216065 -0800                  
    ***************                                                              
    *** 1,13 ****                                                                
      JUnit version 4.11                                                         
      .testCreateCloseOpen                                                       
    ! .testDFKNTsize                                                             
    ! .testHgetlibversion                                                        
    ! .testJ2C                                                                   
      .testHnumber                                                               
    ! .testHishdf                                                                
    ! .testHDgetNTdesc                                                           
    !                                                                            
      Time:  XXXX                                                                
                                                                                 
    ! OK (7 tests)                                                               
                                                                                 
    --- 1,249 ----                                                               
      JUnit version 4.11                                                         
      .testCreateCloseOpen 

error: conflicting types for ‘xdrposix_inline’

I am trying to build on Linux-ARM64 platform. Getting an error:

hdf4/mfhdf/libsrc/xdrposix.c:530:1: error: conflicting types for ‘xdrposix_inline’

Is there a change required to hdfi.h or xdrposix.c to build on Linux-ARM64? (x86 platforms are not an issue)

Thanks!

CMake Fortran is broken on Ubuntu Linux

The Fortran ftest test fails on Ubuntu Linux 22.04.3 w/ gfortran 11.4.0. It passes on the Autotools. The Fortran interface has problems due to the way we jam 64-bit addresses into 32-bit Fortran integers, but we should investigate this since it works on the Autotools. We'll at least want to understand the problem.

        Start  69: ftest
 69/475 Test  #69: ftest ............................................................***Exception: SegFault  0.12 sec
99% tests passed, 1 tests failed out of 475

Label Time Summary:
HDF4Examples_SD         =   0.22 sec*proc (17 tests)
HDF4_HDF_TEST           =   1.19 sec*proc (64 tests)
HDF4_MFHDF_DUMPER       =   1.59 sec*proc (172 tests)
HDF4_MFHDF_FORTRAN      =   0.02 sec*proc (4 tests)
HDF4_MFHDF_HDFIMPORT    =   0.25 sec*proc (35 tests)
HDF4_MFHDF_HDIFF        =   0.28 sec*proc (16 tests)
HDF4_MFHDF_HREPACK      =   2.99 sec*proc (36 tests)
HDF4_MFHDF_NCDUMP       =   0.04 sec*proc (5 tests)
HDF4_MFHDF_NCGEN        =   0.06 sec*proc (6 tests)
HDF4_MFHDF_NCTEST       =   0.01 sec*proc (1 test)
HDF4_MFHDF_TEST         =   0.08 sec*proc (3 tests)
“JAVA”              =   1.20 sec*proc (20 tests)
“MFHDF”             =   5.24 sec*proc (275 tests)

Total Test time (real) =   9.41 sec

The following tests FAILED:
	 69 - ftest (SEGFAULT)
Errors while running CTest

(HDFFR-1593) Crash in VSinquire on a customer file

Date: Tue, 31 Aug 2021 14:42:20 +0000
From:
To: "[email protected]" [email protected]
Subject: Crash in VSinquire on a customer file
Parts/Attachments:
"(customer file).hdf"
"repro.cpp"

Hi HDF Priority Support!

We have a customer-reported crash on their HDF4 file (see (customer file).hdf attached), and I was able to
reproduce it outside (company name/product). Attached is the C repro code, repro.cpp, and the customer file, (customer file).hdf
(the customer said the file should only be shared for necessary debugging/investigation). When I run the repro code,
I see the following output (this was ran on a mac, but we see this crash on other platforms too):

file_id after Hopen: 536870912
status after Vstart: 0
count after Vlone (1): 3
count after Vlone (2): 3
refs[0]: 46
refs[1]: 47
refs[2]: 1427
vgroup_id after Vattach: 805306374
nChild after Vntagrefs: 6
nChild after Vgettagrefs: 6
vdata_id after VSattach: 1073741824
Bus error: 10

The crash seems to happen during VSinquire call. This is the relevant portion of the (company name/product) crash stack trace, in
case it is helpful:

[ 7] 0x00000000610f51c3 [unknown function] at [unknown module] (no module specified)
[ 8] 0x0000000197942d71 scanattrs+00000236
[ 9] 0x000000019793aa05 VSsizeof+00000288
[ 10] 0x000000019793b365 VSinquire+00000176

Note that we are using the 4.2.5 HDF4 version.

Is this a (known) bug? Is there some corruption in the customer file? We would appreciate any information about this
issue.

Thank you very much!

JIRA issue: https://jira.hdfgroup.org/browse/HDFFR-1593

(HDFFR-1597) SegV with hdf4 function 'DF24getimage'

From the user:

"I am seeing a segV with the use of the HDF4 function ‘DF24getimage’ with a corrupt HDF4 file and I was wondering if there is a way to mitigate this.

We are using hdf4 version 4.2.15 and could reproduce the segV in Windows 11 as well as Debian 11.

I have a corrupt hdf4 file with 24 bit raster image data, and I am trying to read the image data using the function ‘DF24getimage’. I am attaching my repro code to this email as ‘DF24getimage_segV.cpp’ and am also attaching the corrupt hdf file (rose24ap_type0_schema2_sample3.hdf) .
During the execution of the repro code, the process aborts unexpectedly with the following message during the execution of the function ‘DF24getimage’:

“Bogus marker length”

Is there any way in which the segV can be avoided and this can be caught as an error code, which can be returned to the calling function?

Feel free to let me know if you have any follow up questions regarding this issue."

JIRA issue: HDFFR-1597

Address issues raised in #260

Code review in PR #260 raised some concerns about memory usage (among other things). We should go back over those comments and address code smells.

Errors building on RH 8

I don't seem to be able to come up with the right combination of configure flags to get HDF4 to build on Red Hat 8.

I am also using GCC 10.3.0 for gcc/g++ and gfortran, so I must use -fallow-argument-mismatch with gfortran.

$ uname -a
Linux build.stage.arc-ts.umich.edu 4.18.0-305.25.1.el8_4.x86_64 #1 SMP Mon Oct 18 14:34:11 EDT 2021 x86_64 x86_64 x86_64 GNU/Linux

First, it seems that RH 8 no longer installs rpc.h as part of the glibc-devel package as it did with RH 7

glibc-headers-2.17-324.el7_9.x86_64 : Header files for development using
                                    : standard C libraries.
Repo        : updates
Matched from:
Filename    : /usr/include/rpc/rpc.h
$ dnf whatprovides /usr/include/rpc/rpc.h
Last metadata expiration check: 0:37:23 ago on Tue 26 Apr 2022 02:04:56 PM EDT.
Error: No Matches found

Following some web pages, I installed libtirpc-devel, which provides /usr/include/tirpc/rpc/rpc.h, but that turned out not to matter (I think), as the configure that finally did work still works after libtirpc-devel is erased.

I tried this to configure HDF4

./configure --prefix=/tmp/bennetsw/local \
    --enable-fortran --enable-production \
    --with-jpeg=$IMG_LIBS_HOME
. . . .
checking jpeglib.h usability... yes
checking jpeglib.h presence... yes
checking for jpeglib.h... yes
checking for jpeg_start_decompress in -ljpeg... yes
checking for szlib... suppressed
checking for xdr library support... checking rpc/rpc.h usability... no
checking rpc/rpc.h presence... no
checking for rpc/rpc.h... no
checking for rpc/rpc.h... (cached) no
configure: error: couldn't find rpc headers

I can get past ./configure if I add --enable-hdf4-xdr, which gives me this summary

Post process libhdf4.settings
config.status: executing depfiles commands
config.status: executing libtool commands
config.status: executing .classes commands
	    SUMMARY OF THE HDF4 CONFIGURATION
	    =================================

General Information:
-------------------
		   HDF4 Version: 4.2.15
		  Configured on: Tue Apr 26 14:47:22 EDT 2022
		  Configured by: [email protected]
		 Configure mode: production
		    Host system: x86_64-unknown-linux-gnu
              Uname information: Linux gls8-build.stage.arc-ts.umich.edu 4.18.0-305.25.1.el8_4.x86_64 #1 SMP Mon Oct 18 14:34:11 EDT 2021 x86_64 x86_64 x86_64 GNU/Linux
		      Libraries: 
	     Installation point: /tmp/bennetsw/local

Compiling Options:
------------------
               Compilation Mode: production
                     C compiler: /sw/pkgs/arc/gcc/10.3.0/bin/gcc ( gcc (GCC) 10.3.0)
                         CFLAGS:  -O3 -fomit-frame-pointer
                       CPPFLAGS:  -I/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/include 
               Shared Libraries: no
               Static Libraries: yes
  Statically Linked Executables: no
                        LDFLAGS:  -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib
 	 	Extra libraries: -ljpeg -lz 
 		       Archiver: ar
 		 	 Ranlib: ranlib

Languages:
----------
                        Fortran: yes
               Fortran Compiler: /sw/pkgs/arc/gcc/10.3.0/bin/gfortran ( GNU Fortran (GCC) 10.3.0)
                         FFLAGS:  -fallow-argument-mismatch -O

                           Java: no

Features:
---------
               SZIP compression: disabled
   Support for netCDF API 2.3.2: yes

I then get link errors about undefined references to a number of xdr related functions.

make[2]: Entering directory '/tmp/bennetsw/build/hdf-4.2.15/mfhdf/ncgen'
./ncgen -c -o ctest0.nc ./test0.cdl > test0.c
/sw/pkgs/arc/gcc/10.3.0/bin/gcc -DHAVE_CONFIG_H -I. -I../../hdf/src  -I../../hdf/src -I../../mfhdf/libsrc -I../../mfhdf/libsrc -I/tmp/bennetsw/build/hdf-4.2.15/mfhdf/xdr -DNDEBUG -DHDF -I/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/include   -O3 -fomit-frame-pointer -c -o ctest0.o test0.c
/bin/sh ../../libtool  --tag=CC   --mode=link /sw/pkgs/arc/gcc/10.3.0/bin/gcc  -O3 -fomit-frame-pointer  -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib -o ctest0 ctest0.o ../../mfhdf/libsrc/libmfhdf.la ../../hdf/src/libdf.la /tmp/bennetsw/build/hdf-4.2.15/mfhdf/xdr/libxdr.la -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib ../../mfhdf/libsrc/.libs/libmfhdf.a ../../hdf/src/.libs/libdf.a -ljpeg -lz 
libtool: link: /sw/pkgs/arc/gcc/10.3.0/bin/gcc -O3 -fomit-frame-pointer -o ctest0 ctest0.o  -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib /tmp/bennetsw/build/hdf-4.2.15/mfhdf/xdr/.libs/libxdr.a ../../mfhdf/libsrc/.libs/libmfhdf.a ../../hdf/src/.libs/libdf.a -ljpeg -lz
../../mfhdf/libsrc/.libs/libmfhdf.a(cdf.o): In function `xdr_numrecs':
cdf.c:(.text+0x34f3): undefined reference to `xdr_u_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(cdf.o): In function `xdr_cdf':
cdf.c:(.text+0x35f8): undefined reference to `xdr_u_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(cdf.o): In function `xdr_NC_fill':
cdf.c:(.text+0x3e76): undefined reference to `xdr_bytes'
cdf.c:(.text+0x3ed6): undefined reference to `xdr_int'
cdf.c:(.text+0x3ee6): undefined reference to `xdr_float'
cdf.c:(.text+0x3ef6): undefined reference to `xdr_double'
../../mfhdf/libsrc/.libs/libmfhdf.a(cdf.o): In function `xdr_numrecs':
cdf.c:(.text+0x3527): undefined reference to `xdr_u_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(dim.o): In function `xdr_NC_dim':
dim.c:(.text+0x791): undefined reference to `xdr_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(putget.o): In function `xdr_NCvbyte':
putget.c:(.text+0x301): undefined reference to `xdr_opaque'
putget.c:(.text+0x37b): undefined reference to `xdr_opaque'
putget.c:(.text+0x3de): undefined reference to `xdr_opaque'
../../mfhdf/libsrc/.libs/libmfhdf.a(putget.o): In function `xdr_NCvshort':
putget.c:(.text+0x482): undefined reference to `xdr_opaque'
putget.c:(.text+0x4fe): undefined reference to `xdr_opaque'
../../mfhdf/libsrc/.libs/libmfhdf.a(putget.o):putget.c:(.text+0x56c): more undefined references to `xdr_opaque' follow
../../mfhdf/libsrc/.libs/libmfhdf.a(putget.o): In function `xdr_NCvdata':
putget.c:(.text+0x7c0): undefined reference to `xdr_float'
putget.c:(.text+0x808): undefined reference to `xdr_int'
putget.c:(.text+0x818): undefined reference to `xdr_double'
putget.c:(.text+0x839): undefined reference to `xdr_opaque'
../../mfhdf/libsrc/.libs/libmfhdf.a(putget.o): In function `xdr_NCv1data':
putget.c:(.text+0x681): undefined reference to `xdr_int'
putget.c:(.text+0x699): undefined reference to `xdr_float'
putget.c:(.text+0x6b1): undefined reference to `xdr_double'
../../mfhdf/libsrc/.libs/libmfhdf.a(sharray.o): In function `NCxdr_shortsb':
sharray.c:(.text+0x426): undefined reference to `xdr_opaque'
../../mfhdf/libsrc/.libs/libmfhdf.a(sharray.o): In function `NCxdr_shortsb.constprop.0':
sharray.c:(.text+0x9d7): undefined reference to `xdr_opaque'
../../mfhdf/libsrc/.libs/libmfhdf.a(string.o): In function `xdr_NC_string':
string.c:(.text+0x566): undefined reference to `xdr_u_long'
string.c:(.text+0x5a9): undefined reference to `xdr_u_long'
string.c:(.text+0x5be): undefined reference to `xdr_opaque'
string.c:(.text+0x645): undefined reference to `xdr_opaque'
string.c:(.text+0x666): undefined reference to `xdr_u_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(var.o): In function `xdr_NC_var':
var.c:(.text+0xd0c): undefined reference to `xdr_enum'
var.c:(.text+0xd26): undefined reference to `xdr_u_long'
var.c:(.text+0xd5e): undefined reference to `xdr_u_long'
../../mfhdf/libsrc/.libs/libmfhdf.a(array.o): In function `xdr_NC_array':
array.c:(.text+0x9bd): undefined reference to `xdr_enum'
array.c:(.text+0x9da): undefined reference to `xdr_u_long'
array.c:(.text+0xaa1): undefined reference to `xdr_opaque'
array.c:(.text+0xb62): undefined reference to `xdr_int'
array.c:(.text+0xc02): undefined reference to `xdr_double'
array.c:(.text+0xc22): undefined reference to `xdr_float'
../../mfhdf/libsrc/.libs/libmfhdf.a(iarray.o): In function `xdr_NC_iarray':
iarray.c:(.text+0x110): undefined reference to `xdr_u_long'
iarray.c:(.text+0x13b): undefined reference to `xdr_int'
iarray.c:(.text+0x186): undefined reference to `xdr_u_long'
iarray.c:(.text+0x1d3): undefined reference to `xdr_int'
collect2: error: ld returned 1 exit status
make[2]: *** [Makefile:1220: ctest0] Error 1
make[2]: Leaving directory '/tmp/bennetsw/build/hdf-4.2.15/mfhdf/ncgen'
make[1]: *** [Makefile:430: all-recursive] Error 1
make[1]: Leaving directory '/tmp/bennetsw/build/hdf-4.2.15/mfhdf'
make: *** [Makefile:515: all-recursive] Error 1

This is the compile line, as far as I can tell,

/sw/pkgs/arc/gcc/10.3.0/bin/gcc -O3 -fomit-frame-pointer -o ctest0 ctest0.o  -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib /tmp/bennetsw/build/hdf-4.2.15/mfhdf/xdr/.libs/libxdr.a ../../mfhdf/libsrc/.libs/libmfhdf.a ../../hdf/src/.libs/libdf.a -ljpeg -lz

and I can get past the undefined xdr_ functions simply by adding additional -Ls in front of the three .a files that are not prefixed with it.

$ /sw/pkgs/arc/gcc/10.3.0/bin/gcc -O3 -fomit-frame-pointer -o ctest0 ctest0.o  -L/sw/pkgs/coe/o/libimage/gcc_10_3/220318.1/lib -L/tmp/bennetsw/build/hdf-4.2.15/mfhdf/xdr/.libs/libxdr.a -L../../mfhdf/libsrc/.libs/libmfhdf.a -L../../hdf/src/.libs/libdf.a -ljpeg -lz

ctest0.o: In function `main':
test0.c:(.text.startup+0x19): undefined reference to `nccreate'
test0.c:(.text.startup+0x2c): undefined reference to `ncdimdef'
test0.c:(.text.startup+0x40): undefined reference to `ncdimdef'
test0.c:(.text.startup+0x51): undefined reference to `ncdimdef'
test0.c:(.text.startup+0x62): undefined reference to `ncdimdef'
test0.c:(.text.startup+0x9e): undefined reference to `ncvardef'
test0.c:(.text.startup+0xc1): undefined reference to `ncvardef'
test0.c:(.text.startup+0xe5): undefined reference to `ncvardef'
test0.c:(.text.startup+0x109): undefined reference to `ncvardef'
test0.c:(.text.startup+0x12b): undefined reference to `ncvardef'
ctest0.o:test0.c:(.text.startup+0x14e): more undefined references to `ncvardef' follow
ctest0.o: In function `main':
test0.c:(.text.startup+0x1bd): undefined reference to `ncattput'
test0.c:(.text.startup+0x1e6): undefined reference to `ncattput'
test0.c:(.text.startup+0x20d): undefined reference to `ncattput'
test0.c:(.text.startup+0x243): undefined reference to `ncattput'
test0.c:(.text.startup+0x26f): undefined reference to `ncattput'
ctest0.o:test0.c:(.text.startup+0x291): more undefined references to `ncattput' follow
ctest0.o: In function `main':
test0.c:(.text.startup+0x298): undefined reference to `ncendef'
test0.c:(.text.startup+0x2b2): undefined reference to `ncvarput'
test0.c:(.text.startup+0x2d1): undefined reference to `ncvarput'
test0.c:(.text.startup+0x2f0): undefined reference to `ncvarput'
test0.c:(.text.startup+0x309): undefined reference to `ncvarput'
test0.c:(.text.startup+0x323): undefined reference to `ncvarput'
ctest0.o:test0.c:(.text.startup+0x33d): more undefined references to `ncvarput' follow
ctest0.o: In function `main':
test0.c:(.text.startup+0x379): undefined reference to `ncclose'
collect2: error: ld returned 1 exit status

but that is really just flailing at the problem.

I do note that this configure command creates a configuration that makes all files and passes all tests.

$ ./configure --prefix=/tmp/bennetsw/local     --disable-fortran --enable-shared --enable-production    --with-jpeg=$IMG_LIBS_HOME --enable-hdf4-xdr

but this one does not,

$ ./configure --prefix=/tmp/bennetsw/local     --enable-fortran --disable-shared --enable-production    --with-jpeg=$IMG_LIBS_HOME --enable-hdf4-xdr

so it seems tied to enabling fortran leading to some missing libraries somewhere along the way.

Has anyone successfully build HDF4 on Red Hat 8?

All hints

hdf/test/comp.c msan failure on write

With master's current head with a custom linux build...

My guess is that init_buffers(void) is not fully initializing all the buffers

InitTest("comp", test_comp, "COMPRESSED ELEMENTS"); 

Fails here:

  if (HI_WRITE(file_rec->file, buf, bytes) == FAIL)

Fails with:

Testing  -- COMPRESSED ELEMENTS (comp) 
Uninitialized bytes in __interceptor_fwrite at offset 513 inside [0x72000000e000, 514)
==5664==WARNING: MemorySanitizer: use-of-uninitialized-value
    #0 0x7fa69ea9df0b in HP_write third_party/hdf4/hdf/src/hfile.c:3593:9

SUMMARY: MemorySanitizer: use-of-uninitialized-value third_party/hdf4/hdf/src/hfile.c:3593:9 in HP_write
Exiting

Remove the dependency on XDR

XDR will be removed and replaced with a thin I/O layer that does handles long integer data correctly.

XDR is currently not in the public API and is not otherwise exposed. Users should experience no change in behavior from this change.

(HDFFR-1586) Problem w/HXsetcreatedir and VSsetexternalfile

Date: Wed, 18 Nov 2020 21:33:24 +0000
From:
To: prioritysupport [email protected]
Subject: Bug with 'VSsetexternalfile' in hdf4 library
Attachments:
"hdf4_bug_repro.cpp"

Hello,

I am developer with <company name>, and I am writing to you regarding a bug that might be existing with the hdf4 library in Windows.
Recently, some of our (company product) test suites pertaining to hdf4 started failing sporadically on Windows, and by investigating further, I have (hopefully)
isolated and narrowed the issue to the function 'VSsetexternalfile' in the VS interface of hdf4, called after the function 'HXsetcreatedir'. The issue
that I am seeing happens only in Windows.

I am attaching a C++ standalone code to highlight the issue, but the short summary of the issue is as follows.

  1. Basically, the repro code that I am attaching builds and executes correctly in Linux (none of the hdf4 functions return -1 error code), with the
    small caveat that the 'dim_sizes' argument to SDcreate (line 33 in the attached code) needs to be 'int' for Linux and it needs to be int32 (typedef
    long int32 for Windows. Otherwise the code fails to compile. I was wondering if you can help me understand why the same argument to the same
    function needs to be of different type in Windows and Linux.
  2. The second and the bigger issue I am seeing is that, the attached standalone code, when executed (with 'int32 dim_sizes' for Windows and 'int
    dim_sizes' for Linux) returns the status as -1 from the function call 'VSsetexternalfile' (line 70 in the attached code) for Windows, whereas the same
    line returns the status as '0' in Linux. Furthermore, I have seen that, in Windows, if I comment out line 36 (status =
    HXsetcreatedir(pathname.c_str()), the status returned by 'VSsetexternalfile' is 0 (success).

My understanding of the hdf4 library is limited, but I was under the impression that the SD and the VS interfaces are independent and before moving
to the VS interfaces from the SD interfaces, the library should reset. I do not get the same errors if I unload and reload the library after line 41
(between the SD interface code and the VS interface code) which leads me to suspect that the library is not resetting correctly in Windows.

I would be really glad if someone could take a look at this and let me know if there is something that we are doing wrong. Kindly let me know if you
need further information from us.

Thanks,

JIRA issue: https://jira.hdfgroup.org/browse/HDFFR-1586


Date: Thu, 19 Nov 2020 21:40:38 +0000
From:
To: "[email protected]" [email protected]
Subject: RE: Bug with 'VSsetexternalfile' in hdf4 library

Hello Barbara,

Thank you for looking into the same and registering a bug report.
It would be really helpful to have a workaround for the HXsetcreatedir issue to avoid seeing the sporadic failures in our test suite.

Also, since we don't have the latest versions of the hdf4 library, I was wondering if someone at your end can test it with the latest version of hdf4.
If the issue is not reproducible in a newer version of the library, we can upgrade to one of the newer versions to resolve this issue.

Thanks,

`libdf` missing in CMake build

Hi there!

It looks like the CMake build is missing libdf, which I was able to build with autogen.
Is that possible?

Thanks!

CMake soversion differs from Autotools

The autotools build creates a library with a soversion of 0 (soname of libdf.so.0), but the cmake build makes one with a soversion of 4 (soname of libdf.so.4). Why? Shouldn't the libraries be equivalent?

Stop deploying internal header files

Starting in 4.3.0, internal header files with undocumented API calls will no longer be deployed.

We'll update this issue with a full list of changes in the early fall.

Add an --enable-netcdf-tools configure option

The --enable-netcdf configure option determines if HDF4-built netCDF-2 calls will have their names mangled to avoid conflicts with the real netCDF library. Users have also asked that we provide a way to not build the netCDF tools ncdump, ncgen, and nctest since these will also conflict with the real netCDF.

The --enable-netcdf configure option was recently updated to provide that functionality, but that is probably the wrong thing to do. Instead we will add an --enable-netcdf-tools option to the Autotools (and HDF4_ENABLE_NETCDF_TOOLS to CMake) to control the building of the netCDF tools.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.