Skip to content

Problem writing big-endian data to existing file #1802

@neishm

Description

@neishm

Writing big-endian data into an existing netCDF4 (on a little-endian system) does not seem to properly swap the bytes.

The following example from Unidata/netcdf4-python#1033 (comment) triggers the problem:

#include <netcdf.h>
int main() {
    int i, iret, dimid, varid, ncid;
    int data[10];
    for (i = 0; i < 10; i++)
        data[i] = i;
    iret = nc_create("test.nc", NC_NETCDF4, &ncid);
    iret = nc_def_dim(ncid, "x", 10, &dimid);
    iret = nc_def_var(ncid, "v", NC_INT, 1, &dimid, &varid);
    iret = nc_def_var_endian(ncid, varid, NC_ENDIAN_BIG);
    /*iret = nc_put_var_int(ncid, varid, data);*/
    iret = nc_close(ncid);
    iret = nc_open("test.nc", NC_WRITE, &ncid);
    iret = nc_put_var_int(ncid, varid, data);
    iret = nc_close(ncid);
}

The resulting file has the data:

 v = 0, 16777216, 33554432, 50331648, 67108864, 83886080, 100663296, 
    117440512, 134217728, 150994944 ;

where the values should be 0,1,2,3,4,5,6,7,8,9. This was tested with netcdf version 4.7.3 on an Ubuntu 20.04 system, and also the latest development snapshot (b9bb44f) was tested with the same result.

If the intermediate nc_close and nc_open calls are commented out (i.e. still writing into a new file), then the data is written correctly.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions