Load netcdf data

Hoi zäme

I wonder whether if somebody could give a helping hand with loading netcdf data (Norwegian climate data) to a PostGIS installation. You can find the data at

Precipitation netcdf file.

When trying to read it with the R model reader ignoring that a zip file is required, it rightfully complains that the file had no zip entry.

If zipping the nc file and trying to read that it rightfully complains that there was no content.xml file such that it was unable to extract the specs.

I have seen two post talking about R and netcdf but I was setting them aside assuming in seven years, things might have changed a bit.

Has somebody experience with netcdf data she/he could share?

Kind regards


Hi Thiemo -

Is this post addressing the same topic as ERROR: row is too big: size 9592, maximum size 8160?

It seems like it might be - if so, I will close it.

Hi Scott

It has the same origin, trying to load Norwegian climate data. However, where as this post is about reading NetCDF data where as the one you linked is about loading the textual representation of them (asc file) leading to the error mentioned in the topic. Latter post is actually more general as it is about solutions loading very broad data into a database specifically PostgreSQL.

To me, they are connected only loosly.



Ah, OK, we’ll leave topic open then - maybe someone else will know more. I haven’t used NetCDF data myself. I know this is an old topic, but is the suggestion by Aaron Hart useful? Maybe the NetCDF base format hasn’t changed much in the intervening years:

Thanks for pointing at Aarons reply. I never coded R, so I am reluctant to try. It might solve my original Task but does not answer the question of this post. I believe to have been able to load the NetCDF data by raster2pgsql, but I could not yet check. The database size increased enormously but I could not locate the data in the database yet. I might try to get help with the PostGIS E-Mail list.

From what I see there should be a way to use R do help you with your tasks.

The documentation recommends a R package “raster


The package is especially useful for large datasets that don’t fit into memory, because data is processed in chunks.

Then there seems to be a package that does load the data into postgres database. You would have to see if you could combine the two and if they can handle the size.

And then you might see if you could use KNIME as an interface to control your R code. Or you could see if you could use the database nodes to handle the output from the first package (I have not checked that).

Or this package promises to provide an interface between R and Postgis.

1 Like

A quick follow up. I played around with the data, and R is able to load the structure and display it and there are several pages that would describe how to go about. But my knowledge of Geodata is limited (as is my time :slight_smile: )

# https://www.markcherrie.net/post/netcdf-in-r/

# reading ncdf in R

# ---> EDIT:
workpath_r <- "/Users/m_lauber/Dropbox/knime-workspace/forum/kn_forum_import_netcdf/data/"
setwd(workpath_r) # Set work directory

# get all netcdf files
flist <- list.files(path = ".", pattern = "^.*\\.(nc|NC|Nc|Nc)$")

# Open a connection to the first file in our list
nc <- nc_open(flist[1])


This would give you some insight into the structure - so you might further address the elemets in question:

One idea would be to extract the table that you would be interested and then either save them as parquet files and load them into a PostGIS database accordingly thru the KNIME connector.

You could try that by using the Postgresapp that comes with PostGIS integrated (or your own database).

I have collected some links and resources on the hub - you might explore that and I may expand it. The rudimentary R script is also there

1 Like

Many, many thanks for your time, patience, research work and opinion! My checks on the database revealed, I was not successful in loading the NetCDF data so getting into R seems to be the way to go.

I think there are ways to get the data into postgres/postgis via R. The point is you would have to know how to present the data so it would fit - that is the tricky part.

Your example seems to hold at least two major sources. I think it would make sense to use R and untangle them and then use something like rpostgis to load it into postgis.

Or extract the information as a table and load that into postgres (via Parquet maybe). Here KNMIE could do its thing. And it could also function as a wrapper for some R code you might build.

Other than that that is not strictly speaking a KNIME question :slight_smile:


This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.