I wonder whether if somebody could give a helping hand with loading netcdf data (Norwegian climate data) to a PostGIS installation. You can find the data at
When trying to read it with the R model reader ignoring that a zip file is required, it rightfully complains that the file had no zip entry.
If zipping the nc file and trying to read that it rightfully complains that there was no content.xml file such that it was unable to extract the specs.
I have seen two post talking about R and netcdf but I was setting them aside assuming in seven years, things might have changed a bit.
Has somebody experience with netcdf data she/he could share?
It has the same origin, trying to load Norwegian climate data. However, where as this post is about reading NetCDF data where as the one you linked is about loading the textual representation of them (asc file) leading to the error mentioned in the topic. Latter post is actually more general as it is about solutions loading very broad data into a database specifically PostgreSQL.
Ah, OK, we’ll leave topic open then - maybe someone else will know more. I haven’t used NetCDF data myself. I know this is an old topic, but is the suggestion by Aaron Hart useful? Maybe the NetCDF base format hasn’t changed much in the intervening years:
Thanks for pointing at Aarons reply. I never coded R, so I am reluctant to try. It might solve my original Task but does not answer the question of this post. I believe to have been able to load the NetCDF data by raster2pgsql, but I could not yet check. The database size increased enormously but I could not locate the data in the database yet. I might try to get help with the PostGIS E-Mail list.
The package is especially useful for large datasets that don’t fit into memory, because data is processed in chunks.
Then there seems to be a package that does load the data into postgres database. You would have to see if you could combine the two and if they can handle the size.
And then you might see if you could use KNIME as an interface to control your R code. Or you could see if you could use the database nodes to handle the output from the first package (I have not checked that).
Or this package promises to provide an interface between R and Postgis.
A quick follow up. I played around with the data, and R is able to load the structure and display it and there are several pages that would describe how to go about. But my knowledge of Geodata is limited (as is my time )
# https://www.markcherrie.net/post/netcdf-in-r/
# reading ncdf in R
library(ncdf4)
library(reshape2)
library(dplyr)
library(raster)
# ---> EDIT:
workpath_r <- "/Users/m_lauber/Dropbox/knime-workspace/forum/kn_forum_import_netcdf/data/"
setwd(workpath_r) # Set work directory
# get all netcdf files
flist <- list.files(path = ".", pattern = "^.*\\.(nc|NC|Nc|Nc)$")
# Open a connection to the first file in our list
nc <- nc_open(flist[1])
print(nc)
This would give you some insight into the structure - so you might further address the elemets in question:
Many, many thanks for your time, patience, research work and opinion! My checks on the database revealed, I was not successful in loading the NetCDF data so getting into R seems to be the way to go.
I think there are ways to get the data into postgres/postgis via R. The point is you would have to know how to present the data so it would fit - that is the tricky part.
Your example seems to hold at least two major sources. I think it would make sense to use R and untangle them and then use something like rpostgis to load it into postgis.
Or extract the information as a table and load that into postgres (via Parquet maybe). Here KNMIE could do its thing. And it could also function as a wrapper for some R code you might build.
Other than that that is not strictly speaking a KNIME question