SAS Data file

WARN SAS7BDAT Reader 2:19 Node created an empty data table. I am importing a SAS data file with 2 million records but KNIME SAS reader is reading it as empty folder. Please assist.

@wilbert_1 it is difficult to tell much with this information. First thing to check would be if you have disabled compression (COMPRESS=NO) when creating the SAS file.

https://documentation.sas.com/?cdcId=pgmsascdc&cdcVersion=9.4_3.5&docsetId=ledsoptsref&docsetTarget=n014hy7167t2asn1j7qo99qv16wa.htm&locale=en

Hi. I did not use compress at all. So I don’t what is the problem. Is there any way that Knime is not able to read very big files like 34g?

Are you able to read any SAS files? Have you checked if you have compression set to NO?

You might want to consider using other export formats to get data from SAS to KNIME if this reader is not working. 34 G sounds like a challenge - but this depends on your machine. Can you tell us something about your machine and configuration.

Memory :64G
Hard drive :3T

OK how much of this has been allocated to KNIME? The problem is without further information it is very hard to get an idea what might be going on so you might give us a little more insight into what is transpiring.

A 34 GB file possibly compressed sounds like a lot of data. Even if you would be able to somehow load that into KNIME you would have to see if your machine can handle it. Do you have an SSD or a HDD disk?

You could set the LOG level to debug, clear the logfile and try again to see if this gives us any insight.

The problem is not about the file itself, it seems it is the Knime itself. I sample 20 000 records and converted the file into text but the problem still persist.

Hmm this sounds strange. Can you test this sample and see if this does work.

And then again: it might help if you could provide us with more information. Maybe even a sample file without sensitive information.

Hi @wilbert_1,
which memory settings did you set for the SAS reader node? Maybe writing all data to harddisk slows down the process but could finaly solve that issue.
Because if it works with a small file and not with the large file it maybe a hint for not enough memory. But then you normally have entries in the log file.

BR

1 Like