I have a working workflow that ends with a csv writer.
The node prior to the csv writer is outputting Arabic text with no loss of character. However when the workflow ends with the csv writer, the csv output file looses character for Arabic text. My csv writer node is configured with utf-8 encoding but still the loss of character is resulting.
@pyroo56323 welcome to the KNIME forum. You might want to check what encoding you use to import the data in the first place. If this still does not work you could provide us with an example so one could investigate further
@mlauber71 thx, i initially had this in mind and have double checked so many times.
All the imports are with encoding UTF-8 and probably that is why the last node before the csv writer node is showing all the text in with no character loss.
I have also added β-Dfile.encoding=UTF8β in my KNIME.ini file still no luck
@pyroo56323 could you create an example where this happens within the workflow so we might investigate further. Maybe it is a special node that is responsible.
@mlauber71 double checked the node monitor of all the nodes (backward) to the 1st node. Everything looks ok, no character loss or anything. Text is visible in the correct character set. Only at the workflow end with the csv writer (having encoding UTF-8) the output csv file when i open it, the characters are lost.
Created the example using the snips, any thing else i should do for more elaboration?
However, another problem arises: that is that KNIME in the second row is placed on the left cell margin although it was originally on the right margin. This happens on the local Excel file only, in KNIME the table is correctly displayed. Again, I suspect this is a problem of how Excel reads and somehow sorts characters. Searching the Internet for how this can be customized should definitely solve your issue .
Hi @roberto_cadili thx for the help. Excel writer does work, no character are loss.
But i wanted the output as a csv, probably will try to read the excel and write csv then maybe.
in my example there is a CSV writer using UTF-16 and the file get read back in. You might also try that with UTF-8. You will have to set the encoding for the writer also.
Unfortunately, I donβt think that will work either. The problem is how Microsoft Excel automatically reads and encodes CSV files. Apparently, you can use Macros to change by default how Excel encodes CSV files (for example using UTF-8): Redirecting.
To prove that the problem lies within Microsoft Excel and the way it reads CSV files, try to open the CSV files created by the CSV Reader node using any other reader for CSV files that is not Microsoft Excel, for example the simple Notepad or Notepad ++.