Data separate and melting

Hi everyone,
I have data frame with column like this:


And I want to separate data from this column into many sub-column (value is separated by ‘;’). After that, I melt table. I am already done with R code.
image
However, when I run on Knime with conda environment, the problem is:

Therefore, how can I fix this issue. I also install package in my code as well as conda environment.
Thank you in advance.

@cuongnguyen it is possible to use R with the Conda Environment Propagation if you have Python/conda ready configured on your machine. You also will have to install additional packages like ‘tidyr’ or ‘pillar’ if they are not there.

In general I think it would make sens to familiarize yourself with the use of KNIME and R so you might be able to use it to your benefit.

https://docs.knime.com/latest/r_installation_guide/index.html#_introduction

3 Likes

Thank you so much @mlauber71 .
But when I check the installed packages in conda environment, ‘tidyr’ and ‘pillar’ are already exist
image
image

or should I re-install another environment

@cuongnguyen the question is have you confirmed the R node to use the correct environment.

How do your conda and r settings look. Maybe you can provide screenshots.

Also you will have to call all the libraries

I can’t help with the R part, but when you have the table already processed with R you can split by delimiter into a collection column and then ungroup? Or maybe unpivoting is the equivalent to melt.
br

1 Like

@mlauber71 here is my workflow, could you please run with your environment to help me check it.
Test_data_wrangling.knwf (23.1 KB)

Thank you @Daniel_Weikert, I will try it.

I already fixed this by installing new R environment

1 Like

@cuongnguyen maybe you try again with some more basic settings and a newer R version (4.1.3) which is available via conda(-forge). Your environment propagation has a lot of special linux packages that I cannot install on Windows/MacOS - so I set up a more generic YAML file - you could add other packages and update the environment later.

The YML/YAML file would look like this

# conda env create -f py39_knime_r.yml
# conda env update -f py39_knime_r.yml --prune
# https://forum.knime.com/t/python-environment-creation-problem/41577/7?u=mlauber71
# https://docs.knime.com/latest/r_installation_guide/index.html
#
# To activate this environment, use
#
#     $ conda activate py39_knime_r
#
# To deactivate an active environment, use
#
#     $ conda deactivate
name: py39_knime_r       # Name of the created environment
channels:                # Repositories to search for packages
- conda-forge
dependencies:            # List of packages that should be installed
- python=3.9             # Python
- py4j                   # used for KNIME <-> Python communication
- nomkl                  # Prevents the use of Intel's MKL
- pandas                 # Table data structures
- jedi<=0.17.2           # Python script autocompletion
- python-dateutil        # Date and Time utilities
- numpy                  # N-dimensional arrays
- cairo                  # SVG support
- pillow                 # Image inputs/outputs
- matplotlib             # Plotting
- pyarrow=6.0            # Arrow serialization
- IPython                # Notebook support
- nbformat               # Notebook support
- scipy                  # Notebook support
- python-flatbuffers<2.0 # because tensorflow expects a version before 2
- h5py<3.0 # must be < 3.0 because they changed whether str or byte is returned
- protobuf>3.12          # Lower protobuf versions do not work with TensorFlow 2
- libiconv               # MDF Reader node
- asammdf=5.19.14        # MDF Reader node
# --------------- basic R packages -------------------------------------
- r-base>=4.1.3
- r-rserve>=1.8_7        # RServe to communicate between R and KNIME
- r-essentials
- r-cairo
- r-ggplot2
- r-sessioninfo
- r-foreign
- r-readr
- r-readxl
- pip
- pip:
  - JPype1 # Databases
2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.