Collection domain values are not saved (or loaded)?

When I create a domain (for example with a Domain Calculator node), save the workflow, close and reopen, then the domain information seems to be gone.

Additionally I got this exception when I tried to open the output table with the missing domain information:

Exception in thread “OutPortView-Updater-1” java.lang.NullPointerException
at org.knime.core.node.workflow.DataTableSpecView.addPossValuesRowsToDataContainer(DataTableSpecView.java:320)
at org.knime.core.node.workflow.DataTableSpecView.createTableSpecTable(DataTableSpecView.java:168)
at org.knime.core.node.workflow.DataTableSpecView.updateDataTableSpec(DataTableSpecView.java:128)
at org.knime.core.node.workflow.DataTableSpecView.(DataTableSpecView.java:112)
at org.knime.core.data.DataTableSpec.getViews(DataTableSpec.java:1000)
at org.knime.core.node.workflow.OutPortView$3.run(OutPortView.java:225)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)

I have to add, that I do not get this exception, do not know the necessary circumstances to reproduce, as it seems to be independent from the type of the node.
Should I do something to prevent this situation?
Thanks, gabor

Additional information: It looks like the problem is with saving:

java.io.NotSerializableException: org.knime.core.data.collection.BlobSupportDataCellList
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1156)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1474)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1392)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1150)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)
at org.knime.core.node.config.Config.writeObject(Config.java:1920)
at org.knime.core.node.config.Config.addDataCell(Config.java:699)
at org.knime.core.node.config.Config.addDataCellArray(Config.java:1692)
at org.knime.core.data.DataColumnDomain.save(DataColumnDomain.java:340)
at org.knime.core.data.DataColumnSpec.save(DataColumnSpec.java:358)
at org.knime.core.data.DataTableSpec.save(DataTableSpec.java:797)
at org.knime.core.node.BufferedDataTable.saveSpec(BufferedDataTable.java:390)
at org.knime.core.node.BufferedDataTable.save(BufferedDataTable.java:369)
at org.knime.core.node.NodePersistorVersion200.savePort(NodePersistorVersion200.java:258)
at org.knime.core.node.NodePersistorVersion200.savePorts(NodePersistorVersion200.java:179)
at org.knime.core.node.NodePersistorVersion200.save(NodePersistorVersion200.java:137)
at org.knime.core.node.workflow.SingleNodeContainerPersistorVersion200.save(SingleNodeContainerPersistorVersion200.java:212)
at org.knime.core.node.workflow.WorkflowPersistorVersion200.saveNodeContainer(WorkflowPersistorVersion200.java:516)
at org.knime.core.node.workflow.WorkflowPersistorVersion200.save(WorkflowPersistorVersion200.java:387)
at org.knime.core.node.workflow.WorkflowManager.save(WorkflowManager.java:4171)
at org.knime.workbench.editor2.SaveWorkflowRunnable.run(SaveWorkflowRunnable.java:128)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121)

Hi Gabor,

This is a bug. Actually we didn’t think of use cases where one wants to attach entire Collections to the domain (that’s what you try to do, right?). This could increase the memory requirement of the table meta information (DataTablespec) tremendously. Does this prevent you from working with that workflow? I guess the framework should catch those exceptions and still load the data?

We will open a bug report on it. Thanks for reporting it.
Bernd

Hi Bernd,
Yes, it is about saving the collections in the domain, sorry if it was not clear.
I thought it is a supported option (as Domain Calculator computes those domains too), so I started to use that information. (And it looks like it will be fixed, so still good. :))

Yes the load is done, but as I use the domain information in other nodes to generate new columns it causes the subsequent nodes to fail to restore previous state (for example the Threshold imaging nodes are failing noisyly with ArrayIndexOutOfBoundException, when they do not find the previously selected column).
Thanks for your help, gabor

PS.: I use collections in the LociReaderNodeModel to generate, and in ConvertToImageNodeModel to consume that information. Maybe not the best use cases/design.