Python error : NegativeArraySizeException or Serializing to a byte array threw an IOException

Hello,

I have a workflow that uses the ImageJ2 'Image Calculator' node to substract the background from a list of images. The resulting image is of type double since there could be some negative intensities (properly handle with 32 or 64 bit images).

After this node I have a python node that further process those images (curve fitting).

The worflow worked with a dozen of images, but when I tried a higher amount of images (48) the python nodes returns the error: 'Execute failed: Serializing to a byte array threw an IOException (should never happen).'

I though that the double format was maybe somehow overflooding the memory so I try converting back to float. This time the python node returns : 'Execute failed: ("NegativeArraySizeException"): null'

Thanks for your help,

Here is my python code:

from __future__ import division
from KNIPImage import KNIPImage
from scipy.optimize import curve_fit
import numpy as np

# Copy input to output
output_table = input_table.copy()

# Create empty output_column
ColumnModel  = [] # Model Image
ColumnAmp    = [] # Parameters
ColumnOffset = []
ColumnX0 = []
ColumnY0 = []
ColumnSigmaX = []
ColumnSigmaY = []

#%% Model function for the fit
def Gaussian2D((X,Y),Amp,Offset,x0,y0,Sigma_x,Sigma_y):
    '''
    Expression of a 2D-Gaussian
    (x,y)  : variables (pixel position)
    Paramètres du fit :
    Amp    : Amplitude
    Offset : "Intensity Background"
    x0,y0 : position of the peak
    Sigma_x, Sigma_y :proportionnal to the width of the gaussian
    '''
    out = Amp*np.exp(-(X-x0)**2/(2*Sigma_x**2))*np.exp(-(Y-y0)**2/(2*Sigma_y**2)) + Offset # Expression of a 2D-Gaussian
    return out.ravel() # turn the result into a 1D matrix, needed for curve fit

# Loop over every cell in the 'Img' column
for index,input_cell in input_table['output'].iteritems():
  
    # get image from cell
    Image = input_cell.array
    
    #%% Open image and get metadata
    shape = Image.shape
    X,Y = np.indices(shape)
    Max = Image.max()
    Min = Image.min()

    #%% Initial Guess and boundaries
    Offset = Image.mean()
    Amp = Max - Offset
    x0,y0 = np.array(shape)//2 # initialising in the middle of the image
    Sigma_x = Sigma_y = 100
    MyGuess = (Amp,Offset,x0,y0,Sigma_x,Sigma_y)
    
    BoundDown = (Min,Min,0,0,0,0) # we might have negative intensity values with 32-float image since the image results from a substraction
    BoundUp   = (Max,Max,shape[0],shape[1],np.inf,np.inf)
    
    #%% Fit
    Parameters, Covariance = curve_fit(Gaussian2D,(X,Y),Image.ravel(), p0=MyGuess,bounds = (BoundDown,BoundUp))
    
    #%% Generate image of model
    Model = Gaussian2D((X,Y),*Parameters)
    Model = Model.reshape(shape)

    # Write result back into a KNIPImage
    output_cell = KNIPImage(Model)
  
    # Append output_cell to output array
    ColumnModel.append(output_cell)
    ColumnAmp.append(Parameters[0])
    ColumnOffset.append(Parameters[1])
    ColumnX0.append(Parameters[2])
    ColumnY0.append(Parameters[3])
    ColumnSigmaX.append(Parameters[4])
    ColumnSigmaY.append(Parameters[5])

# Append columns to table
output_table['Gaussian Model'] = ColumnModel
output_table['Amp']            = ColumnAmp
output_table['Offset']         = ColumnOffset
output_table['x0']             = ColumnX0
output_table['y0']             = ColumnY0
output_table['SigmaX']         = ColumnSigmaX
output_table['SigmaY']         = ColumnSigmaY

Hey Jindil

sorry for the late response. We've been quite busy with our annual summit in Austin. could you maybe provide us with a bit more information regarding you installation, which KNIME Python nodes you are actually using or maybe even the workflow itself that we can easily reproduce the problem?

Thanks in advance,


Christian

Hey Jindil,

you may also have a look at our nightly builds (https://www.knime.com/nightly-build-downloads). If you are using the Python nodes out of the Labs Category (having the "(Labs)" suffix in their name) you may choose a chunksize in the "Options" tab of the node's configure dialog. Set it to 10 and try again. If the problem persists, please get back to me.

 

Best,

Clemens

Dear Christian and Clemens,

Thanks for your answers and sorry for the late answer too.

I am now using the parallel chunk loop nodes with my python node. I don't have the error anymore and that really helps speeding up the computation.

However, the loop end sometimes tries to concatenate the tables before all python nodes in the loop are done. So that I have to click the metanode (the one popping up when the loop starts) for execution to finish the few chunks that were not processed.

It is not a big deal but if you have an easy fix that would be welcomed. Also with a less powerful machine the workflow tends to freeze and raise a "deadlock error".

Best

Laurent

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.