Error BERT Embedder with fine-tuned models

Hi there,

I am trying to use the BERT Embedder Node with a fine-tuned model but I run into the following error. The Embedder Node works fine if I run it with models from the BERT Model Selector Node, only when I fine-tune a model with the BERT classification learner I get an error message.

I run KNIME 4.6.3 on a Win10 machine. I am also using the bundled python environment. The max sequence length in the Embedder node is set to the same value (value=100) as the classification learner.

Could someone please help me with this error message?

Here is the error message:

ERROR BERT Embedder 4:225:406 Execute failed: Executing the Python script failed: Traceback (most recent call last):
File “”, line 2, in
File “C:\Program Files\KNIME\plugins\se.redfield.bert_0.2.0.202208310115\py\BertEmbedder.py”, line 95, in run_from_classifier
embedder = cls.from_saved_model(model_type, saved_model, sentence_column, second_sentence_column, max_seq_length)
File “C:\Program Files\KNIME\plugins\se.redfield.bert_0.2.0.202208310115\py\BertEmbedder.py”, line 117, in from_saved_model
return BertEmbedder(bert_layer, tokenizer)
File “C:\Program Files\KNIME\plugins\se.redfield.bert_0.2.0.202208310115\py\BertEmbedder.py”, line 23, in init
self.pooled_output, self.sequence_output = bert_layer([input_ids, input_masks, input_segments])
File “C:\Program Files\KNIME\plugins\se.redfield.bert.channel.bin.win32.x86_64_1.0.0.202208310115\env\lib\site-packages\keras\utils\traceback_utils.py”, line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File “C:\Program Files\KNIME\plugins\se.redfield.bert.channel.bin.win32.x86_64_1.0.0.202208310115\env\lib\site-packages\tensorflow\python\saved_model\function_deserialization.py”, line 286, in restored_function_body
raise ValueError(
ValueError: Exception encountered when calling layer “bert” (type Custom>TFBertMainLayer).

Could not find matching concrete function to call loaded from the SavedModel. Got:
Positional arguments (14 total):
* [<tf.Tensor ‘input_ids:0’ shape=(None, 128) dtype=int32>,
<tf.Tensor ‘input_ids_1:0’ shape=(None, 128) dtype=int32>,
<tf.Tensor ‘input_ids_2:0’ shape=(None, 128) dtype=int32>]
* None
* None
* None
* None
* None
* None
* None
* None
* None
* None
* None
* None
* False
Keyword arguments: {}

Expected these arguments to match one of the following 2 option(s):

Option 1:
Positional arguments (14 total):
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘input_ids’)
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘attention_mask’)
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘token_type_ids’)
* None
* None
* None
* None
* None
* None
* None
* None
* None
* None
* False
Keyword arguments: {}

Option 2:
Positional arguments (14 total):
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘input_ids’)
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘attention_mask’)
* TensorSpec(shape=(None, 100), dtype=tf.int32, name=‘token_type_ids’)
* None
* None
* None
* None
* None
* None
* None
* None
* None
* None
* True
Keyword arguments: {}

Call arguments received by layer “bert” (type Custom>TFBertMainLayer):
• args=([‘tf.Tensor(shape=(None, 128), dtype=int32)’, ‘tf.Tensor(shape=(None, 128), dtype=int32)’, ‘tf.Tensor(shape=(None, 128), dtype=int32)’],)
• kwargs={‘training’: ‘False’}

Cheers,
Daniel

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.