BERT Model Selector - Import SavedModel Error

Hi,

I downloaded “BERT for patents” from here to a local directory. Trying to load this model from local directory with BERT Model Selector I get the following error message:

ValueError: Importing a SavedModel with tf.saved_model.load requires a tags= argument if there is more than one MetaGraph. Got tags=None, but there are 2 MetaGraphs in the SavedModel with tag sets: [[‘serve’], [‘serve’, ‘tpu’]]. Pass a tags= argument to load this SavedModel.

Is there any solution for this problem?

Hi @sommer_sun,

I’ve never encountered this issue before but from a quick Google search, it looks like it might be connected to this.

At the same time, mind that the documentation of the node reports the following:

SETTINGS tab

  • Remote URL
    An arbitrary link to a model. Use with caution since there is no guarantee that the model will be compatible with the node. Only active when Advanced tab checkbox is active. This feature is experimental, use with discretion.
  • Local folder
    A path to the local model that was downloaded not by this node. Only active when Advanced tab checkbox is active. This feature is experimental, use with discretion.

ADVANCED tab

  • Enable Remote URL and Local path selection modes
    Enable Remote URL and Local path selection modes - an experimental feature that makes it possible to pick any model from a remote or local source. There is no guarantee that the models will be compatible with the BERT extension.

Being still experimental, I guess this feature causes the problem (at least when reading models locally).

Depending on what you need the model for, an option is to access the model from Hugging Face model repository using Python in the Python Script node. I think I’ve found the same model you’re interested here: anferico/bert-for-patents · Hugging Face. On Hugging Face, the task this model is enabled for is “fill-mask”. Find below the essential code to use the mask-filler:

from transformers import pipeline

pipe = pipeline("fill-mask", model="anferico/bert-for-patents")

predict_mask = pipe("Paris is the capital city of [MASK].")

Find more info on fill-mask pipelines here: Pipelines.

However, if you’re interested in using it for classification, this option from Hugging Face won’t work well, as the the task the model is associated with is “fill-mask” instead of “zero-shot-classification” or “text-classification”. You can force the task to do classification but results most likely are going to be poor.

Hope this helps!

Roberto

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.