Content Based Movie Recommenders System using BERT Embeddings – KNIME Community Hub

This Wokflow demonstrates the usage of BERT Embeddings for the Content based recommendations for movie. The text is prepared in way that it contains movie title, plot and other information. The Content Based Recommender System recommends movies that are semantically similar to the movies the user watched.

This is a companion discussion topic for the original entry at

Hello KNIMErs,

I can run successfully this workflow when bert_en_cased_L-12_H-768_A-12 is selected in the BERT Model Selector node.

Default model bert_en_wwm_cased_L-24_H-1024_A-16 triggers the following exception (both when using TensorFlow Hub and Remote URL options):

ERROR BERT Model Selector 4:418 Execute failed: Executing the Python script failed: Traceback (most recent call last):
File “”, line 2, in
File “path_to_KNIME/plugins/se.redfield.bert_1.0.2.202212230230/py/”, line 42, in load_bert_layer
return model_type.load_bert_layer(bert_model_handle, cache_dir)
File “path_to_KNIME/plugins/se.redfield.bert_1.0.2.202212230230/py/”, line 30, in load_bert_layer
return hub.KerasLayer(bert_model_handle, trainable=True)
File “path_to_KNIME/plugins/”, line 153, in init
self._func = load_module(handle, tags, self._load_options)
File “path_to_KNIME/plugins/”, line 449, in load_module
return module_v2.load(handle, tags=tags, options=set_load_options)
File “path_to_KNIME/plugins/”, line 92, in load
module_path = resolve(handle)
File “path_to_KNIME/plugins/”, line 47, in resolve
return registry.resolver(handle)
File “path_to_KNIME/plugins/”, line 51, in call
return impl(*args, **kwargs)
File “path_to_KNIME/plugins/”, line 67, in call
return resolver.atomic_download(handle, download, module_dir,
File “path_to_KNIME/plugins/”, line 418, in atomic_download
download_fn(handle, tmp_dir)
File “path_to_KNIME/plugins/”, line 63, in download
response = self._call_urlopen(request)
File “path_to_KNIME/plugins/”, line 522, in _call_urlopen
return urllib.request.urlopen(request)
File “path_to_KNIME/plugins/”, line 214, in urlopen
return, data, timeout)
File “path_to_KNIME/plugins/”, line 517, in open
response = self._open(req, data)
File “path_to_KNIME/plugins/”, line 534, in _open
result = self._call_chain(self.handle_open, protocol, protocol +
File “path_to_KNIME/plugins/”, line 494, in _call_chain
result = func(*args)
File “path_to_KNIME/plugins/”, line 1389, in https_open
return self.do_open(http.client.HTTPSConnection, req,
File “path_to_KNIME/plugins/”, line 1349, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)>

Any suggestion about how to tackle this issue?

Many thanks in advance,

Can anyone please reply to this?

Let’s see if @Mpattadkal or @Artem are able to assist.

Hello @Stef

It seems that problem is that you cannot download this particular model. The problem is connected to SSL.
This way I can suggest you 2 options:

I hope this helps.

1 Like

Thanks a lot Artem, but sorry, no, it doesn’t do yet!

Option1: I manually downloaded the model from Hugging Face and have it as “bert_en_wwm_cased_L-24_H-1024_A-16.h5” in the local directory “/home/stefano/Downloads”. When I try to load it with the selector by browsing the file system and choosing this very directory (see image below)

Screenshot from 2023-06-08 19-22-29

I get the following error: “OSError: SavedModel file does not exist at: /home/stefano/Downloads/{saved_model.pbtxt|saved_model.pb}”. Tried to rename the model as “bert_en_wwm_cased_L-24_H-1024_A-16.pbtxt” or “bert_en_wwm_cased_L-24_H-1024_A-16.pb”, but to no avail.

Option2: After issuing in my shell “conda activate py3_knime” to check the Python 3 Conda environment that I use in my KNIME 4.7.3 (and that I assume all your BERT nodes also rely upon), I did the following:

  1. made sure the indicated packages are there (e.g. “pip install --upgrade certifi”, but where exactly do you guys import “certifi” in your code? Was unsuccessfully skimming through,,;
  2. updated certificates in my “/etc/ssl/certs” directory with the shell command “sudo update-ca-certificates --fresh”;
  3. exported an environment variable pointing to the certificates’ directory (“export SSL_CERT_DIR=/etc/ssl/certs”) as suggested.

Unfortunately, nothing has really changed, I’m afraid. Same bugging [SSL: CERTIFICATE_VERIFY_FAILED] exception, dammit!

Any way you can be more specific and straight-to-the-point? Sorry for nagging you about this, but I wish I could finally crack it :slight_smile:

Hello @Stef

Regarding your first option. It seems that you are referring to a folder with model, not to the model folder, could you please try to locate to it?

And have no more ideas regarding your second approach. Could a proxy or other network restrictions be a reason for that?

Hi Artem, great reading you!

Look, I’m not entirely clear about what you mean, sorry for that :frowning:

I have a local copy of the Hugging Face’s model on my machine in the “/home/stefano/Downloads” folder. That is indeed “a folder with model” to use your very words.

You write I should use a “model folder” instead and, I assume, put the model file there and select that folder/directory in your BERT Model Selector node. If it’s so, that’s brilliant!

Now, is what you call “model folder” a specific folder I must look out for (e.g. in the specific workflow folder in the knime-workspace directory) to then transfer my model file there? Where is it exactly?

Thanks a ton for your time and patience, perhaps we’re close this time :wink:

No final answer yet? Thanks in advance for one last useful piece of advice here… :pray: