Apologies if this is not the correct forum. I am trying to use Google’s Semantic sentence encoder to just get the embeddings (https://tfhub.dev/google/universal-sentence-encoder/2). I have tried downloading the module (as a tar.gz) and then converting to .zip for use in the Tensorflow Network Reader but it would not work as no “Tags” were found.
Naively, i would just want to go from sentence -> 512 vector so that i can compare sentences, without having to go down the rabbit whole of keras/tensorflow
Does anyone have any advice?
and welcome to our forum
Regarding your question: Our TensorFlow integration does not directly support modules from TensorFlow hub in its current state.
However, in many cases it should be possible to convert a TensorFlow Module into a compatible SavedModel in a DL Python Network Creator node.
Unfortunately, the model you are referring to is not one of these as it requires a special preprocessing library (SentencePiece) that is not part of TensorFlow and hence has to be applied separately from the TensorFlow model.
Therefore your only option right now is to use our Python integration where you can run a slightly adapted version of the example script on the TensorFlow Hub site to run the module on sentences provided by a KNIME table.
Thank you very much for your reply! I was hoping to avoid the onnx_tf issues i am currently having with that node (module ‘tensorflow’ has no attribute ‘ceil’ ) - but these are python issues (i think).
Thanks again (And thanks for the welcome!)