Embedding layer in Keras

Hi, is it possible to initialize the Keras Embedding Layer with a pretrained embedding? In Keras it is possible with the weights argument (not documented here https://keras.io/layers/embeddings/#embedding, but shown here: https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html).

Thanks,
Francesco

Hi Francesco,

Unfortunately, pretrained weights aren’t supported by the layer nodes at the moment. You could manually create a network with a single embedding layer that is initialized with custom weights by using the DL Python Network Creator. You can then append the rest of the layers using regular Keras layer nodes.

Initializing layers with custom weights (and maybe even getting and setting the weights of existing layers) would certainly be an interesting feature. I’ll open a feature request for that :slight_smile:.

Marcel

Thank you Marcel. And in the feature request, add then a trainable - not trainable flag :slight_smile:

Good point, I did that :slight_smile:.