NLP Relation extraction

Hello, dear community.

I have a question about the topic of NLP, specifically about the task of extracting relationships from unstructured text.

I can find different topics about Named Entity Recognition using spaCy model. But what about Relation Extraction for those recognized entities?

Thanks alot!

Hi @vlad28,

As far as I know, Spacy does not support an out-of-the-box Relation Extraction component. It looks like you have to manually define some rules to extract dependencies (I’m sure there are some blog posts out there) or implement your own relation extraction component (check the linked blog post by Spacy). Hence, a corresponding Spacy node in KNIME doesn’t exist either. Basically, this portion of the analysis, requires you to use Python.

A possible alternative for English (for which we do have a node) is the
StanfordNLP Relation Extractor – KNIME Community Hub.
This nodes supports a limited number of entities and relation types (check the node docs for the full list).

If you’re interested in other languages and don’t want to get caught up in complex coding, I’d suggest using LLMs (even local ones which have improved significantly lately). Check my previous answer in a similar forum thread: NLP questions - #2 by roberto_cadili

Hope it helps!
Roberto

Thanks for the answer, Roberto :heart_eyes:

I think I might try the python solution, maybe I can figure it out, cause English is out of the question. :slight_smile:

Maybe the guys from Redfield should take this into consideration, to create a node in order to do that.

Regarding the LLMs out there, have you experienced a light one, which is good for relation extraction?

Thanks and have a great day ahead!

Hi @vlad28, you’re welcome! :slight_smile:

A local LLM that is relatively light and good performing is Llama 3.2 3B instruct. It’s only 1.79GB, so it should be possible to run it on an average home PC, and you can access it for free via GPT4All (and use the corresponding node to import it: Local GPT4All Chat Model Connector – KNIME Community Hub).

Other alternatives for local LLMs can be found via Ollama; Ollama. In the Ollama model repo, you usually find the same model in different sizes, if available. On top of the aforementioned model, you can give Gemma a try (1.7 GB): gemma:2b.

In KNIME, you can connect to Ollama models with the KNIME AI Extensions. Check the tutorial for the details: How to leverage open source LLMs locally via Ollama | KNIME.

Hope it helps! :slight_smile:
Roberto