Re-use Custom Port Object from existing extensions Python-based extensions

Hey there,

I’m currently experimenting with developing nodes with LLM functionality.

Is there a way to “tap into” what is already implemented in the existing python-based Gen AI extension?

E.g. I wanted to make my node “compatible” with the OpenAI Chat Model Connector Ports (I am working on a Prompter node) and I observed the following:

When including the relevant port definitions directly in my extension I get this error:

ValueError: There is already a port type with the provided object class '<class 'utils.ports.ChatModelPortObject'>' registered.

Is there a way to “point” my extension to the existing extensions Custom Port Object s? How do I import Custom Port Objects correctly?

I’ve followed this part of the documentation and have added “org.knime.python.features.llm” as a dependency in my knime.yml:

Feature dependencies: if your extension depends on another extension, you can specify it as a bullet point of feature_dependencies. Optionally, you can add a specific minimum version to it.

Example: You use data types like SmilesValue of the KNIME Base Chemistry Types & Nodes extension in your extension. You have that extension already installed and want to make sure that everybody who uses your extension will also have this extension installed. Then you can go to Help > About KNIME Analytics Platform > Installation Details and check the id of KNIME Base Chemistry Types & Nodes, which is org.knime.features.chem.types.feature.group. Take the id without .feature.group and you have the string of the feature dependency: org.knime.features.chem.types

Hey @MartinDDDD,

It is really cool to see that you are building an extension that wants to interface our AI Extension! But by doing so, you notice the boundaries of what is currently possible in KNIME from Python :wink:

Using port objects defined in other extensions is something that we do not support yet.

There are a few complications why that is the case.

Assume extension A defines port object type PA. It uses library LA in its code and that is also used for serialization of some internals of the port object PA. Note that port objects are always passed between nodes in serialized form.

Importing the Port Object definition

To be able to use the port object PA defined in A in another extension B, we’d have to be able to import the module defining PA in A in our extension B. We currently didn’t build a means of module discoverability between extension yet, also because whatever libraries imported in the module defining PA need to be available in B, which brings us to…

Environment Compatibility
We don’t know yet how to ensure compatibility of the Python environments:

Now extension B wants to use PA. It will only be able to call the deserialization of PA if it also has library LA in its Python environment. There’s no way to enforce that. But we’d like to be able to at least check that this library is available when B depends on PA.

If you have input around how we could nicely resolve these issues, I’m all ears :wink:

Best,
Carsten

3 Likes

Thanks a lot for the detailed response - got it.

The part of the docs and the error that I got had made me hope that there’s a way (given that the port object has been registered). You pretty much need to allow extending an extension rather than extending KNIME… but then again what happens if two people extend the same extension?

Absolutely not in my realm of expertise to solve this ;-). Glad I manage to build something with the integration as is!

Takeaway for me: Will take those parts that have been implemented very well already and re-use it in my extensions… glad it is all open source :slight_smile:

If you are interested in what I am cooking up - successfully built my extension which allows to use Structured Outputs from OpenAI incl. Table to JSON Schema generator:

4 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.