how use model in Android

I am exploring ML. I want to use a model created using KNIME in an android app for predictions. The model format is PMML.

What is the correct way to proceed?
Can KNIME provide our model in Tensorflow compliant format?

Hi @Trivium - welcome to the forum! I’ll try to briefly address both your questions.

1 ) If you have already have a PMML model built using one KNIME workflow, then you could build a separate simple workflow dedicated to deployment of that model.

Using the REST API provided by KNIME Server, you can make REST requests from your android app to make predictions in real time. Perhaps your app sends some JSON data to your KNIME Server - then a deployment workflow might look something like this example, where a JSON model prediction is returned.

If you’re interested in a KNIME Server trial, let us know and we can arrange that for you.

2 ) Depending on how your model is originally developed, we have nodes that can translate to Tensorflow from Keras and ONNX formats. You can then use a few methods to predict using the saved Tensorflow model. Check the hub for more on our Tensorflow nodes and associated workflows.

If you have more specific questions, please don’t hesitate to ask! :slight_smile:

1 Like

Hello @trivium,

As Scott mentioned, the REST API is part of the KNIME Server; feel free to reach out to me if you would like a trial.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.