I am new to KNIME and I am trying to understand what must be given as Hostname and Database name in case we want to access data from Google BigQuery.
In my workflow I plan on using a Google Authentication API Key followed by a Google BigQuery Connector. Please direct me to resources or links that can help me understand how to define these fields.
I tried to follow through the steps in the tutorial. To configure the Google BigQuery Connector, I have used the hostname as bigquery.cloud.google.com and Database name as bigquery-public-data. It throws an error saying access denied. I am attaching a screenshot of the error I am facing. Please let me know how to resolve these permission issues.
It look like you’re trying to create something within the bigquery-public-data, which is read only. The error message alone isn’t enough to diagnose where you might be going wrong - for me anyway, but I don’t have experience with BigQuery myself. Let me see if @emilio_s can help.
Hi @skondur
There is a mistake in the configuration of the Google BigQuery Connector node.
As database name you must provide the Project ID of your Google Cloud Platform project. You can find it in the dashboard of Google Cloud Platform.
Thank you for the reply. I tried to plug in the Project ID and execute the node. It was successful. I have another follow up question. How do we in particular access a dataset from the public BigQuery datasets?. I plan to import a table from patents-public-data --> ebb_chembl into my KNIME workflow. What nodes would you suggest using once I have Google API Connector node and Google BigQuery Connector node executed?
Hi @skondur,
as suggested in the blogpost, next step is using a DB Table Selector node.
To query the ebb_chembl database you should build a custom query like this (please mind the quote ``):