I managed to get BigQuery and Knime working together for a project last year.
I was never able to get the Simba JDBC driver to work correctly, the StarSchema one sort of worked, but the CData one worked perfectly. Sadly it's not free, but you can get a 30-day trial to prove it works before you buy. The trick was in the method of access, and it may work for the newest Simba driver.
I had to use a Google Cloud Platform Service Account to connect to BigQuery, and I had to create a Client ID connected to the Service Account in order to get the JDBC connector to work.
Once I imported the CData driver into Knime, the settings in the database connector were as follows:
Database driver: cdata.jdbc.googlebigquery.GoogleBigQueryDriver
Database URL: jdbc:googlebigquery:InitiateOAuth=GETANDREFRESH;OAuthClientId=<client ID>;OAuthClientSecret=<OAuth key generated by Google for the client ID>;ProjectId=<your BigQuery project ID>;DatasetId=<BigQuery dataset name>
Tick "Use username and password"
Username : <BigQuery Service Account ID>
Password: <Service Account password>
Note that the URL structure above only works for the CData JDBC connector. If you use the StarSchema one, the URL is structured differently, but I can't find my notes on that.
I only used BigQuery as a data source. I think I did manage to get Knime to write new tables, but remember that an RDBMS-style UPDATE has no meaning in a BigQuery context. You can't update a row, you can only read it and write a new table with the altered data. I recall that BigQuery worked well and quickly as a data source, but trying to write back to it was painful.
I'd expect that the drivers have improved in the last 12 months, and Knime has had a number of updates, so it may be easier now, but I'm now using Azure rather than Google, and so far it's a much happier experience. BigQuery isn't really suited to the usage scenario I'm working with.