2024-11-22 16:05:23,090 : DEBUG : main : : DelegatingNodeModel : : : Registering view at model (total count 1) 2024-11-22 16:05:23,579 : DEBUG : main : : FreshPythonGatewayFactory : OpenAI Chat Model Connector : 4:6 : Using CA cert mode ENV 2024-11-22 16:05:23,579 : DEBUG : python-gateway-creator-15 : : FreshPythonGatewayFactory : : : Using CA cert mode ENV 2024-11-22 16:05:24,262 : DEBUG : main : : DefaultPythonGateway : OpenAI Chat Model Connector : 4:6 : Connected to Python process with PID: 18176 after ms: 675 2024-11-22 16:05:24,997 : DEBUG : python-gateway-creator-15 : : DefaultPythonGateway : : : Connected to Python process with PID: 15924 after ms: 727 2024-11-22 16:05:25,406 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpx:load_ssl_context verify=True cert=None trust_env=True http2=False 2024-11-22 16:05:25,410 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpx:load_verify_locations cafile='C:\\Users\\sntra\\Documents\\knime_5.4.0\\bundling\\envs\\org_knime_python_llm_5.4.0\\Lib\\site-packages\\certifi\\cacert.pem' 2024-11-22 16:05:25,798 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : openai._base_client:Request options: {'method': 'get', 'url': '/models', 'post_parser': ._parser at 0x000001EC869AF600>, 'json_data': None} 2024-11-22 16:05:25,827 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : openai._base_client:Sending HTTP Request: GET https://api.openai.com/v1/models 2024-11-22 16:05:25,827 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.connection:connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=5.0 socket_options=None 2024-11-22 16:05:25,852 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.connection:connect_tcp.complete return_value= 2024-11-22 16:05:25,852 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.connection:start_tls.started ssl_context= server_hostname='api.openai.com' timeout=5.0 2024-11-22 16:05:25,867 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.connection:start_tls.complete return_value= 2024-11-22 16:05:25,867 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:send_request_headers.started request= 2024-11-22 16:05:25,868 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:send_request_headers.complete 2024-11-22 16:05:25,868 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:send_request_body.started request= 2024-11-22 16:05:25,868 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:send_request_body.complete 2024-11-22 16:05:25,868 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:receive_response_headers.started request= 2024-11-22 16:05:26,903 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', [(b'Date', b'Fri, 22 Nov 2024 13:05:27 GMT'), (b'Content-Type', b'application/json'), (b'Transfer-Encoding', b'chunked'), (b'Connection', b'keep-alive'), (b'openai-version', b'2020-10-01'), (b'x-request-id', b'8dbe1d9a95f8f789ea8f2f6a933986c7'), (b'openai-processing-ms', b'192'), (b'strict-transport-security', b'max-age=31536000; includeSubDomains; preload'), (b'CF-Cache-Status', b'DYNAMIC'), (b'Set-Cookie', b'__cf_bm=ylMSIWsGORW3w2Om.xbod9ZTEtbF.wiLQWDzwhhwE3E-1732280727-1.0.1.1-6TjA2PYryMIvrwA3Rhx83JaVCXpNgx8q_sKYydsf2BBo6xaRNDivn.d1FDpBXX8jmrbpGvwwPPBiT0t5HYYvhw; path=/; expires=Fri, 22-Nov-24 13:35:27 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), (b'X-Content-Type-Options', b'nosniff'), (b'Set-Cookie', b'_cfuvid=JHqPVUuT8JkWOFEMzE7l2heikAwQTSNFAce8KUut0S0-1732280727084-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), (b'Server', b'cloudflare'), (b'CF-RAY', b'8e692189dd9b9f5c-DOH'), (b'Content-Encoding', b'br'), (b'alt-svc', b'h3=":443"; ma=86400')]) 2024-11-22 16:05:26,904 : INFO : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpx:HTTP Request: GET https://api.openai.com/v1/models "HTTP/1.1 200 OK" 2024-11-22 16:05:26,905 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:receive_response_body.started request= 2024-11-22 16:05:26,905 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:receive_response_body.complete 2024-11-22 16:05:26,905 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:response_closed.started 2024-11-22 16:05:26,906 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : httpcore.http11:response_closed.complete 2024-11-22 16:05:26,906 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : openai._base_client:HTTP Response: GET https://api.openai.com/v1/models "200 OK" Headers([('date', 'Fri, 22 Nov 2024 13:05:27 GMT'), ('content-type', 'application/json'), ('transfer-encoding', 'chunked'), ('connection', 'keep-alive'), ('openai-version', '2020-10-01'), ('x-request-id', '8dbe1d9a95f8f789ea8f2f6a933986c7'), ('openai-processing-ms', '192'), ('strict-transport-security', 'max-age=31536000; includeSubDomains; preload'), ('cf-cache-status', 'DYNAMIC'), ('set-cookie', '__cf_bm=ylMSIWsGORW3w2Om.xbod9ZTEtbF.wiLQWDzwhhwE3E-1732280727-1.0.1.1-6TjA2PYryMIvrwA3Rhx83JaVCXpNgx8q_sKYydsf2BBo6xaRNDivn.d1FDpBXX8jmrbpGvwwPPBiT0t5HYYvhw; path=/; expires=Fri, 22-Nov-24 13:35:27 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), ('x-content-type-options', 'nosniff'), ('set-cookie', '_cfuvid=JHqPVUuT8JkWOFEMzE7l2heikAwQTSNFAce8KUut0S0-1732280727084-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), ('server', 'cloudflare'), ('cf-ray', '8e692189dd9b9f5c-DOH'), ('content-encoding', 'br'), ('alt-svc', 'h3=":443"; ma=86400')]) 2024-11-22 16:05:26,906 : DEBUG : main : : CloseablePythonNodeProxyFactory : OpenAI Chat Model Connector : 4:6 : openai._base_client:request_id: 8dbe1d9a95f8f789ea8f2f6a933986c7 2024-11-22 16:05:44,557 : DEBUG : comm-pool-thread-12 : : Node : Agent Prompter : 4:37 : reset 2024-11-22 16:05:44,688 : DEBUG : comm-pool-thread-12 : : Node : Agent Prompter : 4:37 : clean output ports. 2024-11-22 16:05:44,688 : DEBUG : comm-pool-thread-12 : : WorkflowDataRepository : : : Removing handler 0ed03372-e03e-4e3c-936a-82fbfea3b9e7 (Agent Prompter 4:37: ) - 15 remaining 2024-11-22 16:05:44,688 : DEBUG : comm-pool-thread-12 : : SelectionEventBus : : : Selection event emitter removed for node 4:37. Num emitters: 6 2024-11-22 16:05:44,688 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Agent Prompter 4:37 has new state: IDLE 2024-11-22 16:05:44,688 : DEBUG : comm-pool-thread-12 : : Node : Vector Store to Tool : 4:27 : reset 2024-11-22 16:05:44,812 : DEBUG : comm-pool-thread-12 : : Node : Vector Store to Tool : 4:27 : clean output ports. 2024-11-22 16:05:44,812 : DEBUG : comm-pool-thread-12 : : WorkflowDataRepository : : : Removing handler 4079f774-3a85-4eed-9c35-0082624ce427 (Vector Store to Tool 4:27: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-Vecto_4-27-71901) - 14 remaining 2024-11-22 16:05:44,844 : DEBUG : comm-pool-thread-12 : : WriteFileStoreHandler : : : Disposing file store "4079f774-3a85-4eed-9c35-0082624ce427 (Vector Store to Tool 4:27: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-Vecto_4-27-71901)" - folder successfully deleted 2024-11-22 16:05:44,844 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Vector Store to Tool 4:27 has new state: IDLE 2024-11-22 16:05:44,844 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Functions Agent Creator : 4:36 : reset 2024-11-22 16:05:44,960 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Functions Agent Creator : 4:36 : clean output ports. 2024-11-22 16:05:44,960 : DEBUG : comm-pool-thread-12 : : WorkflowDataRepository : : : Removing handler 2dbe97ee-ef54-4d59-9027-0e2d7c45c743 (OpenAI Functions Agent Creator 4:36: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-OpenA_4-36-71902) - 13 remaining 2024-11-22 16:05:44,965 : DEBUG : comm-pool-thread-12 : : WriteFileStoreHandler : : : Disposing file store "2dbe97ee-ef54-4d59-9027-0e2d7c45c743 (OpenAI Functions Agent Creator 4:36: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-OpenA_4-36-71902)" - folder successfully deleted 2024-11-22 16:05:44,965 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Functions Agent Creator 4:36 has new state: IDLE 2024-11-22 16:05:44,965 : DEBUG : comm-pool-thread-12 : : Node : Chat Model Prompter : 4:39 : reset 2024-11-22 16:05:45,083 : DEBUG : comm-pool-thread-12 : : Node : Chat Model Prompter : 4:39 : clean output ports. 2024-11-22 16:05:45,084 : DEBUG : comm-pool-thread-12 : : WorkflowDataRepository : : : Removing handler 2f2783fc-10bd-4d44-bb5a-b74852137073 (Chat Model Prompter 4:39: ) - 12 remaining 2024-11-22 16:05:45,084 : DEBUG : comm-pool-thread-12 : : SelectionEventBus : : : Selection event emitter removed for node 4:39. Num emitters: 5 2024-11-22 16:05:45,084 : DEBUG : comm-pool-thread-12 : : SelectionEventBus : : : Selection event emitter removed for node 4:39. Num emitters: 5 2024-11-22 16:05:45,084 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Chat Model Prompter 4:39 has new state: IDLE 2024-11-22 16:05:45,084 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Chat Model Connector : 4:6 : reset 2024-11-22 16:05:45,230 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Chat Model Connector : 4:6 : clean output ports. 2024-11-22 16:05:45,230 : DEBUG : comm-pool-thread-12 : : WorkflowDataRepository : : : Removing handler b35b6080-ca53-4b74-a912-a16bcda64ba1 (OpenAI Chat Model Connector 4:6: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-OpenA_4-6-71900) - 11 remaining 2024-11-22 16:05:45,234 : DEBUG : comm-pool-thread-12 : : WriteFileStoreHandler : : : Disposing file store "b35b6080-ca53-4b74-a912-a16bcda64ba1 (OpenAI Chat Model Connector 4:6: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-OpenA_4-6-71900)" - folder successfully deleted 2024-11-22 16:05:45,234 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Chat Model Connector 4:6 has new state: IDLE 2024-11-22 16:05:45,274 : INFO : comm-pool-thread-12 : : CloseablePythonNodeProxy : OpenAI Chat Model Connector : 4:6 : models.openai:Selected model: o1-mini-2024-09-12 2024-11-22 16:05:45,278 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Chat Model Connector : 4:6 : Configure succeeded. (OpenAI Chat Model Connector) 2024-11-22 16:05:45,278 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Chat Model Connector 4:6 has new state: CONFIGURED 2024-11-22 16:05:45,308 : DEBUG : comm-pool-thread-12 : : Node : Vector Store to Tool : 4:27 : Configure succeeded. (Vector Store to Tool) 2024-11-22 16:05:45,308 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Vector Store to Tool 4:27 has new state: CONFIGURED 2024-11-22 16:05:45,335 : DEBUG : comm-pool-thread-12 : : Node : OpenAI Functions Agent Creator : 4:36 : Configure succeeded. (OpenAI Functions Agent Creator) 2024-11-22 16:05:45,335 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Functions Agent Creator 4:36 has new state: CONFIGURED 2024-11-22 16:05:45,364 : DEBUG : comm-pool-thread-12 : : Node : Chat Model Prompter : 4:39 : Configure succeeded. (Chat Model Prompter) 2024-11-22 16:05:45,364 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Chat Model Prompter 4:39 has new state: CONFIGURED 2024-11-22 16:05:45,398 : DEBUG : comm-pool-thread-12 : : Node : Agent Prompter : 4:37 : Configure succeeded. (Agent Prompter) 2024-11-22 16:05:45,398 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Agent Prompter 4:37 has new state: CONFIGURED 2024-11-22 16:05:45,398 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : My Dream Home 4 has new state: CONFIGURED 2024-11-22 16:05:45,416 : DEBUG : main : : DelegatingNodeModel : : : Unregistering view from model (0 remaining). 2024-11-22 16:05:45,430 : DEBUG : main : : DesktopAPI : : : Desktop API function successfully called: openNodeDialog 2024-11-22 16:05:47,394 : DEBUG : comm-pool-thread-12 : : ExecutionContext : : : No file store handler set on "Chat Model Prompter" (possibly running in 3rd party executor) 2024-11-22 16:05:47,394 : DEBUG : comm-pool-thread-12 : : DataValueImageRendererRegistry : : : New batch of to-be-rendered images started for table with id 'spec_4_39'. 2024-11-22 16:05:48,684 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Chat Model Connector 4:6 has new state: CONFIGURED_MARKEDFOREXEC 2024-11-22 16:05:48,684 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : OpenAI Chat Model Connector 4:6 has new state: CONFIGURED_QUEUED 2024-11-22 16:05:48,685 : DEBUG : comm-pool-thread-12 : : SelectionEventBus : : : Selection event emitter removed for node 4:39. Num emitters: 5 2024-11-22 16:05:48,685 : DEBUG : comm-pool-thread-12 : : DataValueImageRendererRegistry : : : Cached image data cleared for table with id 'spec_4_39'. There is still image data cached for 2 tables 2024-11-22 16:05:48,685 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : Chat Model Prompter 4:39 has new state: CONFIGURED_MARKEDFOREXEC 2024-11-22 16:05:48,685 : DEBUG : comm-pool-thread-12 : : NodeContainer : : : My Dream Home 4 has new state: EXECUTING 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WorkflowManager : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 doBeforePreExecution 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : NodeContainer : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 has new state: PREEXECUTE 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WorkflowDataRepository : OpenAI Chat Model Connector : 4:6 : Adding handler 87cdfc50-637a-48a9-91df-52786ee42fa9 (OpenAI Chat Model Connector 4:6: ) - 12 in total 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WorkflowManager : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 doBeforeExecution 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : NodeContainer : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 has new state: EXECUTING 2024-11-22 16:05:48,687 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : LocalNodeExecutionJob : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 Start execute 2024-11-22 16:05:48,689 : DEBUG : python-gateway-creator-15 : : FreshPythonGatewayFactory : : : Using CA cert mode ENV 2024-11-22 16:05:48,853 : INFO : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : CloseablePythonNodeProxy : OpenAI Chat Model Connector : 4:6 : models.openai:Selected model: o1-mini-2024-09-12 2024-11-22 16:05:48,854 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WriteFileStoreHandler : OpenAI Chat Model Connector : 4:6 : Assigning temp directory to file store "87cdfc50-637a-48a9-91df-52786ee42fa9 (OpenAI Chat Model Connector 4:6: C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\fs-OpenA_4-6-71903)" 2024-11-22 16:05:48,859 : INFO : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : LocalNodeExecutionJob : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 End execute (0 secs) 2024-11-22 16:05:48,859 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WorkflowManager : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 doBeforePostExecution 2024-11-22 16:05:48,859 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : NodeContainer : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 has new state: POSTEXECUTE 2024-11-22 16:05:48,859 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : WorkflowManager : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 doAfterExecute - success 2024-11-22 16:05:48,859 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : NodeContainer : OpenAI Chat Model Connector : 4:6 : OpenAI Chat Model Connector 4:6 has new state: EXECUTED 2024-11-22 16:05:48,861 : DEBUG : KNIME-Node-Usage-Writer : : NodeTimer$GlobalNodeStats : : : Successfully wrote node usage stats to file: C:\Users\sntra\Documents\knime_5.4.0\knime-workspace\.metadata\knime\nodeusage_3.0.json 2024-11-22 16:05:48,906 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : Node : Vector Store to Tool : 4:27 : Configure succeeded. (Vector Store to Tool) 2024-11-22 16:05:48,941 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : Node : OpenAI Functions Agent Creator : 4:36 : Configure succeeded. (OpenAI Functions Agent Creator) 2024-11-22 16:05:48,982 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : Node : Chat Model Prompter : 4:39 : Configure succeeded. (Chat Model Prompter) 2024-11-22 16:05:48,982 : DEBUG : KNIME-Worker-72-OpenAI Chat Model Connector 4:6 : : NodeContainer : OpenAI Chat Model Connector : 4:6 : Chat Model Prompter 4:39 has new state: CONFIGURED_QUEUED 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowManager : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 doBeforePreExecution 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 has new state: PREEXECUTE 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowDataRepository : Chat Model Prompter : 4:39 : Adding handler bd8af007-92f6-42a6-b836-2354926ef0af (Chat Model Prompter 4:39: ) - 13 in total 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowManager : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 doBeforeExecution 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 has new state: EXECUTING 2024-11-22 16:05:48,983 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : LocalNodeExecutionJob : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 Start execute 2024-11-22 16:05:48,986 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : FreshPythonGatewayFactory : Chat Model Prompter : 4:39 : Using CA cert mode ENV 2024-11-22 16:05:48,986 : DEBUG : python-gateway-creator-16 : : FreshPythonGatewayFactory : : : Using CA cert mode ENV 2024-11-22 16:05:49,459 : DEBUG : python-gateway-creator-15 : : DefaultPythonGateway : : : Connected to Python process with PID: 3336 after ms: 763 2024-11-22 16:05:50,225 : DEBUG : python-gateway-creator-16 : : DefaultPythonGateway : : : Connected to Python process with PID: 13964 after ms: 758 2024-11-22 16:05:50,980 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : DefaultPythonGateway : Chat Model Prompter : 4:39 : Connected to Python process with PID: 16712 after ms: 745 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : KNIME Buffer cache statistics: 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 4 tables currently held in cache 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 49 distinct tables cached 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 40 tables invalidated successfully 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 5 tables dropped by garbage collector 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 87 cache hits (hard-referenced) 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 30 cache hits (softly referenced) 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 0 cache hits (weakly referenced) 2024-11-22 16:05:51,356 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : BufferCache : Chat Model Prompter : 4:39 : 0 cache misses 2024-11-22 16:05:51,361 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : ArrowBatchWriter : Chat Model Prompter : 4:39 : Closing file C:\Users\sntra\AppData\Local\Temp\knime_My Dream Home_18793\knime_container_20241122_9101071897418933707.knable (2 KB) 2024-11-22 16:05:51,362 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:51,363 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,363 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,364 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:51,365 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:51,365 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,365 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,366 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,366 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,366 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,367 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,367 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:51,367 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,435 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,435 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,435 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,435 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,436 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,436 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,436 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,436 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,437 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:52,437 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,437 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,437 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:52,437 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:52,438 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,438 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,438 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,438 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,438 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.DefaultRowKeyValueFactory"} 2024-11-22 16:05:52,439 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:52,439 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : knime.api.types:The fallback value factory is used for the following type: {"value_factory_class":"org.knime.core.data.v2.value.StringValueFactory"} 2024-11-22 16:05:53,786 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpx:load_ssl_context verify=True cert=None trust_env=True http2=False 2024-11-22 16:05:53,787 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpx:load_verify_locations cafile='C:\\Users\\sntra\\Documents\\knime_5.4.0\\bundling\\envs\\org_knime_python_llm_5.4.0\\Lib\\site-packages\\certifi\\cacert.pem' 2024-11-22 16:05:53,866 : DEBUG : python-output-redirector-135 : : CloseablePythonNodeProxyFactory : Chat Model Prompter : 4:39 : C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\utils\utils.py:161: UserWarning: WARNING! seed is not default parameter. 2024-11-22 16:05:53,866 : DEBUG : python-output-redirector-135 : : CloseablePythonNodeProxyFactory : Chat Model Prompter : 4:39 : seed was transferred to model_kwargs. 2024-11-22 16:05:53,866 : DEBUG : python-output-redirector-135 : : CloseablePythonNodeProxyFactory : Chat Model Prompter : 4:39 : Please confirm that seed is what you intended. 2024-11-22 16:05:53,866 : DEBUG : python-output-redirector-135 : : CloseablePythonNodeProxyFactory : Chat Model Prompter : 4:39 : warnings.warn( 2024-11-22 16:05:54,108 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpx:load_ssl_context verify=True cert=None trust_env=True http2=False 2024-11-22 16:05:54,109 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpx:load_verify_locations cafile='C:\\Users\\sntra\\Documents\\knime_5.4.0\\bundling\\envs\\org_knime_python_llm_5.4.0\\Lib\\site-packages\\certifi\\cacert.pem' 2024-11-22 16:05:54,420 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:Request options: {'method': 'post', 'url': '/chat/completions', 'files': None, 'json_data': {'messages': [{'content': 'You are a physics expert', 'role': 'system'}, {'content': 'Explain quantum physics in 2 sentences', 'role': 'user'}], 'model': 'o1-mini-2024-09-12', 'max_tokens': 4000, 'n': 1, 'seed': None, 'stream': False, 'temperature': 0.2}} 2024-11-22 16:05:54,446 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:Sending HTTP Request: POST https://api.openai.com/v1/chat/completions 2024-11-22 16:05:54,446 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.connection:connect_tcp.started host='api.openai.com' port=443 local_address=None timeout=None socket_options=None 2024-11-22 16:05:54,471 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.connection:connect_tcp.complete return_value= 2024-11-22 16:05:54,472 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.connection:start_tls.started ssl_context= server_hostname='api.openai.com' timeout=None 2024-11-22 16:05:54,486 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.connection:start_tls.complete return_value= 2024-11-22 16:05:54,486 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:send_request_headers.started request= 2024-11-22 16:05:54,486 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:send_request_headers.complete 2024-11-22 16:05:54,486 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:send_request_body.started request= 2024-11-22 16:05:54,487 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:send_request_body.complete 2024-11-22 16:05:54,487 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:receive_response_headers.started request= 2024-11-22 16:05:55,574 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:receive_response_headers.complete return_value=(b'HTTP/1.1', 400, b'Bad Request', [(b'Date', b'Fri, 22 Nov 2024 13:05:55 GMT'), (b'Content-Type', b'application/json'), (b'Content-Length', b'221'), (b'Connection', b'keep-alive'), (b'access-control-expose-headers', b'X-Request-ID'), (b'openai-organization', b'user-nffph5zhu9wxygy8bg1vgfu0'), (b'openai-processing-ms', b'15'), (b'openai-version', b'2020-10-01'), (b'x-ratelimit-limit-requests', b'500'), (b'x-ratelimit-limit-tokens', b'200000'), (b'x-ratelimit-remaining-requests', b'499'), (b'x-ratelimit-remaining-tokens', b'195982'), (b'x-ratelimit-reset-requests', b'120ms'), (b'x-ratelimit-reset-tokens', b'1.205s'), (b'x-request-id', b'req_488b3b183473e9a5b072bc7348bf58c0'), (b'strict-transport-security', b'max-age=31536000; includeSubDomains; preload'), (b'CF-Cache-Status', b'DYNAMIC'), (b'Set-Cookie', b'__cf_bm=LFvAe0KHQg_4lXz5mg7gGY3arIlm1nUcg1w36O9HdP4-1732280755-1.0.1.1-0SZu1vDjpP7x2Cwom4vt3zOIrNCYg3G.xgFbGpMoIq23AJsVTEeBoB3s.VBgvaMqJyC73Nis.j57ZcBEEIjowg; path=/; expires=Fri, 22-Nov-24 13:35:55 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), (b'X-Content-Type-Options', b'nosniff'), (b'Set-Cookie', b'_cfuvid=Yy1rYbvKmlxWfR7as8RTqIK0.ET0n1OWBs_kWV9ADGI-1732280755755-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), (b'Server', b'cloudflare'), (b'CF-RAY', b'8e69223cba2b9f5f-DOH'), (b'alt-svc', b'h3=":443"; ma=86400')]) 2024-11-22 16:05:55,576 : INFO : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 400 Bad Request" 2024-11-22 16:05:55,576 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:receive_response_body.started request= 2024-11-22 16:05:55,576 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:receive_response_body.complete 2024-11-22 16:05:55,576 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:response_closed.started 2024-11-22 16:05:55,576 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : httpcore.http11:response_closed.complete 2024-11-22 16:05:55,577 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:HTTP Response: POST https://api.openai.com/v1/chat/completions "400 Bad Request" Headers([('date', 'Fri, 22 Nov 2024 13:05:55 GMT'), ('content-type', 'application/json'), ('content-length', '221'), ('connection', 'keep-alive'), ('access-control-expose-headers', 'X-Request-ID'), ('openai-organization', 'user-nffph5zhu9wxygy8bg1vgfu0'), ('openai-processing-ms', '15'), ('openai-version', '2020-10-01'), ('x-ratelimit-limit-requests', '500'), ('x-ratelimit-limit-tokens', '200000'), ('x-ratelimit-remaining-requests', '499'), ('x-ratelimit-remaining-tokens', '195982'), ('x-ratelimit-reset-requests', '120ms'), ('x-ratelimit-reset-tokens', '1.205s'), ('x-request-id', 'req_488b3b183473e9a5b072bc7348bf58c0'), ('strict-transport-security', 'max-age=31536000; includeSubDomains; preload'), ('cf-cache-status', 'DYNAMIC'), ('set-cookie', '__cf_bm=LFvAe0KHQg_4lXz5mg7gGY3arIlm1nUcg1w36O9HdP4-1732280755-1.0.1.1-0SZu1vDjpP7x2Cwom4vt3zOIrNCYg3G.xgFbGpMoIq23AJsVTEeBoB3s.VBgvaMqJyC73Nis.j57ZcBEEIjowg; path=/; expires=Fri, 22-Nov-24 13:35:55 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), ('x-content-type-options', 'nosniff'), ('set-cookie', '_cfuvid=Yy1rYbvKmlxWfR7as8RTqIK0.ET0n1OWBs_kWV9ADGI-1732280755755-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None'), ('server', 'cloudflare'), ('cf-ray', '8e69223cba2b9f5f-DOH'), ('alt-svc', 'h3=":443"; ma=86400')]) 2024-11-22 16:05:55,577 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:request_id: req_488b3b183473e9a5b072bc7348bf58c0 2024-11-22 16:05:55,579 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:Encountered httpx.HTTPStatusError Traceback (most recent call last): File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\_base_client.py", line 1038, in _request response.raise_for_status() File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\httpx\_models.py", line 763, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.openai.com/v1/chat/completions' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400 2024-11-22 16:05:55,579 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:Not retrying 2024-11-22 16:05:55,579 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : openai._base_client:Re-raising status error 2024-11-22 16:05:55,587 : WARN : KNIME-Worker-73-Chat Model Prompter 4:39 : : CloseablePythonNodeProxy : Chat Model Prompter : 4:39 : Traceback (most recent call last): File "C:\Users\sntra\Documents\knime_5.4.0\plugins\org.knime.python3.nodes_5.4.0.v202411131616\src\main\python\_node_backend_launcher.py", line 1016, in execute outputs = self._node.execute(exec_context, *inputs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\plugins\org.knime.python3.nodes_5.4.0.v202411131616\src\main\python\knime\extension\nodes.py", line 1237, in wrapper results = func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\plugins\org.knime.python.llm_5.4.0.v202411180921\src\main\python\src\models\base.py", line 654, in execute answer = chat.invoke(conversation_messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\language_models\chat_models.py", line 158, in invoke self.generate_prompt( File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\language_models\chat_models.py", line 560, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\language_models\chat_models.py", line 421, in generate raise e File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\language_models\chat_models.py", line 411, in generate self._generate_with_cache( File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_core\language_models\chat_models.py", line 632, in _generate_with_cache result = self._generate( ^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\langchain_openai\chat_models\base.py", line 522, in _generate response = self.client.create(messages=message_dicts, **params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\_utils\_utils.py", line 275, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\resources\chat\completions.py", line 829, in create return self._post( ^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\_base_client.py", line 1278, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\_base_client.py", line 955, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\sntra\Documents\knime_5.4.0\bundling\envs\org_knime_python_llm_5.4.0\Lib\site-packages\openai\_base_client.py", line 1059, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} 2024-11-22 16:05:55,735 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : Node : Chat Model Prompter : 4:39 : reset 2024-11-22 16:05:55,879 : ERROR : KNIME-Worker-73-Chat Model Prompter 4:39 : : Node : Chat Model Prompter : 4:39 : Execute failed: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} org.knime.python3.nodes.PythonNodeRuntimeException: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}} at org.knime.python3.nodes.CloseablePythonNodeProxy$FailureState.throwIfFailure(CloseablePythonNodeProxy.java:803) at org.knime.python3.nodes.CloseablePythonNodeProxy.execute(CloseablePythonNodeProxy.java:566) at org.knime.python3.nodes.DelegatingNodeModel.lambda$4(DelegatingNodeModel.java:180) at org.knime.python3.nodes.DelegatingNodeModel.runWithProxy(DelegatingNodeModel.java:237) at org.knime.python3.nodes.DelegatingNodeModel.execute(DelegatingNodeModel.java:178) at org.knime.core.node.NodeModel.executeModel(NodeModel.java:596) at org.knime.core.node.Node.invokeFullyNodeModelExecute(Node.java:1284) at org.knime.core.node.Node.execute(Node.java:1049) at org.knime.core.node.workflow.NativeNodeContainer.performExecuteNode(NativeNodeContainer.java:603) at org.knime.core.node.exec.LocalNodeExecutionJob.mainExecute(LocalNodeExecutionJob.java:98) at org.knime.core.node.workflow.NodeExecutionJob.internalRun(NodeExecutionJob.java:198) at org.knime.core.node.workflow.NodeExecutionJob.run(NodeExecutionJob.java:117) at org.knime.core.util.ThreadUtils$RunnableWithContextImpl.runWithContext(ThreadUtils.java:369) at org.knime.core.util.ThreadUtils$RunnableWithContext.run(ThreadUtils.java:223) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at org.knime.core.util.ThreadPool$MyFuture.run(ThreadPool.java:123) at org.knime.core.util.ThreadPool$Worker.run(ThreadPool.java:246) 2024-11-22 16:05:55,879 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowManager : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 doBeforePostExecution 2024-11-22 16:05:55,879 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 has new state: POSTEXECUTE 2024-11-22 16:05:55,879 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowManager : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 doAfterExecute - failure 2024-11-22 16:05:55,879 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : Node : Chat Model Prompter : 4:39 : reset 2024-11-22 16:05:56,019 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : Node : Chat Model Prompter : 4:39 : clean output ports. 2024-11-22 16:05:56,019 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : WorkflowDataRepository : Chat Model Prompter : 4:39 : Removing handler bd8af007-92f6-42a6-b836-2354926ef0af (Chat Model Prompter 4:39: ) - 12 remaining 2024-11-22 16:05:56,019 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 has new state: IDLE 2024-11-22 16:05:56,057 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : Node : Chat Model Prompter : 4:39 : Configure succeeded. (Chat Model Prompter) 2024-11-22 16:05:56,057 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : Chat Model Prompter 4:39 has new state: CONFIGURED 2024-11-22 16:05:56,058 : DEBUG : KNIME-Worker-73-Chat Model Prompter 4:39 : : NodeContainer : Chat Model Prompter : 4:39 : My Dream Home 4 has new state: CONFIGURED 2024-11-22 16:05:56,074 : DEBUG : comm-pool-thread-12 : : ExecutionContext : : : No file store handler set on "Chat Model Prompter" (possibly running in 3rd party executor) 2024-11-22 16:05:56,075 : DEBUG : comm-pool-thread-12 : : DataValueImageRendererRegistry : : : New batch of to-be-rendered images started for table with id 'spec_4_39'. 2024-11-22 16:05:56,100 : ERROR : main : : KnimeBrowserView : : : %cerror%c Error captured hook :: background: #c0392b; border-radius: 0.5em; color: white; font-weight: bold; padding: 2px 0.5em; [object Object] (source: http://org.knime.ui.java/assets/index-BjMlukdx.js; line: 85)