ERROR Table to Spark. Status code: 413 reason=Request Entity Too Large

I am trying to use SPARK Executor to use some functionality of MLLib. I test with small examples and work well, However when I tried to upload a big file, I got this error in knime console:

ERROR Table to Spark       4:2        Spark request failed.    Failed to upload file to server. Path: C:\XX\XX\XX\job-server5221361226461977652.tmp    Status code: 413    Status family: CLIENT_ERROR    Reason: Request Entity Too Large     Response:InboundJaxrsResponse{ClientResponse{method=POST, uri=http://xx:8090/data/dataTable-4e879a6c-9dab-46f6-bdf3-49707bc02f55, status=413, reason=Request Entity Too Large}}
ERROR Table to Spark       4:2        Execute failed: Executing Spark request failed. Error: Failed to upload file to server. Path: C:\XX\XX\XX\job-server5221361226461977652.tmp. Reason: Request Entity Too Large. Error code: 413. For more details see the KNIME log file.


Is there any parameter in the Spark Configuration or Spark Job Server to increase the amount of data to upload to Spark?

Thanks for your help.



Hi Eloy,

yes there is. You need to append the following lines to the environment.conf of your jobserver:

spray {
  can {
    server {
      parsing {
         max-content-length = 50m

This sets the limit for the request entity size to 50 MB (adjust as necessary).