Kafka Producer and Consumer error

Dear team,

I am relatively new in integrating application with KAFKA.
I am trying to get KNIME integrated with our KAFKA cluster within the company.
I successfully configured the KAFKA connector node. However when connecting it to the KAFKA producer and consumer is giving me the below error.

KAFKA Producer --> ERROR Kafka Producer 7:31 Execute failed: org.apache.kafka.common.errors.CorruptRecordException: This message has failed its CRC checksum, exceeds the valid size, or is otherwise corrupt.

KAFKA Consumer -->ERROR Kafka Consumer 7:33 Execute failed: org.apache.kafka.common.errors.SerializationException: Size of data received by LongDeserializer is not 8
I already tried reducing the max no. of messages per poll and increasing the poll timeout.
Also in advanced setting I have tried setting th ekey value to auto.offset.reset=earliest.

image

Any help in this regard is highly appreciated.

Best Regards,
Jeff

Hi Jeff,
could this be due to the queue’s cleanup policy as described here: https://stackoverflow.com/questions/49098274/kafka-stream-get-corruptrecordexception? Can you post a bit more info about your setup?
Kind regards,
Alexander

1 Like

@putheje_1,

currently KNIME assumes that your records are pairs of <long, string>, does this apply to your kafka setup?
Can you read data from your topic using the Kafka console-consumer?

Best
Mark

3 Likes

@Mark_Ortmann @AlexanderFillbrunn
Thanks a lot for your reply ! The problem with the producer was that the data attempted to be send to an existing KAFKA topic differed in its metadata type than what existed in KAFKA. Since the producer did not transmit any data, neither did the consumer receive the data with the offset mode set to latest.

I tried with a new topic which I created adhoc and I was able to produce as well as consume the message !
Thanks a lot for your inputs.

2 Likes

Happy to hear!

Best
Mark

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.