This workflow parses logfiles created during the operation of KNIME Server. The workflow was originally developed using KNIME Server 4.6.2, based on Apache TomEE 7.0.3. The workflow is known to work on the KNIME Server 4.10.x release line using Apache TomEE 7.0.5 The workflow was designed to be run on the KNIME WebPortal. The purpose of this workflow is to read the logfiles created during Server operation and transform them to a format that is both easier to read for humans and easier to analyze for machines. The workflow extracts the most relevant events that are logged. The initial data preparation consists of six steps: 1. Select source Select from which source logfiles should be read. Possible options are: - KNIME Server REST API (requires admin privileges) - Directory on a local filesystem. 2. Path to logs Option REST API requires entering the necessary credentials as well as the URL of the KNIME Server REST API. Reading from a directory requires entering the path to the directory. 3. Select date range The KNIME Server creates a new logfile every day. Since a large number of log data is accumulated over time, this metanode provides the opportunity to specify a start and end date in order to only read data within the selected timeframe. 4. Read logs This metanode loops over the individual logfiles and collects the results in a single table. 5. Filter relevant events This workflow focuses on a number of selected events related to the execution of workflows on the KNIME Server. Other events, such as events related to Executor startup or configuration are therefore filtered out. To also consider additional events, please add them in the Table Creator node 6. Preprocessing At this stage, all data is stored in a single string column, with one row for each logged event. For further processing, this string is split into separate columns - date, event type, log level, source, and the actual log message.
This is a companion discussion topic for the original entry at https://kni.me/w/aRz-V1tUNuzdmv28