Installing KNIME Analytics Platform

For step-by-step videos on how to install KNIME Analytics Platform, please take a look at our KNIMETV YouTube channel.
  1. Go to the download page on the website to start installing KNIME Analytics Platform.

  2. The download page shows three tabs which can be opened individually:

    • Register for Help and Updates: here you can optionally provide some personal information and sign up to our mailing list to receive the latest KNIME news

    • Download KNIME: this is where you can download the software

    • Getting Started: this tab gives you information and links about what you can do after you have installed KNIME Analytics Platform

  3. Now open the Download KNIME tab and click the installation option that fits your operating system. KNIME Analytics Platform can be installed on Windows, Linux, or macOS.

    Notes on the different options for Windows:

    • The Windows installer extracts the compressed installation folder, adds an icon to your desktop, and suggests suitable memory settings.

    • The self-extracting archive simply creates a folder containing the KNIME installation files. You don’t need any software to manage archiving.

    • The zip archive can be downloaded, saved, and extracted in your preferred location on a system to which you have full access rights.

      010 install files
      Figure 1. KNIME Analytics Platform versions
  4. Read and accept the privacy policy and terms and conditions. Then click Download.

  5. Once downloaded, proceed with installing KNIME Analytics Platform:

    • Windows: Run the downloaded installer or self-extracting archive. If you have chosen to download the zip archive instead, unpack it to a location of your choice. Run knime.exe to start KNIME Analytics Platform.

    • Linux: Extract the downloaded tarball to a location of your choice. Run the knime executable to start KNIME Analytics Platform.

    • Mac: Double click the downloaded dmg file and wait for the verification to finish. Then move the KNIME icon to Applications. Double click the KNIME icon in the list of applications to launch KNIME Analytics Platform.

Configuration settings and knime.ini file

When installing KNIME Analytics Platform, configuration settings are set to their defaults, and they can later be changed in the knime.ini file. The configuration settings, i.e. options used by the Java Virtual Machine when KNIME Analytics Platform is launched, range from memory settings to system properties required by some extensions.

You can find knime.ini in the installation folder of KNIME Analytics Platform.

On MacOS: To locate knime.ini on MacOS, open Finder and navigate to your installed Applications. Next, right click the KNIME application, select Show Package Contents in the menu, and navigate to ContentsEclipse.

The knime.ini file can be edited with any plaintext editor, such as Notepad (Windows), TextEdit (MacOS) or gedit (Linux).

Allocating memory in knime.ini file

The entry -Xmx1024m in the knime.ini file specifies how much memory KNIME Analytics Platform is allowed to use. The setting for this value will depend on how much memory is available in your machine. KNIME recommends setting it to approximately one half of your available memory, but you can modify the value based on your needs. For example, if your computer has 16 GB of memory, you might set the entry to -Xmx8192m.

Installing Extensions and Integrations

If you want to add capabilities to KNIME Analytics Platform, you can install extensions and integrations. The available extensions range from free open source extensions and integrations provided by KNIME to free extensions contributed by the community and commercial extensions including novel technology nodes provided by our partners.

The KNIME extensions and integrations developed and maintained by KNIME contain deep learning algorithms provided by Keras, high performance machine learning provided by H2O, big data processing provided by Apache Spark, and scripting provided by Python and R, just to mention a few.

Install extensions from:

  • KNIME Hub:

    • Search for the Extension or Integration you want to install in the search bar

    • Click Extensions on the results page

    • Click the extension you want to install, and from the extension page and drag and drop the squared yellow icon, shown in Figure 2, to the workbench of KNIME Analytics Platform. A window will open asking if you want to search and install the extension or integration. Click Yes and follow the instructions.

      02 hub extension page
      Figure 2. Install the KNIME Integrated Deployment Extension from KNIME Hub
    • Restart KNIME Analytics Platform.

  • KNIME Analytics Platform:

    • Click File on the menu bar and then Install KNIME Extensions…​. The dialog shown in Figure 3 opens.

      040 install extensions
      Figure 3. Installing Extensions and Integrations from KNIME Analytics Platform
    • Select the extensions you want to install

    • Click Next and follow the instructions

    • Restart KNIME Analytics Platform.

The Install KNIME Extensions menu provides the extensions that are available via the update sites you have enabled.

For more information, take a look at our video on How to Install Extensions in KNIME Analytics Platform. Also see the Extensions and Integrations Guide.

To uninstall an extension, click Help, About KNIME Analytics Platform, and then Installation Details. A dialog shown in Figure 4 opens. Now, select the extension that you want to uninstall, and click Uninstall…​.

050 uninstall extensions
Figure 4. Uninstalling Extensions and Integrations

Updating KNIME Analytics Platform and Extensions

It is good to make sure that you always use the latest version of KNIME Analytics Platform and its extensions.

Do this by:

  1. Clicking FileUpdate KNIME…​. In the dialog that opens, select the available updates you want to install and then click Next.

  2. Proceed by following the instructions. KNIME Analytics Platform has to be restarted in order to apply the updates.

Update Sites

The Update Sites are where KNIME retrieves additional software in the form of extensions as well as updates. To see or edit the available update sites, select FilePreferencesInstall/UpdateAvailable Software Sites.

Default Update Sites

These four updates sites are provided by KNIME and are always available:

030 update site
Figure 5. Available Update Sites

KNIME Analytics Platform 4.3 Update Site: Provides all extensions and integrations maintained by KNIME: R, Python, H2O Machine Learning, Apache Spark for big data, and many more. Contains KNIME Labs Extensions, which are extensions that are not yet part of the set of stable KNIME extensions because their functionality may not yet be finalized.

KNIME Community Extensions (Trusted): Provides trusted community extensions, i.e. extensions created by the KNIME community, which have been tested for backward compatibility and compliance with KNIME quality standards.

KNIME Partner Extensions: Provides extensions created by KNIME partners.

Community Extensions (Experimental): Provides additional extensions created by the KNIME community.

KNIME Analytics Platform 4.3 Update Site and KNIME Community Extensions (Trusted) are enabled by default.

Adding External Update Sites

To install extensions that are not part of the above update sites, click Add to manually add the relevant update site, inserting the Name and Location as shown in Figure 6.

030 add update site
Figure 6. Add Update Sites

After adding a new update site you will see it listed in the Available Software Sites. You must now enable it by selecting it from the list.

Adding Local Update Sites

If your working environment has limited internet access or you receive an error message “Proxy Authentication Required” when connecting to a remote update site (provided by a URL), you can install extensions from a local zip file.

  1. Download KNIME update sites as zip files at the following links:

  2. Save the zip file containing the extensions to your local system

  3. Select FilePreferencesInstall/UpdateAvailable Software Sites and enter the path to the zip file by clicking AddArchive…​ as shown in Figure 7.

    040 add zip archive
    Figure 7. Adding Update Sites from Zip Archive
    If the same extensions are provided by a URL, you will first have to disable the update site by disabling it in the list.
  4. Now click Apply and Close

    If the same extensions are also provided by a remote update site, you will first have to disable that update site by deselecting its entry in the Available Software Sites dialog and confirming via Apply and Close.

Working with the Nightly Builds

Once a night, a new version of KNIME Analytics Platform is created directly from our development branch. The Nightly Build versions available here provide insight into what’s coming up in the next regular release. However, for real work, always use a version of a standard KNIME release. Also read the following disclaimer before proceeding:

Really, really, really important disclaimer

This is most definitely not production quality code. These nightly builds are what we use internally to validate and test recent developments, so they are not tested as thoroughly as standard KNIME releases. Furthermore new nodes or functionality may change substantially (or disappear entirely) from one build to the next. It’s even possible that workflows you edit or create with nightly builds stop being readable by future (or past) versions…​

These nightlies are a great way to get a sneak peek at what may be coming in the next version of KNIME and provide feedback and suggestions. They are not a particularly safe way to do real work.

Changelog (KNIME Analytics Platform 4.3)

Detailed changelog for v4.3.x releases

KNIME Analytics Platform 4.3.0

(see highlight summary)
Release date: December 06, 2020

New nodes

  • AP-15781: String to Path (Variable) (New File Handling)

  • AP-15730: Path to String (Variable) (New File Handling)

  • AP-15703: Excel Reader (New File Handling)

  • AP-15614: HTTP(S) Connector (New File Handling)

  • AP-15576: FTP Connector (New File Handling)

  • AP-15571: Excel Sheet Reader (New File Handling)

  • AP-15518: Call Workflow (Table Based) (New File Handling)

  • AP-15499: Excel Writer (New File Handling)

  • AP-15464: Google Drive Connector (New File Handling)

  • AP-15367: Path to String (New File Handling)

  • AP-15355: Azure Blob Storage Connector (New File Handling)

  • AP-15283: SSH Connector (New File Handling)

  • AP-15257: Parquet Writer (New File Handling)

  • AP-15198: Binary Objects to File (New File Handling)

  • AP-14855: KNIME Server Connector (New File Handling)

  • AP-14083: Line Reader (New File Handling)

  • AP-13784: Compress Files (New File Handling)

  • AP-13783: De-compress Files (New File Handling)

  • AP-13729: Create File/Folder Variables (New File Handling)

  • AP-13721: String to Path (New File Handling)

  • AP-13479: Files/Folders Meta Info (New File Handling)

  • AP-13434: Create folder (New File Handling)

  • AP-13433: Transfer Files (New File Handling)

  • AP-13432: Delete files/folder (New File Handling)

  • AP-13429: ORC Reader (New File Handling)

  • AP-13428: Parquet Reader (New File Handling)

  • AP-13427: ORC Writer (New File Handling)

  • AP-15469: Conda Environment Propagation

  • AP-13482: Python Script (with dynamic inputs and outputs)

  • AP-14864: Table Manipulator

  • AP-14860: Variable to Credentials

  • AP-11405: PATCH request

  • BD-1083: (Big Data Extensions): Databricks File System Connector (New File Handling)

  • BD-1080: (Big Data Extensions): Create Databricks Environment (New File Handling)

  • BD-1079: (Big Data Extensions): Create Spark Context (Livy) (New File Handling)

  • BD-1078: (Big Data Extensions): Create Local Big Data Environment (New File Handling)

  • BD-1076: (Big Data Extensions): Spark to Text (New File Handling)

  • BD-1075: (Big Data Extensions): Spark to Parquet (New File Handling)

  • BD-1074: (Big Data Extensions): Spark to ORC (New File Handling)

  • BD-1073: (Big Data Extensions): Spark to JSON (New File Handling)

  • BD-1072: (Big Data Extensions): Spark to CSV (New File Handling)

  • BD-1071: (Big Data Extensions): Spark to Avro (New File Handling)

  • BD-1070: (Big Data Extensions): Text to Spark (New File Handling)

  • BD-1069: (Big Data Extensions): Parquet to Spark (New File Handling)

  • BD-1068: (Big Data Extensions): ORC to Spark (New File Handling)

  • BD-1067: (Big Data Extensions): JSON to Spark (New File Handling)

  • BD-1066: (Big Data Extensions): CSV to Spark (New File Handling)

  • BD-1065: (Big Data Extensions): Avro to Spark (New File Handling)

  • BD-1054: (Big Data Extensions): HDFS Connector (KNOX) (New File Handling)

  • BD-1046: (Big Data Extensions): HDFS Connector (New File Handling)


  • AP-15850: "Columnar Table Backend" as part of KNIME Labs — enables faster execution of workflows

  • AP-15777: Configuration Dialogue Layout should display message instead of blank page when empty

  • AP-15704: Add transformation support to CSV Reader (New File Handling)

  • AP-15672: Amazon S3 Connector: Add SSE-S3 and SSE-KMS support

  • AP-15649: Enrich Node Monitor with table dimensions

  • AP-15488: Add hub view to default build

  • AP-15483: Dynamic ports: Allow to specify default ports for extendable and optional groups

  • AP-15477: Layout editor for configuration nodes

  • AP-15442: Add system property to disable storing weakly encrypted passwords in workflows

  • AP-15432: Update Apache CXF from 3.0.7 to 3.2.14

  • AP-15328: Transpose Node should make use of ColumnFilter API

  • AP-15299: Add Column Filter to Table Row to Variable

  • AP-15270: Support list columns in Parquet Reader

  • AP-15259: Support multi-file writing for ORC/Parquet Writer (new filehandling)

  • AP-15201: Allow workflow diff between current version and snapshot on server mount point

  • AP-15140: Support FSLocation and Collection FlowVariables in Variable Loop End

  • AP-15134: Support Paths and ListCells in Table to Variable Loop Start

  • AP-15061: Handle drag on drop of workflow from Hub to workbench

  • AP-15047: Support Paths in Table to Variable and vice versa

  • AP-15039: Set Python node executable’s "current working directory" to Workflow directory

  • AP-14942: Additional type mappings for NULL DB type

  • AP-14874: Design Read "Enduser API" for Fast Tables

  • AP-14834: Update ChemAxon Marvin extension to x.y.z and re-categorize into "Partner Extension" (now with more stringent license checking)

  • AP-14742: Make Chromium.SWT part of KNIME AP build

  • AP-14428: SAP Theobald Reader: CSV Options

  • AP-14231: Show KNIME Hub in AP Panel

  • AP-14193: (Semi-) Automatically add and enable update sites on node drag&drop from the Hub

  • AP-14132: "Open in KNIME Hub" Links: Use AP integration instead of external browser

  • AP-14035: Enable file download widget to work in the AP

  • AP-14034: Enable file upload widget to work in the AP

  • AP-13883: Excel Reader enhancement when reading multiple files with different structure

  • AP-13827: Google Authentication Should Only Involve The Google Analytics ReadOnly scope Instead Of The Full Scope Set

  • AP-13130: Call Workflow (Table Based) should allow calling workflows without any input or output nodes

  • AP-13019: Input Filtering in Column Selection and Column Filter Configuration

  • AP-11813: More efficient update check of (web-)linked components

  • AP-11549: Excel Reader (XLS) automatically update data preview

  • AP-10689: Configurable load timeout in "Call Workflow" nodes

  • AP-9451: Changing the order of dialog elements in Component Dialogs

  • AP-5400: XLS Reader should allow changing column type in dialog

  • BD-1087: (Big Data Extensions): Update Spark version in Create Local Big Data Environment to Spark 2.4.7

  • WEBP-615: New File Upload Widget should additionally export a path flow variable new File Handling nodes can read

  • WEBP-567: Toggle component

  • WEBP-391: File Download Widget does not support new flow variable type for Path

  • WEBP-182: Frontend: rewrite the file upload input widget (input)

  • WEBP-178: Frontend: rewrite file download widget (output)

  • WEBP-172: Frontend: rewrite date input widget (input)

  • WEBP-170: Frontend: rewrite credentials input widget (input)

Bug Fixes

  • AP-15044: On Linux, node dialog input fields require extra click to become active

  • AP-14945: "Shift + key" in Table Creator doesn’t work any more

  • AP-15782: Force-Enable Create Snapshot does not create a snapshot

  • AP-15758: Configuration Layout Editor shows missing nodes when only views are present

  • AP-15723: DB Connection Closer node should log a warning instead of an error if connection is already closed

  • AP-15721: Support newOutputStream with append option in URIFS

  • AP-15720: If exists options must not be disabled for Costum/KNIME URL

  • AP-15691: Layout Editor of Composite View: Drag&Drop is broken on Windows

  • AP-15645: Tableau Integration: Write hyperd.log to temporary file

  • AP-15641: File and line reader don’t read size of .gz files correctly (incorrect progress report)

  • AP-15638: Mountpoints cannot be changed in case content provider factory isn’t known.

  • AP-15585: Transformation Table disappears if names are wrong and other setting is changed

  • AP-15582: ORC Writer corrupts lists if a list cell is missing

  • AP-15550: NullPointerException in Call Workflow (Table based) node

  • AP-15529: Transformation tab does not restore changed columns orders

  • AP-15527: Port type mix-up when changing port order on components and metanodes (which are connected to a downstream metanode)

  • AP-15501: Call Workflow (Table Based) throws exception while cleaning delegate table

  • AP-15476: Writing workflow summary locks up executor

  • AP-15473: Save Workflow Node does not work in Components

  • AP-15440: File extensions in filter mode are not parsed correctly if they are separated by spaces

  • AP-15357: KNIME Server Connection Builds Incorrect Endpoint with Email-based Usernames

  • AP-15339: Fix broken error message for empty input file paths

  • AP-15244: Errors while loading data in workflows sometimes not properly handled in downstream components

  • AP-15190: DB Loader occasionally fails in a loop

  • AP-15128: DL Keras Learner: View throws an error if loss is negative

  • AP-15098: Excel Sheet Appender truncates sheet names with more than 25 characters

  • AP-15065: Workflow Writer sets inputs and outputs to None if the dialog is canceled when it is opened for the first time

  • AP-15029: Component inside encrypted component is lost after reopening workflow

  • AP-15010: Call Workflow nodes on Windows use '\' — should be '/' to be cross platform compatible

  • AP-14997: Flow variable controlled settings using flow variables from unexecuted components are lost when configuring

  • AP-14894: Create Temp folder dialog does not warn about missing folders

  • AP-14752: Binary Object to File node giving NullPointerException when run on server

  • AP-14748: Closing (some) node configuration dialogs may occasionally cause freeze on MacOS

  • AP-14642: Excel Reader fails in case column cell has only space in column row

  • AP-14568: Workflow Annotation Edit Toolbar color chooser is buggy

  • AP-14408: DL Keras Learner creates many temporary files which are not deleted during execution

  • AP-13932: Potential StackOverflow in NodeContext/NodeLogger in development mode (assertions enabled)

  • AP-13919: Configuration Dialog of Components does not increase width when scaling the windows

  • AP-13067: 'Select Loop' shown in context menu of component input node

  • AP-12595: Metanode reconfiguration does not allow addition of Network port type

  • AP-11847: Excel Reader extracts false Value for Date&Time Column

  • AP-9051: Excel Sheet Appender Hits 65536 Row Limit for XLSM File Type

  • AP-3746: XLS reader sets error on execution when the xls file has slightly changed (e.g. different column names)

  • BD-1094: (Big Data Extensions): Pyspark nodes can not be edited with remote editing

  • BD-1058: (Big Data Extensions): Disable transactions only on Impala BigData/HiveLoader tmp table

  • BD-1057: (Big Data Extensions): Parquet/ORC reader nodes fail on S3 bucket in new regions and buckets created after June 24, 2020

  • WEBP-634: Date&Time Widget is broken

  • WEBP-624: Support local file upload with non-legacy file upload

  • WEBP-578: File Upload Widget - Validation of File extension should not be case sensitive

  • WEBP-477: Fix 'WizardExecutionController#hasCurrentPage' in case the page is still in execution and part of a loop

KNIME Analytics Platform 4.3.1

Release date: January 29, 2021


  • AP-16070: Publish old file handling utility and connector nodes as legacy nodes

  • AP-16053: Add option to open excel after excel writer execution

  • AP-16024: Display warning if file system is not browsable

  • AP-16022: Refer to file/folder instead of path in error message

  • AP-15891: Improve icon of the molecule sketcher

  • AP-15486: Retry option for Call Workflow node

  • BD-1099: (Big Data Extensions): Create Local Big Data Environment node should provide several options for working directory

Bug Fixes

  • AP-16042: TensorFlow2 Network Writer node cannot be configured

  • AP-15922: A flow variable of the Local File Browser Configuration Node was removed

  • AP-15907: (Teamspace Extension:) Custom node repository extension missing in build

  • AP-15885: REST nodes (Get, Post, …​) cause unexpected warning message when loading flows in v4.3.0

  • AP-15870: Linked metanode updates may fail in installations updated from 4.2 (new installations work fine)

  • AP-15864: Flow Variables from (new) "Table Row to Variable Loop Start" can’t be used in Rule Engine Variable, String Manipulation Variable, Math Formula Variable

  • WEBP-668: Date&Time widget outputs inactive fixed date instead of activated execution date by default

  • AP-16082: Amazon S3 connection must refresh default credentials chain on reset

  • AP-16069: Apply as new default button is not working for the molecule widget

  • AP-16059: Transfer Files node should delete source folder if folder option is selected and include selected source folder is not selected

  • AP-16044: Excel Writer throws ArrayIndexOutOfBoundsException when writing images

  • AP-16038: Chrome/Chromium view can open outside of display

  • AP-16034: SharePoint Online Connector: Occasional ClassCastException when accessing a SharePoint site

  • AP-16033: New Readers fail if file structure changes

  • AP-15980: Component Editor can’t save component when it contains variables

  • AP-15976: Auto-guessing of cache sizes can output negative numbers

  • AP-15963: Excel Reader showing "Zip bomb detected!" error

  • AP-15936: Node monitor doesn’t show table dimensions (under some circumstances)

  • AP-15925: Workflow summary doesn’t include workflow metadata

  • AP-15915: Row Filter (Labs): Incorrect implementation of ColumnFilter

  • AP-15875: XLSX files cannot be opened on OneDrive and Google Drive

  • AP-15868: Thread leakage in Logfile appender

  • AP-15865: Slider Widget fails with validation error message when value is greater than 100

  • AP-15857: Table Manipulator does not work with fast table backend

  • AP-15852: Excel reader preview showing error that a column cannot be converted

  • AP-15800: Conda Env Prop fails to create env with unhelpfully terse error message

  • AP-15798: Conda Env Prop node dialog button does nothing for Include only explicitly installed

  • AP-15796: Conda Env Prop reports conda warnings as errors in log

  • AP-15626: “Creation Date” in workflow description is always current date

  • BD-1085: (Big Data Extensions): Spark reader/writer nodes with new file system port should provide better error message for incompatible connector

KNIME Analytics Platform 4.3.2

Release date: March 08, 2021

New nodes

  • AP-13870: (Indexing and Searching extension:) Index Writer

  • AP-13867: (Indexing and Searching extension:) Index Reader


  • AP-16242: Table Indexer to support index reusing

  • AP-16127: Implement non-filestore FSLocationCell

  • BD-1120: (Big Data Extensions): Create Databricks ENV Node should allow user to overwrite the AuthMech JDBC parameter

Bug Fixes

  • AP-16186: SharePoint Online Connector: Reading a file occasionally fails with "Error during http request"

  • AP-16097: Dynamic JS view node framework: dependencies of dependencies are not set to local

  • AP-15893: Python: System entries missing in PATH variable after removing conda activate (Windows)

  • AP-16294: FTP Connector: Problem with underlying Apache Commons library and STRU command

  • AP-16293: Excel Writer cannot handle missing images

  • AP-16263: Amazon S3 Connector should be able to access buckets from other regions

  • AP-16236: Deserialization of "StringListValueFactory" causes exception with new columnar table backend

  • AP-16233: FTP Connector does not check the server hostname against the SSL certificate when using FTPS

  • AP-16222: Oracle Connector does not work with Oracle Wallet

  • AP-16190: Excel Writer causes Java Heap Space exception when writing in append mode

  • AP-16152: File Upload does not escape special characters

  • AP-16147: Only half of the application window is usable

  • AP-16142: Endless loop when creating variable from column called RowID

  • AP-16137: Cannot read/write files that contain percent-encoding in their name with Custom/KNIME URL (new file handling)

  • AP-16122: FTP Connectors causes NullPointerException in List Files/Folders node

  • AP-16098: DB Loader: Path flow variable button forgets state

  • AP-16088: Excel Writer shows confusing error when no path is specified

  • AP-16045: Relative paths are not correctly opened by local file browser

  • AP-16036: Component description is not matching new reordering of configuration nodes

  • AP-15867: Columnar Table Preferences UI Glitch

  • AP-15797: Conda Env Prop discards packages from node dialog when unchecked

  • AP-14930: NullPointerException when loading workflow under heavy load

  • AP-14591: Opening any workflow after cancellation of application close gives error

  • BD-1121: (Big Data Extensions): Handle Databricks File System REST API changes

  • WEBP-563: Use node model to set server credentials (Credentials Widget)

KNIME Analytics Platform 4.3.3

Release date: May 26, 2021

New nodes

  • AP-16252: Azure Data Lake Storage Gen2 Connector

  • AP-14838: Path to URI

  • AP-12700: Delete Files/Folders (Table based) (new file handling)


  • AP-16718: Update buillt-in mysql5 driver to version 5.1.49 to support TLS 1.2

  • AP-16519: File Readers & File Handling Nodes: Add option to follow links to Filter Options

  • AP-16130: New option for JavaScript-based nodes: Sanitize HTML/JS Script

  • AP-15973: Add option to include special files to Filter Options

  • AP-15927: Add gzip as supported file format in Decompress Files node

  • AP-15874: Improve empty column header handling in Excel Reader

  • AP-15821: Add support for generating Shared Access Signature (SAS) URLs to Azure Blob Storage Connector (Path to URI)

  • AP-15820: Add support for generating signed URLs to Google Cloud Storage Connector (Path to URI)

  • AP-15819: Add support for generating presigned URLs to Amazon S3 Connector (Path to URI)

  • BD-1107: (Big Data Extensions): Add support for Spark 3.0 to Create Databricks Environment node

  • BD-1104: (Big Data Extensions): Update H2O Sparkling Water to

Bug Fixes

  • AP-16689: Amazon S3 access does not work without s3:getBucketLocation permission

  • AP-16370: IllegalArgumentException in Reference Row Filter and Splitter Nodes on Low Memory and Empty Input Table

  • AP-16628: Called workflows are not removed by Call Local Workflow node when run on KNIME Server

  • AP-16619: KNIME Server Connector and Relative to file systems should not increase user count in KNIME Server

  • AP-16598: DB Merge node ignores rows for dbs that use multi statement execution (e.g. PostgreSQL, H2) if not all input table columns are used and if the first columns contain the same values in subsequent rows within the same batch

  • AP-16577: Python: Conda Env Prop node does not output detailed error message when pip fails

  • AP-16576: Decompress Node remove unsupported extension error messages

  • AP-16547: Excel Writer cannot overwrite files on network drives

  • AP-16499: Executing multiple excel writers at the same time causes NPE

  • AP-16485: SharePointOnlineConnector: NullPointerException when reading file

  • AP-16414: Race condition in the retrieval of the local file system preferences causes readers to fail

  • AP-16353: CSV Writer does not write column header when overwriting and disabled "don’t write column header" option is enabled

  • AP-16340: Closing the View (CEF) of a component after it’s reset leads to the error

  • AP-16300: String to Path (Variable) does not store variable suffix

  • AP-16206: Send Email node does not encode Chinese characters in subject line

  • AP-16116: CSV file reader does not autodetect on drag-n-drop

  • AP-16061: Must not check for regular file when reading a file (new file handling)

  • AP-15964: CSV Reader does not reset line separator normalizer

  • AP-15953: Socket connection setup failing due to premature executor shutdown

  • AP-15929: Google Analytics fails with Timeout

KNIME Analytics Platform 4.3.4

Release date: July 14, 2021

Bug Fixes

  • AP-16730: Table to PDF & HTML cannot write pngs

  • AP-16591: Hard crash of KNIME AP when entering code in Java Snippet node dialog

  • AP-16486: External NPM version dependencies cause IE11 to fail

  • AP-17031: Bundle org.knime.ext.ftp.filehandling fails to initialize

  • AP-16887: NullPointerException in Call Local Workflow (row based) node when report is generated

  • AP-16880: Excel Reader cannot handle numeric cells with missing values but without cell type

  • AP-16844: SSH Connector: Fix file attribute cache issues after copy/move

  • AP-16825: DynamoDB nodes lose rows when table spec changes in-between read batches

  • AP-16824: FTP Connector cannot browse or list files on Windows IIS FTP server

  • AP-16783: Path to URI with presigned URLs should use normalize

  • AP-15972: KNIME Server Connector and Relative to file systems treat workflows differently