Extract Tasks from Meeting Notes with a Local LLM
In this tutorial, you will build a workflow that reads unstructured meeting notes from various file formats and automatically extracts actionable tasks into a clean table using a local LLM.
Why Local?
This tutorial uses a local LLM so you don’t need to configure API keys or send data to the cloud.
If you already have an API key for a cloud-based model, you can easily replace the local model nodes with the appropriate nodes to connect to your preferred provider.
The Use Case
Modern video conferencing platforms such as Zoom and Google Meet often provide automated meeting transcripts. However, these transcripts are usually long and not structured for tracking technical tasks or tickets.
In this tutorial, you will build a workflow that processes the raw transcription data and extracts the relevant task information such as status, ownership, and deadlines into a structured format your team can act on.
Before you start
Make sure that you have the following installed and running on your computer:
- KNIME Analytics Platform: Download it from knime.com/downloads.
- Ollama: Download and install Ollama. It must be running in your system tray before you start.
- The Local LLM: Open your terminal and run this command to download the model used in this lesson:
batch
ollama run llama3.2:3b1. Download the Action Item Extractor Folder
Before starting KNIME Analytics Platform, you need the workflow files and example data on your computer. To download the Action Item Extractor folder:
- Visit the Extract Tasks from Meeting Notes folder on the KNIME Hub.
- Click the Download icon
to save the entire space to your machine.
- Open KNIME Analytics Platform and import this folder into your Local Workspace.
- Verify: Ensure you see the
datafolder containingtranscript.docxin your explorer.
2. Prepare your Environment
Ensure your KNIME workspace has the necessary nodes installed to handle AI tasks.
- Open the Task Extractor workflow on your local machine.
- If prompted, install the required extensions.
- Result: After KNIME restarts, your workspace is ready to create your action item extraction workflow.
3. Connect the Local LLM
The connection to your local Ollama server is already set up in the workflow.
- Locate the Connect to Ollama box.
- The OpenAI Authenticator (using
http://localhost:11434/v1) and OpenAI LLM Selector (set tollama3.2:3b) are ready. - Execute these nodes.
- Verify: When the status turns green, KNIME is successfully talking to your local model.
4. Parse the Document
The workflow includes a Tika Parser node that can read various file formats. In this case, it will extract the text from transcript.docx.
- Locate the Tika Parser node connected to the document input.
- Execute the node to parse the document.
- Result: The unstructured text from the meeting notes is now ready to be processed by the local LLM.
5. Configure the Extraction Logic
Now you will link the document text to the LLM prompter and define what it should do with the unstructured text.
- Locate the LLM Prompter node.
- Connect the Tika Parser to the data input and the OpenAI LLM Selector to the model input.
- In the LLM Prompter settings, ensure the following are configured:
- Prompt column:
Content - Output format:
Structured - Target object name:
Actionable tasks - Target object description:
Tasks mentioned in the meeting that should be added to the board. - Target objects per input row:
Multiple.
- Prompt column:
6. Define the Task Structure
You must now define the specific table headers you want the local LLM to fill. In the Output columns section, configure the following columns.
| Column | Data Type | Category | Description |
|---|---|---|---|
| Project | String | Single | The project this task belongs to. |
| Responsible | String | Single | Person owning the task (can be null). |
| Deadline | String | Single | Due date or time reference (can be null). |
| Stakeholders | String | Single | People or groups involved or impacted. |
| Status | String | Single | Current task state (open, planned, in progress, tentative, confirmed). |
7. Execute and Inspect
Launch the extraction and view your structured data.
- Execute the LLM Prompter node.
- Notice that: Because the model is running on your local hardware, it may take a few moments to process the document.
- Check out the results.
Conclusion
Congratulations! You've built a Task Extractor that can parse complex documents and generate structured data without ever sending your private information to the cloud.
Next Steps
If you want to keep learning about AI and data workflows in KNIME, take one of our free courses and earn a microcredential badge to showcase your skills: