Jun 14, 2024


Resolved an issue that prevented Sanctum from running properly on MacOS Intel.

Jun 10, 2024


We are excited to share a new release update with new features, improvements, and bug fixes! Here’s what’s new:

File manager (beta)

This is the first step to our comprehensive RAG solution. View all imported files in a single place, use it to organize your workspace, and start chatting with files in a single click.

Message editing

Want to make edits to your prompt? Use the edit icon and get a new answer from the model.

App Improvements
  • Increased model context size in the model settings sidebar (e.g., Llama 3 8B now has a max context size of 8192 enabled by default).

  • Implemented caching and performance improvements to allow for much faster interactions and search on the Models page.

  • Added support for IBM Granite Code model preset & improved Mistral model preset.

Bug fixes
  • Fixed a bug with missing line breaks after submitting a message.

  • Fixed a bug where long chat names overlapped with the model dropdown on smaller resolutions.

  • Fixed an issue with generating chat titles.

  • Tweaked UI to avoid duplicating the left sidebar icon.

May 20, 2024


Added support for deep links for opening Sanctum from HuggingFace.

May 14, 2024


Updated llama.cpp to support pre-tokenizers in new models.

Resolved a bug that caused system prompt words (e.g., <context>) to occasionally appear in AI responses.

May 7, 2024


Fixed bug with chat failing to process subsequent PDF attachments after initial OCR failure.

Added detailed logging to assist with troubleshooting application runtime issues.

Apr 30, 2024


Fixed bug with app crashing on Windows 10.

Apr 29, 2024


Added support for Microsoft's Phi-3-Mini model.

Added Account ID to Settings page.

Fixed bugs and made improvements.

Apr 23, 2024


HuggingFace integration

Access any .GGUF open-source LLM on HuggingFace directly from the Sanctum app on your specific use case.

LLaMa 3 support

Download and run the latest LLaMa 3 8B model from Meta.

Hardware compatibility check

Quickly check whether your system's video memory meets the requirements to run specific AI models efficiently.

Improved “New Chat” interface

The new chat view allows for immediate adjustments to chat settings. Switch between model presets, modify hardware settings, and set the context length, all from one place.

Enhancements and bug fixes

Mar 5, 2024


Bug fixes and improvements:
  • Improved CUDA installation process on Windows - no need to restart the app after installation.

  • Added available GPU memory on MacOS & Windows.

  • Fixed issue with running Sanctum on MacOS Intel.

  • Improved error handling on Windows & MacOS.

  • Minor bug fixes & improvements.

Feb 22, 2024


Bug fixes and improvements:
  • Added support for Gemma 7B model from Google.

  • Tuned default GPU setting for Windows and MacOS.

  • Updated color scheme of the scroll line in the chats.

Feb 20, 2024


Sanctum v1.3.0 now supports Windows, here’s what’s new:

Windows ready

Download and run Sanctum on Windows using CPU or GPU. Supports Nvidia CUDA 11 and 12 (support for AMD GPUs coming next).

PDF chat viewer

Seamlessly view PDFs in the Sanctum UI and see how your queries are answered from highlighted text.

Expanded file support

Added support for a variety of formats like .docx, .pptx, .js, .html, .css, and more!

Editable chat names

Rename chats directly from the sidebar or chat header.

In-chat file management

Easily unlink files from the chat to refine the model's response sources.

Sidebar flexibility

Toggle the sidebar for an uncluttered, focused chat view.

Jan 10, 2024


In this release, we have made the following updates and improvements:

File drag & drop

Enhanced file interaction: Seamlessly drag and drop files directly into Sanctum for a more intuitive user experience.

Device usage information
  • Optimized layout: Relocated Memory Usage information from the left sidebar to beneath the input area for better visibility.

  • New addition: Introduced a CPU Load metric for real-time performance monitoring.

Model additions

Added support for Microsoft's Phi-2 model.

Bug fixes & general improvements

Various bug fixes and improvements to enhance the stability and performance of Sanctum.

Dec 21, 2023


Sanctum leaps forward with a major upgrade 🚀

Introducing Sanctum Pro, a powerful enhancement to your Sanctum experience, packed with new and exciting features:

  • Secure, 100% Private PDF Chatting: Now chat, ask questions, and summarize PDF files in a secure and completely private environment.
  • Advanced Sanctum Vault Search: Effortlessly search through your encrypted chat history.
  • Early Access to New Features: Get exclusive early access to upcoming features and model updates.

Everyone on the Base plan will continue to enjoy all the essentials needed for private chatting.

We have also implemented additional updates and improvements, including a vital new feature:

Account Recovery: Secure your account with a unique 24-word recovery phrase. Use it to regain access if you forget your password or when migrating to a new device.

Dec 15, 2023


Added support for Mixtral-8x7B (32GB+ memory recommended):

  • Mixtral-8x7B-Low-Specs (Q3_K_M)
  • Mixtral-8x7B-Low-Specs (Q4_K_M)
  • Mixtral-8x7B-Low-Specs (Q6_K)

Nov 20, 2023


This update improves app performance for up to 5 times, introduces new features and minor tweaks.

  • Speed Boost: Our switch from .ggml to .gguf model format supercharges the app, expanding our model compatibility game.
  • Model Additions: Welcoming Mistral-7B, TinyLlama & Llama-2-13B.
  • Bookmarks: Save important chat messages, ensuring they're alwayswithin reach.
  • Enhanced Navigation: Seamless transitions with our new “Go back” and “Go forward” buttons.
  • Chat Grouping: Added “Recent” and “Older” groups for your chat history.
  • Model Icon:Easily identify models with distinct icons in the "New chat" dropdown and chat headers.

Oct 10, 2023


Initial release.

Notable features include:

  • Chat with AI without the internet.
  • Launch Llama-2 7B LLM locally.
  • Maintain control of all your data.
  • Lock your data in Sanctum Vault - encrypted local database.