LocalChat is a ChatGPT-like chat that runs on your computer
GPL-3.0 License
Bot releases are visible (Hide)
In this release, we have made several improvements and bug fixes to enhance the user experience. We upgraded our dependencies for better performance and added a configuration option for the light/dark mode. Additionally, we introduced global styles for a consistent look and feel across the application. We also fixed issues with input/output sanitization, and the documentation.
Furthermore, we implemented features such as allowing users to create new conversations directly from the start window and not pre-selecting a conversation on load. Lastly, we resolved an issue where the same conversation would be re-selected. Overall, this release aims to provide a more seamless and enjoyable experience for our users.
These release notes have been generated using
openchat_openchat-3.5-0106
.
Full Changelog: https://github.com/nathanlesage/local-chat/compare/v0.10.0...v0.11.0
Published by github-actions[bot] 8 months ago
This release updates node-llama-cpp
to v3.0.0-beta.13
. Furthermore, it introduces the ability of users to specify custom prompt templates. See the documentation to learn how to set up custom prompts.
Full Changelog: https://github.com/nathanlesage/local-chat/compare/v0.9.0...v0.10.0
Published by github-actions[bot] 8 months ago
This release updates node-llama-cpp
(and llama.cpp alongside) to the most recent release. Now, LocalChat also shows you how much VRAM is being used in the statusbar. Lastly, we've implemented Electron fuses support to prevent unauthorized access to the binary.
Full Changelog: https://github.com/nathanlesage/local-chat/compare/v0.8.0...v0.9.0
Published by github-actions[bot] 9 months ago
LocalChat 0.8.0 is now available for download! This release includes updates to dependencies (node-llama-cpp
⇾ v3.0.0-beta.9), conversations can now be searched, only populated conversation groups are shown, uncollapsing the most recent month's conversations, and the model selector widget is now disabled while the model is generating. Additionally, it fixes issues with button styling and sidebar header positioning. Enjoy the improved user experience!
These release notes have been partially written with the help of
OpenChat 3.5 0106 Q4_K_M
in LocalChat.
Published by github-actions[bot] 9 months ago
This patch fixes an issue with the sidebar header that was hidden, and it fixes a bug when renaming conversations.
Published by github-actions[bot] 9 months ago
We are excited to announce the latest changes in our application. The first change is a fix for conversation styling and autofocus description input. Next, we have introduced automatic assignment of conversation descriptions. We have also added features to sort conversations by time, group them by month, and make them collapsible. Additionally, we have fixed an issue with highlighting code blocks on mount. Lastly, a bug in the update process has been resolved. Stay tuned for more updates!
These release notes have been generated with
OpenChat 3.5 0106 Q4_K_M
in LocalChat.
Published by github-actions[bot] 9 months ago
This update contains a few improvements over 0.5.0. Specifically, now you can customize the system prompts for your chats. Additionally, we have now armed the auto-updater for Windows and macOS installations. This means if you use LocalChat on these operating systems, the apps should be able to auto-update themselves. Note, however, that despite turning on the auto-updater, it could be that there are a few issues that we can only debug in production.
Thus, if your install does not notify you of the next upcoming release, please open an issue so that we can tend to it.
Published by github-actions[bot] 9 months ago
LocalChat 0.5.0 is an exciting release that brings a host of new features and improvements to the platform. One of the most notable additions is the ability to copy messages to the clipboard, making it easier than ever to share interesting or important conversations with others.
The sidebar has also been updated for a more modern look and feel, while the window positioning feature now remembers your preferences across different screen configurations. LocalChat now retains its window position, ensuring a seamless user experience.
In addition to these improvements, LocalChat 0.5.0 introduces support for tables in messages, providing better previews and styling options.
The documentation has been expanded with proper guides and instructions, making it easier for new users to get started with LocalChat. The start guide has been replaced with a welcome message.
LocalChat 0.5.0 also includes several bug fixes, such as ensuring that models load correctly in new conversations and providing a fallback model if necessary. The platform now defaults to a context size of 512 tokens instead of 2,048, offering users more control over their conversation length.
Overall, LocalChat 0.5.0 is a significant update that enhances the user experience and adds valuable new features. Users can expect improved performance, better styling, and greater flexibility in managing their conversations.
These release notes have been generated with
openchat-3.5-0106.Q4_K_M
using LocalChat based on the changelog.
Published by github-actions[bot] 9 months ago
The following release notes have been generated from the changelog using openchat-3.5-0106.Q4_K_M
in LocalChat.
We are excited to announce the latest updates to our platform, which include several bug fixes and new features. Firstly, we have fixed a typo in the release preparation script, making it more accurate and efficient. We have also added a copy code block button for easier sharing of code snippets.
Additionally, we have improved the visibility of inline code by using a better code font, making it easier to read and understand. The sidebar resizer is now properly hidden when the sidebar is not shown, providing a cleaner interface. We have also fixed an issue where the regenerate button was shown even when it wasn't possible to use it.
New features include the addition of a context menu for more options and control over the platform. Furthermore, we have replaced the conversation ID with the model name for better clarity and understanding.
Lastly, we have bumped node-llama-cpp to version 3.0.0-beta.5, ensuring that our platform stays up-to-date with the latest developments in the technology. These updates are aimed at improving user experience and providing a more efficient and feature-rich platform for all users.
Published by github-actions[bot] 9 months ago
This version again fixes minor issues. Specifically, the Windows executable is now properly code-signed. Additionally, you can now regenerate a model response, and delete individual conversation messages.
Note that there is still no auto-update mechanism in place, so make sure to check the GitHub releases page from time to time, or subscribe to notifications.
Published by github-actions[bot] 9 months ago
This release contains various small fixes. Remember that the app still has no auto-update mechanism, so make sure you subscribe to GitHub releases to receive information about new releases. To do so, go to the main page of this repository, click the small arrow next to "Watch" (or "Unwatch", if you already watch the repository) --> Custom --> Releases.
Full Changelog: https://github.com/nathanlesage/local-chat/compare/v0.1.0...v0.2.0
Published by github-actions[bot] 9 months ago
This is the first release of LocalChat. It provides the basic functionality that I intended it to have. Note that due to the fact that both the GGUF format as well as the underlying library, llama.cpp
is not yet very mature, expect some crashes here and there. Also note that I did not have success setting context lengths of more than 2,048 tokens, but your mileage may vary.
Please feel free to read the documentation at https://nathanlesage.github.io/local-chat/.
Have fun chatting!