Show HN: MinimalChat – A Simple and Customizable LLM Chat Application Hello everyone! I have a hobby project that has become fairly full featured that I figured I would share. The idea of MinimalChat has been to create a project that is a lightweight and dead simple application that can be deployed locally in a few seconds (with docker). While of course also having most of the nice to have features and looking pretty nice. A nice bonus is it a Progressive Web Application so it can be installed like a normal application to your mobile device. It has a full mobile UI. For those using Chrome and Edge you can also locally download, load and host entirely via your browser models like LLama-3-8b with hardware acceleration via WebGPU. It's pretty experimental but it does work! I won't bloat this post reiterating it's features, the GitHub Readme gives a good idea of the application abilities. I know chat applications are a dime a dozen but...here's another one hah! https://ift.tt/YFwTNnA May 11, 2024 at 06:49AM
0 Comments