This gist contains a proof of concept project (hacked together during one evening) aimed at connecting an AI chat assistant to the Meshtastic network, enhancing its capabilities. The assistant is designed to provide concise and helpful responses within the constraints of the Meshtastic environment.
This project brings together two favorite elements: the Meshtastic network and an AI chat assistant. The assistant is based on the "FuseChat-7B-VaRM-Q5_K_M" model, and the code is written in Python.
The code is optimized to run on an ordinary PC without a GPU. I tested it on my old i5-2400 CPU with 16 GB of RAM. In such cases, responses may take around 30-60 seconds.
The assistant receives packet payload, Received Signal Strength Indication (RSSI), and Signal-to-Noise Ratio (SNR) from the Meshtastic network. It generates responses with a maximum length of 237 characters, aiming to be concise and helpful.
To run the project, ensure you have Python installed along with the AI model and required dependencies specified in requirements.txt. Additionally, make sure to have Meshtastic devices set up and connected to the USB or to the network.
- Clone this repository to your local machine.
- Install dependencies using pip install -r requirements.txt.
- Download model
huggingface-cli download LoneStriker/FuseChat-7B-VaRM-GGUF FuseChat-7B-VaRM-Q5_K_M.gguf --local-dir ./models --local-dir-use-symlinks False
- Run the Python script
main.py
.
This project is just a proof of concept and may require further tweaking for optimal performance. Any feedback is welcome; feel free to reach out. Enjoy exploring Meshtastic with AI assistance!
If you enjoy this project, be sure to check out my other creation, MakerDeals.eu. This website simplifies the process of finding, filtering, and comparing 3D printing filaments available on Amazon.
Thanks for sharing this. I was planning on trying this myself you saved me some effort.
This works pretty well if you want to send the whole response in multiple messages. As long as you not connected to anything but your local devices.
I am also using TheBloke/Mistral-7B-Instruct-v0.2-GGU which runs great using my RTX 3090 on Win11.