As a bonus their mom briefly became socialist when she freely shared some crabs and HIV with you :)
As a bonus their mom briefly became socialist when she freely shared some crabs and HIV with you :)


Either my reference wasn’t clear enough or the trailer park boys has seriously faded in cultural relevance.



This is from the scene where Ricky graditated from the eighth grade right?
Use a local model, learn some toolcalling and have it retrieve factual answers from a database like wolfram alpha if needed. . We have a community over at c/[email protected] all about local models. if your not very techy I recommend starting with a simple llamafile which is a one click executable EXE that packages engine and model together in a single file.
Then move on to a real local model engine like kobold.cpp running a quantized model that fits in your computer especially if you have a graphics card and want to offload via CUDA or Vulcan. Feel free to reply/message me if you need further clarification/guidance
https://github.com/mozilla-ai/llamafile
https://github.com/LostRuins/koboldcpp
I would start with a 7b q4km quant see if your system can run that.