• 0 Posts
  • 14 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle







  • I’m not opposed at all to using LLMs for such purposes, however, please consider a solution that aligns with the values of GNU/Linux and the Free Software Movement. If you have sufficient RAM and a somewhat modern CPU, you can do inference on your very own machine, locally, with no connection to any external servers. And at very respectable speed.