tinwhiskers@lemmy.world to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 1 year agoNew technique to run 70B LLM Inference on a single 4GB GPUai.gopubby.comexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkNew technique to run 70B LLM Inference on a single 4GB GPUai.gopubby.comtinwhiskers@lemmy.world to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 1 year agomessage-square0fedilink