@joojmachine@lemmy.ml to Linux@lemmy.mlEnglish • 1 month agoAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.orgexternal-linkmessage-square13fedilinkarrow-up1128arrow-down16
arrow-up1122arrow-down1external-linkAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.org@joojmachine@lemmy.ml to Linux@lemmy.mlEnglish • 1 month agomessage-square13fedilink
minus-squarePossibly linuxlinkfedilinkEnglish4•edit-21 month agoThis is what I needed. I will run this locally and run ollama in a VM. Although podman and Distrobox look tempting.
This is what I needed. I will run this locally and run ollama in a VM.
Although podman and Distrobox look tempting.