minus-squareWyre@lemmy.worldtoAsklemmy@lemmy.ml•I'm increasingly unhappy with the limits on AI text generation and I have heard that it's not that hard to do it on a laptop oneself. What is the best path forward?linkfedilinkarrow-up6·8 months agoI’ve been playing a bit with llama2 in Ollama it does not have any restrictions perhaps using Ollama to run models locally is something that would solve some problems for you? linkfedilink
minus-squareWyre@lemmy.worldtoTechnology@lemmy.world•Reddit: Return Of The Junk Stock IPOlinkfedilinkEnglisharrow-up56arrow-down1·8 months agoOh just Forbes totally destroying Reddit. linkfedilink
I’ve been playing a bit with llama2 in Ollama it does not have any restrictions perhaps using Ollama to run models locally is something that would solve some problems for you?