minus-squareBurnedDonutHole@ani.socialtoFree Open-Source Artificial Intelligence@lemmy.world•Totally new to running local LLM. How are you connecting to your model via mobile?linkfedilinkEnglisharrow-up1·16 days agoUse https://anythingllm.com/ on your mobile with its own local models? You’ll be limited with your phone’s hardware but it will do on a pinch where you can’t reach or use your laptop. It’s open source and everything works locally. linkfedilink
Use https://anythingllm.com/ on your mobile with its own local models? You’ll be limited with your phone’s hardware but it will do on a pinch where you can’t reach or use your laptop. It’s open source and everything works locally.