Duck.ai doesn’t data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
Yeah duck is all over bothered with since it came out since you don’t even need to login to use it.
Why would I use this over Ollama?
Ollama can’t run on Android
god i can’t wait for the ai bubble to pop
Google hosting their shit on Microsofts servers and telling you to sideload and not using their own software distribution method for their own OS is kind of crazy if you think about it
Excellent, I will be sure not to use this, like all Google shit.
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
You never heard of ollama or docker model runner?
Android and iOS.
Is the chat uncensored?
And unmonitored? Don’t trust anything from Google anymore.
What makes this better than Ollama?
Quote: “all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!”
So you can download it and set the device to airplane mode, never go online again - they won’t be able to monitor anything, even if there’s code for that included.
never go online again - they won’t be able to monitor anything, even if there’s code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
So trust them. If you don’t and want to use this, buy a separate device for it, or VM.
Can’t? This is not for you.
I won’t gonna use my smartphone as a local llm machine.
Okay man.
everything is unmonitored if you don’t connect to the network.
But not everything works in those conditions.
it does if you make it work in those conditions.
software that “phones home” is easy to fool.
Just firewall the software or is there anything more fancy i would need to do?
typically the phone home is looking for a response to unlock.
use a packet sniffer to see what the request/response is and replicate it with a proxy or response server.
this is also know as a man-in-the-middle (mitm).
takes skill and knowledge to do, but once you do a few dozen it’s pretty easy since most software “phone homes” are looking for static non-encrypted responses.
Thanks for the info!
There is already GPT4All.
Convenient graphical interface, any model you like (for Llama fans - of course it’s there), fully local, easy to opt in or out of data collection, and no fuss to install - it’s just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
Yes, it works perfectly well without Internet. Tried it both on physically disconnected PC and laptop in airplane mode.