There is no difference, flat-earth was the test. It showed that with little effort duping people online is easy because our educational system does not teach critical thinking.
There is no difference, flat-earth was the test. It showed that with little effort duping people online is easy because our educational system does not teach critical thinking.
Becoming??? It has been for almost a decade. Anyone else remember the idiocy of flat-earthers??
I was a terrible typer as a kid, two finger hunt and pecker. Got a job that necessitated fast typing while listening or reading. I learned how to touch type, or fake it enough, really quick. Humans are adaptable, that’s why we are everywhere, they just need the motivation to learn the skill.
Reliability of connection to the drives, especially during unscheduled power cycles. USB is known for random drops, or not picking the drive up before all your other services have started, and can cause the need for extra troubleshooting. Can run fine… or it could not. This is in reference to storage drives, not OS drives.
Trunas with Tailscale/headscale/NetBird as far as software and security. As far as hardware, you want storage that is not attached via usb. Either an off the shelf nas solution or a diy nas would work. There are a few YouTubers that touched on this, hardware haven and raidowl I think.
“Brought to you by Carls Jr”
That statement alone says he is a candidate to be bought. If Oscar Meyer gave him a billion dollars, I’m sure he would have no choice but to make sure hot dogs are served as the only protein in grocery stores.
No shit, because we all see that AI is just technospeak for “harvest all your info”.
The newer Omada routers are pretty good, and their software is getting better. Personally I use Opnsense on a Chinese fanless router from eBay. Paid for an n100, got an i3-1113 with dual channel memory does everything I need no issue, and it has helped me learn ALOT. However if I had the $200 just laying around today, I would stick with Omada just for simplicity.
Jeff Geerling did a video on them, got me super interested and thinking on how to implement and use with family.
Proxmox is based on kvm/qemu, and is very resource conservative. There is virtually no impact on performance due to the hypervisor, even on older processors. Scheduling on the cpu and hypervisor makes running multiple VMs at the same time trivial as well. RAM and I/O bandwidth are the two things that can affect performance. Running out of RAM due to too many VMs will grind you to a halt, but so would running too many applications or containers on bare metal. Running everything off of one spinning sata disk will make it impossible, but again, same downfall on bare metal.
Those minimal impacts to performance are a minor nuisance compared to the ability to run experiments and learn on sandboxed VMs. Now that TrueNAS has better virtualization support, it has caught my eye as a better homelab solution, but I will always have a proxmox server running somewhere in my stack just due to the versatility it gives me.
So if I see it on the “open web”, I’m free to use it however I please? Oh, I get thrown in jail and everything I own taken away.
If companies are people per “citizens united”, why doesn’t the same apply to them?
Just my opinions here:
I have essential services running on a separate computer, 8gb pi4 right now. Stuff like NetBoot.xyz, homepage, etc, lightweight and resource low but need to be always up. That way if your main server needs to go down, you still have those services running.
I have bought second hand enterprise equipment for most of the hardware I have. Basically anything with ddr4 and pcie 3 or above will crush most things you would like to do. Grabbing an intel with quick sync will help with Jellyfin, but you can add a graphics card for transcoding if you want, a quadro p2000 or higher will be fine. Building is a viable option as well, but you may spend more for less powerful but more efficient hardware.
software is probably the most controversial. I went with proxmox on my main server, giving me the ability to run whatever I want whenever I want. It’s not perfect, but gets the job done and has helped me learn A LOT. But flip a coin or roll a dice on what software to run as a newbie, it will all be a learning curve, and everyone will tell you why what they use is superior.
Whatever you do, you’re not wrong. Run things that tickle your fancy and move at your pace. You’ll mess up, step back and punt a lot. Remember to backup essential data before you wipe. Have fun, and good luck on your travels.
This is exactly what I was thinking. There are plenty of smart people that work there that would have said something before release. They were told to not rock the boat by the yes men and now Microsoft has to backpedal and pretend no one there thought about THOSE implications.
He touches on my major issue with all these companies, data mining without compensating the people that created that data. I have to pay for the operating system, get served ads, AND you get to make extra money off my information too? This kind of shenanigans would be tolerable with a free OS, or maybe one that compensated you like brave browser. The blatant fleecing of the consumer here is sickening. I’m glad data mining your screenshots is the last straw for people.
Waiting on a merge it looks like
How bout rocm support for your own inference cards? Got an instinct mi25 I can’t do a damn thing with because it’s the only instinct card rocm does not support.
It has been. I started in this because I liked picking up kick ass enterprise hardware really cheap and playing around with what it can do. Used enterprise hardware is so damn expensive now, it’s cheaper and easier to do everything with consumer products and use the rx6700 in my gaming rig. Just don’t want that running llms and always on.
Picked up an AMD instinct mi25 to try and do just that. Can get easy-diffusion working after some cussing and voodoo. Cannot get rocm to do ANY llm of any kind, feels like a waste of video ram
Also have a tesla p4 that runs most text-to-image models rather well, but have been unsuccessful at any llm either, even oobabooga can’t seem to run on it.
Have given up because the software stack keeps advancing and leaving my hardware behind. I don’t have $3000 for an a100 or $1300 for an mi100 sooo… until the models can run on older/less powerful hardware, I’m probably sitting out of this game. Even though I’d love to be elbow deep in this one.
Very good Joshua