

That’s just a remarkable level of talent and dedication. Mind blowing considering how much of it she did while working and studying.
That’s just a remarkable level of talent and dedication. Mind blowing considering how much of it she did while working and studying.
I had a lot of problems when I’ve used Ubuntu in the past. To be fair that was 2009 - 2012 and it was a much less mature product. But whether it’s snaps, unity, or Ubuntu One integrations, they always seem to be doing their own thing in a way that’s not particularly helpful.
I’ve had a much more “just works” experience with Fedora and Mint.
Which is all well and good except for now it’s just a baseless paranoid fantasy. And if that was laid out up front I would have no notes.
Over here in reality, if Canonical deployed a closed source, paid, spyware laden version of it’s OS it might take a little while for some of the server business to disappear, but they’d loose almost all their market share overnight. They’d be a cautionary tale in the FOSS community and the software industry.
I’m struggling to connect the dots between “X person used to work in electronic surveillance” and an immediate risk to the open source software being developed by a different employer. Is there some reason to think this person is still working for their old employer? Or is the speculation that they are a idologue out to destroy Linux from the inside?
If there’s something unsafe in the code, especially a rust rewrite of the coreutils I’d expect it’s going to be found immediately. People are going to go over that code with a fine toothed comb.
If the central idea of the article is “I don’t think there’s a place in the FOSS community for people with different ideas/beliefs/history than me” then the author should come out and say that (many have in the past). Claiming we’re at risk because of some wild speculation about a nefarious plot between the military and Microsoft to attack Linux and privacy… it really does require something more firm than this.
I think the hardware compatibility issues may be overstated. It seems (to me) that besides apple silicon, the support for most consumer hardware is pretty robust. this seems especially true of the kinds of hardware casuals use. Im not a tester, but havent seen a dell, hp, or Lenovo with a hardware issue in ages.
While I think that could be really helpful it is worth pointing out that schools in the US have been shoving Chromebooks into the hands of kids for over a decade and the market share sits at about 4%. Now Google’s planning to merge Chrome OS into Android.
I think the gap between what the average Linux user thinks is ease of use and what the average non Linux user thinks is ease of use is probably much larger and many devs seem to understand.
I think it would be beneficial to have a completely idiot proof installer that doesn’t ask you about partitions or formatting or basically anything just point it towards a drive and it will set up a default installation.
More GUI based means of doing basic stuff. A casual who wants to access some photos from his laptop does not want to figure out how to manually configure samba shares by editing config files in their terminal based text editor.
I think codecs are a much bigger pain in the ass than is ideal. As I understand that there are legal reasons for this but the first time some casual goes to play a video and gets an error message their first thought may not be “let me search Google and figure out what this error message means” their first thought maybe “Linux sucks and can’t play videos”.
The permission structure that makes Linux so secure makes it a little annoying for casuals. For example, you actively and intentionally go to the default software store, navigate to the updates tab, update a package you’ve already installed and clearly want, and do so from the official OS repository… This requires that you enter your password to protect you from what exactly? It’s not a big deal it takes one second to type my password, but how would you explain this to a casual in a way that makes sense? Your OS is protecting you from potentially rogue acts of official patches to your default text editor.
I think the folder structures are pretty big challenge for converts. On Windows you can find most of the files associated with any given program in your program files folder. On Mac there’s an applications folder. On Linux… it’s somewhere, don’t worry about it. That’s not really a fixable one it just is what it is.
I feel like there really are just 2 or 3 main distros for Linux adoption. Every article, forum, discussion, etc… it’s always Mint, followed by either Fedora or Ubuntu. IMO distro is less important for converts than desktop environment.
I think the most important thing for adoption is actually little quality of life stuff.
You don’t need a high level of technical skill. You can learn everything you need to get started in a few minutes of tutorials or walk throughs. The rest you learn as you go.
Bear in mind no every linux user has memorized every terminal command and the whole file structure. Lots of people are just casual users who learn what they need.
One of the things I wish someone had told me at the start of using linux is that initially your desktop environment will effect how you feel about linux more than the distribution or specific architecture of the OS.
The good news is they’re all free. Try a few things and see what you like. IMO Fedora is a great, beginner friendly Gnome or KDE experience. Mint has an excellent Cinnamon and XFCE desktop either of which will feel somewhat familiar to a windows user. Mint will also run on just about anything.
Also, it’s not binary. You can dual boot. If there’s something you need windows for you can use it. Over time you’ll eventually find that you don’t really need windows anymore.
Imagine you’re finishing in 8k, so you want to shoot higher resolution to give yourself some options in reframing and cropping? I don’t think Red, Arri, or Panavision even makes a cinema camera with a resolution over 8k. I think Arri is still 4k max. You’d pretty much be limited to Blackmagic cameras for 12k production today.
Plus the storage requirements for keeping raw footage in redundancy. Easy enough for a studio, but we’re YEARS from 8k being a practical resolution for most filmmakers.
My guess is most of the early consumer 8k content will be really shoddy AI upscaled content that can be rushed to market from film scans.