The Steamdeck was motivation for the collaboration - since it is based on Arch Linux. But as a desktop client they only support ubuntu officially which makes level 1 tech support easier as supporting every distro can be very complex.
The Steamdeck was motivation for the collaboration - since it is based on Arch Linux. But as a desktop client they only support ubuntu officially which makes level 1 tech support easier as supporting every distro can be very complex.
Arch normally immediately updates to the latest version of every program
This is not true though. Arch packages new program versions as soon as they can - for popular stuff this happens quickly but not everything updates quickly. And when they do publish a new package it goes to the testing repo for a short time before being promoted to the stable repos. If there is a problem with the package that they notice it will be held back until it can be solved. There is not a huge amount of testing that is done here as that is very time consuming and Arch do not have enough man power for this. But they also do not release much broken things at all. I have seen other distros like ubuntu cause far more havoc with a broken update then Arch ever has.
The devs from ΔV: Rings of Saturn give a completely different story. Yeah, most bug reports come from Linux - but platform specific ones a vanishingly rare: https://www.reddit.com/r/gamedev/comments/qeqn3b/despite_having_just_58_sales_over_38_of_bug/
Do you know how many of these 400 bug reports were actually platform-specific? 3. Literally only 3 things were problems that came out just on Linux. The rest of them were affecting everyone - the thing is, the Linux community is exceptionally well trained in reporting bugs. That is just the open-source way. This 5.8% of players found 38% of all the bugs that affected everyone. Just like having your own 700-person strong QA team. That was not 38% extra work for me, that was just free QA!
Not to mention the quality of the reports from the Linux users was vastly more details and useful to them.
One of the fabricated battery pouch cells was even able to work after being folded and cut off. “That proves its high safety for practical application,” the researchers emphasized.
If you can cut it in half and it still works I doubt piercing it will do much.
C is easier to get a program to compile. Rust is easier to get a program working correctly.
REPLs are basically shells. They behave the same way in every essential way. So the real question is support vs non-supported shells. But then that is easy - non supported shells fall back to just what a normal commands do ATM to process input/output. Other applications like TUIs are also easy to deal with as they already enter a different mode called raw mode - when a application requests that it can do what they currently do - switch to a new buffer and give full control to that one application.
I can’t think of any, but I’m not the most creative person… what do you have in mind?
Having smarter scroll back that knows the difference between a prompt/command and the output would let you do quite a few things that would be nice to have. Such as collapsing the output so you can only see the commands, keeping the command at the top of the screen even as other output scrolls off the top so you can always see what was running. Extra support for other UI elements could be nice to have as well - like tooltip support for blocks or similar.
All the shell - or really any application - needs to do is tell the terminal which bits of the output are witch. Like mark the start/end of the prompt, command and command output. Then the fallback is basically ignore the markers and print things out like it currently does.
And those are just random thoughts I have had over the last few days. These can be implemented in backwards compatible ways I believe and don’t need special support for specific shells - just needs to expand the VT100 protocols to be able to send more information between the terminal and shells/applications that are running. Much like how color, cursor positioning, entering/exiting raw mode etc are already done. Though I think some tight specialized integration might be a good start to explore these ideas before the protocols are formed.
What I would love to see is a terminal that builds it’s own shell from scratch too rejecting the ancient ideas we have with bash. I still love bash but I’m curious what could come of it.
Thinking about it some more I am not sure that we would have to go that far. Well not in the longer term - short term we might for experimenting with ideas. I think one of the bigger problems ATM is the terminal does not understand what a shell is or its component parts. Terminals just display characters and move the cursor around the screen and send keyboard and now mouse input back to the command they run. They are also aware of alternative buffers and raw input mode and know about echoing characters back the the screen.
If we extended the terminals to also understand some shell concepts like the prompt, commands being typed and output from the commands and gave the shell some markers it could send along with these (like we do with color information ATM) the terminal would be able to use these to change how it displays each part and would open up a lot of new an interesting features. Could even add things like tooltip support or actions on clicking some bits of the text.
I am starting to see these terminals as experimenting on what we features could be enabled if we were not stuck on the current VT100 protocals. Though if we ever get wider adoption and generalisation of these ideas backed back into the protocals will be another question to consider.
Thinking about it some more I don’t think we would need to abandon the whole TTY to get a good set of the features. What is basically required for a lot of the features is more communication between the shell and the terminal. There is already some communication for basic things - like raw mode and alternative buffers, colors and even images. These are how TUI programs like vim or screen/tmux function and how you can exit them without losing what was previously in the buffer.
I wonder if markers for the prompt and start/end of command output would probably enable a lot smarter virtual terminals with only some minor additions to the VT100 protocols. Possibly some extra data could be sent as well - like optional tooltip data maybe or even supporting actions that when the user clicks something it can send a response back to the shell. Maybe like a retry button on previous commands for example.
There is quite a lot that could be done if the terminal and shells had better protocals to communicate between each other. I dont think these will change overnight though so seeing terminal emulators try out these features to find what people like/want to use IMO is a good thing to see where we can take things in the longer term.
Would we have to abandon SSH or always X forward
No we would not. At the end of the day a TTY is just a input and output pipe between the terminal and the program running on the shell with a specific protocal VT100 (or some bastardized version of that - looking at you xterm). This is what network protocals are as well - just with different protocals in play. So you can do a lot over that connection with changes to the protocals. No need for xforwarding at all.
Might I add the idea that your terminal emulator must support your shell is utterly ridiculous?
TBH I am starting to come around to the idea of a tightly integrated shell and terminal emulator support. There are just things you cannot do with these being separate things. I am very tempted to explore the idea from the other end though - writing a shell that has a emulator built into it (like screen/tmux basically are). But I do think that this integration is needed for any per command features that is not just printing out a prompt.
It would be interesting to see what could be done with this type of integration but will likely break support for existing shells. Unless you maybe launch a shell for each command you run or something 🤔. Would like to seem more people experimenting with stuff like this and see what new things we could drive forward. We have been stuck with the current tty system since like the 80s to support devices that just dont exist anymore.
I think the issue fundamentally is that this isn’t what terminal emulators are. The terminal emulator initializes a TTY session and enters a shell environment (sh, zsh, fish, etc). The medium is text and cannot be anything else.
This is 90s thinking. Why must terminal emulators only be text and only do things that a physical terminal could? What makes teminals so nice is not that they work on 90s technology. Some terminal emualtors can already display images. Which is great. And the ideas they are introducing are still fundamentally text based, but are geared towards structuring that texts a bit more than a constant stream of characters on the screen.
Skill issue. Pipe your output to something (like a file or the “less” command)
This is a convenience issue not a skill issue. Yes you can pipe output to things but you need to know before hand that you want to do that. And with less you lose that output once you close less. And with files you have to clean them up after the fact. Both of these are inconvenient and need to be thought of before you run a command. IMO it would be nice to just run a command and then be able to collapse or expand or search its output after the fact and not have to preempt everything beforehand.
The argument that you can already do that in a much less convenient way is not a very good argument.
Konsole can display images, as can kitty alacritty, western, iterm2, etc.
They can now? I know it was possible in some niche terminals but never knew it was as wide spread as that.
but it’s not exactly game changing
None of these features on their own are game changing I agree. But lots of small nice to haves can end up being game changing overall. Again - I don’t think these terminals offer anywhere near enough to warrant their IMO massive downsides though. But I would love to see more innovation in the terminal emulator space.
Lastly, searching explicitly your last command for a term with context would be much better suited to the shell to solve as it’d be terminal independent.
I had a similar thought TBH. But the more I thought about it the more I came to see that in order to do this nicely - ie with inline scroll back or being able to collapse command output like these terminals do then you would basically need to implement a terminal emulator into the shell. Either way you are breaking down the wall between what a shell and a terminal emulator are doing. I would be interesting in exploring this from the shell side, though I cannot fault them from doing it from the emulator side either.
couldn’t be solved at the shells level or with supplementary applications
I think the key benefit here is integration rather than technical ability to do something. Making it easy and convenient to do goes a long way. There is a lot that can be made much nicer with things more tightly integrated together than trying to string up a bunch of disparate applications together - even if you can do it the integrated approach will give you a much more refined experience.
I doubt they’re outright rejecting any idea of progress
It sounds like an outright dismissal of new features to me.
That is the Luddites argument against progressing anything. There are many problems with current terminal emulators that these newer ones are trying to fix and make the terminal experience better overall. Terminals as they currently work were designed the way they are to talk to dumb typewriters with a screen (that’s right, not keyboards, digital typewriters). And they have barely changed at all since then.
Personally looking at these terminals they have a lot of niceties that I would love to use. But IMO these benefits are not worth the costs these particular terminals also have. One being closed source and requiring an account and the other being electron - no benefit is worth that. But to bury your head in the sand and claim they have no benefits at all is wrong.
Begin able to view images in the terminal would be amazing alone - just like you can cat a text file. I would hate to need to launch a GUI program every time I wanted to see what was inside a text file but that is exactly what I need to do for images or PDFs.
Being able to collapse the output of a command would be nice as well. The number of times I have had to scroll for days to get to the output of a previous command because I happen to run a noisy one but still want to check what something previously had done would save me quite a bit of annoyance. And being able to search just the last commands output would be great - like an after the fact, interactive grep with context. And being able to quickly copy the output of the last command would also be great.
Almost. But with one key difference. PPAs are precompiled binaries where you cannot inspect the source - you have to trust the maintainer of the PPA. AUR is a repository of source packages which you can download and inspect yourself (or hope others have done this). This makes AUR more community focused than PPAs I feel. AUR is also a central repo managed by people that dont own the vast majority of the packages hosted on it and where packages can be taken down if found malicious. PPAs are lots of separate repositories all managed by different people that generally maintain all the packages for their PPA.
Though in both cases anyone can upload anything to them, so they are not 100% trustworthy. But I do think the way AUR works puts them ahead of PPAs.
There is probably a forced arbitration clause and class action waver in the TOS…
I believe that VKD3D can give you directx support. Proton should be able to run most games these days, which is essentially a bundle of wine + vkd3d and other things. This is what valve created to run games on steam on linux/steamdeck. https://www.protondb.com/ shows what is able to run on it and it is most things that do not have some form of incompatible anticheat.
You might have more luck not using wine directly (if that is what you are doing) and using things like steam (you can add external games to it to run them in a proton context) or lutris or heroic games launcher.
The goal ATM is simply to allow people to write new drivers in rust, not convert the whole kernel to rust. It will be a very long time, before more core parts would be allowed to be written in rust let alone rewriting any existing core kernel code. Which is all fine as new drivers are a large part where bugs are added - older parts have had a long time for bugs to be found and fixed and so it is far less important to need to rewrite them.
Or wait for rust to support the extra languages. With LLVM adding new architectures or projects like gccrs. But all of these options are a way out and rust will remain device driver only for a long time I suspect - it is still experimental after all. I would hope that as rust in the kernel matures so do the available architectures that rust supports.
Secureboot is meant to help protect you against the evil maid attack. IE someone with physical access to your computer can compromise your boot loader with a keylogger that can capture your encryption password so that when they return they can gain access to your computer as they now know your password. Though the vast majority of people just don’t need to worry about that level of attack so I have never really bothered with secureboot.
I would say it is more so they can advertise a lower price. But then expect you to get the more expensive ones as the bare minimum is just not enough.
Probably nothing. This is more steamdeck related stuff since the SteamOS is based on ArchLinux. And even then, it does not mean much for SteamDeck users. They wont notice much at all really. This might help with development a bit on valves end. The big news is really for ArchLinux users and maintainers which will see more effort in the development of that distro.
There is some wild speculation that maybe this makes arm for Arch Linux more official in the future. Which is based of the other recent news that Valve are creating an ARM emulation layer for running games on ARM devices. Which means maybe they are working on an ARM device and maybe need to start working on getting ARM support for Arch. Though again this is all wild speculation.