• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle
  • I’m not arguing for anything in the post above, just pointing out that a broken (or badly repaired) insulin pump is genuinely more dangerous than having no insulin pump. That doesn’t have to count against the right to repair one, as if you’ve got the right to repair an insulin pump, and do so badly, it doesn’t mean you’re legally forced to use it afterwards, just like I’ve got the right to inject all the insulin in my fridge with an insulin pen back to back, but I’m not legally forced to do so.

    I do think the right to repair should be universal, but as I think that medical stuff should be paid for by the state, NHS-style, that would end up meaning that the NHS could repair medical devices themselves if they deemed it more economical to do so and recertify things as safe than to get the manufacturer to repair or replace them. The NHS is buying the devices, and gets the right to repair them, and that saves the taxpayer money, as even if they don’t actually end up repairing anything, it stops manufacturers price gouging for repairs and replacements, and if the manufacturer goes bust or refuses to repair something, there’re still ways to keep things working. It doesn’t mean unqualified end users can’t use their new right to repair their medical devices and risk getting it wrong, but if you’ve got an option of a free repair/replacement, most people would choose the safe and certified repair over their own bodge.


  • If you’ve got a broken insulin pump, assuming you’re in a country with a functioning healthcare system, you should have been given a spare pump with the original, and probably some insulin pens, so when one breaks, you fall back to the spare, and get given a new one to be the new spare (or could get the broken one repaired). Using the spare is completely safe.

    If you don’t have a spare, your sugars would go up over several hours, but you’d have a day or two to get to a hospital and potentially several days after that for someone to find you and get you to a hospital, so it’s not safe, but also not something you’d die from if you had any awareness that there was a problem.

    If you’ve got an incorrectly-repaired pump, you could have it fail to give you enough insulin, and end up with higher sugars, notice the higher sugars, and then switch to the spare. That’d be inconvenient, but not a big deal. However, you could also have it dump its entire cartridge into you at once, and have your sugars plummet faster than you can eat. If you don’t have someone nearby, you could be dead in a couple of hours, or much less if you were, for example, driving. That’s much more dangerous than having no insulin at all.

    Prosthetic legs don’t have a failure mode that kills you, so a bad repair can’t make them worse than not having them at all, but insulin pumps do, so a bad repair could.



  • It’s easy to get pressured into thinking it’s your responsibility. There’s also the risk that an unhappy company will make a non-copyleft clone of your project, pump resources into it until it’s what everyone uses by default, and then add proprietary extensions so no one uses the open-source version anymore, which, if you believe in the ideals of Free Software, is a bad thing.





  • You can’t trust users to make informed decisions about cybersecurity as most users don’t have the necessary background knowledge, so won’t think beyond this popup is annoying me and has a button to make it go away and I am smart and therefore immune to malware. Microsoft don’t want Windows to have the reputation for being infested with malware like it used to have, and users don’t want their bank details stolen. If something’s potentially going to be a bad idea, it’s better to only give the decision to people capable of making it an informed decision. That’s why we don’t let children opt into surgery or decide whether to have ice cream for dinner, and have their parents decide instead.

    The comment you’re quoting was replying to someone suggesting a warning popup, and saying it would be a bad idea, rather than suggesting the secure boot UEFI option should be taken away. You need at least a little bit more awareness of the problem to know to toggle that setting.


  • If you’re doing things properly, you’ll know your Microsoft account password or have it in a password manager (and maybe have other account recovery options available like getting a password reset email etc.), and have a separate password for the PC you’re locked out of, which would be the thing you’d forgotten. If someone isn’t computer-literate, it’s totally plausible that they’d forget both passwords, have no password manager, and not have set up a recovery email address, and they’d lose all their data if they couldn’t get into their machine.



  • If you give a chip more voltage, its transistors will switch faster, but they’ll degrade faster. Ideally, you want just barely enough voltage that everything’s reliably finished switching and all signals have propagated before it’s time for the next clock cycle, as that makes everything work and last as long as possible. When the degradation happens, at first it means things need more voltage to reach the same speed, and then they totally stop working. A little degradation over time is normal, but it’s not unreasonable to hope that it’ll take ten or twenty years to build up enough that a chip stops working at its default voltage.

    The microcode bug they’ve identified and are fixing applies too much voltage to part of the chip under specific circumstances, so if an individual chip hasn’t experienced those circumstances very often, it could well have built up some degradation, but not enough that it’s stopped working reliably yet. That could range from having burned through a couple of days of lifetime, which won’t get noticed, to having a chip that’s in the condition you’d expect it to be in if it was twenty years old, which still could pass tests, but might keel over and die at any moment.

    If they’re not doing a mass recall, and can’t come up with a test that says how affected an individual CPU has been without needing to be so damaged that it’s no longer reliable, then they’re betting that most people’s chips aren’t damaged enough to die until the after warranty expires. There’s still a big difference between the three years of their warranty and the ten to twenty years that people expect a CPU to function for, and customers whose parts die after thirty-seven months will lose out compared to what they thought they were buying.





  • That would be annoying for people who work on files with a double extension for legitimate reasons, e.g. .tar.gz, and (this can’t be stressed strongly enough) Windows users do not pay attention to warning popups, so it wouldn’t actually help. Despite it being eighteen years since Windows Vista released, and therefore vanishing unlikely that any given software was written assuming that Windows didn’t have a permissions system, it’s still most people’s first troubleshooting step to try and run things as admin, and you still get loads of people (including ones who should know better, e.g. ones who also use Linux and would never log in as root) who disable UAC as one of the first things they do when setting up a windows install, and end up running everything as the equivalent of root just to suppress the mildly annoying pop-up when something asks for elevated permissions.

    So, your proposed popup:

    • would be annoying including for legitimate uses
    • wouldn’t help as anyone who already ignores the smart screen popup that shows up when running a dodgy application will ignore the new popup, too
    • would be disabled by huge swathes of users anyway


  • It’s a silly flag to use as it only works when running 32-bit Windows applications on 64-bit Windows, and if you’re compiling from source, you should also have the option to just build a 64-bit binary in the first place. It made a degree of sense years ago when people actually used 32-bit Windows sometimes (which was usually just down to OEMs installing the wrong version on prebuilt PCs could have supported 64-bit) if you really wanted to only have one binary or you consumed a precompiled third party library and had to match its architecture.


  • It doesn’t necessarily work that way, though. If tests tell you you broke something immediately, you don’t have time to forget how anything works, so identifying the problem and fixing it is much faster. For the kind of minor bug that’s potentially acceptable to launch a game with, if it’s something tests detect, it’s probably easier to fix than it is to determine whether it’s viable to just ignore it. If it’s something tests don’t detect, it’s just as easy to ignore whether it’s because there are no tests or because despite there being tests, none of them cover this situation.

    The games industry is rife with managers doing things that mean developers have a worse time and have the opposite effect to their stated goals. A good example is crunch. It obviously helps to do extra hours right before a launch when there’s the promise of a holiday after the launch to recuperate, but it’s now common for games studios to be in crunch for months and years at a time, despite the evidence being that after a couple of weeks, everyone’s so tired from crunch that they’re less productive than if they worked normal hours.

    Games are complicated, and building something complicated in a mad rush because of an imposed deadline is less effective than taking the time to think things through, and typically ends up failing or taking longer anyway.