Onno (VK6FLAB)

Anything and everything Amateur Radio and beyond. Heavily into Open Source and SDR, working on a multi band monitor and transmitter.

#geek #nerd #hamradio VK6FLAB #podcaster #australia #ITProfessional #voiceover #opentowork

  • 0 Posts
  • 54 Comments
Joined 4 months ago
cake
Cake day: March 4th, 2024

help-circle
  • I absolutely love the question and I’m going to attempt to answer it in a way that is not a reminiscing by an “old” internet citizen, rather some of the magic and wonder that I have been fortunate enough to experience.

    My first time really connecting to the Internet was in 1990. I didn’t have my own account, so with permission I used the account that belonged to my boss at the time, Brian Murphy. He was a statistician and wine maker who had employed me to convert a statistics program (NANOVA) he wrote for a mainframe into something that could run on a desktop spreadsheet program that was new and exciting at the time, Wingz.

    At the time the way “the Internet” worked was much more fragmented than the almost integrated experience we have today. Protocols (ways of getting information) like “telnet”[1], “ftp”[2] and “finger”[3] were how you got around, using programs that only knew how to do one thing. All of it was text-only. If you’ve heard of “gopher”[4], it didn’t exist yet. The “Wide Area Information Server”[5] (WAIS), had only just been invented but hadn’t made it to my desk.

    You used text only email much like today, but addressing required that you knew how to get your message from your system to the recipient, using a so-called bang path [6] addressing scheme. This was not fun, but it got the job done. You could use tools like “finger” to determine how to get email to a person, which was a great help, but still was non-trivial. It’s like putting an address on an envelope that says, send this message from Perth, to Kalgoorlie, then to Adelaide, then to Sydney, then to Ultimo, then to Harris Street, then to number 500.

    Much simpler was to use “Usenet News”[7], a global messaging system where you connected to your local news server, participated in discussion, whilst behind the scenes your messages would be shared with other news servers which were doing the same.

    So, I’m sitting at my desk in Brian’s office with a brand new Apple Macintosh SE/30. This is leading edge hardware. I have a text-window open that is emulating a terminal (probably a VT220[8]), using telnet I’m connected to the local VAX cluster[9] that is running (among other things) our local news server.

    I am not certain, but I think that this is my first ever message. It’s 4 September 1990 and I’m having an issue with MPW Pascal and the piles of paper documentation surrounding me had no answers. There is no “Google” or anything like it at this point, so I had to find answers elsewhere.

    I found the message in one of the “comp” groups[10], “comp.sys.mac.programmer”, as opposed to an “alt” group[11] like alt.best.of.internet. These names are how you navigated the massive hierarchy of information that Usenet represents. Just like with domain names today, you specify the name by adding more dot names.

    In today’s terms this could be expressed as a Lemmy community or a Reddit sub. And just like with those today, each Usenet group was a community with its “in” jokes, people who knew what they were talking about and those who didn’t, the whole enchilada.

    Anyway, I posted to the group and asked a question about how to achieve the thing I wanted to fix. I went home and the next day I had a reply … from Brazil, where they too had discovered this issue and had found a solution.

    It … blew … my … mind.

    This started me on the journey I’m still on today. There is plenty more to tell to cover the 34 years since then. Perhaps a story for another day.

    I debated providing links to some of the things I mention, but given that links didn’t exist in 1990, finding information was HARD, I thought it would be a nice ‘meta’ joke to include them.

    Today I am going to do something much more mundane, set-up a backup job for a virtual server that was cloned from an older system, running a web-site and database on a cloud provider platform that I can use and access as-if it’s sitting on my desk while it is thousands of kilometres away. If my fingers were small enough, I could do this from my mobile phone.

    So, yeah, things have changed.

    o

    [1] https://en.wikipedia.org/wiki/Telnet
    [2] https://en.wikipedia.org/wiki/File_Transfer_Protocol
    [3] https://en.wikipedia.org/wiki/Finger_(protocol)
    [4] https://en.wikipedia.org/wiki/Gopher_(protocol)
    [5] https://en.wikipedia.org/wiki/Wide_area_information_server
    [6] https://en.wikipedia.org/wiki/UUCP#Bang_path
    [7] https://en.wikipedia.org/wiki/Usenet
    [8] https://en.wikipedia.org/wiki/VT220
    [9] https://en.wikipedia.org/wiki/VMScluster
    [10] https://en.wikipedia.org/wiki/Comp.*_hierarchy
    [11] https://en.wikipedia.org/wiki/Alt.*_hierarchy







  • The business model to require paid credits in order to interact with bots is in my opinion a thing of sheer bastardry.

    Apparently, this is how it works: (*)

    Women were on the site for free, men were required to pay for and use credits in order to interact with women.

    It appears that there weren’t anywhere near the numbers of women claimed by the company. Instead bots would communicate with men, using their credits in the process.

    (*) I say works, because apparently the company still exists today and I’m not aware if they ever admitted to using bots, let alone discontinuing their use. The Netflix series goes into detail, which is where I got this understanding from.

    Disclaimer: I’m not a customer, have never been one and my comments are based on a single source as described above.


  • It’s a package management system in the same way that Flatpack, yum, apt-get, snap and dozens of others are.

    If you use MacOS and Linux, it’s not inconceivable that you might want to use the same package management system across both.

    I’ve used it, didn’t particularly warm to it and didn’t install it on my most recent MacOS install after it shat all over itself on a previous installation.

    I didn’t know that it was available for Linux. Not tempted to try.

    I’m a firm believer in apt-get and failing that, Docker with side journeys into podman.




  • In my opinion, you’re solving the wrong problem with the wrong solution.

    The user base for Canonical, Red Hat and SUSE is not the general public watching traditional TV to decide that they want to install Linux across their enterprise data centre, it’s ICT professionals who talk to other ICT professionals and read white papers and implementation guidelines, then pay installation, management and subscription fees to get ongoing support across their shiny new data centre.

    Growing the user base with mums and dads is not something that Linux vendors are interested in, since it only costs money instead of generating an income stream.

    Linux as a commodity comes from rolling out Android phones and tablets, from deploying embedded Linux on network routers, security cameras, in-car entertainment systems, set top boxes, etc.

    The final hurdle for general desktop Linux is not resolved by getting more users through advertising, it’s through having a product that can be purchased. Chromebooks were promising, but missed the mark.

    System76 are trying, but the scale is too small and Linux isn’t ready as a general computing platform yet. I say that having been a Linux user for 25 years.

    If you don’t agree with that last statement, consider what all computer manufacturers would do at the drop of a hat if they thought it would be cheaper, they’d drop Windows like the hot mess it is.

    Unfortunately, it’s still cheaper to pay the Microsoft tax because the associated support network is already in place for the general public.

    That’s not there, yet, for Linux.

    It remains to be seen if ever will be.






  • Are the people who work at OpenAI smoking crack?

    “Over the last year and a half there have been a lot of questions around what might happen if influence operations use generative AI,” Ben Nimmo, principal investigator on OpenAI’s Intelligence and Investigations team, told members of the media in a press briefing

    Here’s a clue, look around you.

    ChatGPT isn’t the only fish in the sea and state actors using a public service like it deserve to be caught. Running your own system privately, without scrutiny, without censorship, without constraints is so trivial that teenagers are doing this on their laptops, so much so that you can docker pull your way into any number of LLM images.

    Seriously, this is so many levels of absurd that it’s beyond comprehension…




  • Not just burnout, opportunism features with several users I’ve spoken with. The level of ignorance surrounding ChatGPT is staggering.

    One egregious use I know of was a developer who used it to write software to analyse a government dataset despite their department having put in place specific and targeted restrictions specifically against any such activities.

    Their workaround was to use their private email to exfiltrate data and subsequently introduce the code.

    Their rationale was that it didn’t harm anyone and their ICT department would vet any code. They were not concerned about this private data showing up on the ChatGPT public log, nor were they concerned about the accuracy of their code.

    I think that this is just the tip of the iceberg and I think it’s going to take a serious data breach of identifying information before people lose their jobs over this type of misuse.


  • My semi-immutable OS is based around a Debian installation where every application is installed in a separate Docker container.

    When you launch the application, it volume mounts an appropriate directory that contains only the data related to that application.

    Chrome for example launches with a single subdirectory inside ~/Downloads, so each instance can only see its own directory.

    I can also test compilation of random repositories inside a container, without affecting the underlying OS.

    The OS itself has only got a minimal Debian and Docker installed.

    Been using it for several years. I can’t recall when I last rebooted it.