• Tja@programming.dev
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 months ago

        We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.

          • ABCDE@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 months ago

            Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.

            • BetaDoggo_@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              4 months ago

              I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.

              The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.

              Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.

              • strawberry@kbin.run
                link
                fedilink
                arrow-up
                2
                ·
                4 months ago

                would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?

                better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc

                • Flipper@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.

              • cum@lemmy.cafe
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                Yeah I have multiple USB cables, some at 30w, and some at 140w. Get them mixed up all the time! More companies need to at least brand the wattage on the connectors.

            • frezik@midwest.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              There’s some really high bandwidth stuff that USB-C isn’t rated for. You have to really press the limits, though. Something like 4k + 240Hz + HDR.

              • ABCDE@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.

                • frezik@midwest.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  USB-C with Thunderbolt currently had a limit of 40Gbit/sec. Wikipedia has a table of what DisplayPort can do at that bandwidth:

                  https://en.wikipedia.org/wiki/DisplayPort

                  See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.

                  Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.

                • GeniusIsme@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  It is trivial arithmetic: 4.52403840*2160 ≈ 9 GB/ s. Not even close. Even worse, that cable will struggle to get ordinary 60hz 4k delivered.

        • admiralteal@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.

          • trafficnab@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            If the implementation is so broad that I have to break out my label maker, can we even really call it a “standard”

        • Player2@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess

        • trafficnab@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn’t done anything to get it built into TVs.

        • Scrollone@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          DisplayPort supports CEC.

          From Wikipedia:

          The DisplayPort AUX channel is a half-duplex (bidirectional) data channel used for miscellaneous additional data beyond video and audio, such as EDID (I2C) or CEC commands.

          • wjrii@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            4 months ago

            When you’re trying to get into DPs, the outside can be slippery and the screw part can be tight! Very dangerous for the workplace.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    4 months ago

    So why is it rejected?

    Just because they’re still trying to use HDMI to prevent piracy? Who in fuck’s name is using HDMI capture for piracy? On a 24fps movie, that’s 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.

    I’ve got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      The CEO’s of the media companies are all fucking dinosaurs who still think VCRs should have been made illegal. You will never convince them that built in copy protection is a dumb idea and a waste of time.

    • CCF_100@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      Even despite that HDMI capture is simply an awful way of obtaining that data, it’s even more pathetic when that “protection” can be defeated by a $30 capture card on Amazon…

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      The profiles HDMI 2.1 enables are even worse - 4k@120fps type stuff. Not exactly something needed for a movie.

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      HDMI Splitter + capture card.

      No video put on a streaming service produced in the next 40 years will need HDMI 2.1 to display.

    • sarmale@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Can’t you compress what the HDMI outputs in real time so that it would have a normal size?

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Sure. But why bother when you can rip it right from the disc in higher quality than you could ever hope to capture in real time?

    • dangblingus@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      Most people don’t pirate 4K media due to file size and internet speed constraints. Most people pirate 1080p video. There’s also the prospect of people pirating live television, which HDMI capture would be perfect for.

      • Psythik@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Then most people need get a better ISP. My crappy $60/mo fixed 5G can download an entire 4K film in under 10 minutes or start streaming it within a second. Y’all should see if there are any options beyond cable and DSL in your town. You might be pleasantly surprised what’s available these days.

        • nymwit@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          Is that not a compressed stream though? Genuinely asking. A 4k blu ray rip and a 4k stream from a service (or whatever it saves for offline viewing on an app) a pretty different. I think things are getting conflated with capturing live 4k television and capturing a 4k blu ray as it plays, which both might be using an HDMI cable.

          • Psythik@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            4 months ago

            I use Stremio and only stream full 4K Blu Ray rips, with HDR and Dolby Atmos and all. So nothing is recompressed. 50-70GB files but it starts streaming almost instantly.

            I have a poor 5G signal due to a tree that’s blocking my view of the antenna, so I get anywhere between 400Mbps and 1400Mbps (I’m supposed to get a gigabit but it’s usually closer to 500). Even with a poor signal it’s still way faster than any other ISP in my town.

    • LifeInMultipleChoice@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      Is there a reason or way to prevent display port from having so many connection issues specifically on port replicators (docking stations)?

      In corporate environments I find so many times that you plug them up over and over, unplug over and over and check the connection a million times before turning everything off one final time, holding the power button on everything (kind of like an smc reset) and then booting up everything like you originally did and they come up. Is this a result of the devices trying to remember a previous setup or is their an easy way to avoid it?

      I’ve hooked up dozens of them and still ran into issues when a family member brought a setup home to work when they were sick last week.

      • General_Shenanigans@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        We use Dell WD-19 docks. Not sure if you use similar. Updated dock firmware and laptop drivers made a difference for us with connection issues. Sometimes you gotta perform a reset on them to make them behave (disconnect dock power and USB-C and hold power button for just over 15 sec). Sometimes the laptop NVRAM needs to be reset instead (for Dell, disconnect all devices and power while off and hold button for just over 30 sec). Overall, though, no huge issues with DP specifically if the dock and laptop firmwares are up to date. Third-party docks/replicators definitely have way more issues, though.

  • NoLifeGaming@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    Always thought that display port is better anyways lol. Anything that HDMI does or have that display port doesnt?

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I’ve been trying to use an adapter, but it crashes a lot.

    AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.

    • ichbinjasokreativ@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I’m a bit confused by your comment. I have a 120Hz Monitor and use an AMD GPU on linux without issues. Connected via the display port on my GPU to the HDMI Port on my monitor (because samsung does not enable DDC on the display port for some reason).

        • 🐍🩶🐢@lemmy.world
          cake
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          I have one too. Go take a look at Cable Matters. I am able to play games at 4k120 with my mac. See if something will work for you and you can always send a message to their customer support to ask questions.

          • Justin@lemmy.jlh.name
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Yeah thats the one I have. Maybe I’ll ask their support. It has the latest firmware but it’s so flaky about being able to do high bandwidth.

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 months ago

          I’m using an LG C2 Oled TV that doesnt have displayport.

          Connected via the display port on my GPU to the HDMI Port on my monitor

          ichbinjasokreativ seems to suggest that the viewing device he/she connects to is done via HDMI, the same as your OLED TV.

          Unless I’m missing something?

          Edit: You discuss this issue further down in the topic, so no need to reply.

          Could have saved myself the time of replying to you if I had scrolled all the way through first, then backtracked, but that’s kind of unintuitive to do, especially on a cell phone browser.

    • Sudomeapizza@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      ive found that the issue in my experience is that X11 only supports a max of 4k60, but Wayland supports 4k120 and beyond. I dont think the cable matters as the same cable im using works on windows with 4k160.

      • Celestus@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        The OLED TV probably only has HDMI. TVs don’t normally have DisplayPort

  • nivenkos@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    This sucks as all new TVs use HDMI2.1 for modern features and modern games consoles rely on those for 4k 60Hz HDR, etc.

    So now Valve can’t just make their own home console with Steam OS for TVs directly (and support high-end features at least).

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Piracy being easier is the only risk. Once again ruining the experience of legitimate customers to try and stop a thing that they have had no success at even slowing down.

      • Acters@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Even further, it made it more expensive to buy products from all the dumb licensing fees that all the middlemen try to shoehorn in.

  • Godort@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    What’s the over/under that this was about preventing people getting around HDCP using a modified driver?

  • csolisr@hub.azkware.net
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    If we had to relay exclusively on non-proprietary protocols, I doubt that GNU/Linux would have gone anywhere beyond the Commodore 64

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      Linux never ran on the Commodore 64 (1984). That was way before Linux was released by Linus Torvalds (1991).

      I’d also like to point out that we do all rely on non-proprietary protocols. Examples you used today: TCP and HTTP.

      If we didn’t have free and open source protocols we’d all still be using Prodigy and AOL. “Smart” devices couldn’t talk to each other, and the world of software would be 100-10,000x more expensive and we’d probably have about 1/1,000,000th of what we have available today.

      Every little thing we rely on every day from computers to the Internet to cars to planes only works because they’re not relying on exclusive, proprietary protocols. Weird shit like HDMI is the exception, not the rule.

      History demonstrates that proprietary protocols and connectors like HDMI only stick around as long as they’re convenient, easy, and cheap. As soon as they lose one of those properties a competitor will spring up and eventually it will replace the proprietary nonsense. It’s only a matter of time. This news about HDMI being rejected is just another shove, moving the world away from that protocol.

      There actually is a way for proprietary bullshit to persist even when it’s the worst: When it’s mandated by government.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      Why? Most software wasn’t proprietary before companies realized they could make more money at your expense (not all the profit is going into making a better product).

      If given the choice of an uncomfortable dormitory or a comfortable jail, at least the residents can improve the living areas in the former.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Parent is right though. Unix being proprietary is why the GNU project was started, and why the Linux kernel and BSDs rose above.