For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues…

  • f00f/eris
    link
    fedilink
    English
    2393 months ago

    This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.

    • TurboWafflz
      link
      fedilink
      703 months ago

      I don’t understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn’t?

      • @Dudewitbow@lemmy.zip
        link
        fedilink
        69
        edit-2
        3 months ago

        HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how “unlocked” the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

        Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

        • @nivenkos@lemmy.world
          link
          fedilink
          363 months ago

          The DRM is so stupid - now in the era of streaming you can get literally anything webripped day1.

          DRM is obsolete (and it never really wasn’t tbh).

          • @Dudewitbow@lemmy.zip
            link
            fedilink
            133 months ago

            its the attempt that matters more to investors than the pirates. its why a shit ton of games have denuvo, evem if the version of denuvo they utilized is cracked already or not. its not there for the end user, its there for the investors to show they are at least attempting to fight off piracy.

            • @leopold@lemmy.kde.social
              link
              fedilink
              English
              53 months ago

              Denuvo is actually very effective relatively speaking. Several popular games that use it have never been cracked. They haven’t made it impossible, just sufficiently difficult and tedious that no one wants to bother.

            • @Auli@lemmy.ca
              link
              fedilink
              English
              13 months ago

              Isn’t DRM in games working though. Denuvo only being cracked by one person, to me it sounds like a win for the corporations.

              • @Dudewitbow@lemmy.zip
                link
                fedilink
                13 months ago

                it’s working in the sense that i slows it down. However how denuvo works is that there are usually are generations of denuvo that get cracked, so once one gets cracked in a generation, theres a handful that will be cracked with it. if a company is using an older generation of denuvo, you may typically see day 1 cracks, which ultimately means the company paid denuvo for nothing, but the point is, denuvo wasn’t meant to stop piracy first, it was meant to appease investors that require denuvo to be implemented.

        • @n3m37h@lemmy.dbzer0.com
          link
          fedilink
          133 months ago

          I don’t know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

          Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

      • @MiltownClowns@lemmy.world
        link
        fedilink
        503 months ago

        Decades of being the standard in a/v. That’s like asking, why don’t we get rid of gas stations and just install electric chargers? Well, everybody’s got gas powered cars.

        • TurboWafflz
          link
          fedilink
          193 months ago

          AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

          • @MiltownClowns@lemmy.world
            link
            fedilink
            22
            edit-2
            3 months ago

            Well, I wasn’t referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it’s the standard in audio-visual equipment. Why would I talk about computer equipment when it’s not the standard there?

            The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

            • @MonkderZweite@feddit.ch
              link
              fedilink
              1
              edit-2
              3 months ago

              and to phase out that standard in equipment going forward is phasing out equipment people already own.

              And where’s the problem in that? My parents still use a soon 20 years old plasma tv. But they’re getting old too.

          • krolden
            link
            73 months ago

            Computers are AV things.

          • Dog
            link
            fedilink
            English
            13 months ago

            Today. Every time I go downstairs.

        • TimeSquirrel
          link
          fedilink
          93 months ago

          HDMI only had about four good years to itself before DisplayPort showed up. In contrast, the RCA port stuck around for damn near 100 years.

      • @Flaky@iusearchlinux.fyi
        link
        fedilink
        English
        223 months ago

        Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

        That being said, I might consider looking towards DisplayPort when I can get a new monitor…

      • @virr@lemmy.world
        link
        fedilink
        English
        93 months ago

        CEC (technically I think displayport could support it, but generally isn’t implemented) and ethernet up to 100Mbps.

      • @narc0tic_bird@lemm.ee
        link
        fedilink
        73 months ago

        Feature-wise probably next to nothing, and it’s usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can’t afford to just leave it out.

        • @Grass@sh.itjust.works
          link
          fedilink
          8
          edit-2
          3 months ago

          They should anyway. New tech TV’s are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

        • Hyperreality
          link
          fedilink
          2
          edit-2
          3 months ago

          Yep. Very common.

          A lot of people use their pc like a console or media server. Ie. use it to watch/play stuff from their bed or couch.

        • @SuperIce@lemmy.world
          link
          fedilink
          English
          33 months ago

          Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.

      • @Catsrules
        link
        13 months ago

        My guess is it has something to do with DRM protection in the HDMI spec. I have no proof but it seems like it is always DRM that screws over open source.

  • @Sentau@discuss.tchncs.de
    link
    fedilink
    5
    edit-2
    3 months ago

    So I see people on the phoronix forums complaining that this is a bad thing because they have TVs which are HDMI only. From what I read, the HDMI 2.1+ spec is only needed to support extreme cases like 4k@120Hz and above. So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification.

    Edit : it seems I am mistaken in thinking that new TVs have display port.

    • @Catsrules
      link
      53 months ago

      So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification

      As far as I know no consumer TV has Display port.

      I bought a TV maybe 2-3 years ago that supports 4K@120 and it doesn’t have a display port, only HDMI.

    • ruffslOP
      link
      fedilink
      English
      43 months ago

      I’m using a recent 42" LG OLED TV as a large affordable PC monitor in order to support 4K@120Hz+HDR@10bit, which is great for gaming or content creation that can appreciate the screen real estate. Anything in the proper PC Monitor market similarly sized or even slightly smaller costs way more per screen area and feature parity.

      Unfortunately such TVs rarely include anything other than HDMI for digital video input, regardless of the growing trend connecting gaming PCs in the living room, like with fiber optic HDMI cables. I actually went with a GPU with more than one HDMI output so I could display to both TVs in the house simultaneously.

      Also, having an API as well as a remote to control my monitor is kind of nice. Enough folks are using LG TVs as monitors for this midsize range that there even open source projects to entirely mimic conventional display behaviors:

      I also kind of like using the TV as simple KVMs with less cables. For example with audio, I can independently control volume and mux output to either speakers or multiple Bluetooth devices from the TV, without having fiddle around with repairing Bluetooth peripherals to each PC or gaming console. That’s particularly nice when swapping from playing games on the PC to watching movies on a Chromecast with a friend over two pairs of headphones, while still keeping the house quite for the family. That kind of KVM functionality and connectivity is still kind of a premium feature for modest priced PC monitors. Of course others find their own use cases for hacking the TV remote APIs:

    • @cobra89@beehaw.org
      link
      fedilink
      23 months ago

      TVs don’t have DisplayPort. I just bought a new TV, none of the options I looked at had display port.