• bluelander@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    20 hours ago

    I’m a pretty avid old video game enjoyer and own multiple CRTs. Also had the pleasure of owning and maintaining a 19" Sony PVM until I traded it to a friend for a mountain of GBA games. Still keep a 13" hooked up for the occasional VHS or old game.

    That said, I feel like a lot old sentiment toward emulation and modern display tech is rooted in internet opinions from 2010 or prior.

    Yes, older and cheaper LCDs with a Bluetooth controller on old emulation tech pales in comparison to a Super Nintendo hooked up to the cheapest CRT ever. But both display tech and emulation tech have come a long way. High quality upscalers, ultra deep blacks, low latency game modes, insane refresh rates, FPGA, Retroarch run-ahead, cycle accurate emulators, and a dozen other breakthroughs have made retro gaming on modern panels extremely enjoyable.

  • ubergeek@lemmy.today
    link
    fedilink
    English
    arrow-up
    9
    ·
    20 hours ago

    Dear god, no.

    HOWEVER!!!

    What I’d like to see are CRT-esque LCDs with the proper lenses, and blur, to emulate a CRT display.

  • Tattorack@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 day ago

    No please. I want CRTs to stay in the past. The sound they make always gave me a headache, or just irritated me.

    • bigfondue@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      5 hours ago

      That sound is above 15kHz. If you’re old enough to remember CRTs there is a pretty good chance you can’t hear it anymore

      • Tattorack@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        I can, unfortunately. My younger brother bought a small CRT a couple of months ago because he’s getting into retro gaming. He hardly touches the thing, but I can hear it across the apartment whenever he turns the damn thing on.

    • harmsy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 hours ago

      The tinnitus in my right ear sounds exactly like a CRT. Imagine being stuck with that sound for life.

  • YeahIgotskills2@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 day ago

    I wouldn’t use my Amiga, ST, ZX Spectrum or Mega drive on anything other than my CRT. They were designed for that pixel blur and playing on a modern TV is just not the same. I hadn’t realised the difference it made until i tried it and now I can never go back to using an LCD for any of my 80s/90s devices.

    However, beyond that somewhat niche use, CRTs are otherwise entirely pointless and basically a worse display experience in every concievable way when your source is anything produced after the advent of HDMI/Display Port.

    • bluelander@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 hours ago

      Have you tried any of the various shaders available? I find that a good shader set gets pretty so close to my CRTs that I honestly can’t tell the difference. I have a Retrotink for hardware scaling and it also has very good shader options.

  • rozodru@piefed.social
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    2
    ·
    2 days ago

    Look I miss CRTs too but no, don’t bring those things back.

    I used to go LAN Parties at the local university and…man no. in the middle of winter? yeah very nice and toasty but in the humidity filled Canadian summers? eff off buddy.

  • earthworm@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    2 days ago

    It will not. The article is nostalgia and hopium-baiting.

    Restarting a mass-manufacturing production line for something like once super-common CRT TVs would require a major investment that so far nobody is willing to front.

    Meanwhile LCD and OLED technology have hit some serious technological dead-ends, while potential non-organic LED alternatives such as microLED have trouble scaling down to practical pixel densities and yields.

    There’s a chance that Sony and others can open some drawers with old ‘thin CRT’ plans, dust off some prototypes and work through the remaining R&D issues with SED and FED for potentially a pittance of what alternative, brand-new technologies like MicroLED or quantum dot displays would cost.

    Will it happen? Maybe not. It’s quite possible that we’ll still be trying to fix OLED and LCDs for the next decade and beyond, while waxing nostalgically about how much more beautiful the past was, and the future could have been, if only we hadn’t bothered with those goshdarn twisting liquid crystals.

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      54
      arrow-down
      4
      ·
      2 days ago

      It’s also utter garbage. We abandoned CRTs because they sucked. They’re heavy, waste tons of space, guzzle power, and have terrible resolution. Even the best CRT ever made is absolutely destroyed by the worst of modern LCDs. The only advantage you could possibly come up with is that in an emergency you could beat someone to death with a CRT. Well, that and the resolution was so garbage they had a natural form of antialiasing, but that’s a really optimistic way of saying they were blurry as shit.

        • Frezik@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          6
          ·
          21 hours ago

          If you are measuring fairly, then CRTs do have input lag. You have to take into account the time it takes to stream a frame of video from the connector to the beam to complete the draw. Not doing this is giving CRTs an unfair advantage.

          The industry standard is to take the input lag measurement from the middle of the screen. A complete CRT frame in NTSC will take about 16.7ms to stream a frame, so the middle of the screen will be half that. That is, if you press a button to shoot the moment before the frame is drawn and the software miraculously updates the scene before it’s streamed out, then a CRT on NTSC has about 8ms of input lag.

          For PAL, it’s about 20ms to draw a frame, or 10ms of input lag.

          Which is interesting, because a lot of LCDs have around 2ms pixel response time. The difference between NTSC and PAL input lag is also 2ms.

        • nuggie_ss@lemmings.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          1 day ago

          There is literally 1 game I can think of that uses CRTs for competitions, and I guarantee nobody in this thread plays it at a level to require one.

          Motion blur will always be subjective.

          I remember when we were transitioning from CRTs to “flat screens.” Everyone and their grandma wanted a flatscreen.

          Anyone yearning for a return of CRTs that doesn’t play that one game is most likely trying to be quirky rather than fulfill an actual need. It’ll be fun for 1 or 2 sessions because of the novelty, and then never get used again.

          • YiddishMcSquidish@lemmy.today
            link
            fedilink
            English
            arrow-up
            3
            ·
            21 hours ago

            Holy shit the memories! I got one of those wide flat screen Sony guys out of a trash pile the garbage men left cause it was too heavy ig. I grabbed my homie for down the street and we carried it the couple hundred years to my house at about 50 feet a minute so to stop and rest. Good I wish I had that back again! (Both the tv and my actual teenage musculature).

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        2 days ago

        Absolutely, in the beginning there were pros and cons, with the cheap TN-LCD having serious annoying display issues.
        But with better LCD technologies like IPS arriving and improving fast together with lower prices, there is no doubt that today even a cheap IPS display is way better than any CRT can ever be. With better clarity, colors and black, and even less ghosting, because CRT definitely has ghosting too.
        Back in the day my Sony 29" CRT TV weighed about 60 kg without speakers. (the speakers could be detached).
        And the CRT weight increases exponentially with size, because with bigger screen the glass needs to be thicker to withstand the significant pressure of the vacuum in the tube.
        So a 60" TV CRT would most likely weigh above 250 kg!! The tube alone would be more expensive to make than an entire modern TV of similar size!
        But more than that, it would be very difficult to make a 60" CRT screen that doesn’t flicker, and the extreme speed needed for the ray to cross the entire screen, would require enormous power to light the phosphorous surface, within the nanosecond time it has for each pixel. Even just normal HD 1980x1024 at 60 frames per second and 3 RGB subpixels per pixel, is 364.953.600 sub pixels per second, so an analogue signal that needs to control the cathode ray at that speed would require enormous power.

        The result would be a 200kg+ TV with smeared/blurry images and very poor color quality, due to the inherent imprecision. and even with clever tricks to make the tubes slimmer, developed near the end of CRT popularity, it would require almost a meter distance from the wall, to make room for the huge cathode ray tube.

        There is no way CRT is making a comeback, CRT is inferior in every way, for every size of display, and also in blackness, contrary to what he claims in the article.

        Edit PS:
        https://en.wikipedia.org/wiki/Sony_PVM-4300
        The biggest CRT ever made was 43" and weighed 199.6 kg (440 lb).
        So a 60" would weigh way above 250 kg.
        Also notice that even this prestige project by Sony, does NOT have a black screen, so the idea of perfect blacks on CRT as the article claims are pure idiocy.

      • vacuumflower@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        23 hours ago

        and have terrible resolution

        Now-now. With CRTs resolution is not an inherent trait anyway. You could trade off update frequency for better resolution and back.

        They’re heavy, waste tons of space, guzzle power,

        When CRTs were common, LCD displays also were heavy, wasted tons of space and guzzled power. And for some time after that they were crap for your eyes.

        Even the best CRT ever made is absolutely destroyed by the worst of modern LCDs.

        No, the best CRT ever made is really not that, but also costs like an airplane’s wing.

        Well, that and the resolution was so garbage they had a natural form of antialiasing, but that’s a really optimistic way of saying they were blurry as shit.

        An LCD display has resolution as its trait. A CRT display has a range of resolutions realistically usable with it. It doesn’t have a matrix of pixels, only a surface at which particles are shot.

        So, the point before I forget it. While CRTs as they existed are a thing of the past, it would be cool to have some sort of optical displays based on interference (suppose, two lasers at the sides of the screen) or whatever, allowing similarly agile resolution change, and also more energy-efficient than LCDs, and also better for one’s eyes. I think there even are some, just very expensive. Removing the “one bad pixel” component would do wonders. Also this could probably be a better technology for foldable displays. As in - now you scratch a screen, you have to replace the matrix. While such a component wouldn’t cost as much a whole matrix, the lasers would be the expensive part.

        Anyway, just dreaming.

        • hamsterkill@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          17 hours ago

          it would be cool to have some sort of optical displays based on interference (suppose, two lasers at the sides of the screen) or whatever, allowing similarly agile resolution change, and also more energy-efficient than LCDs, and also better for one’s eyes. I think there even are some, just very expensive

          I think you’re just describing laser projection TVs ( though the projection is from the front or back, generally). They’re not that expensive — just huge. For their size, they’re much cheaper than LCDs and OLEDs, but they only come in about 100+".

          https://www.walmart.com/ip/Hisense-L5H-4K-UHD-Ultra-Short-Throw-Laser-TV-Projector-with-100-Light-Rejecting-Screen-Dolby-Vision-Dolby-Atmos-Google-TV/5003861077?classType=REGULAR

          Scanning laser projection is also used in virtual retinal displays, but that’s for stuff like HUDs or a head-mounted display since it projects on (or rather - into) a person’s eye instead of a screen.

          Any kind of scanning display will probably have poor latency compared to LCD/OLED flat panels, I think, though.

          • vacuumflower@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            Yes, except with part of the screen itself being the optical medium, bent light and all that. So that it wouldn’t have to be huge. I’m thinking about portable, foldable, rollable things … Not sure.

  • Brickfrog@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    2 days ago

    My Sony Trinitron served me well back in the day - But no, I don’t miss the CRT era. Just too huge and heavy. And honestly I don’t remember the generic non-Trinitron CRTs being anything special, they were kind of shitty.

    Anyways I thought the CRT thing is just collectors/old school gamers looking to display older media on a proper CRT? Obviously people with a lot of space, garages, basements, etc… people in tiny rooms and apartments need not apply LOL.

    This whole article seems a bit off.

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      2 days ago

      Literally the only reason old school gamers play on CRTs is because old games were designed for the blurry low resolution displays they provided and so look kind of bad on modern crisp displays. You could just smear vasoline on a modern LCD and get roughly the same effect, but using a CRT is less messy.

      • AstralPath@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        The look of CRT is important to retro gaming but do you know what the most important characteristic of CRTs for retro gaming is?

        No input lag.

        Play OG Super Mario Bros on a modern TV and let me know how long it is before you wanna smash the controller in frustration. The game just feels incredibly sloppy.

        • MonkderVierte@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          23 hours ago

          No input lag.

          That’s wrong. The display is the small part there. And the players reaction times are leagues higher even. Btw, a lot in perceived lagginess is level design and psyche.

          • AstralPath@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            21 hours ago

            It’s not wrong. You can feel it.

            My wife is not a gamer and even she can feel it. She hated playing on our living room TV. Said she felt like she got really bad at Mario Bros over the years or something and was disappointed.

            Bought a CRT; she loves the game again and is still quite good at it actually.

            Reacting to stimulus is completely different than timing inputs in a video game. A few ms of delay isn’t really going to register in a reaction test, but if you’re using constant time sensitive information on screen to accurately time your movements in a game, you can easily feel lag in the sub 5ms range.

            As a guitarist, I can feel latency down to 2ms if I’m playing through a modeling amp on my PC, especially if I’m playing at high tempos. The faster you play, the greater the percentage of time between notes that latency becomes. The effect is the same in high speed video games.

            • Frezik@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              21 hours ago

              It is wrong. I have a comment elsewhere in the thread breaking it down, but the short of it is that you have to include the time it takes to stream the frame to the beam, or else you’re giving CRTs an unfair advantage in measurements.

              A good gaming monitor with something like the Framemeister, RetroTINK, or OSSC can give properly unnoticeable amounts of input lag.

              • AstralPath@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                18 hours ago

                A good gaming monitor with something like the Framemeister, RetroTINK, or OSSC can give properly unnoticeable amounts of input lag.

                Ok, so wait a second here. You’re suggesting that buying a “good” gaming monitor (hundreds to thousands of dollars) and an upscaler (the cheapest of the options you mentioned I found for $369 USD is a better option than buying a CRT?

                I found a perfectly good 28" Panasonic CRT on Kijiji for $200 CAD.

                It makes the retro noises, it displays the games the way they were meant to be displayed, and there’s no perceptible input lag. It also just fits the visual aesthetic if you have a retro gaming area/room in your house. There’s no way I’m paying anywhere near 5-600 USD (up to 1k CAD, basically) to play retro games on a modern monitor when I can have a setup faithful to the experiences I had as a kid in the 90s for $200 CAD.

                • Frezik@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  17 hours ago

                  Good gaming monitors are not expensive. Anything that supports G-sync should be fine. I picked one up off Craigslist for like $100. New isn’t much more.

                  CRTs are not going to stay cheap for long. They’re slowly dying to attrition. If nothing else, the phosphors are dimming over time. In fact, we might be getting towards the end of cheap garage sale pickups already. Especially ones that can be easily RGB converted.

        • captainlezbian@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Yeah super smash brothers melee is the peak for this. It looks fine on an lcd, and if you suck at it it’s fine, but there’s a skill level where your TV is hindering your ability to improve and you probably aren’t even winning local competition nights yet at that point.

        • ferret@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          A modern TV is a really bad example. Most modern gaming computer monitors have grey-to-grey pixel response times measured in nanoseconds. I would not be surprised if that exceeds the fade-time of CRT phosphors.

            • Lfrith@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 hours ago

              Still important for console gaming depending on the game, so there are console gamers who will opt for a monitor over the TV. PVP games and rhythm games being one. I got taiko and that felt off until I hooked up my switch to the monitor.

              Its more that people will use and adapt to what they have as opposed to not being relevant. Even TVs gamers will try to get ones that have lower input lag in game mode. But, monitor is still the preferred route.

  • Frezik@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    21 hours ago

    What might be useful is high quality, low latency 720p displays. A console that outputs 240p can have everything tripled to get 720p. Some of the effects applied by things like the OSSC, like scanlines, look pretty good when they do a 3x scaling.

    Most old consoles output something close enough to 240p (which was never a real standard, anyway). For the ones that aren’t quite on, upscaling can be done cleanly with only minimal blank space around the frame.

    1440p is 2x 720p, so that works, too. You don’t need your upscaling processor to be as powerful if you stick to 720p, though.