These days, kids identify them by the aspect ratio.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    64
    ·
    11 months ago

    And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.

    • Hypersapien@lemmy.worldOP
      link
      fedilink
      arrow-up
      31
      arrow-down
      1
      ·
      11 months ago

      People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.

      • jeffw@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        4
        ·
        11 months ago

        You kinda can tell though. CRTs didn’t really use pixels, so it’s not like watching on today’s video equipment though

          • zero_gravitas@aussie.zone
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            11 months ago

            What they’re referring to is that analogue CRTs don’t really have a fixed horizontal resolution. The screen has a finite number of horizontal lines (i.e. rows) which it moves down through on a regular-timed basis, but as the beam scans across horizontally it can basically be continuous (limited by the signal and the radius of the beam). This is why screen resolutions are referred to by their vertical resolutions alone (e.g. 360p = 360 lines, progressive scan [as opposed to interlaced]).

            I’m probably wrong on the specifics, but that gives the gist and enough keywords to find a better explanation.

            [EDIT: A word.]

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          11 months ago

          CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.

          • Psyduck_world@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?

            • NuPNuA@lemm.ee
              link
              fedilink
              arrow-up
              4
              ·
              11 months ago

              They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.

      • fuckwit_mcbumcrumble@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        VHS was capable of not bad quality, people just had a lot bad equipment.

        Some TV shows (if they were crazy) were shot on film so you could re digitize them now in 4 or 8k and they’d look amazing. But there was also a lot of junk that was out there.

        And as others have mentioned if you do an awful job of digitizing it then you could take something that looked good and throw all of that quality away. But if the tape wasn’t stored in good condition then it could just struggle to be digitized in the first place when done properly.

    • Capt. Wolf@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      11 months ago

      There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.

      However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.

      For example.

      Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…

    • Dandroid@dandroid.app
      link
      fedilink
      arrow-up
      5
      ·
      11 months ago

      I watch a lot of hockey. Just watching hockey games from the 2000s are full on potato. I don’t remember them looking that bad back then.

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          All sports have been, also the rise of faster refresh LCD as those early flat screens blurred a lot.

  • 🇨🅾️🇰🅰️N🇪@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    11 months ago

    When I was a kid I used to think black and white meant the TV show or whatever used to be in color but since it got old it turned black and white. My thought process was they changed color just like old people’s hair turns grey… This was 35 years ago before internet.

      • barnsbauer@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        11 months ago

        This video was exactly what first came to mind when I read “badly understandable dialogues”! It bothers me that as we got better mics, the actors became more unintelligible instead of the other way as one would predict.

      • Send_me_nude_girls@feddit.de
        link
        fedilink
        arrow-up
        13
        ·
        11 months ago

        Sure, microphones got better but there is more too it. One huge factor is the mixing for cinemas and not for home theaters or worse for TV speaker.

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          No, the video actually goes into that. Directors think it’s “more real” to have mumbled dialogues. But they seem to misinterpret that as “more mumble = more good”.

          • CeruleanRuin@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            It’s a combination of both. Studios typically will mix the end result for the highest-end sound setups, which most people don’t actually have. If you’re lucky enough to have a full surround with the ability to properly dial in equalizer and other settings, you probably won’t have a problem hearing the dialogue even when it’s mumbled. But on conventional tv speakers, it can easily get lost in the mix.

    • Carighan Maconar@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      11 months ago

      I noticed when watching Good Omens on Amazon Prime that they offer a language option “Original + Dialogue Boost”.

      It works wonders. Almost feels like back in the days again when TV shows wanted dialogue to be understood.

    • drz@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      I think most people have given up and use subtitles on all the time.

    • Kiosade@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I hear this all the time, and maybe I just don’t watch THAT many shows/movies, but I haven’t come across anything where the actors sound like they’re mumbling. Do you have a few examples I could look up?

    • CeruleanRuin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I’ve used subtitles for most of my adult life, ever since having kids. First it was so I could watch without waking the baby, and then it was so I could follow along over all the noise in the house. And I never went back. So as sound mixing changed and got muddier, I guess I didn’t notice, because I was already used to not being able to hear half the dialogue anyway.

  • NuPNuA@lemm.ee
    link
    fedilink
    arrow-up
    18
    ·
    11 months ago

    Even early 16:9 stuff looks pretty dated now if it hasn’t been remastered to 1080/4k.

  • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    11 months ago

    Lotta old shows are re-formated just to have the wider screen, since they would still film at higher res for movies or just because. It’s not just an indication of age if something is still only in 4:3, it’s an indication of thrift or just a general lack of giving a shit about the future.

  • rm_dash_r_star@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    Can always tell when a show is 4:3 aspect. Recently I’ve noticed some modern TV shows adopting the theater aspects of flat (1.85:1) or scope (2.4:1) which I think is pretty cool. The last episode of Strange New Worlds I watched was in scope, that’s some high end filming.

    • CeruleanRuin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      SNW is really top tier production quality across the board. The camera work, the sound, music, design, everything is goddamned impeccable, and that extends to the post production. So much thought goes into every part of it, and I really have to give Paramount its kudos for enabling that level of attention to detail in all aspects of the franchise right now. If I told a fellow Trekkie in the 90s that we would ever see the day, they would laugh.

      • Hazdaz@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        11 months ago

        Radio vs TV for Boomers

        B&W vs Color for Gen X

        SD vs HD would be Millenials

        4K vs HD for Zoomers

        • drz@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          11 months ago

          I’m a millennial and I can’t really tell the difference between SD and HD. Do you mean like when YouTube switches to 360p instead of 1080?

    • bufordt@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      That’s a big problem for stuff that was originally shot on video. Old stuff shot on film can look pretty good when digitized.

      • CeruleanRuin@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        But then you also have that very specific window of time when a lot of stuff especially SFX was done on video that can’t be upscaled. Babylon 5 fans weep.

  • perviouslyiner@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Asteroid City switched between aspect ratios as well as switching between black&white as they swapped between the TV story and the ‘real’/cinema story.

  • PhiAU@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    11 months ago

    Re-watching Buffy the Vampire Slayer with my kids in new hi-def, and you can clearly and easily see the stunt doubles now, and the SFX look really dated now that you can see them clearly.

    It’s amazing what old CRTs would let you get away with.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I identified them by awkward haircuts and clothing styles. I knew something was off / wrong, but it wasn’t until adulthood that I was able to piece it together.

  • balance_sheet@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    11 months ago

    I was a private tutor about few years ago teaching 16 year old. I was 22.

    I still can’t forget his face looking at me like a living fossil talking about how crazyit was to have a touch screen phone the first time…

    • Grimlo9ic@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      11 months ago

      That’s such a trip. Only a 6 year difference between the two of you, yet you experienced the dawn of something and they didn’t, and it shapes both of your perspectives so much.

      Even though it technically applies to transistors, Moore’s Law has been a good barometer for the increase of complexity and capabilities of technology in general. And now because of your comment I’m kinda thinking that since the applicability of that law seems to be nearing its end, it’s either tech will stagnate in the next decade (possible, but I think unlikely), or we may be due for another leapfrog into a higher level of sophistication (more likely).

    • Treczoks@lemmy.world
      cake
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Retaliate! Hand them a rotary phone and ask them to order a pizza.

      Bonus: If they actually managed to phone someone, ask them to send an SMS with it next ;-)

        • Treczoks@lemmy.world
          cake
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          Can’t send a text with a landline though.

          Of course, that was the joke ;-) Does not hurt to watch them trying to figure this out.