Forum - View topicAnswerman - Why Is Old Anime Still Released In Interlaced Format?
Goto page Previous 1, 2, 3, 4 Note: this is the discussion thread for this article |
Author | Message | |||||
---|---|---|---|---|---|---|
Shiflan
Posts: 418 |
|
|||||
As far as I am aware that's exactly the issue.
I don't do very much retro gaming myself but even with my limited messing around with that hobby I have noticed the two problems I described earlier myself. I bought a gamecube about a year ago because I wanted to play some old favorites on it. The last time I had a gamecube it was on a Sony flatscreen CRT TV and it looked beautiful. Hooked up to my new (relatively speaking) HD TV it looks downright awful. I was also gifted a device called a "retropie"--basically a small computer that comes pre-loaded with various retro emulators. I found it very difficult to play platformers because the lag between the console and displaying the image on the TV makes it very hard to do delicate jumping. You see on the screen that you are still standing on an edge but in reality you've already fallen into a pit but don't yet realize it so by the time you press the "jump" button it's too late. I can play emulated platformers like the various mario or mega man games with no problem on an actual PC emulator but once it goes through my TV the lag makes jumping pits or dodging things very hard. I had a similar problem playing the FFX/FFX-2 HD remaster. The graphics looked great on the HDTV but there was a noticeable lag in the control inputs. I had trouble doing the various mini-games (especially lightning dodging in FFX and timing of gunshots in X-2) whereas when I first played the old SD version on an analog TV I had no issues with those. Clearly this sort of thing isn't a problem for every game, but it is for some games & situations. I have a few friends who are really hardcore into retro gaming and what they do is track down used/refurbished studio monitors: basically, high-end CRTs that were meant for use in film studios & TV broadcast applications. Apparently they provide the very best performance for old-school consoles that only had analog output, and while they were several thousand $$$ new they can be had for a lot less on the used market. Sorry, I can't name any specifics as to what models they use but I'm sure you can find more on a dedicated retrogaming site. |
||||||
TheAncientOne
Posts: 1885 Location: USA (mid-south) |
|
|||||
While the emulation itself could be introducing some delay, modern TV sets take advantage of the fact that as long as they delay the audio by the same amount, they can take an extended amount of time (still measured in only milliseconds) processing the video signal. This is why ARC (audio return channel) was incorporated into later HMDI specifications and most AV recievers have an adjustable audio delay. Some HDTV sets include a "gaming mode" to bypass most or all of this processing specifically to avoid or minimize the issue you describe. If you do a search on "best HDTV for gaming", you'll see some sites do features on which sets have the lowest input lag. One site (http://www.displaylag.com/display-database/) even includes monitors. |
||||||
leafy sea dragon
Posts: 7163 Location: Another Kingdom |
|
|||||
Thanks. I am not a retro gamer specifically, just that I never sell off any of my old systems or games and will hook them up and play them if I feel like it. I can say that, having worked at a donation center for a few years, CRT TVs come in every day, and I have counted up to 18 in one day. Most of them were in good working order, albeit like half of them were missing their remotes, because the donors had upgraded to a flatscreen and wanted to get rid of their CRTs. Hence, I would recommend finding a thrift shop if you're going straight for a CRT, because odds are they'll have a lot of them, and they'll be cheap (we priced ours between $7 and $30 depending on the brand, size, and condition).
Oh yes, my HDTV has a "Gaming Mode" (and is called that on the menu). What are the disadvantages to that when doing something other than video games? That is, why are TVs not in gaming mode all the time? |
||||||
TheAncientOne
Posts: 1885 Location: USA (mid-south) |
|
|||||
In gaming mode, the set is doing minimal or no processing of the signal. It probably wouldn't be doing de-interlacing of an interlaced signal, and any upscaling might revert to the simplest available method (not necessarily the best the set could achieve), and noise filtering and sharpening might be deactivated. Frankly, you could always try A/B comparisons of different material using gaming mode turned on or off. Using it won't harm the set. Feeding output from an old 480i gaming console via the composite port would probably see a drop in visual quality, but in return you gain better response time. Gaming mode was actually meant for modern gaming consoles, where video processing is of little or no benefit in the first place. |
||||||
leafy sea dragon
Posts: 7163 Location: Another Kingdom |
|
|||||
All right. That's what I had suspected, regarding video quality. I actually don't play games in gaming mode, but I seem to get by just fine, even the frustrating kaizo stuff in Super Mario Maker. Maybe I grew used to the bit of delay.
|
||||||
Shiflan
Posts: 418 |
|
|||||
Correct me if I am wrong here, but that sounds like it would only help with syncing the audio to the video being displayed. What I was describing was a lag between the control input of the console and the video being displayed on the screen. For example: suppose we're playing super mario bros. Mario has to run and jump over a pit. It's a wide pit, so we have to press the "jump" button when Mario is right at the edge of said pit. If we press the button too soon then the jump won't clear the pit. If we press the button too late then we'll walk off the edge of the pit before executing the jump. So we press the controller to the right and we watch Mario run towards the pit. We're waiting to see him right at the edge. When we see him right at the very edge of the pit then we're going to press "jump". So Mario is running and we're hovering our thumb over "jump". We see him right at the very last edge of the pit so we press the button. However, because the video signal is delayed, we're actually looking at what has already happened a fraction of a second ago. When we press the jump key the console is thinking "hey, Mario's already walked off the edge".
I have been familiar with "Gaming Mode" for a long time, but I had no idea it affected the processing lag. I thought it was only the brightness, contrast, and color balance that was optimized for gaming rather than for film. Thanks for pointing that out--one learns something every day. Though to be honest, I do very little gaming--I'm into this more for the classic AV geek discussion! |
||||||
Shiflan
Posts: 418 |
|
|||||
Yeah, same here!
Yeah if I was more into playing old consoles that's certainly what I would do too. I'm not serious enough about it to go track down a special model, etc. I was just mentioning what some of the really hardcore guys do. Apparently there are some who go so far as to mod their console to intercept the video signal off the actual video chips before it gets converted to the TV output and route this signal via component video to the CRT. Apparently this is the best thing, though most consumer CRTs did not have component video inputs. That's the motivation for using the "studio monitor"--not only are they very high quality CRTs but they had all sorts of fancy inputs used in the pro video industry. Again, just fodder for discussion--not something I've done personally or have the inclination to do.
Admittedly I'm not 100% up to speed on exactly what this does on the latest TVs, but what I have noticed is that it cranks up the brightness and contrast to what I would consider absurd levels. It looks good on bright colors and simple graphics but it just looks "wrong" with film. |
||||||
leafy sea dragon
Posts: 7163 Location: Another Kingdom |
|
|||||
My TV's manual describes "Gaming Mode" as a mode as free of lag as possible, but it didn't explain any tradeoff. Hence why I was wondering why it's not perpetually on "Gaming Mode."
Well, there aren't any major gaming systems I can think of that support component without supporting either composite or HDMI. If it supports HDMI, you might as well get a modern TV, and if it supports composite, most working CRTs will work with that. And if you get even older, the game system might support coaxial only, which is something all CRTs are compatible with. There was only one brief time when I used component, which was with the Wii connected to an HD TV, as composite is as high as the Wii got. By the way, up until my Xbox 360 broke, it was connected to a Sony Trinitron from 1978. It's a weird thought, thinking about it now, that I was playing stuff like BlazBlue on a 150-pound TV with dials and a wooden finish. Despite that my 360 broke, the TV still works! |
||||||
Shiroi Hane
Encyclopedia Editor
Posts: 7580 Location: Wales |
|
|||||
The ones they showed on Tomorrow's World were from Japan, where they had analogue HD broadcasts. From Wikipedia: "Japan had the earliest working HDTV system, with design efforts going back to 1979. The country began broadcasting wideband analog high-definition video signals in the late 1980s using an interlaced resolution of 1035 or 1080-lines active (1035i) or 1125-lines total supported by the Sony HDVS line of equipment. The Japanese system, developed by NHK Science & Technology Research Laboratories in the 1980s, employed filtering tricks to reduce the original source signal to decrease bandwidth utilization. MUSE was marketed as "Hi-Vision" by NHK." |
||||||
Shiflan
Posts: 418 |
|
|||||
Yeah, that's been my experience too--at least in the sense that there was no component connectors on the back of the console. However what I'm talking about is a modification of the hardware itself. Inside the NES (or whatever) the video is processed in R, G, B channels in the graphics chip. The NES then converts that signal to coax for the output to TVs of the day. The really serious retro gamers will take the console apart and solder wires to the motherboard--picking up the RGB signals before they pass through the conversion to coax. They then output this signal to a CRT TV. Apparently it is the best because it avoids any loss of quality in the coax conversion process inside the console, and again in the TV. I'm sure this is a pretty small group of gamers who do this though! |
||||||
leafy sea dragon
Posts: 7163 Location: Another Kingdom |
|
|||||
1979! Most definitely I didn't see any of those. The conventions my father took me to were around 1994 to 1997 or somesuch.
Modifying old systems to work with newer output types? That's pretty hardcore. I still have a working NES (for some reason, my father bought two of them, the second one discovered after the first one was stolen), but I am not going to risk doing anything that can potentially render it unusable. |
||||||
Shiflan
Posts: 418 |
|
|||||
Yes indeed! And it's not just an update to support a newer output, it's also an upgrade to the video quality because it's bypassing the coax conversion loss that was normally present. I am told it is a rather significant upgrade. With a traditional NES hookup the signal would go like this: Video chip ----->RGB/coax converter inside NES---->Coax cable to TV---->TV converts Coax to RGB--->CRT Tube With the mod it goes: Video chip----->Component cables to TV---->CRT Tube It skips the RGB/Coax conversion twice: once inside the NES and again inside the TV. The NES (and other old school consoles) video processors natively work with RGB signals. So the more modern output is already right there in the circuitry. The thing is that back in the days of those consoles hardly anybody owned a TV that could display that. Consumer TVs didn't have component input. Only very expensive professional gear had RGB inputs on it. Therefore the NES contained a device which converted (lossy, of course) to the coax output that was standard at the time. The mod being discussed is really simple, it's just hooking up wires to the pre-existing RGB signal before the coax converter. If you know how to use a soldering iron it's trivially easy. |
||||||
Shiroi Hane
Encyclopedia Editor
Posts: 7580 Location: Wales |
|
|||||
I was only born in 1979 so the ones I remember seeing were much later (and Wikipedia says that actual broadcasts didn't start till the late 80s). I'm sure I remember the sets featuring on Tomorrow's World more than once, but the only presenter I can specifically recall talking about them was Peter Snow who only came onboard in 1997. |
||||||
All times are GMT - 5 Hours |
||
|
Powered by phpBB © 2001, 2005 phpBB Group