Thanks for catching that. I didn’t watch the trailer and the articles I saw only mentioned Xbox Series, PS5, PC, and Mac.
Getting mostly* off topic, but pixel perfection isn't even the right look for a lot of retro games. Some explanation and examples here and here.Retroarch beats the rest because of the shaders. Pixel-fidelity snobs be damned, Triple Eagle SuperSai is where older stuff can really shine.
*Somewhat on topic part being cause fancy new screens can make for really nice filters, so high res high refresh OLED could pull off neat stuff! Even if it is to make stuff technically look worse on a certain level .
RetroArch came out today:
‘RetroArch’ Is Now Available on iOS and iPadOS, tvOS Support Being Evaluated – TouchArcade
After approving PPSSPP, Apple just approved *the* RetroArch . We knew it was submitted to Apple, but I didn't think it would be approved this soon.toucharcade.com
And apparently Gamma (PS1) and PPSSPP (PSP if that wasn't obvious) are out too.
Looking forward to some Outrun 2 if the PSP emulator runs well on my iPad mini.
That would be rad, it would probably need a very high refresh though. Ideally, for a 60hz update of an emulated CRT with e.g 500 scanlines, each 'frame' would need to be drawn via 500 separate frames that each only drew a single line of emulated pixels (and previous lines faded out a little).Yes! Has anyone used high refresh displays to simulate CRT phosphor decay? Seems the obvious choice.
Excellent! I wasn't even thinking about operating on individual scan lines, though that would be amazing. To model an NES output (240p, 60fps) via CRT, scanline by scanline, would take... a 14,400Hz display? And no accounting for vertical blanking. Heheh.That would be rad, it would probably need a very high refresh though. Ideally, for a 60hz update of an emulated CRT with e.g 500 scanlines, each 'frame' would need to be drawn via 500 separate frames that each only drew a single line of emulated pixels (and previous lines faded out a little).
Even a 1000hz display would have to draw 1000/60 = 16 scanlines per frame. That might still look decent though? Or would it visibly start to look like screen tearing?
Black frame or alternating interlaced frames . Not sure if I'm thinking of things right, but I guess on 120Hz each field can have one output frame and one "decay" frame?Maybe start simpler, with something along the lines of black frame insertion. Rather than emulating a projector shutter and cutting to full black, attempt simulate the gradual decay/dimming of the entire, full-field image.
Excellent! I wasn't even thinking about operating on individual scan lines, though that would be amazing. To model an NES output (240p, 60fps) via CRT, scanline by scanline, would take... a 14,400Hz display? And no accounting for vertical blanking. Heheh.
Maybe start simpler, with something along the lines of black frame insertion. Rather than emulating a projector shutter and cutting to full black, attempt simulate the gradual decay/dimming of the entire, full-field image.
A 240Hz display, operating on a 60Hz signal, could offer three additional frames to work with. Assume the final frame is full black, and you have two additional, dimmed frames to insert. Experiment with the falloff in luminosity of those frames and see whether there's a combination that in any way evokes 'CRT'.
There's a large community of retro gamers who beg to differ. Whether you consider it special or not, it's hard to deny that CRTs offer a viewing/gaming experience quite distinct from modern display tech.There is nothing special about CRT decay, so it's a waste of time trying to emulate decay.
There's a large community of retro gamers who beg to differ. Whether you consider it special or not, it's hard to deny that CRTs offer a viewing/gaming experience quite distinct from modern display tech.
If it were such a 'waste of time', how do you explain 30-year old Trinitron PVMs selling for four-figure prices?
A fair number of console games used the decay (flashing a sprite or layer on/off) to simulate transparency or to get colors that wouldn't otherwise be possible. Nintendo had to remove the flashing from some games when they got Virtual Console rereleases.There is nothing special about CRT decay, so it's a waste of time trying to emulate decay.
It's probably excessive to try to get it exact, but it isn't a waste of time to try to do it in general.
Benefits don't matter, some folks just enjoy and/or nostalgize over the 'feel' of CRT.CRT offers some benefits.
Yes, and if maximal motion clarity is what you're after then find an OLED with BFI, not a CRT.Motion clarity comes from the short on cycle, which can be done without emulating a phosphor decay curve.
I posted immediately following your longer post above, and think I understand better where you're coming from.That doesn't mean they come from the phosphor decay curve.
Yes, and if maximal motion clarity is what you're after then find an OLED with BFI, not a CRT.
If the argument is that today's displays would fail at emulating CRTs in this respect, then heard and understood. I still think it would be worthwhile to experiment since it's more about tricking our brains (persistence of vision) than replicating CRT hardware precisely. But I fully respect the concern you're raising. Cheers.
Understood. Thank you for setting me straight on this – I think you tried once before but it didn’t stick. Response time, aka switching speed != ‘on time’, right?No. Minimizing S&H motion blur is all about minimizing "on time". OLED with BFI has much longer "on time" than CRT, and thus worse motion clarity.
Got it. Current displays are several OOM too slow, so silly to bother. Thanks!Some of the LCD with ~1ms strobes are comparable to CRT. See the two examples I mentioned back in post #3,212 above.
Really, the argument was more that you were not chasing an important characteristic. There are plenty more important ones. But yeah, no modern OLED/LCD can emulate the decay characteristics either.
And the noises Ubisoft was making about porting their engine (and by extension making it easier for any of their games on that engine to be ported) seems dare I say optimistic.Last year there was a fair amount of (not-unwarranted) hand-wringing that GPTK would be nothing more than a one-and-done, throwaway effort by Apple. I am glad to see they’ve failed to meet those expectations.
Speaking of which. When will Steam be Apple Silicon native....If Windows users have to put up with UbiSoft’s pile of shit launcher, I don’t see why Mac users should get off lightly.
Once Apple supports 32-bit games in Rosetta?Speaking of which. When will Steam be Apple Silicon native....
seems dare I say optimistic.
The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?I'm still bummed at that the Mac was off the list for Civ VII (that would have made a major splash at WWDC, but Apple). At this point, I think more people game on the Quest 3 than they do on the Mac.
Dropping 32-bit support is what it took to induce Valve to release a 64-bit version of Steam, so it’ll probably take dropping Intel support to get an Apple Silicon version.Speaking of which. When will Steam be Apple Silicon native....
omg! I totally missed that! Thanks!!!!The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?
Edit: The press release also mentions that it’s coming to Mac (and Linux).
2K and Firaxis Games officially announced today Sid Meier’s Civilization® VII, a revolutionary new chapter in the epic strategy video game franchise, will launch in 2025 on PlayStation®5 (PS5®), PlayStation®4 (PS4®), Xbox Series X|S, Xbox One, Nintendo™ Switch, and PC, Mac and Linux via Steam.
Nice catch. Civ 6 is literally never not in the top 10 on Steam and the Mac app store, so I would think porting Civ 7 is a business no-brainer.The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?
Edit: The press release also mentions that it’s coming to Mac (and Linux).
2K and Firaxis Games officially announced today Sid Meier’s Civilization® VII, a revolutionary new chapter in the epic strategy video game franchise, will launch in 2025 on PlayStation®5 (PS5®), PlayStation®4 (PS4®), Xbox Series X|S, Xbox One, Nintendo™ Switch, and PC, Mac and Linux via Steam.
It's also the perfect kind of game for iPad as well. Would be lovely to see it immediately native on all the M-series chips irrespective of the form factor.Nice catch. Civ 6 is literally never not in the top 10 on Steam and the Mac app store, so I would think porting Civ 7 is a business no-brainer.
I have to think Apple's objective is a state where major game engines, whether internal or 3rd party, can at very least produce a non-optimized build for Apple Silicon targets. Even if final polish/test/release is a porting house's responsibility, continuous integration by the primary game studio will be the most assured path toward day-and-date release with consoles and Windows.I don’t think a mainline Civilization has ever skipped the Mac, dating all the way to the original. The bigger question in my mind is who’s handling the release? Aspyr has done the Mac releases starting with Civ IV and I would assume will continue to do so. The recent spate of Mac releases Apple has been trumpeting have been handled by the original publishers rather than being outsourced ports, but Take 2 (who owns Firaxis) hasn’t been one of them. Bit to me it would be a sign of increased commitment if they were to handle publishing duties themselves even if they outsource the actual work of the port itself.
Agreed. To expound on that further, with very few exceptions basically nobody is coding to the bare metal anymore. From below the complexity of modern systems means it’s far more difficult to get that low, while from above modern development tools have gotten so good it’s much harder to improve on their output (plus they’ve gotten so powerful it’s simply not as necessary to extract every single iota of efficiency). On top of that the need to target as many platforms as possible means that highly portable code is much more important that it used to be. End result is current big budget game development is all about engines and middleware, not bespoke tailoring. The benefit there is getting the engine running on Apple’s platforms is the hard part but once it does any particular game afterwards is a much lower relative hurdle.I have to think Apple's objective is a state where major game engines, whether internal or 3rd party, can at very least product a non-optimized build for Apple Silicon targets. Even if final polish/test/release is a porting house's responsibility, continuous integration by the primary game studio is the best path toward day-and-date releases with consoles and Windows.
How about Kickstarter projects quietly canning the Mac (and Linux) version as a footnote while announcing the release of other versions?This is when we'll know the Mac has reached the main-stage of gaming: When we too can be enthralled by vague promises by companies like Ubisoft, only for them to then immediately lay off everyone who might have been responsible for whatever was promised.
Course looking at the campaign those promises were made all the way back in 2016. Things sure have changed since then!Is System Shock still coming to MacOS and Linux?
Unfortunately no, plans for MacOS and Linux releases of System Shock have been shelved.