Apple and Gaming

japtor

Ars Legatus Legionis
13,043

japtor

Ars Legatus Legionis
13,043
Retroarch beats the rest because of the shaders. Pixel-fidelity snobs be damned, Triple Eagle SuperSai is where older stuff can really shine.
Getting mostly* off topic, but pixel perfection isn't even the right look for a lot of retro games. Some explanation and examples here and here.

*Somewhat on topic part being cause fancy new screens can make for really nice filters, so high res high refresh OLED could pull off neat stuff! Even if it is to make stuff technically look worse on a certain level :judge:.
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
*Somewhat on topic part being cause fancy new screens can make for really nice filters, so high res high refresh OLED could pull off neat stuff! Even if it is to make stuff technically look worse on a certain level :judge:.

Yes! Has anyone used high refresh displays to simulate CRT phosphor decay? Seems the obvious choice.
 
  • Like
Reactions: VirtualWolf

Brute

Ars Tribunus Militum
2,563
Subscriptor
RetroArch came out today:


And apparently Gamma (PS1) and PPSSPP (PSP if that wasn't obvious) are out too.

Looking forward to some Outrun 2 if the PSP emulator runs well on my iPad mini.

I haven’t ahem found any ROMs that I could make work with Gamma (though I didn’t spend more than 10 minutes trying), but I was able to get PPSSPP up and running on my M2 iPad Pro. Works pretty well, particularly with a controller. Seems like it tears through battery.
 

Aleamapper

Ars Scholae Palatinae
1,284
Subscriptor
Yes! Has anyone used high refresh displays to simulate CRT phosphor decay? Seems the obvious choice.
That would be rad, it would probably need a very high refresh though. Ideally, for a 60hz update of an emulated CRT with e.g 500 scanlines, each 'frame' would need to be drawn via 500 separate frames that each only drew a single line of emulated pixels (and previous lines faded out a little).

Even a 1000hz display would have to draw 1000/60 = 16 scanlines per frame. That might still look decent though? Or would it visibly start to look like screen tearing?
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
That would be rad, it would probably need a very high refresh though. Ideally, for a 60hz update of an emulated CRT with e.g 500 scanlines, each 'frame' would need to be drawn via 500 separate frames that each only drew a single line of emulated pixels (and previous lines faded out a little).

Even a 1000hz display would have to draw 1000/60 = 16 scanlines per frame. That might still look decent though? Or would it visibly start to look like screen tearing?
Excellent! I wasn't even thinking about operating on individual scan lines, though that would be amazing. To model an NES output (240p, 60fps) via CRT, scanline by scanline, would take... a 14,400Hz display? And no accounting for vertical blanking. Heheh.

Maybe start simpler, with something along the lines of black frame insertion. Rather than emulating a projector shutter and cutting to full black, attempt simulate the gradual decay/dimming of the entire, full-field image.

A 240Hz display, operating on a 60Hz signal, could offer three additional frames to work with. Assume the final frame is full black, and you have two additional, dimmed frames to insert. Experiment with the falloff in luminosity of those frames and see whether there's a combination that in any way evokes 'CRT'.
 
Last edited:
  • Like
Reactions: Aleamapper

japtor

Ars Legatus Legionis
13,043
Maybe start simpler, with something along the lines of black frame insertion. Rather than emulating a projector shutter and cutting to full black, attempt simulate the gradual decay/dimming of the entire, full-field image.
Black frame or alternating interlaced frames 🤔. Not sure if I'm thinking of things right, but I guess on 120Hz each field can have one output frame and one "decay" frame?

And of course there's probably enough resolution to emulate CRTs down to rendering subpixels and such. Not sure if/how you'd want to mess with that stuff as far as refresh/decay or whatever effect goes.

To bring this back to iOS, I've been making use of RetroArch's FinalBurn core to play some (well just two) arcade games on my iPad mini, with my USB controller I posted earlier in the thread. Just been using some basic scanline filter, but there's a crapload of others I haven't tried.

RetroArch controller setup has been a bit annoying, particularly since one button doesn't appear to work once remapped. And when playing a vertical game, the on screen controller overly is all jacked up, displaying the wrong orientation and stretched, while the actual touch targets are the proper orientation. I just know the hide button is in the bottom right corner, and the menu button is vaguely in the middle.
 

ScifiGeek

Ars Legatus Legionis
16,351
Excellent! I wasn't even thinking about operating on individual scan lines, though that would be amazing. To model an NES output (240p, 60fps) via CRT, scanline by scanline, would take... a 14,400Hz display? And no accounting for vertical blanking. Heheh.

Maybe start simpler, with something along the lines of black frame insertion. Rather than emulating a projector shutter and cutting to full black, attempt simulate the gradual decay/dimming of the entire, full-field image.

A 240Hz display, operating on a 60Hz signal, could offer three additional frames to work with. Assume the final frame is full black, and you have two additional, dimmed frames to insert. Experiment with the falloff in luminosity of those frames and see whether there's a combination that in any way evokes 'CRT'.

There is nothing special about CRT decay, so it's a waste of time trying to emulate decay.

All that really matters for blur reduction is minimizing "on time" for the pixels, which means you want a shorter and brighter flash in your backlight strobe.

The best backlight LCDs would move the backlight strobe down the screen. This should be done because that is way LCDs are updated, and the strobing backlight is trying to match up the strobe, with the LCD update. If they just strobe the whole backlight, you end up catching the LCD at different points in it's update, making it only truly sharp on one screen area (usually they target the center of the screen). Not sure if any do this, but NVidias new ULMB 2, at least changes the overdrive depending on vertical position.

Backlight strobing "on time" in the ~1ms range essentially have CRT like motion clarity.

If you feel you need that, look for high end strobing LCD, that claim 1000Hz/1ms clarity.

Something like the Asus ULMB2 monitor:


View: https://www.youtube.com/watch?v=LZndZ7NWnZs

Or Benq Dyac+


View: https://www.youtube.com/watch?v=k8B4zxsMucs
 
Last edited:

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
There is nothing special about CRT decay, so it's a waste of time trying to emulate decay.
There's a large community of retro gamers who beg to differ. Whether you consider it special or not, it's hard to deny that CRTs offer a viewing/gaming experience quite distinct from modern display tech.

If it were such a 'waste of time', how do you explain 30-year old Trinitron PVMs selling for four-figure prices? ;)
 

ScifiGeek

Ars Legatus Legionis
16,351
There's a large community of retro gamers who beg to differ. Whether you consider it special or not, it's hard to deny that CRTs offer a viewing/gaming experience quite distinct from modern display tech.

If it were such a 'waste of time', how do you explain 30-year old Trinitron PVMs selling for four-figure prices? ;)

CRT offers some benefits. That doesn't mean they come from the phosphor decay curve.

What characteristic are you assuming CRTs get from the phosphor decay curve, that would warrant emulating it?

Motion clarity comes from the short on cycle, which can be done without emulating a phosphor decay curve.
 
  • Like
Reactions: Aleamapper

dspariI

Smack-Fu Master, in training
33
There is nothing special about CRT decay, so it's a waste of time trying to emulate decay.
A fair number of console games used the decay (flashing a sprite or layer on/off) to simulate transparency or to get colors that wouldn't otherwise be possible. Nintendo had to remove the flashing from some games when they got Virtual Console rereleases.

Some games used the "low quality" of composite video to simulate transparency using a checkerboard pattern which would get evened out into transparency. On top of this, some games also took the slightly off color of NTSC into account.

It's probably excessive to try to get it exact, but it isn't a waste of time to try to do it in general.
 

ScifiGeek

Ars Legatus Legionis
16,351
It's probably excessive to try to get it exact, but it isn't a waste of time to try to do it in general.

I disagree. What is being attributed to decay, is most likely just related to the very short "on" cycle of CRTs.

To emulate the decay, you would need an even shorter "on" cycle, than CRTs have, because you need to approximate a curve that occurs in about 1ms or less. Are you going to curve fit with less than 10 steps? 10 steps would require a 10,000 Hz monitor... And all for something that is almost certainly pointless, because it just comes from the short "on" cycle, not the decay.

CRTs do have several characteristics that makes them sought after by some retro gamers:

  • Works with light guns = You can play with original retro light gun HW.
  • No input lag = responsive feel.
  • Short on cycle = Extremely low S&H motion blur.
  • Scan Lines = Matches the low resolution pixel art. Often Designers actually took the scan lines into account for art work.
  • Blurring/smoothing pixels together = Masks issue with low resolution pixel art.
  • Multiple Resolution support = great for running low res games. Though this one is somewhat related to the previous point, CRTs are always slightly soft because of pixel blending.
I'm familiar with CRTs. I started gaming with in the warm CRT glow of classic Arcades, and later ColecoVision on the family TV, then a C-64 on a small TV with composite input, then an Amiga 1000 with it's dedicated Amiga 1080 monitor. Then PC's with several PC CRTs, with 20" Trinitrons in the final years before going LCD.

I just don't seen the decay curve of CRT phosphor being important, at all, to any element of the experience.
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
CRT offers some benefits.
Benefits don't matter, some folks just enjoy and/or nostalgize over the 'feel' of CRT.

Does an arc lamp and acetate have "benefits"? It certainly has a lot of drawbacks. But people still seek out the experience of projected film.

Motion clarity comes from the short on cycle, which can be done without emulating a phosphor decay curve.
Yes, and if maximal motion clarity is what you're after then find an OLED with BFI, not a CRT.

That doesn't mean they come from the phosphor decay curve.
I posted immediately following your longer post above, and think I understand better where you're coming from.

If the argument is that today's displays would fail at emulating CRTs in this respect, then heard and understood. I still think it would be worthwhile to experiment since it's more about tricking our brains (persistence of vision) than replicating CRT hardware precisely. But I fully respect the concern you're raising. Cheers.
 
Last edited:

ScifiGeek

Ars Legatus Legionis
16,351
Yes, and if maximal motion clarity is what you're after then find an OLED with BFI, not a CRT.

No. Minimizing S&H motion blur is all about minimizing "on time". OLED with BFI has much longer "on time" than CRT, and thus worse motion clarity.

Some of the LCD with ~1ms strobes are comparable to CRT. See the two examples I mentioned back in post #3,212 above.

If the argument is that today's displays would fail at emulating CRTs in this respect, then heard and understood. I still think it would be worthwhile to experiment since it's more about tricking our brains (persistence of vision) than replicating CRT hardware precisely. But I fully respect the concern you're raising. Cheers.

Really, the argument was more that you were not chasing an important characteristic. There are plenty more important ones. But yeah, no modern OLED/LCD can emulate the decay characteristics either.
 

kenada

Ars Legatus Legionis
17,112
Subscriptor
There have been a few posts in r/macgaming about using GPTK 2. One post reports successfully enabling ray-tracing in Diablo IV. There are also some benchmarks of other games. CyberPunk 2077 with ray-tracing and no FSR gets 13 fps in the benchmark (compared to ~43fps). It gets 33 fps with RT + FSR, though the poster doesn’t say whether the FSR artifacts are fixed.

I’m currently working on the patchset for GPTK support in nixpkgs. It works with Wine 9.0 but breaks with 9.10 (the current unstable release in nixpkgs). If I can get that fixed, I’d like to see if GPTK 2 fixes the crashes with FFXIV, which would let me compare MoltenVK+DXVK versus GPTK. Even if performance isn’t better, I expect it to be more featureful and less glitchy.
 
Last edited:

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
No. Minimizing S&H motion blur is all about minimizing "on time". OLED with BFI has much longer "on time" than CRT, and thus worse motion clarity.
Understood. Thank you for setting me straight on this – I think you tried once before but it didn’t stick. Response time, aka switching speed != ‘on time’, right?

Some of the LCD with ~1ms strobes are comparable to CRT. See the two examples I mentioned back in post #3,212 above.

Really, the argument was more that you were not chasing an important characteristic. There are plenty more important ones. But yeah, no modern OLED/LCD can emulate the decay characteristics either.
Got it. Current displays are several OOM too slow, so silly to bother. Thanks!
 
Last edited:

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
Last year there was a fair amount of (not-unwarranted) hand-wringing that GPTK would be nothing more than a one-and-done, throwaway effort by Apple. I am glad to see they’ve failed to meet those expectations.
And the noises Ubisoft was making about porting their engine (and by extension making it easier for any of their games on that engine to be ported) seems dare I say optimistic.
 

cateye

Ars Legatus Legionis
11,760
Moderator
seems dare I say optimistic.

This is when we'll know the Mac has reached the main-stage of gaming: When we too can be enthralled by vague promises by companies like Ubisoft, only for them to then immediately lay off everyone who might have been responsible for whatever was promised.
 
  • Haha
Reactions: gabemaroz

kenada

Ars Legatus Legionis
17,112
Subscriptor
I'm still bummed at that the Mac was off the list for Civ VII (that would have made a major splash at WWDC, but Apple). At this point, I think more people game on the Quest 3 than they do on the Mac.
The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?

Edit: The press release also mentions that it’s coming to Mac (and Linux).

2K and Firaxis Games officially announced today Sid Meier’s Civilization® VII, a revolutionary new chapter in the epic strategy video game franchise, will launch in 2025 on PlayStation®5 (PS5®), PlayStation®4 (PS4®), Xbox Series X|S, Xbox One, Nintendo™ Switch, and PC, Mac and Linux via Steam.
 
Last edited:
  • Like
Reactions: Scud

Scud

Ars Legatus Legionis
12,314
The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?

Edit: The press release also mentions that it’s coming to Mac (and Linux).

2K and Firaxis Games officially announced today Sid Meier’s Civilization® VII, a revolutionary new chapter in the epic strategy video game franchise, will launch in 2025 on PlayStation®5 (PS5®), PlayStation®4 (PS4®), Xbox Series X|S, Xbox One, Nintendo™ Switch, and PC, Mac and Linux via Steam.
omg! I totally missed that! Thanks!!!!
 
  • Like
Reactions: kenada

byrningman

Ars Tribunus Militum
2,023
Subscriptor
The teaser site only lists Steam as a platform, linking to the Steam page, which lists macOS system requirements as tbd. Is there information elsewhere that Steam actually means Windows-only?

Edit: The press release also mentions that it’s coming to Mac (and Linux).

2K and Firaxis Games officially announced today Sid Meier’s Civilization® VII, a revolutionary new chapter in the epic strategy video game franchise, will launch in 2025 on PlayStation®5 (PS5®), PlayStation®4 (PS4®), Xbox Series X|S, Xbox One, Nintendo™ Switch, and PC, Mac and Linux via Steam.
Nice catch. Civ 6 is literally never not in the top 10 on Steam and the Mac app store, so I would think porting Civ 7 is a business no-brainer.
 
  • Like
Reactions: kenada

ant1pathy

Ars Tribunus Angusticlavius
6,461
Nice catch. Civ 6 is literally never not in the top 10 on Steam and the Mac app store, so I would think porting Civ 7 is a business no-brainer.
It's also the perfect kind of game for iPad as well. Would be lovely to see it immediately native on all the M-series chips irrespective of the form factor.
 

Chris FOM

Ars Legatus Legionis
10,001
Subscriptor
I don’t think a mainline Civilization has ever skipped the Mac, dating all the way to the original. The bigger question in my mind is who’s handling the release? Aspyr has done the Mac releases starting with Civ IV and I would assume will continue to do so. The recent spate of Mac releases Apple has been trumpeting have been handled by the original publishers rather than being outsourced ports, but Take 2 (who owns Firaxis) hasn’t been one of them. Bit to me it would be a sign of increased commitment if they were to handle publishing duties themselves even if they outsource the actual work of the port itself.

And yes, I’m definitely hoping for an iPad release of VII. The interface didn’t translate completely perfectly but overall Civ VI worked wonderfully on the iPad.
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
I don’t think a mainline Civilization has ever skipped the Mac, dating all the way to the original. The bigger question in my mind is who’s handling the release? Aspyr has done the Mac releases starting with Civ IV and I would assume will continue to do so. The recent spate of Mac releases Apple has been trumpeting have been handled by the original publishers rather than being outsourced ports, but Take 2 (who owns Firaxis) hasn’t been one of them. Bit to me it would be a sign of increased commitment if they were to handle publishing duties themselves even if they outsource the actual work of the port itself.
I have to think Apple's objective is a state where major game engines, whether internal or 3rd party, can at very least produce a non-optimized build for Apple Silicon targets. Even if final polish/test/release is a porting house's responsibility, continuous integration by the primary game studio will be the most assured path toward day-and-date release with consoles and Windows.
 

Chris FOM

Ars Legatus Legionis
10,001
Subscriptor
I have to think Apple's objective is a state where major game engines, whether internal or 3rd party, can at very least product a non-optimized build for Apple Silicon targets. Even if final polish/test/release is a porting house's responsibility, continuous integration by the primary game studio is the best path toward day-and-date releases with consoles and Windows.
Agreed. To expound on that further, with very few exceptions basically nobody is coding to the bare metal anymore. From below the complexity of modern systems means it’s far more difficult to get that low, while from above modern development tools have gotten so good it’s much harder to improve on their output (plus they’ve gotten so powerful it’s simply not as necessary to extract every single iota of efficiency). On top of that the need to target as many platforms as possible means that highly portable code is much more important that it used to be. End result is current big budget game development is all about engines and middleware, not bespoke tailoring. The benefit there is getting the engine running on Apple’s platforms is the hard part but once it does any particular game afterwards is a much lower relative hurdle.

Which of course means that good relationships with the guys that make the engines is critical. It’s a good thing that Apple hasn’t gone out of their way to repeatedly antagonize the maker of the biggest, most important game engine in the entire industry.
 

japtor

Ars Legatus Legionis
13,043
This is when we'll know the Mac has reached the main-stage of gaming: When we too can be enthralled by vague promises by companies like Ubisoft, only for them to then immediately lay off everyone who might have been responsible for whatever was promised.
How about Kickstarter projects quietly canning the Mac (and Linux) version as a footnote while announcing the release of other versions?

View: https://www.kickstarter.com/projects/1598858095/system-shock/posts/4106201

Is System Shock still coming to MacOS and Linux?

Unfortunately no, plans for MacOS and Linux releases of System Shock have been shelved.
Course looking at the campaign those promises were made all the way back in 2016. Things sure have changed since then!