Patent document showcases the cloud-only streaming Xbox console that never was

Post content hidden for low score. Show…
Really shows how bad Microsoft is as an org if they couldn't figure out how to release a $100 streaming box.
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases (yes yes I know they're not equivalent NVIDIA product lines, but in many cases they're competing for similar data-center quality chips and NVIDIA can only make so many GPUs so fast).

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
 
Last edited:
Upvote
84 (91 / -7)

HuntingManatees

Smack-Fu Master, in training
81
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases.

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
I gave Luna a whirl back when it came out, since they had "Control" in their library and my PC hardware at the time would have exploded trying to run that game. Graphics quality was terrible, but at least in my case the latency wasn't too much of an issue. Picture and connection aside, the deal-killer really was that a few months' subscription fees started approaching "that's about the price of an RTX, dummy" territory.

My sense is that Microsoft passing on cloud gaming is somewhat of the final say on the matter, since they're probably the company best-positioned to succeed in the cloud gaming market given their decades of console expertise and massive Game Pass subscriber base.
 
Upvote
53 (56 / -3)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases (yes yes I know they're not equivalent NVIDIA product lines, but in many cases they're competing for similar data-center quality chips and NVIDIA can only make so many GPUs so fast).

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
Nvidia GeForce Now reportedly has decent latency with a wired connection. Even John Linneman from Digital Foundry approves. The issue as I see it is that cloud streaming is not particularly economical for the platform holders. Running servers isn't cheap!
 
Upvote
34 (34 / 0)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases (yes yes I know they're not equivalent NVIDIA product lines, but in many cases they're competing for similar data-center quality chips and NVIDIA can only make so many GPUs so fast).

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
Stadia worked very well, even in a relatively rural area. I notice ping fluctuations as low as ~20-30ms in games I play, so I feel confident that I would have noticed any significant latency.

It flopped because of the payment/ownership model. Pretty much all of these companies fail miserably on some metric or other that is critical to consumers.

The basic idea is perfectly sound. There is definitely a market. Companies simply have yet to figure out a profitable business model.
 
Upvote
9 (15 / -6)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

blah blah...

GeForce Now.

The problem with interactive game streaming is really three-fold and two of them have to do with the network quality. Latency, packet loss, and once you get into the $100-$150 price range you're really cutting down to the quick. NOT the processor power, but the system RAM. If you take a close look at Nvidia's solution, systems as far back as 2009 can be used so it's not really a case of the CPU. It's the cost of the RAM and the system as a whole - the SoC. Sure, once you get into 4k territory the processing goes up but it's mostly handled server side- so long as your GPU can output a 4k stream at 60 FPS you're golden, but the RAM requirements go up with the square the viewing dimensions. Remote compute doesn't significantly decrease how much RAM you need to display the image itself. It just mitigates the RAM needed for the compute part (for anti-aliasing for example). You also GREATLY increase the quality of the network connection needed. Almost no ISPs can reliably handle interactive 2k+ streaming even with gigabit connections because they're almost entirely tuned for burst traffic (web browsing and traditional MMOs) or non-interactive streaming (Netflix, file downloads, etc) and to a lesser degree latency (VoIP - buffer bloat's becoming less of an issue thanks to COVID and Zoom). Once you get into interactive setups you can't have ANY significant (greater than or equal to 0.1% loss) packet loss or retransmits in either direction in the entire length of the pipe or you run into problems. I can't tell how many nights I've sit at my computer and start noticing problems with packet loss in a certain interactive world, pull up a network traffic diagnostic and note a certain percentage of packet loss between me and the remote server. Packet loss is a complex problem. No single cause accounts for ALL cases of packet loss. And that's in a city with significant data center presence so the ISP coverage is usually pretty good in and out of the region. But again, it's not the kind of network designed for interactive back and forth. Traditional Internet useage patterns are forgiving of packet loss thanks to the design of TCP. Even VoIP or video conferencing don't have the kind of tight tolerances that interactive streaming gaming requires. It can recover from mild packet loss. But remote compute interactive gaming is definitely not something the current Internet is designed to handle because it can't handle packet loss (nor jitter, which is a different kettle of fish) very well.
 
Last edited:
Upvote
20 (27 / -7)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases (yes yes I know they're not equivalent NVIDIA product lines, but in many cases they're competing for similar data-center quality chips and NVIDIA can only make so many GPUs so fast).

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
Not to mention the data consumption. I've already resorted to connecting my living room tv to my cellular hotspot to avoid going over my cox data cap. Going to cloud gaming would mean nearly doubling the cost of my internet connection to add on unlimited data.
 
Upvote
17 (20 / -3)
Stadia worked very well, even in a relatively rural area. I notice ping fluctuations as low as ~20-30ms in games I play, so I feel confident that I would have noticed any significant latency.

It flopped because of the payment/ownership model. Pretty much all of these companies fail miserably on some metric or other that is critical to consumers.

The basic idea is perfectly sound. There is definitely a market. Companies simply have yet to figure out a profitable business model.
I can also attest for Stadia working well. I played Assassin's Creed Odyssey on the Stadia Beta and it played extremely well at 1920x1080 on a 125mbit line.

They gave you the local game as part of the beta ending; and it looked almost the same playing locally.

I was technically impressed, but then as you note: the ownership and pricing model were atrocious. I was never a paying customer.

I think if they did "all you can eat" for a flat subscription fee like game pass, they probably would have done well.
 
Upvote
9 (10 / -1)
If they just dropped the price of the Series S to about $150-200 Microsoft would capture most of the same audience that would have bought this.
Likewise, if Ford would just drop the price of the F150 to $25,000, they could capture most of the market looking for low cost trucks. It can be done, but it wouldn't be financially responsible.

Edit: Grammar
 
Upvote
29 (35 / -6)

goddog

Ars Scholae Palatinae
655
Subscriptor++
I’m still of the opinion fragmenting their own minimum specs at generation launches, weather the 360 with no had, or this gen with lower spec hardware is a bad idea, it hampers development of games that makes the jump between generation seem important, and negates the benefit of consoles being a one configuration goal. Late gen boosts i am more willing to see as a benefit, but even there I don’t think it’s necessary.,
 
Upvote
7 (9 / -2)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.

Streaming gaming really blows in terms of compressed visual quality and latency for many titles. This is to say nothing even of congestion as all those cloud GPUs get constantly eaten by AI use cases (yes yes I know they're not equivalent NVIDIA product lines, but in many cases they're competing for similar data-center quality chips and NVIDIA can only make so many GPUs so fast).

Even at 1080p it's trivial to see the compression without trying in their current best solution. Compression/bandwidth utilization might get better but latency and the resulting janky feeling is a really hard problem to crack reliably for a large section of the populace who don't have rocket-fast Fiber to the premises and live near major peering areas.
Even streaming on a local network with Xbox Remote Play is garbage imho. Any detail in dark scenes is crushed into an inscrutable black glob of pixelation and lag is noticeable. I haven't checked in on the quality in a while, but it feels like Microsoft has completely stalled on game streaming while they get distracted by the next shiny thing.
 
Upvote
-3 (9 / -12)
I can also attest for Stadia working well. I played Assassin's Creed Odyssey on the Stadia Beta and it played extremely well at 1920x1080 on a 125mbit line.

They gave you the local game as part of the beta ending; and it looked almost the same playing locally.

I was technically impressed, but then as you note: the ownership and pricing model were atrocious. I was never a paying customer.

I think if they did "all you can eat" for a flat subscription fee like game pass, they probably would have done well.
I would add to all your opinions.
At the time I only had a business laptop and tried to use Stadia and it worked very well indeed
The problem was with the business and ownership model.
I ended buying a Series S.
 
Upvote
3 (4 / -1)
Post content hidden for low score. Show…
If they just dropped the price of the Series S to about $150-200 Microsoft would capture most of the same audience that would have bought this.
So.... Nobody? I mean, the data already shows how badly MS is doing in console sales and GP subs. Good decision on them to can this.
 
Upvote
-10 (2 / -12)

Emon

Ars Praefectus
3,921
Subscriptor++
Nvidia GeForce Now reportedly has decent latency with a wired connection. Even John Linneman from Digital Foundry approves. The issue as I see it is that cloud streaming is not particularly economical for the platform holders. Running servers isn't cheap!
GFN is fantastic with a good connection and decent location. I'm in Seattle, connect to the Portland server, and my pings with wired ethernet are consistently 7-9ms. On wifi it'll be 10-12 with occasional stuttering up to 25 then back down. This is at 3360x1440 and 120 fps. Visual fidelity is excellent, even in dark areas.

Microsoft first experimented with streaming a LONG time ago. Their first prototype was literally modified, rack mounted Xbox 360s. They should have been using their money to just build service locations and beat everyone else to it. Instead, NVidia now owns it all.

It really is all about latency. It's literally the laws of physics that are the ultimate limitation here, so the service side needs to be really good. You cannot just optimize away these problems with clever software. You need physical presence. That whole c thing is something we can't change, ya know.
 
Upvote
10 (13 / -3)

cks445

Smack-Fu Master, in training
62
Subscriptor
Not to mention the data consumption. I've already resorted to connecting my living room tv to my cellular hotspot to avoid going over my cox data cap. Going to cloud gaming would mean nearly doubling the cost of my internet connection to add on unlimited data.

So, you're saying that ISPs ought to give this away to their customers, to increase telco profits....
 
Upvote
-1 (1 / -2)

Eredus

Smack-Fu Master, in training
65
Subscriptor++
I've been trying GeForce NOW on and off for some time now. It's ok for games like Diablo 4, Age of Empires, single payer titles (Starfield, Cyberpunk).

This was using the Ultimate membership, with 1gig FIOS, maxxed settings including 120hz monitor to use their 120hz option.

The input lag is noticable, but negligible due to the nature of those games.

I even enjoyed some Battlefield and CoD with it, but those the input lag even at the 120fps/120hz was a bit much for me.

It was UN-playable (at any casually competitive level, not even actual comp) for a title like CS2, again with 120/120, IMO -- like an uber laggy wireless mouse/kb. I don't see it ever working for these kinds of games, and even CoD and Battlefield honestly, it's just fun to try with all the Ultra High graphics settings if you don't personally have a 5k+ gaming rig already.

I could see it working well perhaps for console gamers who do not ordinarily glean much advantage from high RR and may not notice the input lag on a TV/controller. Or, if they are able to do some client side magic akin to how the netcode became visually client side for these games (compared to say, CS 1.6/HL1/TFC with like 120ms ping, where you had to lead shots).

But all that aside, on rare occasions when my latency would be > 5ms, it would seem totally unusuable.
 
Last edited:
Upvote
5 (6 / -1)
Even streaming on a local network with Xbox Remote Play is garbage imho. Any detail in dark scenes is crushed into an inscrutable black glob of pixelation and lag is noticeable. I haven't checked in on the quality in a while, but it feels like Microsoft has completely stalled on game streaming while they get distracted by the next shiny thing.
That's because interactive game streaming isn't a significant market. It's very niche and really doesn't bring a value proposition to most gamers even if you ignore the infrastructure issues related to poor data services. Microsoft knows this. The value prospect for consumers just isn't there if the hardware is significantly more than $100 considering their primary competition would likely be Nvidia and GeForce Now (which runs on that old PC in the basement), not Sony.
 
Upvote
7 (7 / 0)

Hichung

Ars Praetorian
455
Subscriptor++
So, you're saying that ISPs ought to give this away to their customers, to increase telco profits....

Not sure if this is /s but I’ll assume not. Cox uses caps to “fix” congestion because they sucked at building out their network for the last couple of decades.

But hey throw over another $50 for unlimited and suddenly there’s no network congestion to your home apparently, just to the neighbors who won’t fork out more money.

Fuck Cox.

In the decade since I moved, the place I moved from has had 2 fiber providers offering service to my old address (1gbps+) and Cox in the same time has lowered their upload speed and offered “powered by fiber” bull shit.
 
Upvote
7 (7 / 0)
Right which is why I'm shocked they couldn't get the pricing right. Seems like it should be doable with a controller for 125-150 with all the Roku like shit built in.

Phil Spencer (or maybe someone else at Microsoft) has talked about this. The problem they had is the Xbox OS is designed for an AMD APU, and AMD didn't have an APU inexpensive enough (and with low enough power consumption) to make a very low cost streaming box.

They could have gone and purchased another chip architecture from someone else (at significant expense unless it's completely off-the-shelf) but then they have the issue of no OS to run on it. Sure they could develop an OS (e.g. Linux or try and adapt Windows on ARM) but once again that involves a significant investment which doesn't make economic sense for an inexpensive set-top box while they need to continue to support and develop Xbox OS for the mainline consoles.

Now, there are ways this could still happen one day. Microsoft could choose a different partner for the next-gen Xbox, and part of that selection could be ensuring their partner has both a high-end and an inexpensive low-end chip which could run the same OS. Then, while it might be costly to port Xbox OS (to ARM or whatever they select) at least they'd get economies of scale running the same core operating system across their entire family of consoles, from a low end streaming box to a higher end traditional console.
 
Upvote
13 (13 / 0)
It's more likely that they realized that similar efforts like Stadia and Luna have been massive face-plants for those companies. Show me a successful streaming-only gaming console without significant local compute.
I can't speak so much to Luna, but i think it was pretty clear Stadia was going to faceplant for Manny reasons most notably poor offerings and lack of trust not to kill it.

I personally avoided it because I haven't experienced a Google product worth paying for.

I think at $100 tops, a MS streaming box could have a chance. Their streaming service is excellent, is seamless when jumping between streaming/console/PC play, and has a pretty good catalog.

Even then, $100 would be a lot to ask since you can so easily stream MS' offerings from many devices. Really it would have to be like $50 but consider that $50 is what they charge just for a controller.
 
Upvote
4 (4 / 0)
Phil Spencer (or maybe someone else at Microsoft) has talked about this. The problem they had is the Xbox OS is designed for an AMD APU, and AMD didn't have an APU inexpensive enough (and with low enough power consumption) to make a very low cost streaming box.

They could have gone and purchased another chip architecture from someone else (at significant expense unless it's completely off-the-shelf) but then they have the issue of no OS to run on it. Sure they could develop an OS (e.g. Linux or try and adapt Windows on ARM) but once again that involves a significant investment which doesn't make economic sense for an inexpensive set-top box while they need to continue to support and develop Xbox OS for the mainline consoles.

Now, there are ways this could still happen one day. Microsoft could choose a different partner for the next-gen Xbox, and part of that selection could be ensuring their partner has both a high-end and an inexpensive low-end chip which could run the same OS. Then, while it might be costly to port Xbox OS (to ARM or whatever they select) at least they'd get economies of scale running the same core operating system across their entire family of consoles, from a low end streaming box to a higher end traditional console.
Ah that's cool, thank you for filling in the missing gap.

Honestly though what they should be doing is creating a Roku app and Smart TV app in general that connects to their game pass streaming service. Sell a controller and 3 months of game pass ultimate with streaming for $65 and all you have to have is any Smart TV or Roku.

I think someone else identified the real problem which is that the economics on streaming just aren't adding up. Xbox server hardware sitting in a data center is expensive especially on the electric bill. They can't provide all the games and the console for only $15 a month at any kind of scale. At least that's my guess on the biggest problem
 
Upvote
2 (3 / -1)
I'm not looking forward to the day this becomes economical. Most games aren't PC exclusives, which means a lot of what we get is designed around the lowest common denominator of games console. When game streaming becomes the norm, that means console ports designed around streaming's limitations like bad worst-case input latency, just like locked-30fps and bad KBM support used to be common.

That also means games designed around everyone being logged into the same mind-bogglingly massive render farm. Flight Simulator is an early taste of this; you can fly most anywhere you want in impossible detail because Microsoft will stream high-resolution maps of the entire fucking planet to you from Azure in a way that just wouldn't be practical to pre-install on your hard drive. Games designed around those capabilities will be hard enough to port that the PC version will probably be "just stream it on Nintendo Netflix, nerd."
 
Upvote
-3 (2 / -5)

Ianab

Wise, Aged Ars Veteran
119
Nvidia GeForce Now reportedly has decent latency with a wired connection. Even John Linneman from Digital Foundry approves. The issue as I see it is that cloud streaming is not particularly economical for the platform holders. Running servers isn't cheap!
Agreed. The platform has to both buy the CPU / GPU capacity, pay for the electricity, AND the streaming bandwidth. I can't see how the business model really works out, compared to having the User buy the title, and then pay for all those things themselves.

I can see how data centers and economy of scale might actually make it cheaper for the cloud provider, but the average user isn't going to count the extra $10 on their power bill from the new gaming PC, and will spread the cost of the hardware over "X" years use.
 
Upvote
2 (3 / -1)
Really shows how bad Microsoft is as an org if they couldn't figure out how to release a $100 streaming box.
Yeah, unlike all those other streaming devices like Stadia that just flew off the shelves, proving that market all on their own with no complaints about functionality and performance.

Wait, what's that? Most of those devices failed, and Microsoft were probably right to hold off on plans for a dedicated device and concentrate on building the market through existing platforms like non-Windows PCs and phones/tablets? Huh.
 
Upvote
1 (2 / -1)
I’m still of the opinion fragmenting their own minimum specs at generation launches, weather the 360 with no had, or this gen with lower spec hardware is a bad idea, it hampers development of games that makes the jump between generation seem important, and negates the benefit of consoles being a one configuration goal. Late gen boosts i am more willing to see as a benefit, but even there I don’t think it’s necessary.,
That’s a weird take. When the Xbox360 Arcade launched there was no requirement for installing games outside of digitally purchased games. The PS3 requiring game installs from discs (not a requirement for X360 discs) was because of a hardware limitation of the PS3’s Blu-Ray drive (fixed data read speed with large data).

Even the Series S “hampering development” doesn’t make much sense. For the first two years of this gen devs have been creating games around Jaguar cores and SATA HDD transfer rates because that’s the limitations of the PS4. The majority of the PS5’s best games are PS4 games with pretty mode upgrades for the PS5.
 
Upvote
9 (9 / 0)