Page 25 of 37 FirstFirst ... 15 23 24 25 26 27 35 ... LastLast
Results 241 to 250 of 368

Thread: The Xbone

  1. #241
    The Silent Photographer Zeph's Avatar
    Join Date
    Sep 2006
    Posts
    4,886

    Re: The Xbone

    That's not how it works with games now and would actually be a step back.
    It's pretty much all or nothing.
    Commit everything to the cloud (onLive failed miserably btw) or keep it to the local client.
    It's not about where the processing power is, it's about how long away it is. That time is crucial.
    OnLive got its funding as an IPO only because of their strict testing environment which pretty much equated to requiring your ISP to have direct backbone access to where their datacenters were.


    Learn some cpp and start digging into an engine to get a better idea of how a game actually runs. Unity or CE3 should do.
    As it stands, you say cloud without really comprehending the mechanics behind it.
    Reply With Quote

  2. #242
    Posts, posts EVERYWHERE! Warsaw's Avatar
    Join Date
    May 2007
    Location
    State of Pandemonium
    Posts
    8,647

    Re: The Xbone

    Wall-o-text incoming:

    Microsoft's new XBL cloud is a large number (300,000+) of servers using the Azure framework (I assume you know what that is) that are used to host your typical XBL features but are supposedly also capable of doing pretty much anything the developer asks Microsoft permission to do, i.e. streaming content. Events need to be synchronized between client and server, and you need a system in place to make the effects of latency invisible to the client; this generally involves compression and minimizing the amount of data that needs to be transmitted and synchronized by choosing what gets computed where. OnLive chose to have everything about the game computed on their side so all the user received were the audio and video streams while all OnLive had to receive were user inputs and ID checks. What Microsoft is proposing is that developers can split the computing task between two machines by intelligently choosing which tasks are capable of being done remotely with minimal impact by latency.

    Did I miss anything? It's straight-forward enough. There are all those details of what to store where, which side performs what task, but that is all dependent on the task you are trying to perform.

    I'm familiar with C++ and Python by necessity (yay school), enough to be able to follow what I'm looking at, though it's not my forté. I actually do understand the basics of how a current game works and is drawn; what I'm getting at, though, is that what you know about how a game currently works is not necessarily applicable to a game tailored to a hybrid remote-local computing solution. You could all but forget things like LOD if you offload to the remote render farms: you have your geometry calculated and rendered remotely with identifiers so your machine knows which textures to employ where and with what lighting so it can paint by numbers when it receives a video feed. It's 3D on the server's end, but all your machine is doing is largely 2D work. By removing the colour information from the transmission, you cut down on packet size; you could even let the 3D world have a low resolution and use the identifiers and 2D painting to mask it with the player none the wiser. It's a new concept (and I may actually have it backwards with which side does what), and that's really what makes this exciting: it's new territory. There is nothing on the market, to my knowledge, that can currently take advantage of this hybrid solution out of the box. Microsoft doesn't even have a system in mind, though it was them who said that it was capable of letting developers use their servers to offload some of the work. And, what's even better, is that what they suggest we can do with it now is not going to be all we can ever do; the console hardware is stuck at what it is while internet connections and the Xbox Live back-end can and will be constantly improving.

    Really, the Xbox One is a much more exciting package than the PS4. Sure, Sony could theoretically do all of the above but from what I've read, they don't already have the software and hardware infrastructure in place to do anything like what Microsoft is suggesting the Xbox One can do. I, like you, prefer to have my games all rendered and stored locally, but that doesn't make the potential any less cool. Yes, it's more efficient to render everything server-side, but I think the hybrid idea they are putting forth is a way to get around the supposed bandwidth issue (also local storage/horsepower issues) and to slowly ease people into a world where all of their software is provided as a service. While I'm appalled at the latter thought, the former is neat.

    By the way, OnLive also didn't really fail because of bandwidth issues (I tried it, it worked just fine), it failed because it didn't offer people anything that they couldn't already do at a price that made it worth switching. It wasn't convenient enough. Anybody who could afford a sub to OnLive, and didn't want to play on the PC, had a console already and with a larger library of games and no requirement to pay a recurring fee. It could have had a promising future on handheld mobile devices if carriers didn't all price their data rates through the stratosphere; I would have subbed for that (though I'd rather be able to stream games from my already capable home PC) and I loathe subscription-based business models.
    Reply With Quote

  3. #243
    Senior Member =sw=warlord's Avatar
    Join Date
    Jan 2007
    Location
    Dalek Crucible
    Posts
    5,331

    Re: The Xbone

    There was a time people thought it wasn't feasible to put the memory controller on the CPU, there was also a time where people thought offloading the video processing to a dedicated controller was not worth it.
    Both of those assumptions have since been proven wrong, with that I'm reasonably sure as connections speed up offloading work to "cloud" will be worth doing more and more.
    Reply With Quote

  4. #244

    Re: The Xbone

    It's 3D on the server's end, but all your machine is doing is largely 2D work. By removing the colour information from the transmission, you cut down on packet size; you could even let the 3D world have a low resolution and use the identifiers and 2D painting to mask it with the player none the wiser. It's a new concept


    You lose too much information if all your doing is sending 2D data to the client, paint by numbers isn't going to look good and you might as well just do the paint by numbers on the server which then just becomes another streaming service like onLive. Everything is pushing toward higher resolution why would you want to make it a lower resolution (will probably be worse off than the current generation).
    Reply With Quote

  5. #245

    Re: The Xbone

    Unless I can play without an always-on audio/visual recording device pointed at me I still won't even consider buying one.
    Reply With Quote

  6. #246

    Re: The Xbone

    ( ͡° ͜ʖ ͡°)
    Last edited by DEElekgolo; June 26th, 2013 at 09:07 PM. Reason: ( ͡° ͜ʖ ͡°)
    Reply With Quote

  7. #247
    Posts, posts EVERYWHERE! Warsaw's Avatar
    Join Date
    May 2007
    Location
    State of Pandemonium
    Posts
    8,647

    Re: The Xbone

    @Freelancer:
    Well me neither, but still.

    @Skyline:
    Because your local hardware can't necessarily do high-resolution geometry AND high resolution textures AND high fidelity lighting, that's why. The numbers can be more or less densely packed, and you'd have an algorithm that smooths out the final image. The whole point is that you transmit only the bare minimum information needed to draw that 2D image; but you first need to calculate the detailed image before you can decide where to put your info points. This is why I said voxels (which I still maintain are an illusion) might become a thing again. I've seen voxel animation, and the only reason it looked bad was because the game came out in the mid-90s and PCs simply didn't have the horsepower. We now have the horsepower.
    Reply With Quote

  8. #248
    A Loose Screw Phopojijo's Avatar
    Join Date
    Dec 2006
    Location
    Ontario, Canada
    Posts
    2,749

    Re: The Xbone

    Going back to "cloud processing", you could also do things like re-render cubemaps or lightmaps while destruction happens and blast them out to all applicable clients before they're needed.

    The problem is that most of this stuff could have just been crammed in spare cycles of a few light-load frames. I don't know...
    Reply With Quote

  9. #249
    Posts, posts EVERYWHERE! Warsaw's Avatar
    Join Date
    May 2007
    Location
    State of Pandemonium
    Posts
    8,647

    Re: The Xbone

    It's all about the detail. If you cram it into the spare cycles during a light load frame, you have to make do with whatever elbow room you have at that moment so you lower your fidelity target as a precaution against overreaching. If you offload it, latency becomes your main concern but that's far more consistent than the local hardware's dynamic load. You can therefore potentially set your targets higher and even scale them with the connection speed.
    Reply With Quote

  10. #250
    Neanderthal Dwood's Avatar
    Join Date
    Sep 2008
    Location
    Wouldn't u like to know?
    Posts
    4,186

    Re: The Xbone

    Personally I think the bandwidth will only really be able to be sustained by ridiculous services such as google fiber. HOORAY Google fiber.
    Reply With Quote

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •