Graphics Update, v1.1

Discussion in 'Planetary Annihilation General Discussion' started by varrak, February 20, 2014.

  1. Dementiurge

    Dementiurge Post Master General

    Messages:
    1,094
    Likes Received:
    693
    Nvidia made one mistake with the 5000 series. ATI blundered constantly after that, peaking with the HD2000 series. It probably would have been better for AMD to let them go bankrupt and then buy the leftovers at action, because their luck seems to have been inherited by AMD, as well as their enormous debt hoarding.

    Between Nvidia's CUDA, 3D Vision, and now G-Sync, or Mantle, I'm about ready to sit with Matrox. Even if they don't make GPUs.
  2. bmb

    bmb Well-Known Member

    Messages:
    1,497
    Likes Received:
    219
    It seems that AMD was the one struggling after the success of Athlon. Intel got a headstart with core2, another headstart with Nehalem, and AMD just never caught up, game over. ATI seems to be carrying the business, I don't have any hard numbers on this but that's what it looks like from outside.

    I can't imagine them scoring all three consoles being anything less than a small miracle for the company.
    EdWood likes this.
  3. cdrkf

    cdrkf Post Master General

    Messages:
    5,721
    Likes Received:
    4,793
    Your missing a very overlooked ATI / AMD wonder card- the HD 5870. When that thing came out it was a beast, and what's really amazing is that as it is fully DX11 compatible its still pretty much fully up to current specs. I have a friend of mine still running it as his main card (and he got one right after they came out, it's been soldiering on all this time). There were a few benchmarks done recently comparing older gen cards and the 5870 is still up there (on the current AMD GPU stack its about the same performance as a 260X). Full 1080p on high settings in all the latest games is not bad going for a card released in 2009.

    The other thing to note- Nvidia still haven't got the multi screen capabilities of AMD / ATI, I do 3-D design work for a living and have been doing work involving multi screen implementations for training systems. We regularly use the 7870 eyefinity edition with 6 dp outputs. I'm not using this for the usual multi-panel wall, rather a very bizarre mixture of independent and cloned screens (and due to slot limitations on 'standard' computer motherboards being able to run it all off 1 graphics card is a real boon). What is quite amusing is the configuration we've got working completely baffled AMD support- they don't realise how capable their own tech is... But I guess that's similar to the issues you're getting with lack of developer support.
  4. thetrophysystem

    thetrophysystem Post Master General

    Messages:
    7,050
    Likes Received:
    2,874
    It was price efficient. Poop out cheap hardware, consoles will buy it.

    One thing I will not miss about consoles as I game on with my PC, is their shoddy hardware. Many fixes of controllers, consoles, and other things. Have to put so many sturdy radio shack parts where the cheap plastic pieces broke.

    Wish one of the consoles would have nutted up and went in on price with better hardware. Then again, doesn't matter, if it is same specs then console fanboys will talk over you as they promote their favorite. It is IMPOSSIBLE to talk to a console fanboys about realistic specs and such, like they don't actually consider how the data running devices is stored and processed and rapidly exchanged to make a computer device function, they just chant numbers like they mean nothing.
  5. Quitch

    Quitch Post Master General

    Messages:
    5,856
    Likes Received:
    6,045
    I still run one in my rig, got it when my 4870X2 died within warranty. Consoles have slowed the graphical upgrade cycle such that I don't think I've run across anything it couldn't handle. If I were to upgrade it would only be so I can throw higher resolution modded textures at Skyrim.
    cdrkf likes this.
  6. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    AMD "won" the console business because Intel and Nvidia intentionally avoided it.

    At the end of the original Xbox's lifespan Microsoft was still paying the same price for most of the components as they had been when the console had been released. So Nvidia and Intel were selling Microsoft 7 year old components for huge profits.

    For the Xbox 360 Microsoft made sure it owned the designs of the CPU and GPU so it could shop around who made them as well as modify them so they could be produced for cheaper. Intel doesn't allow people to do that and the price Nvidia asked for was too high for Microsoft, so they jumped ship to PowerPC and AMD.

    Sony had never planned to include a GPU on the PS3 as they had expected the Cell architecture to be much faster than it ended up being, so the ended up running to Nvidia at the last minute as they could provide a custom GPU in under a year.

    For this generation of consoles both Microsoft and Sony wanted to own the design and wanted low power requirements. AMD provided solutions to both companies in the form of modifications to their existing APU line. Intel wanted to use a radically different type of architecture for a new console (Larrabee) which scared off both Sony and Microsoft, and both Intel and Nvidia wouldn't sell their chip designs. I think there was a lot of pressure from developers on both sides to go with a x86 CPU, so IBM's PowerPC was out too.
  7. EdWood

    EdWood Active Member

    Messages:
    533
    Likes Received:
    147
    Why not, AMD's APUs do a great job and are cost effective. :)
    Only good for AMD... they need a break and I don't wanna lose AMD. Competition is good for us. :)
    The biggest joke now is that AMD has a great line of GPUs, but instead of $350 for the 280X you have to pay around $550... not enough regular parts available, incredible.
  8. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    Yes, in all honesty AMD APU is a great option. The Xbox 360 started out as having separate CPU and GPU chips that over time Microsoft merged into a single chip for reduced cost of production, reduced power usage and reduced heat production. AMD has already done that work. And AMD's graphics hardware has always been awesome, it's always been the software that's caused them issues. For the Xbox One and Playstation 4 the software while initially coming from AMD is going to have a lot more hands on it and iteration allowing for access to features and power that you wouldn't be able to use on a PC.

    The Xbox 360's GPU does some amazing stuff for example. It's technically a DX9 class GPU, but many of the kinds of things people do with it are similar to features the PC didn't see until DX11 class hardware.

    At the same time, a Larrabee console would of been an amazing thing. When I see the Killzone devs talk about "raytracing" reflections I chuckle a little, they're doing screen space reprojection which Crysis has been doing for a while (even on Xbox 360 and PS3). I have a different image in my head when I think raytracing, mainly because we worked on real raytracing on the Larrabee.
    http://www.polygon.com/features/2013/3/19/4094472/uber-hail-mary-monday-night-combat
    [​IMG]
    LavaSnake, EdWood and Quitch like this.
  9. bmb

    bmb Well-Known Member

    Messages:
    1,497
    Likes Received:
    219
    I can't say raytracing is a good idea for video games. It's not a good idea for commercial movies that literally have hours to render a frame and more computing power than you can shake a stick at, why would it ever be a good idea in realtime? It's the ugliest brute force approach to rendering and you can accomplish much the same thing with raster but much faster. I see raster techniques replacing raytracing as they continue to get better and approach CG quality. Already UE4 and similar advanced engines are showing off things running in realtime with raster techniques that you couldn't have done with all the offline raytracing in the world 10 years ago.

    There are certain limitations of course, but I would not trade the level of raster quality we have reached now for raytracing that looks worse than toy story, and probably belongs on an amiga.
  10. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    UE4 is using raytracing, it's raytracing voxel data using GPU compute. I don't think pure raytracing is the way to go, but a hybrid approach like UE4 seems more useful for games for the next few years certainly, but something like Larrabee would of allowed this transition to happen much more easily.

    Also it should be noted that Pixar from the original Toy Story up till Brave used Reyes raster and raycasting instead of raytracing. For Monster's University they switched to raytracing and have come to say, essentially, "we were wrong, we should of gone to raytracing years ago, it's better". Blue Sky, best known for their Ice Age movies, have been using raytracing since day one rejecting Pixar's assertion that Reyes was the "correct" way. Similarly ILM tossed Pixar's Renderman some years ago and have been using Arnold (a raytracer) exclusively for some time now, as has Sony. Even Weta who started out with Renderman have moved to 100% raytracing.

    Nvidia has a real time raytracer, Octane, which they've shown off doing film quality rendering at 60fps, granted on giant cloud server of 160 Titans, but still.

    Another more real-world level example is the Brigade game engine which is a real time raytracer running on normal consumer level hardware written by one of the guys working on Octane.

    We have nearly two decades of game related raster rendering techniques and all our hardware has been purpose built to accelerate those techniques. But we're starting to reach the limit of what we can fake with raster alone. There's a lot of research yet to do to get our proficiency with raytracing up to what we do now, but it'll be amazing when we do.
    Last edited: February 22, 2014
    LavaSnake, EdWood and Quitch like this.
  11. Dementiurge

    Dementiurge Post Master General

    Messages:
    1,094
    Likes Received:
    693
    I've been curious about the difference between ray-tracing and the cone-racing used in SVOGI. Hard to find an easy explanation of it, unlike ray-tracing. It was immensely disappointing that UE4 abandoned SVOGI, but apparently 3rd parties and Unity are picking up the slack. I'm not sure what UE4 is using now, maybe the same as Frostbite 2.

    I like the visuals Brigade resolves to, but it takes far too long to resolve in darker areas. That grittiness just isn't going to be practical.
  12. SXX

    SXX Post Master General

    Messages:
    6,896
    Likes Received:
    1,812
    Just wonder what is your opinion about using technologies like Euclideon UD for gaming? For obvious reasons they don't have any lighting (or lighting will be expensive),animation, but do you see possibility to use point clouds in games?

    It's interesting to know what you think about that now when it's not just marketing videos, but real technology that actually work.
    Last edited: February 22, 2014
  13. bmb

    bmb Well-Known Member

    Messages:
    1,497
    Likes Received:
    219
    Right, there are limited uses for raytracing in realtime such as the global illumination in BF3. But it is still a raster game. Even GI is not necessarily best done with raytracing as there are various techniques for faking it constantly under development. Again it is a brute force approach that will inevitably be replaced by a more elegant solution.
    UE4 isn't really using raytracing either, but a variant technique. The other big realtime GI, CryEngine uses something that doesn't even resemble raytracing.

    That certain studios decide to waste processing power doing the same thing they could be doing with a lot less is probably more of a testament to the increasing power and decreasing cost of computers than to the "correctness" of raytracing.
    Theoretically the technology is there to render something like Monsters University in a matter of days instead of months. On a desktop machine. The only thing left is waiting for them to figure it out and put it into practice.

    A game engine is obviously a different kind of beast from an offline renderer no matter how fast, but the quality is still getting there, and it won't get there by regressing to the early 90's in terms of technology.

    It just isn't a matter of raytracing needing development, it's a matter of raytracing being the most expensive/naive way to render something. Raster, reyes, scanline etc. those are the optimizations that replace it. Developed for an age that needed it far more than we do now. If raytracing ever catches on it'll only be because there is so much power to waste that it doesn't make a difference. And that won't happen in realtime.

    Consider that ambient occlusion was developed as a fast way to avoid raytracing. And that games still struggle to do even a pathethic facsimile of an effect that was made to be a cheap alternative to raytracing.

    Oh they removed it? I wonder why. It couldn't possibly be performance.
    Last edited: February 22, 2014
  14. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    I know some other developers are exploring the use of SVOGI, but it takes a lot of processing power and memory to get sharp details. As long as you're okay with a soft look, it's still a viable technique. If I was to guess UE4 is now using something like Crytek's Light Propagation Volumes.

    The Frostbite engine stuff works by having two versions of the scene; the high poly and low poly version. They render very low resolution cube maps of the low poly world to get bounced lighting information in real time. No raytracing at all. Sharp reflections are still done with techniques we've been using since Half-Life 2. It's possible UE4 is doing something like this as well, but unlikely as Epic likes making tech that allows for very a dynamic editor environment which the Frostbite tech doesn't really allow for.

    Euclideon is very impressive for what it is, but it's useless for games because it's strengths all come from the data being 100% static. I think their recent change in direction to large scale LIDAR scan visualization is good for them. There's an unsolved issue of how do you do skinned characters with voxels efficiently. UE4's SVOGI and movies that use voxel rendering techniques all take polygon meshes animated with traditional bone or morph target animation and convert them to a static voxel representation for each frame.
    LavaSnake and SXX like this.
  15. bradaz85

    bradaz85 Active Member

    Messages:
    532
    Likes Received:
    233
    Well, I just upgraded from an AMD 7970 BE to an EVGA 780 ti Classified and all I can say is just... WOW!
    Getting near double performance in some games, in PA its running very well, All settings maxed running about 110 FPS at start, about 60 mid game and about 40 for late game with loads of units! Thats about double from the 7970. Can even use PiP without a massive hit, very impressed. £600 well spent.
  16. tollman

    tollman Member

    Messages:
    93
    Likes Received:
    26
    What happened to the 7970? Might know some interested parties if it is at a reasonable price :)
  17. bradaz85

    bradaz85 Active Member

    Messages:
    532
    Likes Received:
    233
    Sold it to a mate for £140 after I got it RMA'd and replaced. :)
    tollman likes this.
  18. v4skunk84

    v4skunk84 Active Member

    Messages:
    196
    Likes Received:
    64
    You can blame US retailers for artificially increasing the price of AMD cards due to mining.
    Here in Europe AMD is still way cheaper. I can get a Msi TF4 R290x for £120 cheaper than the Msi TF4 780ti.
  19. tylerseacrest

    tylerseacrest Member

    Messages:
    56
    Likes Received:
    19
    @bmb
    First, whats your experience in graphics? I've seen a lot of post from you on the subject.
    For reference, I do graphics programming as a hobby and I am currently a junior studying programming and animation

    Pixar used raytracing in Monsters U and others for two reasons. First was ease of setup. Setting up lighting and materials using raytracing technology takes very little time when compared to other techniques and compute power is a ton cheaper than man power. Second is memory usage. Raytracers have lower memory usage then any other technique. The current top-of-the-line raytracer, Arnold, is capable of rendering billions of unique triangles with trillions of instanced triangles in the same scene in current render farms. Further more, Arnold is not that slow and it isn't even using the GPU. I highly recommend looking it up.
  20. EdWood

    EdWood Active Member

    Messages:
    533
    Likes Received:
    147
    I read about the mining but that is only one part of the story. AMD did not sell any crazy number of new videocards... obviously they did not order enough regular parts for the cards to satisfy the market... obviously they completely underestimated the demand, especially in the declining PC market. Just a shame, I wanted to get me a 280X but not for those prices.

Share This Page