PAX Dev slides

Discussion in 'Planetary Annihilation General Discussion' started by varrak, September 2, 2014.

  1. varrak

    varrak Official PA

    Messages:
    169
    Likes Received:
    1,237
    Hey, so very quick one for all you technology junkies out there...

    I did a talk at PAX dev (the PAX developer's conference that happens the 2 days before PAX prime in Seattle). It was some general advice on optimizing PA's rendering engine (which is OpenGL based), targeted at developers with Direct3D backgrounds (which is where I came from before Uber).

    I posted the slides here (https://www.dropbox.com/s/qjfgktemcomfriw/Optimizing OpenGL.pptx?dl=0) for anyone who is interested. I've put the talking points in the speaker notes on each slide, so they (may) make a little more sense.
    LavaSnake, Schulti, tatsujb and 26 others like this.
  2. mered4

    mered4 Post Master General

    Messages:
    4,083
    Likes Received:
    3,149
    Awesome. I'm reading it. Nice Slides.
  3. feyder

    feyder Member

    Messages:
    88
    Likes Received:
    13
    Thanks for sharing these, love code and tech :D
  4. SXX

    SXX Post Master General

    Messages:
    6,896
    Likes Received:
    1,812
    Just wonder before on forum you talking that you want to adopt extensions which isn't yet included within GL core. Did you mean direct_state_access back then? Also when it's not included into OpenGL 4.5 do you see possibility to use it in PA? Thoughts?

    Or did you mean some other extensions too? Which one?
  5. varrak

    varrak Official PA

    Messages:
    169
    Likes Received:
    1,237
    I mention some in the slides. One's the multi-draw indirect stuff (with array textures), which will be a performance win. One's moving some of the CPU load onto compute shaders. A lot of it is still to-be-determined, to be honest.
    cdrkf and SXX like this.
  6. selfavenger

    selfavenger Active Member

    Messages:
    128
    Likes Received:
    78
    Very cool stuff @varrak. Thanks for sharing mate.

    I was hoping I might read something in there about the future implementation of SLi? Is it still on the table as a future improvement down the track?

    Cheers,

    -Todd
    cdrkf likes this.
  7. varrak

    varrak Official PA

    Messages:
    169
    Likes Received:
    1,237
    SLi is handled in the driver. Support should be automatic. PA is mostly limited by cpu right now anyway, so there's little benefit to get from sli until I move more compute work to the gpu.
    tatsujb, cdrkf, Remy561 and 1 other person like this.
  8. selfavenger

    selfavenger Active Member

    Messages:
    128
    Likes Received:
    78
    Thanks for the response @varrak,

    Good to know!

    Cheers,

    -Todd
  9. Tormidal

    Tormidal Active Member

    Messages:
    243
    Likes Received:
    158
    This slideshow made me mentally aroused.
    I always love seeing backbone stuff for games, its so interesting.

    I have an old dual Intel LGA2011 system(It still runs like a beast. :) ), yknow, dual Xeons? Its fun stuff. Will that system run PA at all, since its using two physical processor sets? And more importantly, since a lot of PA is processor based, will it run better?
  10. SXX

    SXX Post Master General

    Messages:
    6,896
    Likes Received:
    1,812
    No reason why it's wouldn't work. Actually I suppose I know one of two guys who tested it, but under Windows.

    Not likely. I have no idea what actual overhead is, but I pretty sure that communication between processors have high delay and RAM is also issue so using two sockets for single game client wouldn't work well.

    Though this is interesting question related to server-side.
  11. websterx01

    websterx01 Post Master General

    Messages:
    1,682
    Likes Received:
    1,063
    The game isn't optimized to use that many cores, so the performance wouldn't be much different
  12. squishypon3

    squishypon3 Post Master General

    Messages:
    7,971
    Likes Received:
    4,357
    The game barely utilizes a duel core multi-threaded CPU atm. :p
  13. someonewhoisnobody

    someonewhoisnobody Well-Known Member

    Messages:
    657
    Likes Received:
    361
    Is it bad that I'm making a openGl engine for myself and I didn't understand 1/4 of it?

    He he he. Looks like I have a lot of reading to do.
  14. totalannihilation

    totalannihilation Active Member

    Messages:
    215
    Likes Received:
    168
    I really liked this
    even though I didnt understand half of this tech talk, I got the point that developing in openGL is really tough, and doing such things with openGL is quiet imprissive

    on the other hand, did I get right that the game has a greater performance with vendor "N"?
    I guess my next computer will be nvidia xD
  15. exterminans

    exterminans Post Master General

    Messages:
    1,881
    Likes Received:
    986
    Not necessarily. What he said, was that vendor "N"'s driver was more tolerant towards broken shader syntax and other "you should not have done that"-stuff, plus support was more competent/responsive.

    Not a single word about raw performance though. Plus, Nvidia apparently also had some nasty bugs.
  16. doud

    doud Well-Known Member

    Messages:
    922
    Likes Received:
    568
    Thanks !!!!!! jumping on it !
  17. japporo

    japporo Active Member

    Messages:
    224
    Likes Received:
    118
    I've said this in another post but former Valve developer Rich Geldreich's "The Truth on OpenGL Driver Quality" is worth a read and matches up with what I've heard elsewhere from credible sources. The comments are also enlightening.

    The thing that's a bit worrying about all these vendor shenanigans is whether games of today will continue to work into the far future. I can think of a fair number of 7-10 year old titles that have graphics problems or simply crash on up-to-date PCs.
    Last edited: September 3, 2014
    shootall and varrak like this.
  18. Quitch

    Quitch Post Master General

    Messages:
    5,885
    Likes Received:
    6,045
    I'd still go nVidia though because their dev support is so much stronger, which means that games like this, and many others, tend to have far less issues on nVidia hardware. AMD often give better bang for buck, but you're going to have to put up with more games having more quirks.
  19. exterminans

    exterminans Post Master General

    Messages:
    1,881
    Likes Received:
    986
    Dunno, as a dev, I would probably go for AMD since they have superior cross platform debugging tools and a less error tolerant implementation, and a second alibi card from Nvidia to get access to their support, because that is virtually non-existent with AMD. Going purely Nvidia will probably result in unclean / unportable code since the driver is tolerating far too much.

    As a user, it doesn't matter much. Yes, few applications will be "broken" on AMD cards (mostly since the devs did only develop on Nvidia cards, screwed up, and the Nvidia driver was so "generous" to ignore/fix the error silently), but in the end, all well tested applications run equally well on both platforms.
  20. Quitch

    Quitch Post Master General

    Messages:
    5,885
    Likes Received:
    6,045
    Sorry, I meant as an end user. As a dev you're going to go all three because doing otherwise is madness.

    But not all applications are well tested, and even if they are discovering a bug is no guarantee of resolution prior to shipping.

Share This Page