Multi-GPU compatibility to avoid SUPCOM problems

Discussion in 'Planetary Annihilation General Discussion' started by sovietsandwhich, April 8, 2013.

  1. sovietsandwhich

    sovietsandwhich New Member

    Messages:
    2
    Likes Received:
    0
    A thing that I and many others have always thought was a great hardware setback in the Supreme Commander and Supreme Commander Forged Alliance games was a lack of support for multi-GPU machines via Scalable Link Interface.

    I, like many others, have chosen to pick 2 2nd rate Graphics cards instead of 1 1st rate card, in my case this was because of a sale that was on at the time. Through the combined use of these however I actually get more processing power than 1 1st rate card of the combined price of my two.

    However, due to the lack of multi-GPU support in SUPCOM, I have to seriously downgrade my graphics settings in order to run the game at any reasonable FPS as I only have half of my graphical processing power, making the game lose that amazing touch of beauty. This would not be a problem if the game supported multi-GPU as I would then be able to run both of my cards.

    Though I admit that I have no idea what having or not having multi-GPU support means from a game developers standpoint, I'm just putting out an opinion of many hundreds if not thousands of people who have experienced these problems due to the lack of multi-GPU support.

    Therefore I urge that any developers that happen to read this, to consider implementing it in the game, I guarantee that people will be happy with more flexibility in terms of hardware setups for this game.


    -By the way, the unit canon on the moon totally sold it for me.
  2. bobucles

    bobucles Post Master General

    Messages:
    3,388
    Likes Received:
    558
    CPU. The slowdown in Supcom happened with the CPU.

    The dev goal is to hit 64 cores with some unknown limit on graphics. Whether this happens more or less depends on how tough it is and how well they nail it.
  3. rhkcommander959

    rhkcommander959 New Member

    Messages:
    6
    Likes Received:
    0
    CPU and Memory. But the memory was due to a memory leak... Sorian and FAForever seem to have fixed that problem, as well as whatever Steam is using as far as I can tell. The steam version is patched higher than the non-steam went AFAIK.

    Long story short - GPG goofed when programming, causing excessive usage.
  4. bmb

    bmb Well-Known Member

    Messages:
    1,497
    Likes Received:
    219
    I hear the beta inevitably slowed to 5fps after 15 minutes?

    They've come a long way certainly, but yeah, supcom is just a terribly optimized game. Which doesn't help when it's already demanding enough without that overhead.
  5. sovietsandwhich

    sovietsandwhich New Member

    Messages:
    2
    Likes Received:
    0

    I don't think so, at least not for me. I'm running a i7-3770K overclocked to 4.05 Ghz and I make sure to monitor everything with my CPU and GPU's. The highest I have ever seen my CPU go to was 61% usage, and the highest with SUPCOM I believe was around 45-55%.

    For me this is definitely a GPU processing issue, as only one of my cards is working, at full capacity basically the whole time and it just can't handle the bigger fights at any reasonable level of zoom at higher graphics settings.

    So for me this is a problem with lack of SLI.
  6. bobucles

    bobucles Post Master General

    Messages:
    3,388
    Likes Received:
    558
    This post really belongs on Reddit. It's just too cute.

    I wish real computers worked that way, but they don't. Supcom uses 2 CPUS + change, and can not use any more. The task manager is basically lying to you.
  7. Causeless

    Causeless Member

    Messages:
    241
    Likes Received:
    1
    That's still an extremely powerful CPU. I don't see why you need to talk down to him.
  8. drsinistar

    drsinistar Member

    Messages:
    218
    Likes Received:
    0
    The thing is that the problem is with the game interacting with the CPU. You could have two i7 Extremes and still have this issue. Sovietsandwich doesn't understand that every single computer is going to have this problem. IDK what the OP is doing to make his framerate drop so bad.

    Just to put in my experience with the game, I have a Radeon HD 6670. It cost $70, and I always have playable framerates with the graphic settings maxed. However, I still encounter the slowdown, which happens for me around 40 minutes.
  9. neutrino

    neutrino low mass particle Uber Employee

    Messages:
    3,123
    Likes Received:
    2,687
    I think there is a lot of confusion here.

    Let me just say that since the engine architecture of PA has basically zero in common with SupCom let's see what our unique challenges end up being.

    Multi-GPU has never been high up on our list because very few people have that kind of setup. It's also not a slam dunk in terms of performance unless you are actually GPU bound.

    Basically this is a complex area that's not really easy to discuss in a thread like this anyway.
  10. FlandersNed

    FlandersNed Member

    Messages:
    233
    Likes Received:
    8
    So, essentially, multi-GPU support is not an importance (and probably won't be implemented) unless enough people seem to use it?



    I would hope that it will have support for it; I bought two graphics cards for future proofing and I don't want to have to use a single card that is a few years old now.

    Of course, if you can't do it then you can't do it.
    Last edited: April 9, 2013
  11. Polynomial

    Polynomial Moderator Alumni

    Messages:
    1,680
    Likes Received:
    53
  12. bobucles

    bobucles Post Master General

    Messages:
    3,388
    Likes Received:
    558
    There are a hundred posts like his every day where someone says "Well I checked task manager and it's only at 10-50% so it's not my CPU". The problem is at the CPU! The problem is because the game engine can not utilize the full computing resources at its disposal. The problem is because multicore designs have been out for over 10 years now, and everything pointed to multicore as the future, yet only a scant handful of games can even begin to tap that potential.

    That's part of what makes PA so exciting. It's being built from the ground up to use modern day CPUs and even full scale servers. It's going to be one of the first games to do what should have been done half a dozen years ago, and it's not even coming from a mega rich AAA company. That's pretty damn impressive, and I wouldn't be surprised if PA becomes a smash hit on scaling design alone. They just need a bit more munchkin/autism in the math dept, cuz their game balance hasn't always been great. No pressure. :D
  13. BulletMagnet

    BulletMagnet Post Master General

    Messages:
    3,263
    Likes Received:
    591
  14. bobucles

    bobucles Post Master General

    Messages:
    3,388
    Likes Received:
    558
    Ah yes, the red tape law.

    Video games have many places where they CAN go parallel, because there are a great number of high crunch activities that are easily isolated(more or less) from each other. The trouble is retraining an entire programing team to focus on those problems, instead of building another cawadooty from ancient architecture.
  15. numptyscrub

    numptyscrub Member

    Messages:
    325
    Likes Received:
    2
    Me Grimlock say, fetchecute them!

    On a slightly more serious note, that sounds like a complete physical design upheaval, with associated incompatible programming paradigm. Has an alternative architecture even been posited?

    (I am actually curious BTW)
  16. BulletMagnet

    BulletMagnet Post Master General

    Messages:
    3,263
    Likes Received:
    591
    Alternatives exist, but they're currently only ever considered for industrial and signal-processing applications.

    This is getting off-topic, but it's a topic I love so whatevs'.

    Most people understand enough about the central processing unit to agree that it's a fixed piece of hardware, that processes a pile of instructions that describe what to do. The instructions are software, and what you want to do is determined by the programmer. You iPhone can do pretty much anything you tell it to; all you need to do is give it the right instructions. It's very very flexible, and flexible can be a very desirable thing to have.

    Now imagine a wristwatch: a cheap and tacky digital one. It's got a chip in it too, but you can't tell it to do new things. It just counts numbers, and does that with amazing efficiency - how long do batteries on watches last for? A good long while. It's specialised, so what it does do it does exceedingly well. How long would your iPhone's battery last if all it was doing was counting a number? Sometimes the opposite of flexibility is also a very desirable thing to have. But you can't really have your cake and eat it too.


    There's this thing called an FPGA. It's also a giant silicon chip, like that in your computer and your phone. Anybody designing digital electronics as a job will probably work with these things on a weekly basis.

    You can give these things instructions too. These instructions don't tell what to do. These instructions tell the FPGA what to become. Inside an FPGA is a metric f**kton of transistors, all in little neat lines. Connected to every transistor is a little switch that can be toggled on-or-off electronically. On the other side of those switches is a whole mess of wires. It's urban sprawl of electronic highway. Every switch is connected to every other switch through this. The program that you load into an FPGA describes what switches need to be on, and what switches need to be off.

    Want USB on your FPGA? Simple. Make a set of switches that describe how a USB controller works. Want more USB? Simple. Use that description, and turn more switches on and off. Want a CPU in your FPGA? Work out what switches you need to set, and load it up. You now have a CPU. Want something to render tanks and explosions? Describe one. Want to make something to render tanks and explosions twice as fast? You can see where I'm going with this.


    This thing sounds fantastic! Why isn't it in everything?

    Well, they're expensive as hell. At least they have up until the last few years. First, a little bit of history.

    Because they're expensive, nobody expected them to be commonplace. Regular CPUs were becoming popular, and lots of research effort was put into those instead... especially into making software for them. Making software for your regular old computer requires a compiler; that converts what the programmer writes and describes into something the processor can process. There's usually more than one way to solve any problem. If you've got more than one way of doing something, it makes sense to choose the fastest way of doing it. Compilers do a lot of that for you. This meant that software became efficient, which meant CPUs became more popular, which meant people competed to make better compilers.

    And so on, and so forth.

    CPUs and software are everywhere. To abandon those, and switch to something new means abandoning almost everything that a multi-billion dollar industry was built on. Hell, it's probably a multi-trillion dollar industry.

    Anyway, FPGAs are expensive. Why?

    Because there's only three companies in the world that make good ones, and they only sell them to other big electronics companies who have millions to throw into R&D. There's little competition in the market, so the focus on innovation is to make FPGAs better and not necessarily cheaper. If they became cheap, they'd see a little more attention in the consumer market. And that's really what is needed to bring the price down.

    What's my prediction about the future. Something like the FPGA will be the future. But I don't know when. If you squint your eyes, and look at it from a funny angle you can see people realising that the standard CPU's days are numbered. Graphics processors were the first to step away from wrist-watch style fixed-but-efficient designs. They became glorified CPUs, then multicore CPUs. Now they're many-core CPUs, and that mess of wiring that FPGAs have.

    CPUs are picking up extra toys now; mainly by putting little GPUs in bed next to them and connecting a bunch of wires to them. Intel's Core i5s and i7s all come with GPUs on-chip. There's some AMD chips that do this too. While these GPUs are probably too weak to render Crysis 3, they are very handy for CUDA/OpenCL/PhysX calculations. Give it enough time, and we'll want them to do more.

    TL; DR. I just wrote an 800 word essay about awesome.
  17. apocatequil

    apocatequil Member

    Messages:
    109
    Likes Received:
    9
    Huh, I wonder if the first step in that direction would be FGPA attachments to regular CPUs, you make a program, the program tells the CPU what to do, the CPU processes that and turns it into an array for a cheap, non-sophisticated FGPA... In fact, if you could get your hands on a non-sophisticated FGPA, it's not out of the realm to design a CPU that utilizes it, and then modify a System to know when and how to run that CPU... You'd need Linux and loads of free time/motivation to do it though, and you'd probably be the only one that knew how to use it unless some big company bought it off you...

    But that's just a layman's "OOOOH this could be cool", and regardless of how topic derailing, or incorrect I am, respect for knowing about this awesome aspect of computing technology.
  18. BulletMagnet

    BulletMagnet Post Master General

    Messages:
    3,263
    Likes Received:
    591
    Fun fact: you can buy PCI-Express boards that come with FPGAs on them.
  19. apocatequil

    apocatequil Member

    Messages:
    109
    Likes Received:
    9
    Then hell, there's probably already a linux patch out there.
  20. drsinistar

    drsinistar Member

    Messages:
    218
    Likes Received:
    0
    Thanks for the post, I find stuff like this fascinating. Gonna eat everything on the subject now. :D

Share This Page