1. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    My work PC has an iGPU (i7-3770 HD4000) and a NVS300 (2 monitors each) and the game performs very poorly. I am not saying it should be like a discrete graphics machine but it does meet the "minimum" requirements. I mean this is way faster then my laptop.

    Anyone else have a good iGPU experience?
  2. antillie

    antillie Member

    Messages:
    813
    Likes Received:
    7
    The Alpha isn't at all optimized for performance so the posted minimum specs don't really mean anything.

    However unless you have an AMD APU you really shouldn't expect integrated graphics to perform very well in games. Intel GPUs have always been terrible.
  3. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    HD4000 plays portal2 like a dedicated would. I have 2 E350's with 6310's and they are useless for gaming as they are single channel MCH. I have never used an A8/A10 but I am open to anyones integrated GPU feedback.

    Well, I gave up and doubled my machines power consumption and stuck a 5770 where the NVS300 was and it plays well now.
  4. mushroomars

    mushroomars Well-Known Member

    Messages:
    1,655
    Likes Received:
    319
    Portal 2 and pretty much all Source engine games are incredibly optimized. Remember that the Source Engine contains almost no real-time cinematics or physics, and has only really improved in shader quality, art quality, supported resolutions and post-processing quality over the years. I think Portal 2 only takes up like 20% of my CPU and GPU.

    You should never rely on iGPUs. My DEDICATED GPU can hardly handle PA, and it is still considered a "low-end" card as opposed to a "completely outdated, upgrade kthnxbai" card.
  5. antillie

    antillie Member

    Messages:
    813
    Likes Received:
    7
    Portal 2 is not a demanding game. I'm sorry but Intel graphics are a joke. Anyone trying to really game with them is just kidding themselves.

    An APU or a dedicated card are the only real choices.
  6. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    Thanks for reminding me why I don't post on forums.

    Have a nice life trolls.
  7. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    We've had issues with Intel integrated graphics which we will likely need to address in the future. However it is unlikely the game will be playable without a dedicated graphics card during the Alpha. On paper the Intel HD graphics in the Core i# CPUs have have all of the features required to play the game, but in practice they are simply far less performant for gaming and our engine is not yet fully optimized. While in Alpha performance is a major concern, our primary concern is making sure the game is playable on the workstations we have in the office so development can continue rather than supporting the lowest end systems out there.

    The Source engine used in Portal 2 has had nearly a decade of performance optimizations put into it. We've had about a month. :)

    The new Intel Haswell CPUs have significantly beefier integrated graphics than the previous ones, but we have not had a chance to test the game on those systems yet.
  8. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    And thank you for reviving my hope in forums.

    As I already noted I almost tripled my power consumption by installing an old HD5770 I had laying around into my i7-3770 machine and basically lost my iAMT support.

    I doubt the trolls know what that is or why anyone would care about environmentally responsible computing. Know this, I build stacks of 22nm high density servers for work and our newest chassis hold 12 nodes (~160GHz/384GB DDR3) in 3U's pulling 314w with all C3/EIST etc disabled.

    I understand the engine needs time, I am looking forward to a new haswell laptop in the very near future, and I really do not want discreet graphics.
  9. bgolus

    bgolus Uber Alumni

    Messages:
    1,481
    Likes Received:
    2,299
    Benchmark wise the upcoming Haswell CPUs with the Intel HD 4600 are about on par with the AMD APUs in terms of graphics performance, which are almost on par with the 5 year old NVidia 9800 GT which is our unofficial low end target... which currently can run the game but not very well.

    The Intel HD 5000, Iris 5100 or Iris 5200 integrated graphic chip sets that are coming soon are supposed to be roughly ~2x more performant. I'm hopeful that puts in the league of useful gaming performance.
  10. Baleur

    Baleur Member

    Messages:
    122
    Likes Received:
    22
    You shouldn't dismiss peoples replies as "trolls" when they just tell you a honest opinion (or fact).
    Building servers is very different to gaming computers. Different hardware requirements.
    Same for video processing and 3d rendering.

    The truth of the matter is, without one of the industry leading GPU's you will have issues in one way or another playing games. It will work fine with some games, and not with others. This is because the reason they are industry leading (nVidia and ATI) is because the vast majority of game developers (including independent developers, single-man teams) base their entire engines on those two standards.

    This leads to optimization and customization for those respective rendering pipelines and driver setups that you simply do not have if using an alternative such as integrated gpus.
    I'm sorry this is not the reply you want, but it's the truth of the matter when it comes to pc gaming.
  11. mushroomars

    mushroomars Well-Known Member

    Messages:
    1,655
    Likes Received:
    319
    [​IMG]

    This is a glazed donut, It is tasty.

    However, unless someone tells you otherwise, you don't know you are eating a ring of processed, deep fried complex carbohydrates covered in molten sugar crystals. This thing is about as dangerous as eating a similarly dimentioned brick of lead.

    My point is, that you seem to be uninformed about how terrible iGPUs are. We aren't going to tell you your iGPU is great, because then you might be unhappy, which we don't want. You may also misinform others about iGPUs. Uber isn't going to cut corners until the game is a circle so it will run on your iGPU.

    Severs primarily deal with raw mathematical calculations and networking. GPUs primarily deal with rendering. The difference is that one requires a butt-ton of SPEED (Math) and the other requires a butt-ton of POWER (Rendering). Unless you're mining bitcoins, a Server only needs a GPU to display what little information humans need to read from it. Please correct me if I'm wrong about any of this information as my understanding of both servers and graphics processing is minimal.

    Back to the glazed donut, we don't sugarcoat things. That leads to misinterpretation, which leads to disappointment in the long run. Yes, that glazed donut looks tasty, but you won't be so happy when you die a painful agonizing death after eating them.
  12. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    Wait donuts are going to kill me!?!?

    I was dismissing the initial posts because I asked for other people success stories, not the poisonous lead I was fed. I would love to hear someone with an A10 say they can play the rough cut alpha engine. I appreciate hearing an unofficial goal base gpu. Believe me we I say I know what its like not to make the pipe dreams come true in software development.

    I never said iGPU's are great, and the only amd units I have are single channel MCH's I wish I hadn't bought on a black friday sale :) altho I have been tempted by the new A10s I doubt I will pull that trigger before a new laptop.

    My current mobile device was cutting edge and served my needs to be able to saturate 1Gb links with 64b iperfs. As you see I am more of an enterprise person who looks at numbers of machines in blocks. So when you are evaluating a purchase for 100 systems you are almost never buying consumer gaming cards. Point being I now need 10Gb traffic generation in mobile processing and the commercially available gear isn't as portable or in the price range I can sneak through like a new laptop. I am just hoping a non-mac thunderbolt machine is released soon. Then I can hopefully pass time lost to airport delays with some planet smashing entertainment.
  13. antillie

    antillie Member

    Messages:
    813
    Likes Received:
    7
    This is the issue with Intel graphics. They are so far behind in performance that they are next to useless for gaming. Even a flat 2x increase in performance (which I doubt that the 5000 series will really achieve) will not be enough to make them competitive.

    The only reason that AMDs APUs are acceptable low end solutions is because AMD bought a discrete graphics card company (ATI) and is literally putting ATI GPUs on the same die with their CPUs. Intel does not have this massive advantage that lets AMD's on chip graphics keep pace with low end dedicated cards.

    Even if Intel puts some magic super GPU on their chips they will still be a low end solution at best since the normal DDR3 memory that all integrated GPUs have to use just cannot compete with the super fast timings and throughput of GDDR5 on a 384-512 bit memory bus.

    Nvidia and AMD/ATI also spend an awful lot of time optimizing their video card drivers for gaming performance. They have been doing it for over 15 years and they have gotten really really good at it. Intel, not so much.

    Intel is a CPU company. The only real reason that they currently have and have ever had integrated graphics in their CPUs is so that companies like Dell and HP don't have to put video cards into the bazillion office workstations that they sell to businesses every year. That way each workstation is $30-50 cheaper. This lets Intel sell gazillions of CPUs and rake in money that AMD could only dream of.

    These graphics processors exist to render people's emails and spreadsheets. While their drivers are designed to be stable, they are not optimized for 3D performance like AMD's and Nvidia's are. Designing high performance 3D hardware is expensive. Driver optimization is also expensive. Intel saves a crap ton of money by not doing these things and focusing on their core business of making and selling gazillions of really good CPUs. If Intel really cared about making gaming class GPUs they would have either purchased Nvidia or entered into a licensing agreement with them ages ago.

    The only reason that Intel graphics support the shaders and stuff needed for DirectX and OpenGL is because they have to support them to quality for the "Designed for Windows" logos you see on so many pre built PCs. Once again this goes back to Dell and HP not needing to put video cards into everyday business workstations. (And normal home PCs.) Look at how Microsoft's requirements for the Windows hardware logo certification program have changed over the years and then compare that to the features in Intel's GPUs over the same time period. MS puts an awful lot of pressure on OEMs to meet these hardware requirements and Intel is happy to give them a cheap and easy way to meet them that doesn't require a video card.

    I am not trying to troll here. I am trying to help people understand why Intel graphics are terrible for gaming. Its not because Intel doesn't know what they are doing, it's because Intel does know what they are doing. In fact they know what they are doing so well that they are making huge amounts of money. (Almost 10x that of AMD.)
  14. SirChristoffee

    SirChristoffee Member

    Messages:
    45
    Likes Received:
    1
    While your general idea is good, I just want to try express the bold words in more depth, here goes my potentially wrong, crude understanding:

    They are both math if you dig down to the bottom of it. The difference being how they do the math, and what tasks are generally given is based on this.

    The CPU deals with mathematical tasks which generally run in a largely sequential manner, some parallelism if the program is written for it (yay for PA) and the CPU has multiple cores. This is fine for basic arithmetic, which a lot of the central, simulation level code in a program consists of. Think of the metal income in PA. There is a set value of income, each second that value is added to your total value of stored metal. A true false statement could be used on every addition to test if the stock pile is full:

    currentValue = oldValue + income;
    if (currentValue > maxStorage){ //This your true/false statement
    currentValue = maxStorage; //Do this if above statement returns true
    }

    (note: this doesn't deal with displaying it on the screen, just manipulating the stored values every cycle. This example would likely be in some looping structure, but I have no idea how uber has chosen to implement this feature.)

    The CPU is constantly running through a whole lot of code dealing with different things. Think move commands, build commands, attack, damage inflicted, damaged recieved, has this unit got no health left, if so blow up....

    In this (really cool) video, more cores could be like the CPU robot having 4 heads, each of which start a quarter of the way around the circle.

    The GPU deals with mathematical tasks that are doable with large amounts of parallelism, rendering is one of these. GPUs tend to have 100s to 1000s of cores (which individually are designed quite differently from CPU cores, but I want to say away from that), allowing for a high amount of parallelism. Think CUDA cores in regards to Nvidia GPUs. This structure lends it self to matrix and vector operations, a large portion of graphical processing.

    I haven't gotten to coding/playing with GPUs yet, so not going to try show some semi-pseudocode. But from what I can gather, in regards to graphics, the code the cores are executing doesn't vary much, largely dealing with many values that need to be changed at the same time (these values will be changed according to what outputs the CPU provides while running the underlying simulation). While a unit (or any object) moves from A to B, each of the vertices of each polygon being displayed to the user on that unit need to be moved (via a vector operation) to new coordinates. Each of these vertices are dealt with by an individual core. If you had a CPU with 4 cores, only 4 could be changed at a time, while a GPU with 512 cores can do 512 at a time. Perhaps the values of these are stored in a matrix, but this paragraph I am really stabbing in the dark. Go look it up on wiki or something if you want to know more :) .

    After getting the polygons in the right places, it's a matter along the lines of attributing textures (stored in the graphics card memory) to each polygon, adjusting the colours (on a pixel-pixel basis, more cores = more pixel at a time) according to night/day, sending this info to your screen so it can display it to your face. And it does all the above dozens of times a second, hopefully, to give you your lovely 60FPS.

    Disclaimer: I have no idea what your (mushroomars) personal knowledge on this subject is, but while directed at you, this is more for people who read your comment wanting to know more about the fundamental difference. I also hope it is accurate, but not making promises. Please correct me if you can.
  15. neutrino

    neutrino low mass particle Uber Employee

    Messages:
    3,123
    Likes Received:
    2,687
    I really really want the game to be playable on these. Going to put some work into doing some more testing on them.

    Try turning everything to low and see what happens.
  16. neutrino

    neutrino low mass particle Uber Employee

    Messages:
    3,123
    Likes Received:
    2,687
    I also just dropped some keys on Intel so we can do some work with them on this.
  17. lnslunchbox

    lnslunchbox New Member

    Messages:
    12
    Likes Received:
    0
    That would be wonderful to see. A well polished engine (kinda like valve's source engine) would be great.

    And that is one tasty looking ring of processed, deep fried complex carbohydrates covered in molten sugar crystals.
  18. neutrino

    neutrino low mass particle Uber Employee

    Messages:
    3,123
    Likes Received:
    2,687
    It will never run as fast as source because it does a lot more complex stuff.
  19. lnslunchbox

    lnslunchbox New Member

    Messages:
    12
    Likes Received:
    0
    As an RTS game I agree. But there are so many games that are released that aren't polished enough. Some devs just seem to raise the requirement bar a tad higher just to avoid doing too much polishing.

    All of us fans approve of the great work you guys do. :D
  20. racerxoffl

    racerxoffl New Member

    Messages:
    10
    Likes Received:
    0
    I have excess hardware if you want access to some my HD4000 machines, I have quite a few of them...

    Not that its at all relevant to gaming what Intel has done with quicksync show that they want to do cool things, and defeat these thoughts that iGPU's can't play modern games. When you look where my industry is going every watt counts.

Share This Page