![]() |
||
|
|
Welcome to the Exploding Garrmondo Weiner Interactive Swiss Army Penis. |
GFF is a community of gaming and music enthusiasts. We have a team of dedicated moderators, constant member-organized activities, and plenty of custom features, including our unique journal system. If this is your first visit, be sure to check out the FAQ or our GFWiki. You will have to register before you can post. Membership is completely free (and gets rid of the pesky advertisement unit underneath this message).
|
![]() |
|
Thread Tools |
Lower FPS Standards
FPS being frames per second; not first person shooter.
In the past couple of years, I've started to notice that develoeprs are releasing game trailers showing pretty poor performance. The games have great graphics and everything, but a lot of them have terrible framerates even when there's not much action going on. I can understand this when games are in beta, but those that are nearing release shouldn't be having issues with fluidity. Is it just me or is there a lower standard with regards to playable framerates these days? Often you see a trailer that shows 15-20 fps for a game that's getting released in a couple of weeks. That's barely even playable anymore. What's worse is that it'll probably run worse on your machine because of the lack of optimization. It's not even a problem of how good your machine is since it's mostly with regards to console gaming. The PC is affected too though as seen in the newly released Dawn of War 2 trailer. To give another example, I just saw a trailer for that Too Human game. Now I know the game looks like it has terrible gameplay, but the graphics are pretty decent. The thing is, when they first showed the trailer running on xbox 360, it suffered from horrible framerate. The trailer I just saw showed no improvement. The only games that I've seen recently that don't have this problem are Starcraft 2 and Diablo 3, but that's to be expected from Blizzard since they take years to optimize their games. Then again, I heard that Too Human was in development for ~10 years. Has anyone else noticed this or has 20 fps become the standard for playable framerate that no one really notices how bad it is? Jam it back in, in the dark. ![]() |
On-Topic: CCP makes great promotional videos for all of the Eve stuff and is sure to in the future for other projects. Of course, they also accomplish this by doctoring footage and presenting situations which never happen in-game. An Abaddon one-shotting a Titan what was that shit? I think it's primarily a matter of effort. There's nowhere I can't reach.
Last edited by Bradylama; Jul 22, 2008 at 02:50 AM.
|
This is an issue I've been bothered about myself lately.
You have all these high definition super powered uber consoles with 1080p shining bloom trisplinear HDR graffix with debuffered bespecter face mapping and what not and the framerate hardly goes over 30fps, nor any decent AA. Most of the time the framerate drops really badly when almost anything bigger besides idle animation happens and it's quite awkward looking. With all this horsepower and possibilities, you'd think a high, steady framerate would be pretty important since so much is reliant on presentation. Yeah ok, so Gran Turismo 5 Prologue apparently goes 60fps (except at the race start camera span where you can see it dropping). This is great, despite the game lacking any kind of sense of speed (unless you assess it solely on the speedometer). This thing is sticky, and I don't like it. I don't appreciate it. ![]() |
The framerate produced by the computer is but one of many factors determining how smooth something appears though. If you had a game running at a constant 120fps but your tv's framerate was 110, it'd look flickery because they're out of synch. It also depends on how sharp the image is. As resolutions increase, things will look worse because the images are sharper so it's easier for your eyes to tell the difference between each frame. The same game running at the same framerate will look far smoother on an SDTV than an HD one as a result.
That's probably why I never understand why people complain about framerates in games. The combination of my elderly eye sight and fairly poor television mean the frames blur together nicely for me and everything looks smooth. That's why I've always prefered concole graphics to PC actually. Games always look too clean on PC, the sharp edges ruin the look of a thing generally. So yes, the problem isn't neccessarily with the developers, you just need to buy worse tv sets. Most amazing jew boots ![]() ![]() |
It's true, I lowered my Orange Box resolutions back down to 1024x768 because I felt it looked "better" with large objects in the distance (since they weren't actually large objects far away, but small objects much closer, meant to look like large objects in the distance). But this wasn't a framerate issue but just regular resolution niggle.
The games look far better and smoother on my PC than the 360 version on my SDTV though (though it looked fine as well). I was speaking idiomatically. ![]() |
Is your computer monitor a progressive scan one? That would synch the framerates which helps a lot. Also, I imagine your PC was chucking out a higher framerate than your Xbox anyway, seeing as how PAL 360's are capped at 60fps and generally run at 25 or 30 so as to ensure compatabillity with the most tv sets (I think, someone tell me if I'm talking out of my arse).
The problem with consoles is that you only buy the console, you already generally own the tv. It's not like on PCs where you buy a monitor to suit your graphics card, console manufacturers and the developers need to take into consideration the fact that most people don't own a 1080i HD tv set yet and whilst I'm sure they'd love to optimise their games to run on that, it would alienate most of the market which is commercial suicide. You could try sitting further away from the screen? What kind of toxic man-thing is happening now? ![]() ![]() |
What no it looks fine now. The problem with Orange Box was how at a higher resolution the structures in the distance that are supposed to look like they're really huge and far away (the Citadel, for instance), actually reveal themselves to be just small structures much closer. It worked well enough on lower resolutions when the games came out, but with higher resolutions, because the buildings are actually not physically that far away on the map (and therefore lack the detailed textures since you never see them up close unlike say the character models), but merely disguised to look like that, it's more obvious.
I don't have issues with framerates with either the PC nor 360 version (though the PC version obviously runs smoother as the 360 version is probably capped at 30 and that's why I said it looked better). As for the compatibility thing, I'm having a hard time believing that. I think console games run around 30 fps because that's usually the limit they can get for a steady framerate with 1080p. If the game ran 576i (PAL) or 480i (NTSC) exclusively, I'm sure you'd see more smoother games, but because a lot of developers have problems optimising for a smooth 60fps at highest resolutions, it's easier for them to cap it at 30 and be done with it and focus on other stuff. For instance, Resistance didn't even do 1080p at all because the developers said they couldn't get good enough results and so they went with 720p. I'm assuming good enough results also referred to framerate performance. It seems with HD gaming, developers are less likely to ensure compatibility with old TVs, and more concerned with achieving _something_ at the highest resolutions since HD is required for all PS3 and 360 games. See Sega with their Sonic PS3 game. They admitted there was a bug in the game that caused cutscene videos to judder and glitch on SDTVs and there was no fix for this. I know because we got a lot of calls for that from families (when Sonic was the only kid friendly game for the PS3 and of course they played it on an old TV) who thought there was something wrong with their console. FELIPE NO ![]()
Last edited by map car man words telling me to do things; Jul 22, 2008 at 03:55 AM.
|
You reckon? See to me that sounds counter-intuitive. I'd rather make a game look good for everyone rather than great for a handful of people and lousy for everyone else. I ain't a game developer though (And I don't have an HDTV). I wonder how long it'll be before the next Silent Hill/Turok LOL fog games start coming out to hide hardware deficiencies?
Some games do a pretty good job of keeping the game looking smooth though. GRID never judders at all, thanks in part to a healthy dose of motion blur when you're going sideways round corners. One wonders whether it's there to convey a sense of speed or to mask the framerate drop when the machine's shifting all of the very detailed background past you at once (As opposed to when you're driving in a straight line and can see mainly track). Whichever, it works. Most amazing jew boots ![]() ![]() |
Well usually the game does look good on SDTV. I feel if a game doesn't look "good" on a TV without the fancy resolutions, then it doesn't look that impressive to begin with. Upped resolution alone doesn't make a game look better for me.
I play 360 and PS3 on my old 29" Sony SDTV and while blu-rays on SD resolutions look like crap and unlike the PC the consoles merely downscale the image instead of actually changing the resolution of the game, they still manage to look very good (Ace Combat 6, Metal Gear Solid 4, Mass Effect etc). It's with special cases like the Sonic cutscene issues (which I didn't know were physically possible) and Dead Risings super tiny text that it's apparent developers are more concerned with the higher resolutions and expect the game to function just as well on lower ones. And usually they do. What I think is count-productive is how they can't get those smooth enough framerates because of the very resolutions they are currently marketing their games and consoles with. Most of the time. There's many examples of extremely good looking and very smooth HD games too, but when you have trailers like the new Brothers in Arms game with really choppy framerate, it can't look that impressive even to the average consumer. Jam it back in, in the dark. ![]()
Last edited by map car man words telling me to do things; Jul 22, 2008 at 04:15 AM.
|
You say that console games appear smoother, but I've experienced more slowdown on console games than on PC games. This is while playing on a wide range of HDTVs and on TVs I've had for about 8 years. Mass Effect (on 360), for example, has a lot of stuttering on both new and old TV sets. What bothers me is that reviews say this game has smooth framerate most of the time. This isn't true at all and I'm thinking they only said this because of how everyone is getting used to crappy framerates. It's definitely the developers who are at fault for this since everyone seems to just be going for great graphics while sacrificing performance. There's nowhere I can't reach. ![]() |
Every time I play Odin Sphere I wonder if the slowdown was considered a feature and not a bug.
This thing is sticky, and I don't like it. I don't appreciate it. |
I am a dolphin, do you want me on your body? |
Just skip to the 1:00 minute mark. I was speaking idiomatically. ![]() |
Haha, oh wow. My first two or three days playing consisted of wandering into lowsec and being ever-so-surprised when a load of fuckers on the other end of the gate decided to wreck me.
![]() What kind of toxic man-thing is happening now? |
Hmmm.. did i link the wrong video? I meant to link the one about how exciting eve is with its endless options such as mining, dogfights, and learning skills.
I played the trial for eve, but it was too slow for me. Plus the fact that you need at least 6 months of skill training to be able to pilot a battleship well really put me off. I really like the idea of eve, it's just that it's too much of a time sink for me to really get into. How ya doing, buddy? ![]() |
Forget that. What, you don't want my bikini-clad body? ![]() |