Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Measuring Input Latency In Console Games

Soulskill posted more than 5 years ago | from the button-mashing-efficacy dept.

GUI 160

The Digital Foundry blog has an article about measuring an important but often nebulous aspect of console gameplay: input lag. Using a video camera and a custom input monitor made by console modder Ben Heck, and after calibrating for display lag, they tested a variety of games to an accuracy of one video frame in order to determine the latency between pressing a button and seeing its effect on the screen. Quoting: "If a proven methodology can be put into place, games reviewers can better inform their readers, but more importantly developers can benefit in helping to eliminate unwanted lag from their code. ... It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it. The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server."

Sorry! There are no comments related to the filter you selected.

Transfers to PC Game Ports too... (4, Interesting)

TheSambassador (1134253) | more than 5 years ago | (#29332723)

Only in the ports that the PC gets from the consoles (or even ones that happen to be released on both systems) do I notice the horrible latency. It's awful in Oblivion, Fallout 3, Bioshock, and plenty of others. Part of it has to do with V-Sync, but turning that off doesn't eliminate all of it. I can't believe that 133ms is the norm. I've grown up a PC gamer, and that's definitely one of the top reasons I *hate* console FPS games.

Re:Transfers to PC Game Ports too... (2, Interesting)

Kral_Blbec (1201285) | more than 5 years ago | (#29332773)

Side note about Oblivion and Fallout 3. I think it is delayed intentionally to make it feel like someone actually is moving and make it more RPG-like.They aren't supposed to be twitchfests. Many FPS have the char move so fast it isn't humanly possible, turning, running, switching weapons etc.

Re:Transfers to PC Game Ports too... (0)

Anonymous Coward | more than 5 years ago | (#29332957)

John Carmack had a few things to say about input latency at his QCon 2009 keynote speech and it seems that one of the 'easiest' ways to take advantage of multiple cores/parallel processing (which seems more important for consoles, or the PS3 at least) is to just work on 2 or more frames at a time and on different parts of the game engine so you don't have to deal with any dependency issues, so you get a kind of pipelined architecture. That said, Fallout 3 on the PC was tons better after turning off Vsync, and also mouse acceleration (which is only active with the hud thingy which was majorly annoying to work with).

Re:Transfers to PC Game Ports too... (0)

Anonymous Coward | more than 5 years ago | (#29333447)

This may be so but it still doesn't... feel right. I always end up double checking my render ahead and mouse settings only to find out I'm supposed to feel like I'm controlling a borderline dyspraxic soldier.

Re:Transfers to PC Game Ports too... (1)

Kral_Blbec (1201285) | more than 5 years ago | (#29333511)

Good intentions, but poor implementation. They slowed it down too much IMO

Re:Transfers to PC Game Ports too... (1)

Spatial (1235392) | more than 5 years ago | (#29333525)

You move even more unnaturally than in most games, so I doubt that. You instantly accelerate when you press movement buttons, you jump like you're on the moon, etc.

I think it's about as intentional as the stuttering [youtube.com] .

Re:Transfers to PC Game Ports too... (1)

chonglibloodsport (1270740) | more than 5 years ago | (#29333705)

No, it is a flaw in the engine, one that is corrected by a mod called Oblivion Stutter Remover [bethsoft.com] .

Re:Transfers to PC Game Ports too... (1)

bemymonkey (1244086) | more than 5 years ago | (#29333185)

Interesting, and I thought that was just my setup getting old (on both Bioshock and Fallout 3)... I'm used to 0-latency gaming (never got into consoles other than Mario Kart 64 and Mario Tennis), so that was a bit of a shock...

Re:Transfers to PC Game Ports too... (0)

Anonymous Coward | more than 5 years ago | (#29333409)

I agree totally.

I have noticed that a lot of new games in general have this problem, even if I am getting really good frames per second it feels like I am playing through treacle at times. Is this caused by deferred rendering?

Re:Transfers to PC Game Ports too... (1)

tenthkarma (1632363) | more than 5 years ago | (#29333477)

I grew up a PC gamer too, and that's why I'm utterly desensitized to input lag. In other words, slogging through Counter-Strike with a dial-up modem builds character. Maybe younger gamers growing up with the rise of online consoles will be even further conditioned to latency.

Re:Transfers to PC Game Ports too... (3, Insightful)

KulSeran (1432707) | more than 5 years ago | (#29333785)

1) Input is often sampled only once per frame. That is why quake at 120fps feels more responsive, the time between you pressing a button and the game noticing you pressed the button is reduced.

2) Input and actions are often determined on a per-frame basis. Meaning the fastest delay you can get is a single frame. Consoles tend to have games that run at a target frame rate (30, 24, 60) that determines how much visual flavor the game can have (60hz leaves less time to draw and update stuff than 30hz). So, at 30Hz, the fastest you can hope to see an action is 1 frame after the game detects it. That amounts to 33ms-66ms depending on your timing of the press in relation to the frame processing (we are asynchronous to the technology after all).

3) After a game renders a frame, it is usually buffered, so it has to swap and display the buffer. With V-Sync (consoles tend to v-sync automatically) that means it has to wait the 16ms for a 60Hz screen to refresh. But it only attempts to swap once a frame, introducing an aditional frames worth of delay in the display.

This is where the 99ms minimum response for a 30FPS game came from. 48ms for a 60FPS game.

Then you have to take into account that because of threading and networking, there are often more buffers in the game that only swap once a frame. This can introduce additional frames worth of delays as developers attempt to use the hardware to its limits.

Re:Transfers to PC Game Ports too... (1)

beelsebob (529313) | more than 5 years ago | (#29333819)

Feature, not bug.

Humans do no move infinitely quickly. We can not always carry out the actions that we want to instantly. In fact, I would rather have input lag, with a pretty animation on screen showing *why* things are lagging than have everything magically happen instantly.

Re:Transfers to PC Game Ports too... (1)

KDR_11k (778916) | more than 5 years ago | (#29333903)

When it comes to things like aiming my arm introduces the realistic delay all by itself. Delaying it further just causes confusion because your physical motion is over while the ingame action keeps going for a bit.

Re:Transfers to PC Game Ports too... (1)

TheSambassador (1134253) | more than 5 years ago | (#29333961)

Cmon, that's silly.

If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.

And if you were confused, I'm talking about mouse lag. If it takes a little bit to accelerate to a speed when moving, that's fine and to be expected. If it takes time to draw my sword I will, of course, accept that. However, if moving my mouse a little bit to the left has noticeable delay, why would that be a "feature?" The lag between me deciding to move my neck and it moving is *NOT* 133 milliseconds.

On an unrelated note (and somewhat pointless/rantlike), why couldn't they have been bothered to animate an up+strafe animation? It's incredible that holding W and D at the same time in Third Person Mode has the same animation as running forward, but you're magically sliding sideways. Obviously first person is better, but it's a silly lack of detail.

Re:Transfers to PC Game Ports too... (1)

beelsebob (529313) | more than 5 years ago | (#29334325)

If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.
But your real life body *doesn't* take care of it. Your finger moves all of 3mm, that takes very little energy, and very little time. At least when compared to the energy/time required to e.g. swing a sword all around your body and smack it into an enemy.

The lag between me deciding to move my neck and it moving is *NOT* 133 milliseconds.
No, it's not... the point is, that this system is measuring the average time between the button being pressed, and the respective action actually being carried out. None of this study covers whether or not the developers intended there to be a lag there or not, only if there was a lag.

Re:Transfers to PC Game Ports too... (1)

TheSambassador (1134253) | more than 5 years ago | (#29334613)

I feel like you didn't read the entire post, and it seems like you're missing the point.

I clearly stated that I was talking about mouse lag. Your examples are clearly things that are expected... when I press a button to swing my sword, it should take time for my character to swing the sword.

However, there shouldn't be a delay (as little delay as hardware will allow) between the time you press the button and the time that your character STARTS to swing the sword.

"Realistic" is NOT the same as "laggy." We're trying to represent view rotation with a mouse... a movement of the mouse should equate to movement of the head. You're not supposed to be instructing the character, you're supposed to BE the character.

Re:Transfers to PC Game Ports too... (1)

beelsebob (529313) | more than 5 years ago | (#29334707)

Correct, what my point is is that this study isn't distinguishing between "realistic" and "laggy". My feeling with oblivion was actually that it was fairly accurate with how long things should take. I actually felt the combat system was one of the more fluid that I've ever seen.

Re:Transfers to PC Game Ports too... (1)

beelsebob (529313) | more than 5 years ago | (#29334343)

As a side note, yes, games lacking animations suck, actually, in oblivion, the completely static jump animation pisses me off more than the strafe non-animation. But that's a separate discussion.

Re:Transfers to PC Game Ports too... (1)

Hurricane78 (562437) | more than 5 years ago | (#29334169)

Have you tried GTA4? It's a nightmare. Missions are partially next to impossible because of it. And then of course ther's a bug, where the lag goes up to 2 seconds. *And now* please add the stuttering of a crappy engine adaptation on anything less than a quad-core CPU.

Re:Transfers to PC Game Ports too... (1)

Vu1turEMaN (1270774) | more than 5 years ago | (#29334525)

As far as any Half-Life 1 based games are concerned, input that registers server side would change drastically based on the client FPS.

http://www.fortress-forever.com/fpsreport/ [fortress-forever.com] for a detailed analysis of the situation of forcing fps_max in settings. Scroll down to the very bottom for the tl;dr graph.

I used to force mine on 101 (like every noob recommends) before I read this, and there is a noticable increase in speed when I lowered it to 50. So much, that it's become impossible to shoot the autoshotty (its spread did absolutely no damage firing faster). However, with weapons like the MP5, it might allow you to get a shot or two off more than the other person shooting at you, giving you an advantage over an entire game....2 less deaths, 2 more kills. In a tournament that could easily be the difference between winning and losing.

I wonder if this is true for other similar engines?

Re:Transfers to PC Game Ports too... (1)

ObsessiveMathsFreak (773371) | more than 5 years ago | (#29334675)

This study is completely bogus. Take a look at the Call of Duty video. They begin counting the frames when the first of the trigger LEDs lights, but the third trigger does not light until their frame count reaches 3. The gun fires at the 7^th frame. Is this a valid test? Are you telling me that the Call of Duty will fire your gun the second the trigger button is even slightly depressed? Moreover, by their own admission, the study did not take into account the delay induced by the monitor itself, which they haven't even bother to measure. What's the real delay here? 7 frames? 5 frames? 3 Frames?

I play both PC and console games and I have never once in my life ever been affected by this supposedly "crippling" lag. This is coming from someone who has passed the "Final Exam" in Bionic Commando: Rearmed, twice, with one of these supposedly slow controllers. If you've got a better test of twitch, I'd like to see it. I notice latency problems on every PC game I have ever owned. They occur when the system begins crunching and I lack the $2000 neccessary to speed things up. That's the only input lag I know of. However, never in either console or PC games have I ever had complaint with the latency between button press and resulting action in a smoothly running game. Ever.

As it stands, this study has absolutely no controls. Are there any games at all that suffer from noticeably less lag than the ones presented? I for one would be delighted, absolutely delighted, to see a comparable study carried out with PC games. The study would be best run by comparing input methods of PS/2 and USB keyboards, as well as with USB controllers, and indeed blue-tooth controllers. I think it would be a very educational experience for all involved.

Reality check (4, Interesting)

girlintraining (1395911) | more than 5 years ago | (#29332731)

...average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms.

Considering that until very recently all displays had an inherent lag of about 70ms -- and that new [LCD] technology has pushed that higher. But we're only considering half the equation: The average human response time for auditory or visual input is 160--220ms. This increases as we age. We are also part of this system and we're a helluva lot more lagged than our technology is.

I want an upgrade.

Re:Reality check (1)

Brian Gordon (987471) | more than 5 years ago | (#29332765)

The average human response time for auditory or visual input is 160--220ms

But that doesn't have anything to do with how much lag we can detect

Re:Reality check (1)

girlintraining (1395911) | more than 5 years ago | (#29332805)

But that doesn't have anything to do with how much lag we can detect

You're saying we can't measure the time from when a person receives an input until there's a neurological response?!

Re:Reality check (0)

DNS-and-BIND (461968) | more than 5 years ago | (#29332841)

Because we have no way of knowing if the person has detected the input until there is a neurological response, yes, I'm saying we can't. Please stop playing with big words that you don't understand.

Re:Reality check (1)

Spaham (634471) | more than 5 years ago | (#29332961)

of course we can, by managing how and when the input is emited !
Or if we record the activity, we can tell the many steps the processing goes through.
(at least electrically, that is)

Re:Reality check (1)

noidentity (188756) | more than 5 years ago | (#29332991)

Because we have no way of knowing if the person has detected the input until there is a neurological response, yes, I'm saying we can't.

The only meaningful test here is ABX [wikipedia.org] . Present player with A, B, and X. A is the system with less latency than B, and X is randomly either A or B. Run test multiple times and see whether player's determination of X is significantly different than it would be by pure chance (50%). The player doesn't have to be able to quote the latency difference, merely detect it, perhaps by its effect on his performance.

Re:Reality check (1)

Jarjarthejedi (996957) | more than 5 years ago | (#29332847)

I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.

Re:Reality check (1)

girlintraining (1395911) | more than 5 years ago | (#29332955)

I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.

Close. I'm looking at the entire system, not just the technology side but also the human side. Granted, the computer and its peripherals are the easiest to modify by far, but looking at the entire loop (Computer-display-person-input-computer) is the only way to make informed choices about improving the quality of real-time applications (which is the ultimate goal of this research).

Re:Reality check (1)

MattRC (1571463) | more than 5 years ago | (#29333149)

Personally, I don't care about the human side of the input lag question. That's irrelevant to me because at present that line of questioning is unlikely to lead anywhere. I don't see any evidence for the argument that "looking at the entire loop ... is the only way to make informed choices about ...". As I see it, I can make an informed choice by looking at the display alone - choosing the right display will result in a clear, measurable improvement. It sounds like choosing the right platform will have a similar effect. I can't choose the right human ... or my age ... so those considerations seem useless. How does knowing my own input lag make any difference? Whether it's 150 ms or 1,500 ms, I can't change it, and everyone else in my age group is on the same playing field. Now, I'm not saying that it's not interesting to know. It just doesn't make a difference in my gaming experience if I know my own latency. It also doesn't make a difference to me whether the 20 ms I shave off of my experienced latency by a certain hardware choice constitutes 10% or 50% of the "total" latency involved.

Re:Reality check (1)

girlintraining (1395911) | more than 5 years ago | (#29333363)

Whether it's 150 ms or 1,500 ms, I can't change it, and everyone else in my age group is on the same playing field.

No, but if you want a game to appeal to a wider audience, maybe a game that isn't as latency-sensitive would be beneficial. This way, 30 year old gamers wouldn't be outgunned by 20 year old gamers on account of a 50ms reaction time difference.

Re:Reality check (4, Insightful)

sahonen (680948) | more than 5 years ago | (#29334527)

Actually, raw reaction time, which doesn't even change too much between 20 and 30, is not the primary element of skill at first person shooters. I've looked at the raw reaction time (i.e. click your mouse when you see a light turn on) of many gamers, some who absolutely dominate me and some at or below my level, and there was no real correlation between that reaction time and skill. From what I've gathered, I've determined that skill at FPS games is more a function of experience and training rather than raw reaction time.

The basic categories that set an elite gamer apart from an average or newbie gamer go something like this:

Predicting your opponent and being unpredictable yourself: Knowing where your opponent is going to be, and acting in a manner that your opponent can't predict. If you can put your crosshair where you know your enemy is going to be, and he can't do the same, you're going to win even if he has better raw reaction time than you. This is a function of experience with the game.

Decision making: Evaluating the importance of the various high-level goals in the game, deciding which ones to prioritize, and acting on that decision. Making better decisions, making them faster. Again, a function of experience with the game.

Aiming skill: If an enemy appears on your screen away from your crosshair, how quickly and accurately you can move your mouse to put the crosshair over him. This is a function of training, learning exactly how much mouse movement corresponds to how much movement on screen, and being able to precisely produce that movement with your hand. This is often confused for reaction time when watching people play, but really, the reaction time component is only in seeing the enemy and deciding to shoot him. The rest is muscle memory.

This is where input lag really hurts, it's very very important that your field of view appears to correspond to your mouse movements with absolutely no lag. Console games don't suffer from this because aiming with console controllers is far less precise than using a mouse, so the input lag "hides" behind the imprecision of the joystick. When the game meets the PC where people are using mice, the lag between moving your mouse and your on screen view changing becomes perceptible.

Movement skill: The ability to manipulate your controls to allow you to travel faster. Not just finding the most efficient routes, but being able to use quirks in the game's movement code to give yourself more velocity. Another function of training, getting the control inputs just right can be difficult to master.

Teamwork: In team-based games, communication, chemistry, planning, and effective group decision making.

Re:Reality check (1)

Brian Gordon (987471) | more than 5 years ago | (#29332873)

I meant we as gamers

Re:Reality check (1, Insightful)

Anonymous Coward | more than 5 years ago | (#29332997)

there are 2 different lags.

1. Something happens in screen, small lag, you press a key.

2. you press a key, small lag, something happens in screen.

Later one is very very detectable, while first one doesn't matter so much.

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29333283)

there are 2 different lags.

1. Something happens in screen, small lag, you press a key.

2. you press a key, small lag, something happens in screen.

Later one is very very detectable, while first one doesn't matter so much.

First one matters a whole lot. Unless you don't mind getting roflstomped and teabagged.

Re:Reality check (3, Interesting)

Mprx (82435) | more than 5 years ago | (#29332925)

The only inherent display latency of a CRT is the time taken for the beam to arrive at any particular part of the screen. In the worst case this is one frame, which at a reasonable refresh rate (100Hz+) will be only 10ms or less. A good LCD (there's only one on the market, the ViewSonic VX2268wm) updates in the same line by line fashion as a CRT, and will add only a few more milliseconds switching time latency.

Of course you still have the latency in the input/processing/rendering stages, but this doesn't have to be very high (increase input sampling rate, avoid any interpolation, disable graphics buffering, etc). The only reason most modern console games are unplayable is because reviewers all ignore latency, and low latency can be traded for higher graphics detail which the reviewers pay attention to.

Perceived latency has nothing to do with reaction time.

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29333295)

While the ViewSonic has a true 120 Hz refresh rate, it's not a good LCD, as it has a TN panel with awful picture quality.

Re:Reality check (1)

Mprx (82435) | more than 5 years ago | (#29333335)

TN panels have acceptable image quality if they have gloss finish, no obvious light sources nearby to reflect off them, and are viewed from directly in front. This isn't too difficult to arrange for a non-portable monitor. If you disagree then there are currently zero "good" LCDs on the market, as every other LCD is too slow or has defective brightness control.

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29333385)

If you limit yourself to 120 Hz LCDs, then sure, there isn't a "good" LCD available. If you go down to classic LCDs, some of which can be happily ran at 75 Hz, you even have IPS panels with 0 ms input lag.

Unless you are looking at the ViewSonic from 5 meters away, it will exhibit serious vertical gamma shift, just like every other LCD.

Real reality check (1, Insightful)

Anonymous Coward | more than 5 years ago | (#29333929)

Ignoring the flamebait (only one good LCD? Really? I'm pretty happy with my Philips), I must say that as someone who has been a PC gamer most of his conscious life, I'm pretty impressed with what consoles had (and still have) to offer. When I was introduced to the Zeldas for the N64 there were so many things going through my head at once that I couldn't tell what the first impression was, but I'm pretty sure it wasn't "unplayable". Neither did I notice severe amounts of lag, but even if it's true that console games have more lag than PC games (and I think the article is making an unfair comparison by setting consoles, which are based off budget PC's nowadays, next to performance you can only get theoretically with a top of the line PC no one can afford) that doesn't necessarily mean that console games suck. There are more factors in the equation, not the least of which is new, fun, often somewhat original games that we haven't seen before yet. Katamari Damacy, Ridge Racer (actually arcade at first), Okami, Little Big Planet, or the legendary console RPGs from the SNES era, like Tales Of and Seiken Densetsu (among many many others), DDR, just to name a few. Rant on about latency all day long, but that doesn't change the fact that they're rock solid games and that PC game designers could learn something from them.

Multiple Mismtatched Stimuli (2, Insightful)

Plekto (1018050) | more than 5 years ago | (#29333971)

Often the real problem players have isn't the latency itself, because our brains will accommodate almost any lag as long as it's uniform(witness the lack of "frames" for most movies, despite being (usually) at a mere 24fps). What causes the problem is actually when you have more than one set of stimuli that are going at different rates. This is most noticeable with audio and video not being in sync.

With an LCD display, this is magnified greatly unless you are going directly from the computer or machine to speakers with their own amplifier built in(or headphones).

Case in point - I like to play Rockband with my son. On a CRT, it was fine. On a LCD I had to set the audio lag to 0ms. THEN set the video to sync with that. Adding delay to the audio as well as the video made it impossible to get a decent result - one has to be set to zero.

eg: audio lag is tested at 20ms. Video lag is 35ms. The correct settings are 0 and 15, because the audio will always have that delay in it no matter what you do. Putting both at the recommended 20 and 35 yields a combined 55ms between your finger and the result. Though they are in sync, they feel "laggy" as our brains are used to video running at about 60fps continuous/30fps interlaced if we grew up in the era of CRT displays. So 55ms feels like we just dropped 3 frames versus one in this situation. And that's just about at the threshold of delay between what we hear and what we see where it starts to become annoying.

Note - it's also why you need a $20 dedicated sound card. Often games hammer the CPU and on-board audio chip set when a big group of sounds come in and that also causes lag in Direct-X games for a moment(which most all console ports are, exacerbating the issue).

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29334417)

The Samsung 2233rz is the other consumer 120hz LCD. It's pretty good.

Mousing around the screen feels like being back on a good old CRT :)

(and to the color gamut complainers ITT; I've had LCDs with better color repro but @ 60hz _with_ a good 40-60ms input lag. I'd take a 120hz greyscale LCD before going back to that :)

Re:Reality check (1)

noidentity (188756) | more than 5 years ago | (#29332949)

Considering that until very recently all displays had an inherent lag of about 70ms

CRTs have a lag of nearly zero. Perhaps ones with 3D comb filters have more. Back in the old days (NES, Atari), a video game could directly affect the current color at the electron beam, giving a lag of nearly zero. It's only gotten worse since. Same for controllers, where they either had a separate wire for each button (e.g. Atari), or had a simple shift register that could be read in under a millisecond.

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29334037)

CRT monitors perhaps have virtually no lag, but the last gen CRT TV's also introduced lag. All those fancy Fluid Motion or whatever they call it simply takes time. But that's perhaps because the TV is more aimed at movies and less at gaming. On the other hand, more and more LCD panels have a special mode for gaming, bypassing all the chips for fluid motion.

Re:Reality check (1)

elashish14 (1302231) | more than 5 years ago | (#29333077)

Most large screen LCD tvs have a lot of digital processing before you get to see the output. For most applications, this is fine, but for important ones (like playing Melee*), it makes the TV unusable. In these cases, you usually have to dig through the menus to find a game mode option and turn it on. It doesn't fix the whole problem though, the best way is to go with a CRT.

*Yes, my priorities are a bit unconventional and possibly screwed up.

Re:Reality check (1)

jadin (65295) | more than 5 years ago | (#29333165)

The average human response time for auditory or visual input is 160--220ms.

So what you're saying is from the initial stimuli to seeing my response happen is 293ms - 353ms? Compared to if our technology was 'perfect' it would be 160 - 220ms?

Seems pretty obvious why people want faster response technology..

[*]apologies if I'm misinterpreting the data

Re:Reality check (1)

ahabswhale (1189519) | more than 5 years ago | (#29333209)

"The average human response time for auditory or visual input is 160--220ms."

Where did these numbers come from?

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29333279)

as a musician i can tell you that anything higher than 10 - 20ms is absolutely horrible to play something like a software synthesizer trough midi for example. it would drive musicians insane if there was a latency of 160 - 220ms. you can definitively feel that the response happens too late.

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29333531)

(OK, this is more on noticing things rather than responding to things)

Two words(OK, a acronym and a word): DLP Projector

Even in a static scene I will see the colors flickering(rainbow effect) because of the slow speed of the disc spinning in the projector. (Single chip model, Spring 2005, DVD player via composite(yes composite) into the projector). Drove me nuts. Other people don't see it (even though we are of the same age) but I did.

So while it isn't lag, I'm sure it is less than 160-220 ms(1/6th- 1/4.5th of a secondish). I think the numbers I have heard(sorry, don't remember source) is that for visual things faster than 20 times a second (50ms) start being noticed as continuous rather than a series of frames(for most people).

So I guess what I am saying is that while it may take us a while to respond, it doesn't mean that we can't notice it.

Re:Reality check (1)

ildon (413912) | more than 5 years ago | (#29333593)

Human brain lag and input lag don't overlap, they add on to each other. So it's kind of moot for this discussion.

Re:Reality check (1)

cpicon92 (1157705) | more than 5 years ago | (#29333615)

My LCD monitor (which is about 2 years old now) claims to have a response time of 8ms, is this very fast, or am I missing something?

Re:Reality check (1)

Tom (822) | more than 5 years ago | (#29333617)

Sorry, but human visual processing time does not figure into the equation. The brain compensates for that, which is why our experience of the world appears to be immediate despite the processing time required.

Lag is also something you can train for, unfortunately. If you are playing a lot of low-latency FPS games, you become more aware of it, because you're training your brain for fast reaction times. Like everything else in the human body and mind, how well you perform depends on how much you train/use it.

133ms seems like a lot to me, and I'm by no means a hardcore gamer. But I do notice lag in old online FPS (newer ones often do extrapolation) when it goes above ca. 50 ms.

It also depends a lot on the game, of course. In an FPS, lag really hurts. In most MMORPGs with their auto-combat, well you could probably play them from the moon and be ok.

Re:Reality check (2, Informative)

Hurricane78 (562437) | more than 5 years ago | (#29334091)

The average human response time for auditory or visual input is 160-220ms.

You know exactly that you're talking bullshit. The statement is true, but is irrelevant, because this is the response time when the pipelining of predicted actions does not work. How else would we be able to do any high-speed actions?

The brain *expects* a bang and a flash when we press the pistol trigger. If it's too late, this will show later, when the predictions and reality are compared again.

You see the monster, and pipeline a shot, some ms later, your hands press the trigger. Now you get the signal of higher pressure form your fingers which goes into the input pipeline. But the bang and flash arrive much too late at that same pipeline. So when they later come out of it again, the discrepancy still is there. Which messes with your ability to predict things.

Try playing a keyboard with 100 ms of lag in-between. At 200 ms it is next to impossible.
Try it with a online shooter with the additional 200 ms ping. Good luck winning that match!

Re:Reality check (0)

Anonymous Coward | more than 5 years ago | (#29334253)

Considering that until very recently all displays had an inherent lag of about 70ms

You are completely wrong... CRT monitors with refresh rate of 100 Hz and more were very common already ten years ago. Demos on the Amiga and Atari ST running at full framerate (50 Hz or 60 Hz depending on Europe/USA/etc.) were very common. And I can tell you that the difference between some graphical effect in these demos run at 60 Hz or 30 Hz was more than obvious. 50Hz means that the 'inherent display lag' [sic] could not be more than 20ms. That was nearly 20 years ago.

Seriously dude, your post... WTF!?

Ten years ago I was a competitive FPS gamer using "low poly" (low number of polygons) mod for Counter-Strike so that my Pentium III (?) could provide me about 100 fps. I'd then play on european servers that had a ping (round-trip) of 20 ms.

There's no console today that can provide the responsiveness I had in 1999 for my Counter-Strike setup.

Cue the clueless "but the human eye can not see more than 30 images per second" in 5...4...3...2....1...

Besides that, you're also completely wrong on human response time: average conscious response time doesn't explain typing speed of 100wpm+ (which I happily reach despite my old age). Average conscious response time doesn't explain the speed at which you close your eyes when something comes too close to them.

I've seen the (back-then) best european Counter-Strike team play at a LAN. I was good and yet I had a hard time believing what I saw.

You're completely wrong to think that "up until recently a 70ms visual display lag" was the norm and that seeing an ennemy in a FPS, aiming and shooting is a normal conscious process to experienced gamers...

And all the clueless +insightful mods...

DDR? (3, Interesting)

koinu (472851) | more than 5 years ago | (#29332735)

Anyone can make a comment how the lags affect gameplay on DDR? I still hesitate to buy an LCD TV and stay with my CRT, because I am not sure about it. When playing DDR, I usually listen to the music and the rhythm, so I really don't know exactly what would happen with a LCD TV.

I've seen people playing DDR with Samsung LCD TVs on Youtube. It seems it's working well.

Re:DDR? (5, Insightful)

bjorniac (836863) | more than 5 years ago | (#29332859)

DDR or any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent. The game isn't based much on reaction times, more hitting the pads at the right intervals. Once you get accustomed to the lag (which should happen naturally as you dance) the actual amount won't matter so much - you just have to move 160ms before the arrow hits the circle or whatever, something you will have been doing already, moving to land on the beat, rather than waiting for the beat and then moving. This differs from, say, a shooter like counter-strike, where you have to react as fast as possible to what is a non-rhythmic, supposedly non-predictable event (unless the opposing team comes out in synchronized swimming formation).

Inconsistency in lag would be a killer here, as it is everywhere, as it would be essentially adding a random component to your timing that you have no control over. But any time you do rhythmic work you're doing predictable lag compensation already - eg clapping on the beat requires you to start the motion before the beat happens rather than react to it.

Re:DDR? (1)

koinu (472851) | more than 5 years ago | (#29333117)

I'm almost sure that when you have an audio lag, the result would be pretty bad (at least you can correct some values in the main settings). I've heard people complain about songs that are off-sync. And second thing is... there are people who can read about 1000 arrows a minute. This means 60ms between arrows!! And I can play at about average of 250ms between arrows, and I pretty suck at this game. When you look at 160ms average lag and my reaction time of 250ms (and less, because it's average, of course!).. it might be a problem, in my opinion.

And there might be a limit where the brain cannot correct the lag anymore. Don't you think?

Re:DDR? (1)

MartinSchou (1360093) | more than 5 years ago | (#29333219)

Actually one of the most fun things I've tried with an FPS was writing a very simple program that would move the mouse five pixels in a random direction 20 times a second.

It starts out as insanely annoying, especially on the desktop, but after a few minutes in the game, you end up finding it a lot more challenging than normal. Coop becomes even more fun when you're running in formations because you might accidentally shoot your friends. Or miss them when you're actually trying to frag them in revenge.

Re:DDR? (1)

CompassIIDX (1522813) | more than 5 years ago | (#29333601)

DDR != any rhythm/timing based game. For example, early IIDX and Pop'n Music, pre-timing adjustment settings, are more or less unplayable on modern TVs. You're actually "building" the songs -- constructing them note by note -- as the basic framework plays on. Try doing that with terrible lag, it ain't pretty. In short, this:

"...any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent."

is completely false.

Re:DDR? (0)

Anonymous Coward | more than 5 years ago | (#29333627)

I dont know about console-versions but I know that StepMania has options for adjusting A/V sync. This would eliminate that issue.

Re:DDR? (4, Informative)

Anpheus (908711) | more than 5 years ago | (#29333061)

One thing Rock Band has done, and presumably this came from somewhere else or has propagated to Guitar Hero and other rhythm games, is that you can set the video latency and audio latencies separately and finely tune the system so that it looks and sounds like you want it to be.

Rock Band 2's guitar controller actually has a tiny light sensitive component and a cheap microphone, so that you can auto-set your game. It's really very handy, and took only fifteen seconds or so. The result was that when a note crosses the "active line" of the game is when I should both strum it / hit it / sing it and hear the result.

Are you certain there is no way to do the same thing with DDR?

Re:DDR? (1)

koinu (472851) | more than 5 years ago | (#29333167)

I haven't seen such thing, but I can manually correct audio delay in main settings, as far as I remember. I don't need it though, because I have a simple CRT and an analog HiFi system. There should not be any big lags. At least I can play without problems.

Re:DDR? (0)

Anonymous Coward | more than 5 years ago | (#29333357)

Are you certain there is no way to do the same thing with DDR?

Recent builds of StepMania and OpenITG allow for video latency correction in addition to audio latency. There is one caveat, though: only audio latency is correctable in-game, either manually or automatically; video lag must be adjusted through manual .ini file editing (it's not hard, but knowing or finding the delay time can take some practice).

Hopefully in the future, the developers will add a means of calibrating video latency via easier means, like Rock Band did with their auto-adjusting light/sound sensors.

Re:DDR? (1)

Judinous (1093945) | more than 5 years ago | (#29333281)

I've spent a lot of time comparing how my rhythm games perform on various CRTs and LCDs, and I can tell you that the experience is orders of magnitude better on a CRT. However, if you're playing at low difficulties (1-10 steps) or low BPM (500 or so), then you are probably okay with an LCD. This range encompasses essentially all play that is done with your feet, so if you are physically dancing to your rhythm games, then by all means go for it. However, if you are playing rhythm games with your fingers on a keyboard or pad of some kind, you will find that it becomes completely unplayable on an LCD. The extra latency from the LCD combined with the very small amount of time that each arrow is on screen when playing on a high BPM (which is required for high-level play) makes it essentially impossible for a human player to even complete a difficult song.

Re:DDR? (3, Insightful)

mwvdlee (775178) | more than 5 years ago | (#29333349)

I'm sorry, perhaps I'm misunderstanding you, but in the world of music 500 BPM is far from "low". Most "danceable" music generally is somewhere between 120 and 130 BPM, Drum-and-bass (which most people would consider quite fast) is about 170-180 BPM. Finding anything over 200 BPM is uncommon and usually for novelty sake. Perhaps the measurement you're talking about is something else than Beats Per Minute?

Re:DDR? (1)

Mprx (82435) | more than 5 years ago | (#29333391)

And a lot of people dance to drum and bass at half tempo. I used to dance at full speed, but I find it too tiring now. 500bpm is fast enough that its hard to distinguish individual beats.

Re:DDR? (1)

koinu (472851) | more than 5 years ago | (#29333421)

Yeah... 500 bpm is pretty high, in my opinion, too. Perhaps he is speaking about the highest rate during the song, there are songs where you go above for a very short time. But on average, it's a bit difficult to keep a constant rate of 500 bpm. But it does not mean, that it's not possible, of course.

See here (from ITG):
http://www.youtube.com/watch?v=CpTcN2zTqKY [youtube.com]

Re:DDR? (2, Informative)

CompassIIDX (1522813) | more than 5 years ago | (#29333675)

He shouldn't have referenced "BPM" because it's not really accurate, but by "500 BPM," he's talking about the rate at which the notes are falling down the screen. Many people take advantage of Hi-speed settings which allow you to increase this rate, thus decreasing the total number of notes your brain has to process at any given time. So a 125BPM song at Hi-speed 4 scrolls the notes at a rate of "500BPM." The actual beats per minute remain the same, though, obviously.

Re:DDR? (2, Informative)

Judinous (1093945) | more than 5 years ago | (#29333937)

Yes, by BPM I was referring to the DDR setting used to control the speed at which the notes flow past the screen. The reason that it must be turned up for higher difficulty songs has less to do with the number of notes on the screen at once, and more to do with the amount of separation between them. At low speeds, there are not enough vertical pixels separating the notes to distinguish the order that they are actually coming in, and whether they are simultaneous (jumps) or not. When played at "normal" speeds, the notes will even overlap each other making a solid "wall" that is nearly impossible to work out, even if you were to pause the game and dissect the screen at your leisure.

Re:DDR? (1)

koinu (472851) | more than 5 years ago | (#29334109)

Hmm... I see. What I was talking about is when you have 400 steps in your song and the songs length is 1:40 then you get 240 steps per minute (on average of course). This is pretty fast, in my opinion. But songs usually can have faster passages (where you have very dense steps). Well... but I don't need to tell you this, I think.

anal sex (-1, Troll)

Anonymous Coward | more than 5 years ago | (#29332817)

it won't do anything but make your dick stink

Further proof (0, Offtopic)

Anonymous Coward | more than 5 years ago | (#29332819)

that console players are vegetables. Be a man, buy a pc.

Point of Comparison (1)

algae (2196) | more than 5 years ago | (#29332931)

Just as a point of comparison, the typical latency you want in pro audio applications between when a guitarist plucks a string, and when they hear the note, is less than 15ms. This makes me think that the 80ms might be *acceptable*, but it's by no means ideal.

real test = ps3 vs xbox360 (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#29332967)

Why not measure the same games on both ps3 and xbox360 and compare the results?

I've played the full version of puzzle quest on a rarely used ps3 with wireless controllers at a friend's house on 1080p/60 Samsung TV connected via component cables, and I've played the demo version of puzzle quest on my heavily used xbox360 with wireless controllers connected by 3rd party USB charger on a 1080p/120 Toshiba TV connected with HDMI. Going back and forth between the two, I noticed substantially more lag at my friend's house. It was significant enough that I would actually freak out during the delay between pressing the button and seeing the response on the ps3, because I was used to playing at home on the 360. However, there are too many variables to conclude that the ps3 is laggier than the 360.

We need someone to measure the actual difference. TFA missed a huge opportunity.

Re:real test = ps3 vs xbox360 (0)

Anonymous Coward | more than 5 years ago | (#29333109)

It could just be the TV. Playing "Wii Sports/Tennis" at my friends house is awful for me because his TV introduces a 100-200ms video enhancement processing delay. It also lags the sound to compensate. He's used to it so he plays OK, but it takes me 4-5 rounds to re-train myself.

This is once reason recent TVs have a "gaming mode" that removes this delay by eliminating any post-processing.

Re:real test = ps3 vs xbox360 (0)

Anonymous Coward | more than 5 years ago | (#29333809)

From what I can tell, the Xbox is generally worse for lag. I own both systems, and COD4 definitely has more lag on the Xbox version.

What TV people has is far more a factor than the game. Most LCD panels are horrible, and don't come close to a even a budget plasma, and plasmas don't come close to CRTs in terms of lag.

Re:real test = ps3 vs xbox360 (0, Insightful)

Anonymous Coward | more than 5 years ago | (#29334223)

Reading TFA would have told you the reason why the PS3 wasn't tested at this time.


Musicians can detect very small amounts of latency (4, Informative)

silverspell (1556765) | more than 5 years ago | (#29332969)

It may be that console gamers have learned to expect around 100-150ms of input latency, perhaps thanks to visual cues that help to justify the latency on some level. (If I decide to jump, it takes a certain amount of time to react to my thought and make that happen; if I tell Mario to jump, maybe he takes about the same amount of time to react to the stimulus. It makes a certain kind of sense.)

But I assure you that musicians find that level of latency unacceptable. When you're playing a software synth live, performing with other musicians, even 75ms of latency is very noticeable and makes you feel like you're playing through molasses. Same thing with recording -- if it takes longer than 25-30ms to hear my own sound coming back at me, I definitely notice it. Virtuosic music regularly exceeds an input density of 50ms per event!

mo3 uP (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#29333009)

[tuxedo.org], to yet another Systems. The Gay during which I

Anonymous Coward (0)

Anonymous Coward | more than 5 years ago | (#29333027)

One thing the article fails to mention is if the weapons he's firing are supposed to be instant. There could be a "startup" time programmed into the weapons. Think of it like a fighting game. The slower heavier attacks have a longer windup than the weaker attacks. If the weapon firing is supposed to be instant than this was a very good test.

Re:Anonymous Coward (1)

Mprx (82435) | more than 5 years ago | (#29333173)

Startup time only makes sense if it's a choice. Having to decide on the responsive but weak attack and the stronger but laggier one potentially adds tactical depth and can make the game more interesting. If every attack has added latency in some misguided attempt at "realism" it's just bad game design.

Shooting Pause... (1)

Xin Jing (1587107) | more than 5 years ago | (#29333163)

Something I never understood in arcade games was the "shooting pause", where a maximum amount of button presses (usually associated with the fire button) yielded a maximum number of projectiles before failing to register additional button presses. A noticable break in visual projectiles was observed before the additional button presses are again registered. I can't think of a single game I've ever played where the shooting pause increased dramatic tension, added to the atmosphere of enjoyment or balanced the playability. I've always suspected the shooting pause must be a deliberate software coded effect rather than a hardware limitation.

Re:Shooting Pause... (1)

Mprx (82435) | more than 5 years ago | (#29333195)

The point of limiting the number of player projectiles on screen is to provide a risk/reward mechanic by encouraging you to move closer to the enemies. You'll do more damage, but you'll have less time to react. Rewarding risky behavior is generally good design.

Re:Shooting Pause... (3, Informative)

Jeek Elemental (976426) | more than 5 years ago | (#29333307)

I think what you see is simply hitting the max number of inflight bullets? Software limited yes but probably based on what the hardware can handle.
If the game uses hardware sprites (quite possible) it may be limited by the total number of sprites on screen.

So when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.

Re:Shooting Pause... (1)

Xin Jing (1587107) | more than 5 years ago | (#29333551)

Yes a good example is Galaxian for a limited number of projectiles on screen at any one time. An example of a shooting pause was Centipede. Both instances you fire projectiles at different speeds, but both seem to have an ai exploit where an incoming opponent "knows" you are approaching your shooting pause and will attempt to collide with you. The Centipede example depicts an extremely frustrating experience since you are essentially on rapid-fire with random pauses. The Galaxian example is also strange because it has the player locked at a shot-limit that is also the shooting pause, which makes no sense when there aren't incoming opponents and you can't move along the Y axis, which should not to be confused with Galaga which allowed you to advance along the Y axis and had the option of powerups if I'm not mistaken.

Atari 2600 has less latency (3, Informative)

Visoblast (15851) | more than 5 years ago | (#29333223)

On the old Atari 2600, the game has to be written around rendering fields (half frames) of video. On NTSC, that is 59.94 fields per second, or a little under 16.7ms. Input is usually read during vertical blanking between fields. That makes for not much more than 33.3ms latency in the worst case of input change just after vertical blanking.

Maybe new isn't really better.

Re:Atari 2600 has less latency (0)

Anonymous Coward | more than 5 years ago | (#29333361)

Yeah but imagine how good the graphics would look if Custer's Revenge was re-released on 360!

Re:Atari 2600 has less latency (1)

Ant P. (974313) | more than 5 years ago | (#29333661)

New is better up to a point.

In the 16-bit era you could do processing in the hblank too, forget the vblank. Sega used it for better-looking water (palette swap), Amiga could do the same to get a huge amount of colours on screen, or even run two different horizontal resolutions at the same time.

This transfers beyond games. (1)

Omnifarious (11933) | more than 5 years ago | (#29333381)

Kernel developers have complained that UI latency doesn't have very good measures under Linux. Now here's a methodology for measuring it. This could lead to kernels better optimized for the user experience that were provably so.

I don't think though, for the Linux kernel or for a video game, that pure latency is exactly the right measure. I think the standard deviation of latency is an important measure too. A user should be able to reliably predict the latency. They may not consciously do so, but their cerebellum will.

World at war PC (1)

ivesceneenough (1407533) | more than 5 years ago | (#29333401)

I don't know what caused it, but my normal Call of Duty: World at War server that i used to ping 50 to just jumped to about 120 on average. It really messed with my accuracy for a few rounds until i adjusted. I didn't think it would make a difference but it definitely did.

Re:World at war PC (0)

Anonymous Coward | more than 5 years ago | (#29334083)

I call bullshit.

How can they miss this ? (3, Informative)

billcopc (196330) | more than 5 years ago | (#29333609)

OK, I'll be the first to concede that I am more sensitive (or attentive) to lag issues, being an audio/video hack myself, but how can 4+ frames of lag be ignored or even tolerated in any action game ?

I already consider the 3-frame LCD lag inacceptable and utterly shameful.. I mean the data is there, put it up already! If the de-crapifying filters need that much lookahead to function, they need to be refactored to use look-behind, and if the copycat engineers can't fix it, at least give an option to disable it per-port so we can play our games.

Now on the development side, as a so-so game dev myself, I can't think of any valid excuse for Killzone's 12 frames of lag. What the hell are they doing in the loop ? Here's what a game loop is supposed to look like :

for (;;)
    if(button_pushed(1) && ga_hasammo(ga_PEW_PEW))
        ga_plWeapon::spawn_bullet(); /* MOTHERFUCKING PEW PEW!!!1!! */


Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet, play sound and draw the goddamned frame already! If that takes you 200 msec to process, then your game is really running at 5 fps with a shit ton of interpolated frames in-between, and you should probably go back to writing Joomla plugins.

Ten years ago, this shit would not have flown. We used to tweak the everloving crap out of our loops, and VSYNC was the norm, which made late frames painfully obvious. To deal with it, we used hard-timed loops and every single piece of code had to obey the almighty strobe. You had 16 or 33ms to render your frame, and if that wasn't enough well, you had to tweak your code. Today, now that even game consoles have gone multicore, there is no excuse. You could even have one thread acting as a clock watcher, monitoring the other tasks and telling them to hustle (e.g. degrade) if they're falling behind.

To prioritize anything else is to betray the game's purpose: to entertain via interactivity. If a game is going to sacrifice interactivity, I might as well go watch Mythbusters instead :P

Re:How can they miss this ? (1, Informative)

Anonymous Coward | more than 5 years ago | (#29333899)

A lot of this comes from developers trying to exploit the concurrency possible in modern systems. So, at 30 fps, if you sample input in the main thread (hopefully early in the frame, so 33 ms before the simulation is done) -> renderthread runs behind the main thread (up to 33 ms) -> GPU runs behind the render thread (up to 33 ms) -> CPU/SPU post processing (up to 33 ms) -> wait for next vsync (if you're unlucky you miss it) -> any frame processing the TV does (god knows how many ms), and then your input may finally show up on screen. That's a deep pipeline!

Hmm.. (0)

Anonymous Coward | more than 5 years ago | (#29333669)

After my first couple experiences gaming on the XBox, I'm still convinced it has a latency of about 2 seconds. Which is sure to have been improved down from the beta of 3 seconds by the excellent testing/QA department over at Microsoft, which I am sure definitely exists.

Sex with 4 Cum (-1, Troll)

Anonymous Coward | more than 5 years ago | (#29333727)

survive at aal and shower. For be treated by your Driven out by the nearly two years

The latency issue with the Wii. (2, Interesting)

bezenek (958723) | more than 5 years ago | (#29333839)

At the Hot Chips symposium last month, Rich Hilleman, Creative Director for Electronic Arts, commented on the 100ms delay inherent in the Wii remote (Wiimote). I assumed there was an issue in the delay involved in sensing the accelerometers, but this article shows 100ms is not any different from other consoles.

I wonder what Rich Hilleman was really getting at? Maybe people are more sensitive to delays when they are a result of a full-body-type movements rather than a button-press.

This is interesting stuff, and it would be a good thing if some graduate student did a thesis on it. (Free Ph.D. here--no thinking required!)


Re:The latency issue with the Wii. (1)

Asterra (1087671) | more than 5 years ago | (#29334153)

I, for one, cannot wait until these guys tackle the Wii. Heck, I thought at first that that was the whole point behind the exercise. I've done my own tests, for what it's worth, though not using quite so idealized a setup. The Wiimote's motion control has tremendous input lag.. I gauged it to be at least 120ms greater than the buttons on the same controller. The kicker? The Wii Motion Plus did NOTHING to reduce this. And this is the reason why games which rely upon the motion component of the controller quite simply suffer for it. It's really only the games which minimize said control which remain playable. Natal and Sony's wand have essentially the same lag.

That resolution is too low! (1)

Hurricane78 (562437) | more than 5 years ago | (#29333979)

One video frame? With a normal camera? That's 1000/30 = 33.333... ms. From making music, I know when you start to notice lag, and some people can notice this at around 10 ms, and I get into trouble above 30 ms. So you would have to have at least the double temporal resolution, to get useful results.

Recommendation for LCD screen then? (1)

pete-wilko (628329) | more than 5 years ago | (#29334457)

Seeing lots of comments about LCD screens - i'm thinking of upgrading from my old 17" to something around 21-24" LCD widescreen.

Am a gamer (not completely hardcore though) - so response time would be good. Am aware of the refresh issues with LCD's. Also do some photography stuff so good colour reproduction would be handy (after calibrated etc), but viewing angle not so important.

Any ideas? Looked around for reviews and found a few conflicting reports - suggestions much appreciated! Budget is low to mid (in dollars guessing $180-$250).

Two so far on short list are either:

Viewsonic VX2260WM 22"

Asus 24" VW246H

Consistent latency (1)

192939495969798999 (58312) | more than 5 years ago | (#29334653)

The amount of latency is not really an issue as much as the consistency of latency. There's nothing more frustrating than getting fragged because YOUR input was processed late because of too much going on, or for any other reason. I recall missing tons of jumps in Megaman 2 because of this, so it's hardly a new problem.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?