×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

4K Monitors: Not Now, But Soon

Soulskill posted about 6 months ago | from the wait-for-16K dept.

Displays 186

An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."

Sorry! There are no comments related to the filter you selected.

Get a TV (2, Informative)

TechyImmigrant (175943) | about 6 months ago | (#47258697)

Why pay $1000+ for a 4K monitor tomorrow when you can pay $500 for a TV today?

http://tiamat.tsotech.com/4k-i... [tsotech.com]

Re:Get a TV (2)

houstonbofh (602064) | about 6 months ago | (#47258711)

I have 2 clients with Seiko 4kTVs as monitors and it is fantastic for them. Another case of "This is not what I need, so no one needs it."

Re:Get a TV (5, Insightful)

TechyImmigrant (175943) | about 6 months ago | (#47258761)

Frame rate is for gamers. Programmers need pixels.

That's why TFA is missing the right angle.
4K is great for programming
      1 - You can see more lines of code
      2 - it doesn't require silly refresh rates)
4K for gaming is silly. It doesn't meet the basic requirements
      1 - your card can't drive it
      2 - the framerate is low)

Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.
 

Bread, eggs, breaded eggs (2)

tepples (727027) | about 6 months ago | (#47258823)

Frame rate is for gamers. Programmers need pixels.

What do game programmers need?

Re:Bread, eggs, breaded eggs (5, Funny)

Cryacin (657549) | about 6 months ago | (#47258843)

What do game programmers need?

Sleep, generally.

Re:Bread, eggs, breaded eggs (-1)

Anonymous Coward | about 6 months ago | (#47259047)

A girl...

Posted through a proxy because these assholes are blocking my IP. Even the spammers get more respect.

HAHA.. Fuck off!

Re:Bread, eggs, breaded eggs (1)

NormalVisual (565491) | about 6 months ago | (#47259735)

What do game programmers need?

Multiple displays that work well for the task at hand.

Re:Bread, eggs, breaded eggs (1)

houstonbofh (602064) | about 6 months ago | (#47260233)

What do game programmers need? Multiple displays that work well for the task at hand.

A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

Re:Bread, eggs, breaded eggs (0)

Anonymous Coward | about 6 months ago | (#47260269)

A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

Or four 5 inch Galaxy S4 smartphones.

Re:Bread, eggs, breaded eggs (1)

NormalVisual (565491) | about 6 months ago | (#47260345)

A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.

Re:Get a TV (5, Insightful)

sexconker (1179573) | about 6 months ago | (#47258947)

Frame rate is for gamers. Programmers need pixels.

That's why TFA is missing the right angle.
4K is great for programming

      1 - You can see more lines of code

      2 - it doesn't require silly refresh rates)
4K for gaming is silly. It doesn't meet the basic requirements

      1 - your card can't drive it

      2 - the framerate is low)

Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.

Are you kidding me? Staring at 30 Hz console output is maddening, and plenty of GPUs can handle 4K @ 60 fps for modern games. I'm sorry if you're trying to run Ubisoft's latest gimped turd, but that's an issue with the game, not a modern flagship GPU. Beyond that plenty of monitors can handle 4K 60 Hz. I have no idea why the fuck this shit got front paged. HDMI 2.0. WELCOME TO THE PRESENT. DisplayPort 1.2. WELCOME TO THE YEAR 2010.

Re:Get a TV (2)

Twinbee (767046) | about 6 months ago | (#47259317)

Enjoy your mouse cursor and window frame moving at 30fps then, and the associated lag that will bring.

Instead we should be encouraging movement the other way - towards 120fps which allows for much more lifelike smoother motion. Youtube stuck at 30fps is a thorn in the whole online video sector.

Re:Get a TV (1)

aaronb1138 (2035478) | about 6 months ago | (#47259463)

For gaming (not text or web) if the refresh is high enough (30 hz is not), scaled resolutions look fine. We've hit high enough resolutions where certain scaling operations just look like anti-aliasing instead of blurring.

Scaling rightfully got a bad name when it was upscaling 800x600 content to a 1024x768 or 1280x1024 17" monitor. It looked blurry. Scaling 1920x1080 to 2560x1440 on a 27" monitor looks really good. I'm more interested on the gaming side if these 4K TVs will take 1920x1080 or 2560x1440 at 60 hz and maintain refresh rate (technically if it is 120 hz it should, but I have my doubts about their scaler). Doing productivity work at full resolution would mostly be fine at 30 hz, if occasionally annoying.

Re:Get a TV (1)

mikael (484) | about 6 months ago | (#47259883)

And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.

The problem with the higher resolutions is that application developers just seem to think they can then make their application main window even bigger so it still fills the entire screen. Then they have to use bigger fonts to maintain compatibility with past versions of the same application.

Re:Get a TV (2)

strack (1051390) | about 6 months ago | (#47259631)

I mean, seriously, Seiki needs to hurry up and release a 60hz 4k version of its 38.5 inch display, preferably with a displayport. A 38.5 inch 4k 60hz VA panel would blow the weak ass 28 inch 4k TN panels everyone seems to be pushing today out of the water, especially if they keep their current price point. Ditch the tv tuner and smart tv crap, put in displayport and adaptive sync, and watch it become the monitor for All The Computers In The World.

Re:Get a TV (1)

rcht148 (2872453) | about 6 months ago | (#47258739)

Makes me wonder if Seiki can afford to give 65" 4k TV for $1059, (Deal at Amazon right now: http://www.amazon.com/exec/obi... [amazon.com] ) why can't the other big name brands (Sony/LG/Samsung) have 65" 4k TV sets at even double that price (say $2100).

Re:Get a TV (0)

Anonymous Coward | about 6 months ago | (#47259081)

Lets see, TVs are designed to be watched from a distance, so the retina area is farther back. Monitors generally have better options to connect to a computer(it's changing, but slowly), I get more options to adjust my monitors than I've seen on a TV, I can get more custom ratios on a monitor(I have one that's pretty much square, it works awesome for documents, but less so for video), more monitors are designed for multi-panel display, etc. Are those good enough reasons for you? Or do I have to point out that TVs generally have a shorter duty cycle(they need to be off more of the time or they overheat), or that most TVs don't go to sleep if the image isn't changed but instead allow the image to be burned into the screen permanently. I also get pissed because TVs have a stupid start noise and screen, my monitors don't have that.

Re:Get a TV (0)

Anonymous Coward | about 6 months ago | (#47259319)

So, given that monitors and TVs use the same LCDs and LED displays off the same production lines ... what's your point? Did you even have one? I'm pretty sure that my monitor only has DVI and VGA inputs, while the TV can eat just about everything but SCART.

Re:Get a TV (0)

Anonymous Coward | about 6 months ago | (#47260403)

thatts a nice fairy tale, and it may be true for your no name china crap but just the gaps between pixels on a TV sucks nuts and most of their controller boards wont handle the native resolution of the panel

Re: Get a TV (-1)

Anonymous Coward | about 6 months ago | (#47260029)

What the fuck have you been smoking. Your entire post is total utter nonsense. Fuck you for making me read your shit comment.

Occulus Rift (2, Insightful)

ZouPrime (460611) | about 6 months ago | (#47258715)

Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

Obviously that's a gamer perspective - I'm sure plenty of people will find 4K for what they are doing.

Re:Occulus Rift (2, Interesting)

Anonymous Coward | about 6 months ago | (#47258805)

In it's present iteration, the Occulus Rift might very well fit your current hardware but the requirements for getting a decent amount of pixel per view-angle on VR are brutal. Micheal Abrash's post on the matter is very enlightening: http://blogs.valvesoftware.com/abrash/when-it-comes-to-resolution-its-all-relative/. In short, you'll most likely need a ultra-responsive, insanely dense mini-displays each boasting a 4k x 4k resolution per eye. This kind of resolution plus the latency requirements for VR will indeed demand a very powerful gaming rig.

Re:Occulus Rift (1)

Your.Master (1088569) | about 6 months ago | (#47258869)

It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games (though I do like the sort of FPS-stealth-subgenre that encompasses Hitman, Dishonoured, Deus Ex, etc., and I can see how VR would be an asset there).

Platformers, most RPGs (the Elder Scrolls series is a popular exception, but I have never liked them), strategy and/or tactics games, most adventure games, most puzzle games, most "unique" / "indie" games, etc. -- these things and others are generally not first-person, and VR almost implies a first person perspective.

Most of those things I listed (aside from platformers) are already more popular on the PC than on console competitors.

Re:Occulus Rift (0)

Anonymous Coward | about 6 months ago | (#47258903)

Platformers, most RPGs (the Elder Scrolls series is a popular exception, but I have never liked them), strategy and/or tactics games, most adventure games, most puzzle games, most "unique" / "indie" games, etc. -- these things and others are generally not first-person, and VR almost implies a first person perspective.

I disagree. I think 3D perspective -- think "god view" where you can view from any position in 3D space -- will be an even bigger deal than FPS. Of course it will require some creativity from game designers to make it more than just a gimmick, but really the same thing can be said for 3D FPS.

Re:Occulus Rift (1)

RedWizzard (192002) | about 6 months ago | (#47259897)

It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games... and VR almost implies a first person perspective.

Only if you've got no imagination. What this iteration of VR is bringing is head tracking and that allows massive virtual screens. I think Rift and similar products are going to break into non-gaming market as cost effective way of getting giant flat displays.

Re:Occulus Rift (1)

Osgeld (1900440) | about 6 months ago | (#47260409)

head tracking has been around in toy grade vr helmets like the rift since the 90's ... those serial ports on them were not there for the sound

Re:Occulus Rift (1)

vux984 (928602) | about 6 months ago | (#47259057)

Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

As a gamer I'm not really concerned about 4k either. I'm much more interested better support for 3-view type setups. And 4k 3-view is just all the gamer problems of 4k times 3 :)

Oculus... I'm not sold on it. I see it as niche at best. Very cool in that niche though.

I would like to see head tracking go mainstream though.

Re:Occulus Rift (2)

Osgeld (1900440) | about 6 months ago | (#47259187)

you have been able to do that for 2 decades, so the question is why havent you

I will give you a hint, there is a reason for that, that reason is strapping a thing to your face gets old really fucking quick

Re:Occulus Rift (4, Informative)

Solandri (704621) | about 6 months ago | (#47259767)

Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

You're making a fundamental error many people make when it comes to display resolution. What matters isn't resolution or pixels per inch. It's pixels per degree. Angular resolution, not linear resolution.

I've got a 1080p projector. When I project a 20 ft image onto a wall 10 ft away, the pixels are quite obvious and I wish I had a 4k projector. If I move back to 20 ft away from the wall, the image becomes acceptable again. It's the angle of view that matters not the size or resolution. 20/20 vision is defined as the ability to distinguish a line pair with 1 arc-minute separation. So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

This is where the 300 dpi standard comes from. Viewed from 2 ft away, one inch covers just about 2.5 degrees, which is 150 arc-minutes, which can be fully resolved with 300 dots. So for a printout viewed from 2 ft away, you want about 300 dpi to match 20/20 vision. If it's not necessary to perfectly fool the eye, you can cut this requirement to about half.

In terms of Occulus Rift, a 1080p screen is 2203 pixels diagonal, so this corresponds to 18.4 degrees to fool 20/20 vision, 39 degrees to be adequate. If you want your VR display to look decent while covering a substantially wider angle of view than 39 degrees, you will want better than 1080p resolution. I'm gonna go out on a limb, and predict that most people will want more than a 39 degree field of view in their VR headset.

Re:Occulus Rift (0)

DarwinSurvivor (1752106) | about 6 months ago | (#47260235)

Yes, resolution DOES matter. A line of text requires a certain number of vertical pixels to be legible. Whether that line is an inch high or a quarter-inch high makes no difference. For people that need to see more at once, they absolutely do need more pixels. The image from a 1080p projector may look fine from across the room, but you can still only see a small amount of text at a time.

You are making the fundamental error that people just want their displays to look nice instead of actually being able to see either fine detail or large quantities of information at the same time. Some of us DO need (or want very, very much) more pixels on our displays.

Re:Occulus Rift (1)

Anonymous Coward | about 6 months ago | (#47260419)

spoken like a spoiled brat

go use a single tasking 80 column system for a week then come back and say 1080 is a small amount of text

display port (5, Interesting)

rogoshen1 (2922505) | about 6 months ago | (#47258731)

Displayport doesn't have the same limitations that HDMI has at those resolutions. and is available now.

Nvidia 6xx and ATI 7xxx (not to mention intel hd4000) are not exactly brand new, and available now.

IF anything, this sounds like "HDMI is showing it's age, use displayport"

Re:display port (3, Interesting)

complete loony (663508) | about 6 months ago | (#47258915)

HDMI was showing it's age the moment it was designed. All of the design and planning behind HD TV's was short sighted, as if they never planned to replace it.

Re:display port (1)

InvalidError (771317) | about 6 months ago | (#47259093)

DisplayPort did not support 2160p60 out-of-the-box either; it needed v1.2 to get there.

HDMI can do 2160p60 too, just needs v2.0.

Re:display port (2, Informative)

Anonymous Coward | about 6 months ago | (#47259203)

Oh you mean v1.2 which came out in 2009, and virtually every DP capable graphics card and monitor supports?

Re:display port (2, Insightful)

Anonymous Coward | about 6 months ago | (#47259615)

Building on that, is HDMI 2.0 even shopping yet?

2009 vs 2015, maybe?

Re:display port (1)

aliquis (678370) | about 6 months ago | (#47259157)

Yupp.

False claims for clueless idiots - old news and dupes.

Re:display port (1)

Anonymous Coward | about 6 months ago | (#47259159)

I have no clue where the problem lies (first guess would be something about the MST handshake or similar), but using 60hz 4K monitor (Dell UP3214Q) with Radeon R9 280X can be a bit buggy. There are two common problems:

  • When waking up the monitor (especially after a long sleep), there's quite high chance (I haven't measured it, but it tends to happen every other day or so) that it won't actually wake up and I need to shut down the monitor, wait a few seconds and then power it up. This also causes windows and such to move around to first half of the monitor.
  • Sometimes the resolution completely fucks up and I need to unplug/plug the monitor several times to get back to 4K

Re:display port (2)

Jumunquo (2988827) | about 6 months ago | (#47259231)

Totally agree. Nvidia 6xx has been out for a long time, and a 660 costs like $150. Anyone who buys a 4k monitor for $1000+ is not going to think twice about getting a matching video card. For gamers, in all likelihood, they probably already have one. The article claims a hardware barrier that is simply not an issue.

The real issue here is the price point. 2560x1440 27" monitors have been around for a long time, but it wasn't until it dropped under $400 that gamers started chomping them up. When they get low enough in price, then the graphics card cost can become an issue for non-gamers who are just using integrated graphics. There's also the issue of whether 2560x1440 at 27" is good enough, esp. for gamers, because given the distance from keyboard to monitor, going bigger than 27" doesn't seem that great, and at 27", 2560x1440 is already so small that most people can't find the dead pixels unless you fill the screen w/ white.

Re:display port (1)

Twinbee (767046) | about 6 months ago | (#47259339)

One major consideration in the exact type of my new gfx card (750 Ti) was down to whether it had DisplayPort. The EVGA version was one of the only ones to have it.

Re:display port (1)

aliquis (678370) | about 6 months ago | (#47259555)

http://www.prisjakt.nu/kategor... [prisjakt.nu]
3 GTX 750 with DVI.
11 GTX 750 without DVI.

That's super surprising to me.

209 GTX 7__ with DVI.
20 GTX 7__ without DVI.

Is it some special thing with the 750? I know it's the only Maxwell card out yet.

It would be more understandable if they missed DVI because that one is abandoned and won't be upgraded. Then again lots of devices have DVI so I guess we won't get rid of it all too quickly (disturbing to have both DVI and HDMI in the first place.)

Re:display port (1)

DarwinSurvivor (1752106) | about 6 months ago | (#47260241)

DVI != DisplayPort

Over 30yo+ you won't see the difference anyway. (1)

MindPrison (864299) | about 6 months ago | (#47258737)

I'm not a young person anymore, but I've been on the tech wagon since I was 8 years old. And I have to admit that I was one of those people touting the high-resolution thing and pushed it forward all the time (I even made a living in the graphics industry).

But there is such a thing as too much. After 720p...over 2 meters away from the television set, despite having Air-Pilot approved eyes, I still could not HONESTLY see the difference between a 50 inch 720p and a 50 inch 1080p, honestly - I could not!

I'd rather have a TV that can be seen perfectly from any angle, super-fast refresh rate for my gaming needs (my current LG 47" inch TV sports a 4ms refresh rate), but there is still room for improvement. And I'd love for these screens to be in the OLED department instead of the LED (Aka...TFT with LED backlight) we have now.

Re:Over 30yo+ you won't see the difference anyway. (-1, Troll)

dinfinity (2300094) | about 6 months ago | (#47259051)

We're talking about monitors. Go be old and irrelevant somewhere else.

Re:Over 30yo+ you won't see the difference anyway. (2)

Jumunquo (2988827) | about 6 months ago | (#47259447)

If you watched something with high resolution and a clean picture, like Disney's "Frozen," on a high-quality display, like a Samsung 55", then you should be able to tell the difference b/w 720p and 1080p easily. For many things, it is hard to tell the difference at a reasonable distance. Monitors are different in that you're usually much closer to one. At 24", 720p monitors look like crap compared to 1080p. 4K, however, seems like overkill at anything below 30".

For gaming, I'm totally with you. For computer gamers, what's really popular are the 27" 2560x1440 monitors that can be overclocked, ideally to 120Hz and that do not have a scalar which reduces response time (which means it can only be run at 2560x1440 and has a single dual-dvi input). Many cheaper monitors will advertise sort of bogus or software-corrected response times that are not representative of real-world use, so it's important to read the reviews. For the more mainstream models, tftcentral is a very good resource. It's trickier if you import from Korea trying to get the magic 120Hz overclock.

Re:Over 30yo+ you won't see the difference anyway. (1)

DarwinSurvivor (1752106) | about 6 months ago | (#47260267)

I regularly use a 1080p monitor in the 24" range and I can tell you I would *definitely* like the resolution to be higher. I do a lot of text-based work and I can see the letters start to get blocky if I reduce the text size while I know for a fact I could easily read text even smaller when printed on a decent laser printer.

Try it one day. Use a word processor to print "the quick brown fox jumped over the lazy dog" in steadily reduced font size down the page. Print that page and hold it next to the computer screen at a comfortable viewing distance and find the smallest font size you can read on the printed version and the on-screen version. If picked the same paper and monitor sizes (as measured by a real-life ruler), you may want to see an optometrist.

Re:Over 30yo+ you won't see the difference anyway. (1)

twistedcubic (577194) | about 6 months ago | (#47260347)

Everybody says this. It has been repeated hundreds of times on Slashdot. And it is just wrong. Fuzzy text looks fuzzy whether you're 2 inches away or 2 feet away. You might not be able to see individuals pixels, but you can clearly see the resolution is not sufficiently high to allow clear and crisp font rendering. I'm over 40 and my eyesight is worse than most people, but I sure as hell know that zooming out does not make a fuzzy picture look smooth.

Faggy GUI effects? (0, Flamebait)

Gothmolly (148874) | about 6 months ago | (#47258751)

I'm not worried about "choppy" OS animations or transitions. I want high res and decent refresh for gaming, and a nice, contrasty, good-black, clear giant work area for multiple windows.

Re:Faggy GUI effects? (2)

pspahn (1175617) | about 6 months ago | (#47259079)

... clear giant work area for multiple windows.

All this.

There are way too many applications I use that fail to do anything useful for multi-monitor setups. There's a few useful features like being able to resize window panels to customize my view better, but I want to be able to tear panels off and put them on a different monitor. To me, that is so vastly more important than just increasing resolution.

I currently use two monitors. One in landscape and one in portrait and I use them exactly how you'd expect, documents on the portrait screen, video/games/etc on the landscape screen. If I use Photoshop, it's great because I can use the landscape screen for the image and the portrait screen can hold all of my panels ... nice and out of the way. Unfortunately, this is one of the few suites that supports these tear-off panels. I have yet to find an IDE/coding environment that makes me happy in this regard (while also making me happy in others). If I could stand to use Eclipse, I would ... I just absolutely loathe it.

Display Port (1, Interesting)

Anonymous Coward | about 6 months ago | (#47258753)

Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.

This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.

Re:Display Port (4, Informative)

sexconker (1179573) | about 6 months ago | (#47258965)

Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.

This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.

DisplayPort is AMD's thing, through VESA. It's not Apple's thing.

Re:Display Port (3, Interesting)

strstr (539330) | about 6 months ago | (#47259239)

DisplayPort is actually Intel's and Dell's thing. They invented it.

AMD and Apple picked it up because it's the only replacement for DVI which is capped at 1600x1200 at 60Hz or 1200p at 60Hz. One display only. Requires dual-link for higher resolutions. Has large outdated connector.

DisplayPort supports up to 8K and 4K 3D or two 4K displays per connector at 60Hz. Or 4K at 120Hz which is what I want on my display. :P

You can drive multiple DisplayPort monitors by daisy chaining them together rather than using multiple ports, too.

http://en.wikipedia.org/wiki/D... [wikipedia.org]

Re:Display Port (1)

DarwinSurvivor (1752106) | about 6 months ago | (#47260281)

Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.

This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.

Nope [amazon.com]

What?! (5, Interesting)

RyanFenton (230700) | about 6 months ago | (#47258755)

I'm typing this on a monitor with 3840x2160 resolution, at 60hz right now. I posted about it weeks ago:

Clicky [slashdot.org]

It's like $600 when on sale, and it works superb for coding and playing games. Skyrim/Saints Row 4 plays fine on a GTX 660 at 4k resolution, you just disable any AA (not needed), but enable vsync (tearing is more visible at 4k, so just use that). Perhaps that's just me - but things seem fine at 4k res on a medium-cost graphics card.

A few generations of video cards, and everything will be > 60-FPS smooth again anyway (partially thanks to consoles again), so I don't really need to wait for a dynamic frame smoothing algorithm implementation to enjoy having a giant screen for coding now.

I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great. See my previous post for a review link and an image of all the PC Ultima games on screen at once.

Ryan Fenton

Re:What?! (1)

MindPrison (864299) | about 6 months ago | (#47258773)

I agree, for coding - the more resolution, the better...no doubt about that.

Re:What?! (0)

Anonymous Coward | about 6 months ago | (#47259019)

> I'm typing this on a monitor with 3840x2160 resolution, at 60hz right now. I posted about it weeks ago:

Wish there was a 40" class version of that. High DPI is for graphics, but it is counter-productive for text. I wouldn't mind about a 20% improvement in DPI but after that you just end up making the fonts bigger which is a waste.

Re:What?! (1)

Jumunquo (2988827) | about 6 months ago | (#47259517)

Personally, I'd rather get something like this for ~$350 on sale:
http://www.monoprice.com/Produ... [monoprice.com]
The pixels are already so small, people can't find the dead pixels, and it's AH-IPS, the highest quality panel, not TN, the cheapest.

Single-tile too (2)

Namarrgon (105036) | about 6 months ago | (#47259557)

The other nice thing about the Samsung UD590, apart from 4K @ 60Hz, is that it presents itself as a single 4K monitor, rather than two half-size monitors tiled next to each other. That can make a big difference to some uses, like running games at lower resolutions. The Asus PB287Q is another such single-tile 4K monitor.

custom resolutions, people! (0)

Anonymous Coward | about 6 months ago | (#47258775)

you don't need an NVidia gtx 6xx series GPU or higher to run 4k. you don't even need a 4k monitor to output 4k.

in the GPU control panel, just set scaling to your GPU (instead of your monitor), create a custom resolution (up to 4k) and there ya go. Your desktop will run at this resolution, games will play at this resolution, etc.

The main thing holding back 4k right now is current spec HDMI and that's about it. 30Hz is terrible. hdmi 1.4a can run 18xxp @ 59Hz and it aint bad, but it isn't quite 4k either.

If you need more info on this just google "downsampling". For graphics whores, its a massive IQ boost. Can you play Watch Dogs at 4k + max settings? probably not, but a ton of unreal engine 3 games play at ~4k just fine on semi-beefy GPUs.

Ack! I'll take muh 1080p monitor thank you. (2, Funny)

Anonymous Coward | about 6 months ago | (#47258779)

But all I really need is a LCD running 720p.
Truthfully all I really need is a super vga CRT.
In all honesty I could live with the warm glow of an ega screen.
Net net I miss a nice monochrome to get me through.
All things considered, teletype handles 99% of my day to day needs.
Actually, I feel like anything more than a single blinking indicator light is pretty decadent.

Re:Ack! I'll take muh 1080p monitor thank you. (1)

preaction (1526109) | about 6 months ago | (#47258791)

To be fair, a teletype would solve 80% of what I need, with a video-capable tablet providing the rest...

Re:Ack! I'll take muh 1080p monitor thank you. (1)

Bing Tsher E (943915) | about 6 months ago | (#47258901)

You don't need lower case, you read slower than 10 cps, and you've got a storeroom in back with shelves of rolls of yellow pulpy paper? And the recyclers call frequently?

Re:Ack! I'll take muh 1080p monitor thank you. (0)

sexconker (1179573) | about 6 months ago | (#47258979)

To be fair, a teletype would solve 80% of what I need, with a video-capable tablet providing the rest...

To be fair, a non-tele blow job would solve 98% of what I need.

I actually own one of these its incredible (1)

Anonymous Coward | about 6 months ago | (#47258781)

As my title says, I have a seiki 4K 50" that I use as a monitor. My biggest mistake was getting the 50", I think the 39" would have been better.

But I watch a lot of TV as well with the wife, so this is nice. The 30hz refresh rate in full on 4K means its not a 4K gaming machine, however you can drop it down to hdtv levels and it hits 120hz. But for development, or pictures, its incredible. And if you can find some 4K video the result is stunning. And google maps is incredible as well.

I still keep a second monitor around, but thats for 3D gaming. And I have a occulus rift as well. Overall these monitors really do some thing very very well. And for software development the resolution and size means you can see more code at one-which is always good.

Wait for G-Sync vs. FreeSync to finish (1)

gman003 (1693318) | about 6 months ago | (#47258783)

This seems to be a time when monitor features are growing fast. I'm personally going to stick with my 1440p screen until it stabilizes a bit.

The G-Sync/FreeSync battle is going to start. For gamers, this is going to be big. Right now, G-Sync only works with Nvidia cards, and FreeSync will probably only work with AMD cards. FreeSync is much better licensed, and I expect it will probably win eventually, but I tend to prefer Nvidia cards so I'm willing to wait until we get a clear winner.

Basically, my dream monitor right now would be:
under 28" diagonal
full AdobeRGB gamut or better, factory-calibrated (if significantly wider than AdobeRGB, needs 10-bit color support)
refresh rates up to at least 120Hz, variable using either Sync method as long as it works with any card I buy
resolution of 3840x2400 or higher (16:10 aspect ratio)
no need for multiple data links (as some current 2160p monitors do)
sub-millisecond input latency

I would naturally be willing to compromise on many of those points, but the way the market is going, I might not have to. And what I have right now is plenty good enough to last me until things become more future-proof.

Re:Wait for G-Sync vs. FreeSync to finish (1)

Twinbee (767046) | about 6 months ago | (#47259381)

Which is technically better out of Gsync and Freesync?

I agree with your ideal choice of monitor btw! Apart from the size which should be bigger, but further away. This way your eyes would be more relaxed.

Re:Wait for G-Sync vs. FreeSync to finish (2)

gman003 (1693318) | about 6 months ago | (#47259857)

I live in a rather small apartment and would really like a triple monitor setup. So I prefer smaller hardware. I'm also nearsighted and usually take my glasses off when computing for a long period, so smaller, closer displays are actually more relaxing. But to each his own.

As far as which is technically better, I haven't seen any solid comparisons. G-Sync does use proprietary hardware in the display, which means it has the potential to do a lot more. FreeSync works with existing panels provided they support V_BLANK, which isn't many yet, and none are exposing it to the GPU.

FreeSync has been incorporated into the DisplayPort standard (as "Adaptive-Sync", an option in DP1.2a and 1.3) but no displays have made it to market yet. G-Sync has the advantage of shipping, but unless it's either far superior in a technical manner, or Nvidia flat-out refuses to support Adaptive-Sync, I expect it to die sometime next year when the competition arrives.

BUY NOW because you have to be ready (1)

alen (225700) | about 6 months ago | (#47258793)

when the 4K content starts coming out
because you know, they will stop selling these soon and you will never be able to buy one to view all the 4K content coming out soon
or they will drop in price to the point where kids can afford them on their allowance, but you have to buy it NOW and Before this happens just to be the first one to watch 4K content

Re:BUY NOW because you have to be ready (1)

Twinbee (767046) | about 6 months ago | (#47259397)

I admire those first customers, because without them and the rich, the ball would never get rolling and we'd all be without forever.

4k media? (1)

asmkm22 (1902712) | about 6 months ago | (#47258797)

It just seems like the options down the road for media that can store 4k are a bit limited. Streaming seems out of the question when we can't even get consistent 1080p streams out to people. Blu Ray would need some major overhauls unless people want to have 4k movies come on 10 to 20 disks, and something tells me people aren't going to rush out to embrace a new media format even if it did get that overhaul. I just can't help but think 4k tech will have to be targeted at niche industries like photo editing and maybe CAD type stuff. I could also see a push towards the medical industry. But the average consumer? Not happening.

Re:4k media? (1)

Jumunquo (2988827) | about 6 months ago | (#47259595)

Yup, despite Google showing it can be done, the cable companies only put in fiber wherever Google puts in fiber just to screw with them. Most ISPs have a 200-300GB cap per month on data, so you'd better not watch more than, like, five 4K movies a month. Media, yea, I hear ya, many Bluray sets now come with DVD AND Bluray discs, and I bet it's because a lot of people don't have Bluray players but want to their purchase to be future-proofed.

Re:4k media? (0)

Anonymous Coward | about 6 months ago | (#47260005)

I just can't help but think 4k tech will have to be targeted at niche industries like photo editing and maybe CAD type stuff. I could also see a push towards the medical industry. But the average consumer? Not happening.

The good thing is for us geeks. Windows can finally stop massively resizing our 9000+Megapixel Pictures to fit our crummy 1 point something megapixel screens. YUCK! Reminds me of how until 5 years ago all video resolutions were stuck at 640x480 that was born a decade earlier.
My acquaintances over fifty still have XP computers running 4:3 monitors on single-core machines. Viewing modern camera pictures and youtube videos is heavy on those machines. My own monitor is 1600x900 and they probably will upgrade to that way before there's a replacement 10 years from now that can do 4K for their failing eyes.

The elephant in the room is digital photography. Granted, in mainstream homes I can never find people who know how to upload their cellphone photos to their PCs. That's the weak spot. At this rate phones will be on the 4K bandwagon long before people replace their expired MMX machines. Those were bought solely to bring each household to the internet back when 2-contract subsidies and sub $1000 prices were a novelty. I wonder what will be the trigger for HD on the home desktop.

Re:4k media? (0)

Anonymous Coward | about 6 months ago | (#47260023)

s/2-contract subsidies/2-year-contract subsidies/
Oops

Who needs HDMI? (1)

Kjella (173770) | about 6 months ago | (#47258815)

I got a UHD @ 60Hz single stream transport here in the Samsung U28D590D. There's not much video content yet except for a few porn sites, but for stills it's brilliant. Software support for increasing font size is mediocre in many apps, but they're usually functional just ugly. I wish there was some way to just tell Windows to draw a window at 200% size instead. Gaming is cool though my graphics card is choking on the resolution when it gets heavy, I guess it needs an upgrade now that it's pushing 4x the pixels. Overall I'm happy, yes I'm an early adopter but the bleeding edge is more like a paper cut.

Panasonic with HDMI-2.0 (1)

TheSync (5291) | about 6 months ago | (#47258817)

I can confirm that the Panasonic TC-L65WT600 [panasonic.com] 65" 4K UHDTV can play 60 fps 4K over its HDMI 2.0 connector (yes, I actually have access to 4K/60p content and a 4K/60p video server). I have seen it for as low as $3500 on BestBuy.com.

Future is here for some displays on OSX (1)

rsborg (111459) | about 6 months ago | (#47258819)

4K displays @ 60Hz with Retina pixel doubling = fantastic coding display [1]
Of course, I don't have this at work - I have two separate 24" monitors but my spend most of my time on my 15" retina screen.

[1] http://support.apple.com/kb/ht... [apple.com]

Honey Boo boo (-1)

Anonymous Coward | about 6 months ago | (#47258833)

I don't know about you, but why the rush to see Honey Boo Boo in 4K resolution? Aren't we appalled enough already?

Seiki 4k for $500 (1)

Rinikusu (28164) | about 6 months ago | (#47258835)

I've been considering one of these bad boys for awhile now. Cheap and for what I intend to use it for (software dev and video editing where the 30Hz refresh isn't a big deal), good enough. It's not something I'd use for gaming, at least at 4K, but hey... $500.

Re:Seiki 4k for $500 (1)

Anonymous Coward | about 6 months ago | (#47259039)

Yes that what my thinking - I haven't regretted it, the real estate is great and if I wanted to game I could always use my old monitor.
The Seiki's have size in their favor too - I'm not someone who gets the point of 4K in a 28" display, the 39 is ideal I think.

Re:Seiki 4k for $500 (1)

marked23 (693822) | about 6 months ago | (#47260323)

39" is ok. I've had the 39" Seiki for a few weeks now. I still have a normal (1920x1200) second monitor. So I don't want to run the larger fonts in Windows. I tried, and didn't like it.

For my taste, I think 50" is probably the minimum size for a 4k monitor if you want to use the default font size in Windows.

Re:Seiki 4k for $500 (0)

Anonymous Coward | about 6 months ago | (#47259117)

They are $390 shipped nowadays. Been at that price point for a few weeks.

Re:Seiki 4k for $500 (2)

Squash (2258) | about 6 months ago | (#47259185)

Really, it's fine for anything this side of gaming. Even Youtube and local media plays just fine. Very little out there has a framerate over that 30hz mark. The only real downside is that you can only fit one of them on your desk at a time.

Re:Seiki 4k for $500 (1)

strack (1051390) | about 6 months ago | (#47259669)

Think of it as 4 monitors with no bezels splitting them up.

Re:Seiki 4k for $500 (0)

Anonymous Coward | about 6 months ago | (#47260327)

I currently have 3 27"s on my desk and could reasonably fit 3 30"s. I don't buy the "you can only fit one on your desk at a time" statement.

TV is only 30Hz (1)

Anonymous Coward | about 6 months ago | (#47258859)

> a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors.

Uh no. The vast majority of TV content is 30Hz or less. Films and basically anything that isn't "live" is 24Hz, the live stuff is 30Hz (60Hz interlaced, but that is still only 30Hz on the wire). The only place where we regularly have actual 60Hz content is for sports and then it is only 720p anyway because of bandwidth limitations in the ATSC broadcast spec.

So, if what you care about is current tv/film content, one of these HDMI 1.4 spec displays is perfect. Maybe in the future it won't be. But 99.999% of the content available to you now will work just fine. Plus, actual 4K content of any refresh rate is also rare as hen's teeth. Netflix has a couple of series and there are demo clips floating around (the Blender guys have some like Tears of Steel).

4K is nice but... (1)

the eric conspiracy (20178) | about 6 months ago | (#47258879)

Having a full color gamut is important too. And a really good contrast ratio.

So I'm saving my pennies for a OLED 4K display. At 80". And none of that curved bullshit.

Re:4K is nice but... (1)

NemoinSpace (1118137) | about 6 months ago | (#47259003)

Yeah, curved! Man who buys this stuff? 3d sucks too and beta.

Re:4K is nice but... (1)

Areyoukiddingme (1289470) | about 6 months ago | (#47260187)

Having a full color gamut is important too. And a really good contrast ratio.

Check out the reviews of the Asus PB287Q. Very nearly full color gamut. These ain't your daddy's TN panels.

Yeah OLED would be nice, but I'd be surprised if an UltraHD or 4K OLED display is affordable this decade.

I'd settle for (1)

rossdee (243626) | about 6 months ago | (#47258883)

A 30 inch monitor, 16:10 aspect ratio , and 2560 x 1600

The only reason I would want much higher resolution that that is to overcome the problem of scaling on digital displays, in the old days of analog monitors we could run differehttp://hardware.slashdot.org/story/14/06/17/224208/4k-monitors-not-now-but-soon#nt resolutions wothout it looking like shit.

I currently have a 28 inch 1920 x 1200 monitor, but they don't make those anymore,

Moderators suck cocks in hell! (-1)

Anonymous Coward | about 6 months ago | (#47258997)

You think you can keep me off of here? Think again assholes.

Great for Presentations (0)

Anonymous Coward | about 6 months ago | (#47259095)

We recently bought a 65 inch 4k monitor to replace an HD projector. One major downside with a projector, you have to turn out the lights to see it, and that is not a problem with a 4k monitor. The problem with older large screen tv's is that the text on them was horrific, but that's not the case with the 4ks, they are crystal clear. So for a presentation device, a 4k monitor is an awesome tool.

I admit though, I wouldn't game on one as they sit now, but they do have their uses.

Ow, the ignorance (5, Informative)

jtownatpunk.net (245670) | about 6 months ago | (#47259115)

Was that summary written by someone who's never used a 30Hz 4k display?

A 30Hz feed to an LCD panel is not like a 30Hz feed to a CRT. The CRT phosphors need to be refreshed frequently or the image fades. That's why 30Hz was all flickery and crappy back in the 90s. But 30Hz to an LCD isn't like that. The image stays solid until it's changed. A 30Hz display on an LCD is rock solid and works fine for a workstation. I know. I've seen me do it. Right now. There are no "transition" issues, whatever that is supposed to mean. Nothing weird happens when I switch between applications. Multitasking works fine. I'm playing multiple HD videos without a hitch. Same way the 30hz 1080 programming from cable and satellite plays just fine on LCDs. Gaming's not great but turn on vertical sync and it's not terrible. I'd rather be running at 60Hz but I got my 4k panel for $400. It'll hold me over until displays and video cards with HDMI 2 are common.

Re:Ow, the ignorance (2)

strstr (539330) | about 6 months ago | (#47259775)

30Hz display would work but with high refresh rate. He's talking about motion looking like shit, bleeding together, or looking as if it's not smooth.

It's noticeable especially when you play games and use your mouse. Video not so noticeable because it's 30fps except in HD at 60fps where motion is fast like sporting and action shots.

Moving a screen around on your desktop fast will have noticeable jumpiness when done rapidly. Anything where motion happens fast.

My computer can but no interest right now (1)

EmperorOfCanada (1332175) | about 6 months ago | (#47259209)

I don't know any of my tech friends who are breathlessly awaiting 4K monitors. If I go to staples to replace my monitors some day and see that the 4K one is $50 more than the regular one, then OK I'll happily buy one. But it if it is $200 more then, no, I'll wait.

I am not saying that 4K is a stupid idea, or that I hate 4K, if it turned out that one of my present monitors had a switch on the back that would switch it to 4K I would be delighted, but when it comes to budgeting my money there are a huge number of things that would make my workflow a whole lot better that I would rather spend my money on. 4K is nice but just not needed. I think that I speak for most people who aren't doing video editing.

But I suspect that for the next 3-5 years that I am going to be reading various tech blogs and they will breathlessly review the latest 4K monitors as they drop lower and lower in price. But again the spread between regular and 4K will have to be pretty small before I will make the jump.

A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.

Re:My computer can but no interest right now (1)

Areyoukiddingme (1289470) | about 6 months ago | (#47260179)

A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.

You have that pretty backwards. UltraHD is immediately useful for a monitor, if you actually do work with a computer and aren't one of these people who think work can only be done in a maximized window. There's not much video in that resolution yet and at any distance it's not immediately obvious what resolution a TV is, but you can put all the text you want on screen at that resolution and you sit within arm's length of your monitor.

colour space (0)

Anonymous Coward | about 6 months ago | (#47259287)

Rather than resolution, I'm curious to see the expanded colour space that is defined by ITU Rec. 2020:

http://wolfcrow.com/blog/say-hello-to-rec-2020-the-color-space-of-the-future/
https://en.wikipedia.org/wiki/Rec._2020

Currently, HDTV (Rec. 709) covers about 35% of the CIE 1931 color space—which is basically the totality of what the eye can see. UHDTV/Rec. 2020 will cover about 75% of CIE 1931. A lot more shades of green will be visible on standards-compliant screens.

Won't buy new until 8K monitors (1)

Anonymous Coward | about 6 months ago | (#47259943)

I've been running a 30" 2560x1600 for many years now. Why would I want to pay a bunch of money for only double the pixels and a screwy aspect ratio. No Thanks. I'll wait for something more along the lines of 5120 x 3200.

multiple inputs for 4k? (1)

Touvan (868256) | about 6 months ago | (#47260039)

Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)? It seems it could be possible.

Re:multiple inputs for 4k? (1)

Areyoukiddingme (1289470) | about 6 months ago | (#47260169)

Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)?

The Asus PB287Q has two HDMI and one DisplayPort and supports dual simultaneous input from any two of them. They call it Picture-by-Picture mode. They put two HD displays side by side, with black bars above and below, from two different machines. It's slightly silly, since it's not exactly convenient to switch to that mode, but it's available. It will also do Picture-In-Picture mode, displaying one input across the full screen and the other in a window up in the corner, all rescaled in software transparently to the machines outputting the signals.

HDMI 2.0 is available on a lot of these monitors (1)

jzatopa (2743773) | about 6 months ago | (#47260139)

Not sure why this post doesn't even mention that HDMI 2.0 is already built into all but the lowest end of 4K monitors.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?