Scott, did you completely miss the crux of my argument? I said that the tools will catch up to the hardware. Mouse capture is a super easy keyboard shortcut away.
My whole point is that once the tools catch up, you'll be able to see the messages at any point around the game you want, without leaving your game, AND without having to look away at another monitor.
Apple has never given any shits about gaming on Macs. Don't expect this to change any time soon. Do you guys even have proper joystick support yet?
I specifically said that when the hardware is available for non Apple computers, as in Windows PCs, tools for gaming will catch up. I also specifically said that there is no need or call for any gaming tools for this or any Mac.
I'm not talking about now. My original statement was about the future of computing.
Once these screens come to non-Apple PCs (or any other than this single machine) I think there tools will catch up. There's probably not a big market as yet for gaming tools on the Mac just yet!
That monitor has an insane price. It supposedly has a 60Hz refresh rate, which is impressive. But the real problem is that you're going to have to make extra sure your video card can actually output 5K properly.
Even assuming all those things work, none of the PC software properly supports 5K. You see, OSX and iOS have all sorts of special software from the ground up to support high dpi displays that nothing else has. This makes such screens unusable on non-Apple OSes.
Let's take the most basic application for example. Your web browser. Well, Chrome wasn't created with such small pixels in mind. Chrome will tell the OS to draw a tab, and that it should be X pixels in size and use such and such graphics. On a PC it will do exactly as it is told. That tab will be impossibly small to see because the pixels are impossibly small. OSX knows that the program wasn't written with retina in mind, and it resizes everything in the GUI stack to be the appropriate size and look good.
The same goes with fonts and images especially. If a program tells windows to draw a font or image at a particular size, it will do as it is told. That font or image will be impossibly small on that Dell monitor. On OSX it knows how to draw a font properly no matter what, so the true size of the font on screen will always be the expected readable size you expect. However, it knows how many pixels it has, so it will use all of them to draw the font as smoothly and beautifully as possible. For an image, it will draw the image at the real expected visible size you expect, but if the image is high enough resolution, it will use every available pixel to display the image at maximum fidelity.
This is all transparent to the user and largely even transparent to the application developer. It just works. Windows has no support for any of this whatsoever.
So what happens on Windows with such a screen is that most of your applications actually end up being completely useless. You can't click the buttons in Photoshop because they are too small, though there are some hacks to fix it (maybe). You can't read the text in the location bar of your browser. You have to zoom in like a blind person on everything. It's awful.
As Scott has discussed already, scaling is the main issue for me.
Hardware for gaming at this resolution is insanely expensive or impossible to reach. Scaling down to 1920x1080 introduces some blurring.
The extra desktop room is very helpful, I used a 30" monitor (it was only 2560x1600) for 7 years but I would often have 3 or 4 windows open and be able to use them all e.g. text editor, VLC and 2 Chrome windows. For gaming and movies I absolutely needed them full screened as immersion was lost.
When I downgraded to 1080p, I was like WTF how do people use this (for multitasking).
Probably have to start coming down in price, or have a computer built in, to catch on, I guess.
Having a computer built in defeats the purpose of having a PC. Prices are coming down, ranging from $400 to the $2500 price point. The price differences are physical size (24" and up), colour gamut and accuracy.
Cut out gaming from my use case and I would buy one or two of these.
Having a computer built in was a sly comment on the Dell 5K screen being exactly the same price as the basic iMac Retina.
Gaming aside (as that wasn't part of my original point), instead of having 3 or 4 windows open on a screen, with 5K you'll be able to have 7 or 8 open. Then "one or two of these" won't be needed, as just one will do the job. And not just do the job well, but once better windows managers are developed, do the job better and more flexibly than two screens. I think it'll get to the point where having extra screens will be seen as being very dated.
Having a computer built in was a sly comment on the Dell 5K screen being exactly the same price as the basic iMac Retina.
Gaming aside (as that wasn't part of my original point), instead of having 3 or 4 windows open on a screen, with 5K you'll be able to have 7 or 8 open. Then "one or two of these" won't be needed, as just one will do the job. And not just do the job well, but once better windows managers are developed, do the job better and more flexibly than two screens. I think it'll get to the point where having extra screens will be seen as being very dated.
If you're not gaming, then just get a Mac.
Also, now that I think about it, I still don't understand how a higher resolution screen means you can fit that many more windows.
Let's say, for example, you have a 24" monitor. On the left half you put a web browser and on the right have you put a terminal. Now you double the resolution. To display the same information you can just use 1/4 of the screen. It's like you have three more monitors!
But the screen is still 24". That browser and terminal are now 1/4 their original real size. How can you read them? Your browser used to be about 10" wide, but now it's about 5" wide. That's too small to see even if the text is really clear and crisp. To have it be the same size it originally was, it would have to be... 10" wide. And you're not fitting any more windows in there.
Sure, the smaller pixels do make it easier to read text at smaller sizes because of increased clarity. But it's still not a 4X increase. Maybe you can put one or two extra windows in there and make the font a few points smaller. Even with my incredible vision, unscaled apps on 4K and 5K monitors are too small to see.
Also, now that I think about it, I still don't understand how a higher resolution screen means you can fit that many more windows.
That's understandable. You really have to experiment with two 27 inch monitors side by side, one normal and one Retina.
It's as simple as being able to have the screen closer to your face. That's it. You don't need the screen a long way off to smooth out pixels and make text look good. Up close is fine! Just like you're fine holding a retina phone or iPad closer to your face than your monitor.
With the non-retina screen, all I could see was the mesh of pixels and the gaps between them. It was worse than my 15 inch MacBook Pro with Hi-Res screen (the old near-1080p screen, not the Retina), and was only tolerable at a distance.
It sounds obvious, and I guess it is, but it hadn't really dawned on me that this was an option with desktop computing until I experimented in the store. And it seems to be non-obvious enough that you didn't make the leap either.
And to be clear, it's not a question of unscaled apps. You need the ability to zoom and scroll and arrange wherever, not bounded by physical pixels.
So is it a bad idea when you put your phone closer to your face than your screen? Or your tablet closer to your face than your screen?
Yes, if you are holding it there for extended periods shining a light directly into your eyes. A desktop is even worse since you are leaning forward. I guess you could alleviate that by bringing the screen forward to you instead of your face forward to the screen.
Turn down the brightness and bring the screen forward on the desk.
Or just keep doing things the way you do them now, but enjoy text, images, and videos being sharper and more beautiful with all those extra pixels.
Even now without retina screens I rarely ever do anything windowed. Almost everything I do is full screen regardless of screen size. Fullscreen Adobe apps, full screen browser, full screen terminals, full screen text editors, full screen games, full screen videos. I guess sometimes I'll have a terminal and text editor both taking up exactly half the screen using the Window+left and Window+right shortcuts. When I want two things to both be open side by side, I use multiple monitors. Text editor on one screen, browser on the other screen.
I can see how an OSX user would be very unlikely to behave this way since the maximize function has never been properly implemented, and only recently has full screen even been supported properly.
As a human being even though I may switch between tasks frequently, I am only ever actually doing one thing at any particular moment. Every pixel on a screen that isn't dedicated to the application I'm currently using is a wasted pixel serving only to distract me from the task at hand. Even if they make 10K or 20K screens, I'm probably still going to full screen everything and just enjoy how much easier it is to edit high rez photos and videos when Lightroom and Premiere are full screen. iOS really gets this right.
The only time I think that I do not use full screen is when there are multiple applications that interact with each other via click+drag type functions. For example if I'm organizing files, and I need to drag something from Explorer into an SFTP client.
In my experience on the slightly higher resolution monitor, I had more real estate in that it was 30" instead of what was the regular 22" or 24" at that time.
I had my text editor open and usually zoomed in on a quarter of the screen but could still read about 15 lines of code or any document I was writing.
In another 1/4 of the screen I could fit a HD TV show, lecture or Youtube video. On the 3rd and 4th quarters I had a chat window and a reference browser window. The browser windows had variable zooms that made the text readable.
I was only doing the one task of writing but had extra areas for background noise, research and social interaction.
On a 1080p monitor the video takes up the whole screen and has to be scaled, I can't make my text editor small enough to both read and see many lines of code along side other browser windows. I end up having to alt tab between everything. This might seem like a minimal inconvenience, however there was a definite hit to my productivity when going from a higher resolution screen to a lower resolution screen.
Having 2 screens is also fine and has use cases but I never need a text editor / word processor to fill the entire screen, same with chat or browsing.
For me must have full screen is when I need to see something visually appealing, it could be a photograph, a comic page, a video or a game.
So, I'm in my school's class of 2014 yearbook, despite having not graduated that year. It's kinda glorious, because everyone else was properly trimmed and dressed, and I showed up with dirty fuzzy hair, a full beard, and a Live Slow Die Whenever tee shirt. We've decided I'm in there as a recipient of the Boston Latin Benjamin Franklin Award.
The senior goal is a string of Bruce Springsteen references. The quote is a reference to my 10th grade English (both times) teacher, who introduced me to a colleague as "our residential 'Well, actually' student."
I woke up at 8am and started writing and playtesting and I'm only stopping now. And I'm only stopping because I realized I spent the last 45 minutes violently agreeing with somebody. My brain feels like pudding.
Some very important casting news came out of Hollywood today - Iko Uwais, Yayan Ruhian, and Cecep Arif Rahman have all been confirmed as cast in the upcoming star wars movie. Don't recognise them? You should - They're all actors from The Raid and The Raid 2. They played Rama and Mad Dog in the Raid, and the Assassin in the final fight of The Raid 2, respectively.
Silat kicks arse even long, long ago in a Galaxy far, far away - confirmed.
Also the rumors about Scarlett Johanson from three months ago have popped up again, but who gives a tuppenny fuck until it's officially confirmed.
Some very important casting news came out of Hollywood today - Iko Uwais, Yayan Ruhian, and Cecep Arif Rahman have all been confirmed as cast in the upcoming star wars movie.
Oh no! Now i'm kind of excited for this movie. No! I want to keep my expectations really low lol
Abrams seems to be going out of his way to cast a lot of people of color for the new Star Wars, which is awesome in itself, but the possibility of silat-practicing Jedi? I think I need to go lie down.
Abrams seems to be going out of his way to cast a lot of people of color for the new Star Wars, which is awesome in itself, but the possibility of silat-practicing Jedi? I think I need to go lie down.
I'm disappointed that you don't already think we're Jedi.
Abrams seems to be going out of his way to cast a lot of people of color for the new Star Wars, which is awesome in itself, but the possibility of silat-practicing Jedi? I think I need to go lie down.
Jedi fighting styles take a lot of inspiration from real-world martial arts.
As Scott has discussed already, scaling is the main issue for me.
Hardware for gaming at this resolution is insanely expensive or impossible to reach. Scaling down to 1920x1080 introduces some blurring.
You're missing an absolutely crucial point here. The 5K Retina is 5120x2880, which means that you can simply run all of your games at 2560x1440. There won't be any blurring at all, because effectively all that needs to be done is to map 2x2 blocks of pixels into single pixels.
Basically, if you have a 5120x2880 screen, it can function 100% perfectly as a 2560x1440 screen.
EDIT: Granted, this is under the assumption that either your monitor or video card is smart enough to recognise that there is a simple mapping of 4 pixels to 1 pixel that it can do instead of applying the smoothing algorithm. However, I don't think this is too much to expect from expensive hardware...
EDIT2: If (for example) your video card can do 2560x1440 with 2x FSAA, then it really should be able to do 5120x2880 without the FSAA, and the latter will be of overall higher quality because you're better off showing 4 pixels than downsampling them back to one pixel.
Comments
I specifically said that when the hardware is available for non Apple computers, as in Windows PCs, tools for gaming will catch up. I also specifically said that there is no need or call for any gaming tools for this or any Mac.
I'm not talking about now. My original statement was about the future of computing.
http://accessories.us.dell.com/sna/productdetail.aspx?c=us&cs=04&l=en&sku=UP275K3
Even assuming all those things work, none of the PC software properly supports 5K. You see, OSX and iOS have all sorts of special software from the ground up to support high dpi displays that nothing else has. This makes such screens unusable on non-Apple OSes.
Let's take the most basic application for example. Your web browser. Well, Chrome wasn't created with such small pixels in mind. Chrome will tell the OS to draw a tab, and that it should be X pixels in size and use such and such graphics. On a PC it will do exactly as it is told. That tab will be impossibly small to see because the pixels are impossibly small. OSX knows that the program wasn't written with retina in mind, and it resizes everything in the GUI stack to be the appropriate size and look good.
The same goes with fonts and images especially. If a program tells windows to draw a font or image at a particular size, it will do as it is told. That font or image will be impossibly small on that Dell monitor. On OSX it knows how to draw a font properly no matter what, so the true size of the font on screen will always be the expected readable size you expect. However, it knows how many pixels it has, so it will use all of them to draw the font as smoothly and beautifully as possible. For an image, it will draw the image at the real expected visible size you expect, but if the image is high enough resolution, it will use every available pixel to display the image at maximum fidelity.
This is all transparent to the user and largely even transparent to the application developer. It just works. Windows has no support for any of this whatsoever.
So what happens on Windows with such a screen is that most of your applications actually end up being completely useless. You can't click the buttons in Photoshop because they are too small, though there are some hacks to fix it (maybe). You can't read the text in the location bar of your browser. You have to zoom in like a blind person on everything. It's awful.
Also, has anyone had any experience with an Adobe RGB workflow?
Hardware for gaming at this resolution is insanely expensive or impossible to reach. Scaling down to 1920x1080 introduces some blurring.
The extra desktop room is very helpful, I used a 30" monitor (it was only 2560x1600) for 7 years but I would often have 3 or 4 windows open and be able to use them all e.g. text editor, VLC and 2 Chrome windows. For gaming and movies I absolutely needed them full screened as immersion was lost.
When I downgraded to 1080p, I was like WTF how do people use this (for multitasking). Having a computer built in defeats the purpose of having a PC.
Prices are coming down, ranging from $400 to the $2500 price point.
The price differences are physical size (24" and up), colour gamut and accuracy.
Cut out gaming from my use case and I would buy one or two of these.
Gaming aside (as that wasn't part of my original point), instead of having 3 or 4 windows open on a screen, with 5K you'll be able to have 7 or 8 open. Then "one or two of these" won't be needed, as just one will do the job. And not just do the job well, but once better windows managers are developed, do the job better and more flexibly than two screens. I think it'll get to the point where having extra screens will be seen as being very dated.
Also, now that I think about it, I still don't understand how a higher resolution screen means you can fit that many more windows.
Let's say, for example, you have a 24" monitor. On the left half you put a web browser and on the right have you put a terminal. Now you double the resolution. To display the same information you can just use 1/4 of the screen. It's like you have three more monitors!
But the screen is still 24". That browser and terminal are now 1/4 their original real size. How can you read them? Your browser used to be about 10" wide, but now it's about 5" wide. That's too small to see even if the text is really clear and crisp. To have it be the same size it originally was, it would have to be... 10" wide. And you're not fitting any more windows in there.
Sure, the smaller pixels do make it easier to read text at smaller sizes because of increased clarity. But it's still not a 4X increase. Maybe you can put one or two extra windows in there and make the font a few points smaller. Even with my incredible vision, unscaled apps on 4K and 5K monitors are too small to see.
It's as simple as being able to have the screen closer to your face. That's it. You don't need the screen a long way off to smooth out pixels and make text look good. Up close is fine! Just like you're fine holding a retina phone or iPad closer to your face than your monitor.
With the non-retina screen, all I could see was the mesh of pixels and the gaps between them. It was worse than my 15 inch MacBook Pro with Hi-Res screen (the old near-1080p screen, not the Retina), and was only tolerable at a distance.
It sounds obvious, and I guess it is, but it hadn't really dawned on me that this was an option with desktop computing until I experimented in the store. And it seems to be non-obvious enough that you didn't make the leap either.
And to be clear, it's not a question of unscaled apps. You need the ability to zoom and scroll and arrange wherever, not bounded by physical pixels.
Turn down the brightness and bring the screen forward on the desk.
Even now without retina screens I rarely ever do anything windowed. Almost everything I do is full screen regardless of screen size. Fullscreen Adobe apps, full screen browser, full screen terminals, full screen text editors, full screen games, full screen videos. I guess sometimes I'll have a terminal and text editor both taking up exactly half the screen using the Window+left and Window+right shortcuts. When I want two things to both be open side by side, I use multiple monitors. Text editor on one screen, browser on the other screen.
I can see how an OSX user would be very unlikely to behave this way since the maximize function has never been properly implemented, and only recently has full screen even been supported properly.
As a human being even though I may switch between tasks frequently, I am only ever actually doing one thing at any particular moment. Every pixel on a screen that isn't dedicated to the application I'm currently using is a wasted pixel serving only to distract me from the task at hand. Even if they make 10K or 20K screens, I'm probably still going to full screen everything and just enjoy how much easier it is to edit high rez photos and videos when Lightroom and Premiere are full screen. iOS really gets this right.
The only time I think that I do not use full screen is when there are multiple applications that interact with each other via click+drag type functions. For example if I'm organizing files, and I need to drag something from Explorer into an SFTP client.
I had my text editor open and usually zoomed in on a quarter of the screen but could still read about 15 lines of code or any document I was writing.
In another 1/4 of the screen I could fit a HD TV show, lecture or Youtube video.
On the 3rd and 4th quarters I had a chat window and a reference browser window. The browser windows had variable zooms that made the text readable.
I was only doing the one task of writing but had extra areas for background noise, research and social interaction.
On a 1080p monitor the video takes up the whole screen and has to be scaled, I can't make my text editor small enough to both read and see many lines of code along side other browser windows. I end up having to alt tab between everything. This might seem like a minimal inconvenience, however there was a definite hit to my productivity when going from a higher resolution screen to a lower resolution screen.
Having 2 screens is also fine and has use cases but I never need a text editor / word processor to fill the entire screen, same with chat or browsing.
For me must have full screen is when I need to see something visually appealing, it could be a photograph, a comic page, a video or a game.
The senior goal is a string of Bruce Springsteen references. The quote is a reference to my 10th grade English (both times) teacher, who introduced me to a colleague as "our residential 'Well, actually' student."
Silat kicks arse even long, long ago in a Galaxy far, far away - confirmed.
Also the rumors about Scarlett Johanson from three months ago have popped up again, but who gives a tuppenny fuck until it's officially confirmed.
Basically, if you have a 5120x2880 screen, it can function 100% perfectly as a 2560x1440 screen.
EDIT: Granted, this is under the assumption that either your monitor or video card is smart enough to recognise that there is a simple mapping of 4 pixels to 1 pixel that it can do instead of applying the smoothing algorithm. However, I don't think this is too much to expect from expensive hardware...
EDIT2: If (for example) your video card can do 2560x1440 with 2x FSAA, then it really should be able to do 5120x2880 without the FSAA, and the latter will be of overall higher quality because you're better off showing 4 pixels than downsampling them back to one pixel.