That's just for now. Real 4G can get close to 100Mbps. ISPs and wireless carriers are holding all of us back.
Get back to me when they have "Real 4G" in all the places I have used my a laptop this year. For example: Kenya, Tanzania, Malawi (I think the 3rd poorest country in the world), etc.
I visited Kansas City this year too. Everyone was excited about Google Fiber... and 90% of those people were frustrated that it wasn't in their area. And in New York I couldn't get my phone to work at all for two days.
Seriously, no technology that relies on 4G internet connectivity to get video data from the device in my pocket to the device in my hand will ever work, let alone replace wires.
For anything other than a tablet, a touchscreen is laughable.
For you. I like the one on my phone and iPod. I hate the touch screen on airline screens. I'd like a touch screen on my DSLR, but would never give up physical buttons to get it. I'd love the option to control video, images and pdf files (things like pinching to zoom, etc) on my laptop screen.
For anything other than a tablet, a touchscreen is laughable.
For you. I like the one on my phone and iPod. I hate the touch screen on airline screens. I'd like a touch screen on my DSLR, but would never give up physical buttons to get it. I'd love the option to control video, images and pdf files (things like pinching to zoom, etc) on my laptop screen.
OK, if we're talking non-generic-computing devices, then yes. Touch screens are great for cameras and the like.
A phone is just a small tablet PC with built-in cell data access.
But touch-screens on regular non-convertable laptops? They're a recipe for broken screen hinges and probably not of interest to most users.
Touch screen monitors have been rejected by consumers every time they've been introduced. With desktops, they cause gorilla arm and are mostly useless. With nonconvertable laptops, they're awkward to use, stress the screen hinges, and require additional tension in them to be usable. They're still usually awkward in most use cases.
For convertable laptops, these are also tablets, so they're obligatory and the point is moot.
You sound like someone who said phones didn't need touch screens and that a keypad was just fine. Until a laptop with a screen becomes popular enough for new software and uses to emerge, you can't say what future uses might find interesting about it.
Pre-iPhone nobody knew they wanted to play Angry Birds and tap-to-expose a photo. Why can't I point at more than one thing on my laptop screen at once? I can do trackpad gestures with two, three and four fingers, but it only acts on a single pixel target.
You sound like someone who said phones didn't need touch screens and that a keypad was just fine. Until a laptop with a screen becomes popular enough for new software and uses to emerge, you can't say what future uses might find interesting about it.
I think every consumer trend is toward tablets without keyboards or touchpads where the tablet itself is the primary input device, so it's mostly moot at this point. Traditional laptops are becoming either basically desktop replacements for mobile workers or ultraportables for tech workers and other professionals with narrow needs.
Pre-iPhone nobody knew they wanted to play Angry Birds and tap-to-expose a photo. Why can't I point at more than one thing on my laptop screen at once? I can do trackpad gestures with two, three and four fingers, but it only acts on a single pixel target.
There were other limitations there. The phones just weren't powerful enough, and the touchscreens weren't precise enough. The use cases emerged when the technology enabled them.
Multi-pixel point and select is a drastically different use paradigm for most software and would require dramatic shifts in UI design. But for a keyboard-centric interface (e.g., coding, writing), the loss from the physical context shift between keyboard, mouse, and screen is far too costly. It's also imprecise.
Take single-file audio editing/mastering as an example. Most of the work there involves high-precision interaction. The ideal interface is a physical interface such as a digital mixer board, coupled with a mouse and keyboard. But, as these are expensive and not commonly available, a mouse and keyboard alone is a 95% replacement.
But a touchscreen? Without drastic UI changes and large elements, it's simply not precise enough. I don't foresee the software being updated in the next several years for this. ;^)
Rym, have you ever used a (modern, Windows 8-era) non-tablet (by your definition) touchscreen laptop before?
Yes. I found the touchscreen itself to not be useful. I primarily interact with my laptop via keyboard, and secondarily, mouse.
Tablets and phones, I use the touchscreen (obviously), and it's the clear winner. But a laptop? Laptop implies that I'm sitting and doing real work: anything else and I don't even bother opening the laptop.
If I even open my laptop, it means I'm doing one of the following things:
1. Editing audio or video 2. Writing at length 3. Playing a full-on PC game like Civ5 4. Designing in InDesign or Balsamiq
Anything else? I use my phone or some other device.
I'll say this much: a touch screen laptop/desktop/etc. as an additional input method used alongside a keyboard and mouse/trackpad isn't necessarily a bad idea. Some operations are more intuitive with a touch screen and using touch gestures for certain operations certainly may be beneficial. However, as Rym said, relying too much on a touch screen would cause Gorilla arm and stress screen hinges. A touch screen on a desktop/laptop is more a "nice to have" feature due to these issues, but they cannot fully replace a keyboard and mouse/trackpad for getting any sort of real work done.
For example, I can see using a touch screen on my desktop/laptop for program launching and maybe flipping through photos in a photo browser app or something. I could also see it for things such as pinch to zoom or whatever. However, these would be relatively infrequent operations. I would still prefer a mouse or touchpad for the vast majority of graphical work on my machine.
But a touchscreen? Without drastic UI changes and large elements, it's simply not precise enough. I don't foresee the software being updated in the next several years for this. ;^)
Who said anything about current software being updated? I want new software for new uses of new technology. My "one pixel" complaint wasn't to do with precision, but the number "one".
For example, if I want to select many icons or text on my current laptop, I have to go to move a pointer to the start, somehow indicate this is the start of selection, then to the end, and again indicate. On my girlfriend's iPad, in a note taking app, she can select a bunch of text or images by drawing a circle round them. In another app she can just grab them, and the "start" and "end" selections just work.
I can't believe you actually think restricting people to fewer modes of input is a good thing. This is the reason why the Apple computer didn't come with cursor keys, because they though a mouse would be enough. Then they added cursors. BOTH is always the way to go for input in machines like this, and let the user decide which is best suited for them.
Mike Krahulik posted a blog entry on the PA site discussing using the WiiU controller to play COD stand-alone while reacting to his emerging parenting duties, seems relevant to this conversation. It convinced me that something like the Shield could be useful to a meaningful segment of the market.
nVidia has never made gadgets as consumer electronics before. They've made chips for other peoples' gadgets, sure, but this here is uncharted territory. Can they do it? I hope so, but it's gonna be hard to sell it to people, especially at the price ranges speculated by some tech journalists.
Big +1 to this. I'm not optimistic about Nvidia getting it right, especially when some non-trivial number of windows blue-screens can be attributed to bugs in their (simple by comparison) existing graphics drivers.
The specific use of touchscreen ultrabooks that Intel is talking about are powerful laptops where you can easily flip (or altogether remove) the screen from the rest of the laptop and use it as a tablet running Windows 8. Think the Surface, but a true laptop + tablet hybrid. Put it this way:
Intel's new ultrabooks: Spoon + fork hybrid, full functionality and benefits of each. Microsoft Surface: Spork.
Comments
I visited Kansas City this year too. Everyone was excited about Google Fiber... and 90% of those people were frustrated that it wasn't in their area. And in New York I couldn't get my phone to work at all for two days.
Seriously, no technology that relies on 4G internet connectivity to get video data from the device in my pocket to the device in my hand will ever work, let alone replace wires.
A phone is just a small tablet PC with built-in cell data access.
But touch-screens on regular non-convertable laptops? They're a recipe for broken screen hinges and probably not of interest to most users.
Touch screen monitors have been rejected by consumers every time they've been introduced. With desktops, they cause gorilla arm and are mostly useless. With nonconvertable laptops, they're awkward to use, stress the screen hinges, and require additional tension in them to be usable. They're still usually awkward in most use cases.
For convertable laptops, these are also tablets, so they're obligatory and the point is moot.
Pre-iPhone nobody knew they wanted to play Angry Birds and tap-to-expose a photo. Why can't I point at more than one thing on my laptop screen at once? I can do trackpad gestures with two, three and four fingers, but it only acts on a single pixel target.
There were other limitations there. The phones just weren't powerful enough, and the touchscreens weren't precise enough. The use cases emerged when the technology enabled them.
Multi-pixel point and select is a drastically different use paradigm for most software and would require dramatic shifts in UI design. But for a keyboard-centric interface (e.g., coding, writing), the loss from the physical context shift between keyboard, mouse, and screen is far too costly.
It's also imprecise.
Take single-file audio editing/mastering as an example. Most of the work there involves high-precision interaction. The ideal interface is a physical interface such as a digital mixer board, coupled with a mouse and keyboard. But, as these are expensive and not commonly available, a mouse and keyboard alone is a 95% replacement.
But a touchscreen? Without drastic UI changes and large elements, it's simply not precise enough. I don't foresee the software being updated in the next several years for this. ;^)
Tablets and phones, I use the touchscreen (obviously), and it's the clear winner. But a laptop? Laptop implies that I'm sitting and doing real work: anything else and I don't even bother opening the laptop.
If I even open my laptop, it means I'm doing one of the following things:
1. Editing audio or video
2. Writing at length
3. Playing a full-on PC game like Civ5
4. Designing in InDesign or Balsamiq
Anything else? I use my phone or some other device.
For example, I can see using a touch screen on my desktop/laptop for program launching and maybe flipping through photos in a photo browser app or something. I could also see it for things such as pinch to zoom or whatever. However, these would be relatively infrequent operations. I would still prefer a mouse or touchpad for the vast majority of graphical work on my machine.
For example, if I want to select many icons or text on my current laptop, I have to go to move a pointer to the start, somehow indicate this is the start of selection, then to the end, and again indicate. On my girlfriend's iPad, in a note taking app, she can select a bunch of text or images by drawing a circle round them. In another app she can just grab them, and the "start" and "end" selections just work.
I can't believe you actually think restricting people to fewer modes of input is a good thing. This is the reason why the Apple computer didn't come with cursor keys, because they though a mouse would be enough. Then they added cursors. BOTH is always the way to go for input in machines like this, and let the user decide which is best suited for them.
Intel's new ultrabooks: Spoon + fork hybrid, full functionality and benefits of each.
Microsoft Surface: Spork.