Most ski resorts require you to wear a leash with your snowboard, which wraps around your ankle and connects you to the board. This keeps the snowboard from getting away if it pops off. Of course, since most leashes are just made of nylon, it is certainly possible for them to break or slip off.
There are people out there who stuck with AOL as an email service when everyone who knew anything were going to Yahoo! mail circa 2000. There are people who stuck with Yahoo! mail when everyone who knew anything were going to Gmail circa 2005. One feature of these old services is that they increasingly filled your screen with flashing ads. Their users have a choice of either going insane or putting blinders on and ignoring everything but the middle of the screen. I even remember that RealPlayer, which wasn't a really intelligent choice of a media server, tended to put a blinking icon on the task bar, just to annoy you about paying for an upgrade. I also remember RealPlayer once streaming ads when I simply used it to play an audio CD.
One is tempted to associate intelligence with failure to clue-in to the best services. At least, we can say that the person connected to tech-savvy crowds is going to hear about the better services that come available. But I do think there's also a political philosophy behind how you respond to criticism of corporations. I have one Capitalist (Capitalism-ism-ist!) friend who instinctively isn't going to jump on any bandwagon (linux, google, etc.) that involves calling any businessman's free action in the marketplace "evil"-- he bought a Zune and i's on Yahoo! Mail.
I have one Capitalist (Capitalism-ism-ist!) friend who instinctively isn't going to jump on any bandwagon (linux, google, etc.) that involves calling any businessman's free action in the marketplace "evil"
Then he doesn't understand capitalism. If other people are doing it for free, the people who want to charge money to do it provide a service that effectively has no value. Not evil, just... capitalism.
That sentence of pterandon's was difficult to decipher. Like Rym, I would assume that its intended meaning was effectively the opposite of its literal meaning?
But he pays taxes while the government enforces laws against price collusion?
Could someone find an example of Linux calling Microsoft's business practices evil outside of the patent shenanigans? Specifically a Linux company; Nut-job fanboys will say anything.
I also think this guy has a pretty mal-formed view of what capitalism entails.
Okay first of all, let me retract any criticism of my one friend described in my post.
Capitalism and free markets are good things. But there have been folks who thought that defending capitalism means refuting claims that industrial pollution can cause problems like ozone depletion or global warming. The science of climate change, per se, is no threat to the wisdom of libertarianism or free markets, but it can threaten certain militantly anti-communitarian moral philosophies. And there are different corporate responses to these problems. You have the US Chamber of Commerce and industry groups like the late Global Climate Coalition who took a staunch anti-communitarian response to enviro problems (to the point of disinformation), and successful, Fortune 500 Capitalists who quit these industry groups specifically because of their brownwashing.
Folks with this anti-communitarian worldview aren't going to be swayed to Linux by a speech by RMS, for one thing.
This was really convenient, as I've recently moved out of a studio apartment into a one-bedroom deal. Having the big, loud beasts in the bedroom serving media out to a smaller, quieter HTPC in the main room is an ideal setup, and I might get some pointers from your machine to help me build my own. Thanks for sharing that wish list!
This was really convenient, as I've recently moved out of a studio apartment into a one-bedroom deal. Having the big, loud beasts in the bedroom serving media out to a smaller, quieter HTPC in the main room is an ideal setup, and I might get some pointers from your machine to help me build my own. Thanks for sharing that wish list!
If you aren't planning on playing 3D games on your HTPC, then don't build one. If all you want to do is watch videos that are on bedroom PCs, the Boxee Box will probably do really well. If not that, then any NVidia ION machine will do. This one is looking hot. Also, the Mac mini will do just fine, but will cost more. I would wait to see if the rumors of Mac mini with HDMI are true.
Alas, I am planning on playing games with this machine, to include emulating PlayStation 2, GameCube, and Wii titles (all of which works best when 64-bitted and multi-cored). Also, in lieu of buying a receiver, I have been using a Creative Soundblaster X-Fi Elite Pro for surround sound on my game consoles, so I'll need a box I can put a PCI card in for the main room anyway.
As for the ION, running 1080p, H.264 video and barely taxing the CPU is very attractive and I'm sure that it can emulate all of the simpler consoles (PlayStation and earlier). Without being able to play something like Red Faction Guerilla or Half-Life 2 at acceptable framerates though, I can't do anything with it.
Alas, Iamplanning on playing games with this machine, to include emulating PlayStation 2, GameCube, and Wii titles (all of which works best when 64-bitted and multi-cored).
You might not get a quiet machine to do that.
As for the ION, running 1080p, H.264 video and barely taxing the CPU is very attractive and I'm sure that it can emulate all of the simpler consoles (PlayStation and earlier). Without being able to play something like Red Faction Guerilla or Half-Life 2 at acceptable framerates though, I can't do anything with it.
Half-life 2 came out when I was in 8th Grade. It's still legendary for its rock-bottom system requirements. Just get an ION board, a low-profile last-gen video card, and perhaps a soundcard (though motherboards seem to all have pretty tolerable HD audio built in these days) and you'll be good. Scott's machine could probably do HL2 no problem.
I have been using a Creative Soundblaster X-Fi Elite Pro for surround sound on my game consoles, so I'll need a box I can put a PCI card in for the main room anyway.
I thought about doing this with my X-Fi. The problem is that it is a full-sized PCI express card. All of the nice HTPC cases are low profile, so you would need a low profile sound card or an external one. Also, if you want to do such heavy gaming, including fps games on the TV (why?) don't bother trying to get an HTPC. Just get a PC. What I got is pretty much the max I could get while being quiet and small. More horsepower I would have either had to get louder, bigger, or really more complex and expensive (water cooling).
Quieter is all it need be. I have a Core 2 Duo (E8500) with a stock fan I've been using to emulate the consoles mentioned at full framerates, and it has been quiet enough. The six case fans it uses to keep the hard drives cool, on the other hand, are not. I've already specced out the new machine anyway, and it looks like the mobo I'm going for is essentially the LGA 775 equivalent of the one in Scott's wish list. With the Soundblaster X-Fi and my existing last-gen video card, I'm all set!
Scott's HD 5570 can handle Left 4 Dead at 1920x1200 (44fps though, so it could be better), so 1080p HL2 wouldn't be a problem.
He's building a real machine, as am I. Both without IONs!
I really need to emphasize here, I named Half-Life 2 specifically because of a review of the original ION hardware's difficulty with that title. The ION 2 hardware probably screams with it, but I'm not as interested in Half-Life 2 as I am games like Red Faction Guerrilla, Prototype, Street Fighter IV, or other console- (read:gamepad-) centric titles. A real proc with a real video card is necessary to get what I want out of games that recent, and that hardware will also perform the emulation tasks I require. Thanks for your suggestions, though!
As far as I can tell, the "not seeing words on the screen" is an example of a psychological phenomenon called inattentional blindness, and it's something that everyone does. If you want a simple demonstration just watch this video and follow the instructions. There are many more videos on the subject, and on the related subject of change blindness here.
I think that the best solution to the problem is to raise the base level of computer competence across the board. Only when people learn how computers work and know how to look for these messages will the problem go away completely.
how to look for these messages will the problem go away completely
They pop up in the middle of your screen with an audible, distinct alert sound. Audio-visual. The problem is not inattentional blindness (it's not a matter of not paying attention), it's a problem of shitty error messages (random error codes) and people not wanting to deal with whatever happened and getting annoyed by being interrupted in their tasks. Add to this the fact that sometimes people lose data when an error occurs and they're even more pissed. It's not inattentional blindness, it's people being stupid. They see the bear break dancing, doing the foxtrot, swing, tango, all while operating a barbecue without any ill effect, but they don't want to see him. So they look the other way.
The solution is indeed just teaching people how to work with a computer though, agree on that.
Though I might share some nice low profile components I've stumbled around on newegg and weight in on the whole HTPC components talk
The best low profile ready video card I could find was this http://www.newegg.ca/Product/Product.aspx?Item=N82E16814187098&Tpk=sparkle%20250 that's a pretty damn fast card even better than the 5570 and keep in mind there's micro atx boards that have Crossfire/SLI capabilities so you can just stick 2 compatible low profile boards (this particular one doesn't support SLI) to make it a lot faster, the 5570 would probably be a good bet.
While that card is low profile ready, it does not include a low profile bracket. Therefore, how are you actually supposed to get it into a low profile system? Also, it requires the extra power connectors and that fan looks pretty loud. Probably not going to do so well in an HTPC where low power consumption and quiet are priorities.
As for that sound card, it's basically useless in my setup. It only has one TOSLink. My Creative X-Fi has two, one in and one out.
According to Wikipedia, the HD 5570 has a TDP of 43W, while the GTS 250 has a TDP of 145W. That means 3x as much power, and 3x as much heat to dissipate. The GT 240, HD 5670, and HD 5750 are more powerful than the HD 5570 without excessive power consumption, but I haven't seen low-profile versions of any of those cards.
The problem of people not looking at and absorbing the relevant information in communication is a problem that dogs every level of communication, not only GUIs and other tech inputs. From a graphic design theory perspective, the problem is often reduced to the issue of audience response.
Cognitive psychologist Donald Norman talks about the emotional response that communication elicits falling into distinct categories:
Visceral- Gut reaction stuff. Cultural cues that are so ingrained in our psyche, that when they get called up there's an instant a usually predictable reaction. Behavioral- Responses that deal with regularly practiced rituals; Norman calls these the purview of "experts". The classic example of a behavioral response is how driving a car becomes increasingly subconscious over the course of time. It doesn't mean that you're particularly good at driving, simply that as you repeatedly interact with the same stimuli over and over again in a similar scenario, you interact with them less and less consciously. This is the set up for the inattentive blindness that Karlito mentioned, and why Rym's example of the unnoticed motorcyclist is so pertinent. When a new stimuli interrupts this string of behavioral responses, it can often go ignored or avoided, because unlike all the other stimuli, it warrants reflection. It breaks the ritual. Reflective- The response for the sudden error message, the unexpected motorcyclist, the stimuli that require contemplation and comprehension to work with. In communication theory, stimuli that require a reflective response can often be the most powerful, as they have a longer presence in minds of their audience. The only problem is that when introduced carelessly, they can be just as easily ignored as engaged.
I think that for a lot of tech-savvy people, obviously including many of the people in this forum, the concept of error messages were introduced in an environment that required problem solving on a personal level. You were forced to engage with the reflection that the error message required, and doing so helped develop that interaction as part of your "computer ritual." The next time you engaged an error, it was already part of the ritual. You knew what it was, and at least the beginning steps to deal with it. A lot of people don't have that experience. They either were never forced to take or never sought out the self-reliant path. There was always a friend, a co-worker, a customer service rep that could help them out. At the very least, the system allows them to ignore it. Troubleshooting forever remains an intersection with their regular ritual of computer use that they can simply ignore or push off onto someone else. That pushing off itself cements itself as part of their ritual, until it becomes a subconscious act like everything else they wok through.
I think that the best solution to the problem is to raise the base level of computer competence across the board. Only when people learn how computers work and know how to look for these messages will the problem go away completely.
I think that's the easy solution, and to a lot of tech-savvy people, the "obvious" one. It's hard to perceive how one method of operating can be so ingrained when your way of operating feels so natural and easily learned. I also think it's the lazy, and frankly, sloppy answer to the issue. It's essentially saying that a problem that a huge mass of consumers are having with interfaces and products is their problem, not the problem of the interfaces and products themselves. It's the antithesis of modern ergonomics, and in a lot of similar cases, it doesn't work. Just like encouraging literate children requires the fostering of an environment the rewards the independent engagement of literature, I think the answer to this sort of interface illiteracy requires interfaces that foster a sense of engagement and self-reliance. I say the key is to establish self-troubleshooting as an integral part of the computer experience that's introduced early on. By all means make it an easy and guided experience, but make sure it's a necessary one. People are forced to engage, they're forced to learn and incorporate these behaviors into their routine, and the world is better for it.
I also think it's the lazy, and frankly, sloppy answer to the issue. It's essentially saying that a problem that a huge mass of consumers are having with interfaces and products is their problem, not the problem of the interfaces and products themselves. ... I say the key is to establish self-troubleshooting as an integral part of the computer experience that's introduced early on.
I say you are all underestimating the "Steve Jobs" effect, and the real solution is to make computers simple and easy enough to use, that don't break, and don't display error messages. Or, to put it another way, the people who don't read will never read, and when someone makes something iPad-easy, the experience of computing will become so stress-free that they'll just sell a shit load.
simple and easy enough to use, that don't break, and don't display error messages.
But see, Apple products do break and don't display error messages anyway. My iPhone (not Jailbroken, for the record), has crashed numerous times with no error log for me to look at or anything.
You need to have some way to do diagnostic work; you can't build a perfect machine, because people will always find a way to make it crash. Either make error logs more comprehensive, or work on your GUI design in error reporting. The BSoD is probably the best example of "shit error reporting no one wanted to look at."
I say you are all underestimating the "Steve Jobs" effect, and the real solution is to make computers simple and easy enough to use, that don't break, and don't display error messages. Or, to put it another way, the people who don't read will never read, and when someone makes something iPad-easy, the experience of computing will become so stress-free that they'll just sell a shit load.
Sure, that's why they don't need a genius bar at all, and there isn't a huge line for it. Oh wait.
Even if you were able to make the ideal Steve Jobs device, with no crashing or errors ever, it presents an ethical conundrum. While supposedly being easier for these non-reading users, Apple products have extremely limited functionality compared to the alternatives.
Here's a good analogy. Let's say Steve Jobs is an automatic transmission and Linux is a manual. Imagine if radios and air conditioning were only available in cars with manual transmissions. Is that right? Should people be forced to learn to drive stick just to have the comfort of air conditioning and music? The Linux people would put in an automatic if they could, but nobody has the design skills to figure it out. Apple could easily add the radio and air conditioning, but they refuse. When people try to open their cars and put it in themselves, Apple fights them.
There's currently a dichotomy where people are forced to choose between a computer with a good UI and a computer that can actually do all the things it is capable of doing. It's a true dichotomy in that it does currently exist in the world. It's a false dichotomy in that it doesn't have to exist.
Here's a good analogy. Let's say Steve Jobs is an automatic transmission and Linux is a manual. Imagine if radios and air conditioning were only available in cars with manual transmissions. Is that right? Should people be forced to learn to drive stick just to have the comfort of air conditioning and music? The Linux people would put in an automatic if they could, but nobody has the design skills to figure it out. Apple could easily add the radio and air conditioning, but they refuse. When people try to open their cars and put it in themselves, Apple fights them.
There's currently a dichotomy where people are forced to choose between a computer with a good UI and a computer that can actually do all the things it is capable of doing. It's a true dichotomy in that it does currently exist in the world. It's a false dichotomy in that it doesn't have to exist.
This is the best analogy I've heard in a great while, though, if I could, I would like to add to it.
Most cars in the world would be windows cars, which offer the best of both worlds between apple and Linux; a Flappy Paddle sequential manual. Also, all windows cars come with A/C, though some people might want to add a little extra charge to their freon tank.
But see, Apple products do break and don't display error messages anyway. My iPhone (not Jailbroken, for the record), has crashed numerous times with no error log for me to look at or anything.
If you look back at my original statement, I'm talking in the future tense about a hypothetical computing device that does not yet exist. I have an iPhone too, and it crashes sometimes. I'm not saying the iPhone is the device for non-readers, I just think that solution is going to be reached WAY before any general education of computer users makes them able to read and act on error messages and error logs.
I just think that solution is going to be reached WAY before any general education of computer users makes them able to read and act on error messages and error logs.
There's currently a dichotomy where people are forced to choose between a computer with a good UI and a computer that can actually do all the things it is capable of doing. It's a true dichotomy in that it does currently exist in the world. It's a false dichotomy in that it doesn't have to exist.
It is also a false dichotomy in that the vast majority of people don't want a computer that is capable of doing all the things THE COMPUTER is capable of doing. Instead they want a computer that is capable of doing all the things that THE USER is capable of doing.
Which means that, if the user isn't even capable of READING, a computer that is limited in its functionality is probably the best bet for sales. If people could read error messages and manually manage programs so multitasking doesn't bog the phone down completely, the iPhone could probably get by with true multitasking. But the iPhone developers knew that most people couldn't cope with such a phone, so multitasking wasn't an option.
Scott, you continually overestimate what the average user wants from a phone or computer, and the average user is where these companies make the most money.
Comments
There are people out there who stuck with AOL as an email service when everyone who knew anything were going to Yahoo! mail circa 2000. There are people who stuck with Yahoo! mail when everyone who knew anything were going to Gmail circa 2005. One feature of these old services is that they increasingly filled your screen with flashing ads. Their users have a choice of either going insane or putting blinders on and ignoring everything but the middle of the screen. I even remember that RealPlayer, which wasn't a really intelligent choice of a media server, tended to put a blinking icon on the task bar, just to annoy you about paying for an upgrade. I also remember RealPlayer once streaming ads when I simply used it to play an audio CD.
One is tempted to associate intelligence with failure to clue-in to the best services. At least, we can say that the person connected to tech-savvy crowds is going to hear about the better services that come available. But I do think there's also a political philosophy behind how you respond to criticism of corporations. I have one Capitalist (Capitalism-ism-ist!) friend who instinctively isn't going to jump on any bandwagon (linux, google, etc.) that involves calling any businessman's free action in the marketplace "evil"-- he bought a Zune and i's on Yahoo! Mail.
Could someone find an example of Linux calling Microsoft's business practices evil outside of the patent shenanigans? Specifically a Linux company; Nut-job fanboys will say anything.
I also think this guy has a pretty mal-formed view of what capitalism entails.
Capitalism and free markets are good things. But there have been folks who thought that defending capitalism means refuting claims that industrial pollution can cause problems like ozone depletion or global warming. The science of climate change, per se, is no threat to the wisdom of libertarianism or free markets, but it can threaten certain militantly anti-communitarian moral philosophies. And there are different corporate responses to these problems. You have the US Chamber of Commerce and industry groups like the late Global Climate Coalition who took a staunch anti-communitarian response to enviro problems (to the point of disinformation), and successful, Fortune 500 Capitalists who quit these industry groups specifically because of their brownwashing.
Folks with this anti-communitarian worldview aren't going to be swayed to Linux by a speech by RMS, for one thing.
As for the ION, running 1080p, H.264 video and barely taxing the CPU is very attractive and I'm sure that it can emulate all of the simpler consoles (PlayStation and earlier). Without being able to play something like Red Faction Guerilla or Half-Life 2 at acceptable framerates though, I can't do anything with it.
I really need to emphasize here, I named Half-Life 2 specifically because of a review of the original ION hardware's difficulty with that title. The ION 2 hardware probably screams with it, but I'm not as interested in Half-Life 2 as I am games like Red Faction Guerrilla, Prototype, Street Fighter IV, or other console- (read:gamepad-) centric titles. A real proc with a real video card is necessary to get what I want out of games that recent, and that hardware will also perform the emulation tasks I require. Thanks for your suggestions, though!
As far as I can tell, the "not seeing words on the screen" is an example of a psychological phenomenon called inattentional blindness, and it's something that everyone does. If you want a simple demonstration just watch this video and follow the instructions. There are many more videos on the subject, and on the related subject of change blindness here.
I think that the best solution to the problem is to raise the base level of computer competence across the board. Only when people learn how computers work and know how to look for these messages will the problem go away completely.
The solution is indeed just teaching people how to work with a computer though, agree on that.
The best low profile ready video card I could find was this
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814187098&Tpk=sparkle%20250
that's a pretty damn fast card even better than the 5570 and keep in mind there's micro atx boards that have Crossfire/SLI capabilities so you can just stick 2 compatible low profile boards (this particular one doesn't support SLI) to make it a lot faster, the 5570 would probably be a good bet.
Anybody wanting a nice XFI based card in a low profile casing check this sound card out
http://www.newegg.com/Product/Product.aspx?Item=N82E16829156010&Tpk=azuntech%20forte
It's based off Creative XFI Titanium line using the same chipset but with better drivers not provided by Creative themselves.
As for that sound card, it's basically useless in my setup. It only has one TOSLink. My Creative X-Fi has two, one in and one out.
The GT 240, HD 5670, and HD 5750 are more powerful than the HD 5570 without excessive power consumption, but I haven't seen low-profile versions of any of those cards.
Cognitive psychologist Donald Norman talks about the emotional response that communication elicits falling into distinct categories: I think that for a lot of tech-savvy people, obviously including many of the people in this forum, the concept of error messages were introduced in an environment that required problem solving on a personal level. You were forced to engage with the reflection that the error message required, and doing so helped develop that interaction as part of your "computer ritual." The next time you engaged an error, it was already part of the ritual. You knew what it was, and at least the beginning steps to deal with it.
A lot of people don't have that experience. They either were never forced to take or never sought out the self-reliant path. There was always a friend, a co-worker, a customer service rep that could help them out. At the very least, the system allows them to ignore it. Troubleshooting forever remains an intersection with their regular ritual of computer use that they can simply ignore or push off onto someone else. That pushing off itself cements itself as part of their ritual, until it becomes a subconscious act like everything else they wok through. I think that's the easy solution, and to a lot of tech-savvy people, the "obvious" one. It's hard to perceive how one method of operating can be so ingrained when your way of operating feels so natural and easily learned.
I also think it's the lazy, and frankly, sloppy answer to the issue. It's essentially saying that a problem that a huge mass of consumers are having with interfaces and products is their problem, not the problem of the interfaces and products themselves. It's the antithesis of modern ergonomics, and in a lot of similar cases, it doesn't work. Just like encouraging literate children requires the fostering of an environment the rewards the independent engagement of literature, I think the answer to this sort of interface illiteracy requires interfaces that foster a sense of engagement and self-reliance.
I say the key is to establish self-troubleshooting as an integral part of the computer experience that's introduced early on. By all means make it an easy and guided experience, but make sure it's a necessary one. People are forced to engage, they're forced to learn and incorporate these behaviors into their routine, and the world is better for it.
You need to have some way to do diagnostic work; you can't build a perfect machine, because people will always find a way to make it crash. Either make error logs more comprehensive, or work on your GUI design in error reporting. The BSoD is probably the best example of "shit error reporting no one wanted to look at."
Even if you were able to make the ideal Steve Jobs device, with no crashing or errors ever, it presents an ethical conundrum. While supposedly being easier for these non-reading users, Apple products have extremely limited functionality compared to the alternatives.
Here's a good analogy. Let's say Steve Jobs is an automatic transmission and Linux is a manual. Imagine if radios and air conditioning were only available in cars with manual transmissions. Is that right? Should people be forced to learn to drive stick just to have the comfort of air conditioning and music? The Linux people would put in an automatic if they could, but nobody has the design skills to figure it out. Apple could easily add the radio and air conditioning, but they refuse. When people try to open their cars and put it in themselves, Apple fights them.
There's currently a dichotomy where people are forced to choose between a computer with a good UI and a computer that can actually do all the things it is capable of doing. It's a true dichotomy in that it does currently exist in the world. It's a false dichotomy in that it doesn't have to exist.
Most cars in the world would be windows cars, which offer the best of both worlds between apple and Linux; a Flappy Paddle sequential manual. Also, all windows cars come with A/C, though some people might want to add a little extra charge to their freon tank.
Which means that, if the user isn't even capable of READING, a computer that is limited in its functionality is probably the best bet for sales. If people could read error messages and manually manage programs so multitasking doesn't bog the phone down completely, the iPhone could probably get by with true multitasking. But the iPhone developers knew that most people couldn't cope with such a phone, so multitasking wasn't an option.
Scott, you continually overestimate what the average user wants from a phone or computer, and the average user is where these companies make the most money.