I'm starting to think about purchasing a new laptop. I'm in the early stages, doing some research.
A couple of questions...
Is Core 2 Duo really worth it? As best as I can tell, it only makes a difference if you are multi-tasking and/or you are using a program that is coded to take advantage of the dual core. I'm assuming that games will be coded this way, but I don't intend to use this laptop for gaming.
From what I've read, it should only make about a 5% difference for more mundane tasks such as surfing the web, checking email, etc.
My one real requirement is that I'd like the laptop to be good at playing video. I'd like something that integrates well with my television (remote control??), but I suspect that technology is either too expensive for what it's worth or isn't quite there yet.
I don't have an HDTV, nor do I anticipate getting one in the near future - so that's not a big concern.
I'd also like to cap the price at about $1000.
So... can I get something that is good with video for that price? Dell says so... but I don't know if that's marketing hype or not.
If it makes a difference, Linux is not an option for me.
Comments
DON'T!!!
Get the best you can afford because you CAN NOT upgrade later.
No matter what you do, the hardware for connecting to the TV will be the same. You will have some kind of cable, probably S-Video, going from the laptop to the TV. You will also have some audio cables, probably stereo mini to RCA, making the same trip. Depending on what laptop you get, you might need some adapters on the laptop end. There is currently no cheap and easy way to make this connection without wires.
The decision you have to make is if you want to use any special TV-centric software. Your first option is to pay a premium for the Apple. It can not be denied that Front Row, and the forthcoming Apple TV, will make this experience pretty damn good. It also gets you a working remote control. Your second choice is Windows Media Center Edition. I've never used it, but some people are big fans. If you have an XBox 360, this is the choice for you hands down because of the integration. The third choice is MythTV on Linux, but you already ruled that out. MythTV actually works really well, and provides a great user experience. However, even I wouldn't suggest using it on a laptop. You can also choose not to use any TV-centric software at all. Just use a wireless USB mouse and keyboard and off you go with a normal computer using the TV as a monitor.
The biggest consideration you have to make when buying a laptop is not power, but physical size. If you plan to leave this thing in once place all the time, you can probably get a laptop that is fatter, heavier and cheaper. If you want to carry it with you everywhere, you should pay more for something smaller, lighter and longer battery life.
It really sounds to me like most laptops on the market will do what you want. The only other thing I can suggest is that you get at least a gig of RAM. I suspect you are going to use Windows, and the user experience these days is really crappy without lots of RAM. The only reason my work laptop is tolerable is because it has 2 Gigs of RAM.
I'm not a road warrior, and would rarely need the battery. I have a wireless network at home, and I like the portability of a laptop. I like to be able to use it on the sofa, in the kitchen, etc. I figured that an average sized laptop would be fine. That's what I currently have, and I've got no real complaints.
The one thing sticking in my craw is the premium that is paid versus a desktop. However, I'd have to pay this anyway if I was to get some hardware to connect my desktop to my TV - especially since they are in different rooms.
1) Any thoughts on AMD versus Intel? Does AMD make their own version of a Core 2 Duo?
2) Most laptops seem to use integrated graphics. Is it worth paying extra for an ATI "video card" (Dell offers an ATI MOBILITY™ RADEON® X1300 HyperMemory in 128mb and 256mb)?
4) Which is better... A 1.66 Ghz Core 2 Duo or a 2.0 Ghz single core?
3) I'm shooting for 2 gigs of RAM. Would it be worth giving up a Core 2 Duo in order to increase the RAM?
AMD does not yet have a equivalent to core 2 duo as far as I know.
I know that my IBM R52 came with a ATI card in it.
personally I'd go with the Core 2 Duo.
and if you must wait a little longer and get a Core 2 Duo AND 2 gigs of ram your applications such as photoshop and games will run much better.
but honestly I am NOT an expert this is what I would get if I had the money and a choice.
I'm talking about skimping as in saving a couple of bucks and doing without something that is truly better.
Examples:
Thinking the 8MB on board video card is "good enough".
Thinking you will never "need" more than 2 USB ports.
Thinking that PCMCIA wi-fi card is a "better" alternative than the built-in one. Then you try to install Linux and find that the particular wi-fi card you got is the ONLY one that does not support Linux from that brand (even with ndiswrapper)!
Going for the discounted memory upgrade only to find out that they do it by filling all of your RAM slots rather than give you one big RAM chip.
Going for the slightly cheaper screen option only to find out that anything other than 12 inches away and straight on gives you a horrible viewing angle problem.
Not opting for the DVD/CD burner combo drive 'cause it cost $50 more and than complaining a month later that you can't burn anything.
That's what I mean by skimping... No, I'm not a bitter owner of a 'skimped' laptop... nope, not me!
Other than that, both of our computers are equally obsolete. In 2003 SATA was invented, and today neither of us have enough, if any, SATA ports. We both need new computers to carry on with business as usual. We also both need new CPUs and video cards if we want to play the games coming out this year. Wireless USB and 802.11n are the next big things, and not even the most expensive computers today will offer those features. No matter what you get, you will eventually need something different, that's how computers are. Spending more money will not stave off obsolescence. You just have to decide if you want to spend a small amount of money every 3-5 years or a large amount.
Buy a computer that meets and slightly exceeds your needs at the present. As your needs expand, upgrade that machine when it is practical. When your needs exceed the possibilities of the hardware, repeat the process by buying a new machine. This is the reality you have to accept. No amount of money you can spend will stave off obsolescence. The key to not making a bad purchase is to put in the proper amount of research before buying. You wouldn't have had problems with a non-Linux PCMCIA wireless card if you looked it up on Google before buying it. You would have known the cheaper LCD screen had a crappier viewing angle if you looked that up as well. Evaluate your needs and make sure what you buy can satisfy all of them and more without costing a boat load of money.
If all you are going to do is browse the web and read email you do not need a powerhouse.
FYI: All of the PCs in my home are 6+ yrs old. We will be getting a new one in a month or two, just debating the Mac vs PC thing right now.
My desktop is about 4 years old, and I have no desire to replace it anytime soon. It runs just fine. And it was a pretty cheap unit when I bought it. Granted, it's just used for email, internet, word processing, etc.
It seems that processor speeds have flat-lined somewhat as well. That's what got me nervous about Core 2 Duo. Intel's website (admittedly biased) says that processor speeds are supposed to flatten out. They say that the utilization of Core Duos is the future as far as speeding up applications.
As for games, the only game I play is Flight Simulator. I just buy one edition prior to the current release, and it seems to run fine on whatever I computer I buy. I won't be playing flight simulator on a laptop, though. I've never been impressed with a laptop's ability to do high end games. If it's even possible, the price is just absurd. I've got a real-world pilot's license, and I'd much rather spend $4500 flying a real airplane than buying a beefed up laptop so I can play one game.
I was surprised at your comment that 2 GB or RAM might not be necessary. I was always told that if you spend money on one thing, spend it on memory. Nobody has ever suggested to me that you could have too much.
There are a relatively small number of kinds of software where more is almost always better: graphics and artificial intelligence are the two that come most immediately to mind. These are tasks where the programmers really are restricted in what they can do by the limits of hardware. I can always make more detailed 3D models if the computer can crunch the numbers fast enough, for example.
But for the everyday tasks of most users (web browsing, playing media, word processing), the time when this sort of thing was an issue has long passed. Cutting-edge games are the exception (though I'd hardly claim they're the province of "everyday users"), because game developers (many of whom are graphics and AI programmers) will happily use whatever hardware they can get away with.
Whoever can crack this would make a fortune. Imagine dictating a letter talking just as you would to a friend? No doubt it would sell well.
The first is the problem of taking sounds produced by any human voice and converting them into phonemes, syllables, or some other representation that can be combined into meaningful words. One major stumbling block is the sheer variety of sounds humans make; another is that the computer must somehow know of a word in order to recognize it. Given the vast variety of specialized terminology, this is nearly impossible. There are dictionaries of common English words that work fairly well, but they break down when you want to work with, say, organic chemistry nomenclature. "1,5,5,6-tetramethyl-13-cyclohexadiene" and its ilk make the baby speech processor program cry. At best, it'll think you said something very strange about bicycles.
Natural language processing is the problem of taking sentences composed by humans and extracting meaning. Consider all the common rules of written English grammar. Then consider the uncommon ones. Then consider all the ways those rules are commonly broken in written English. Then consider conversational spoken English, which breaks most of those rules as a matter of course. Humans have proven themselves quite capable of understanding sentence fragments, misspelling and mispronunciation, malapropisms, ambiguous statements, and all the rest (even when it annoys the hell out of us, we can usually understand at least part of the message), but computers need to have the rules laid out much more explicitly.
What it boils down to is that there are a lot of things that the human brain does so easily and automatically that we assume they must be simple. Early AI research fell victim to this problem a lot - if you read the relevant papers from back in the 50s and 60s, when this was getting started, they're full of assertions that things like speech recognition, natural language processing, visual object recognition, and the like are only ten or fifteen years away. The usual excuse was that the hardware just needed to be a little faster, or the algorithms just needed a little more refinement. It took forty years of failures with the relatively simple models in use then, as well as advances in neuro- and cognitive science, to realize the truth: these things only seem simple because we've been hard-wired to do them. In truth, they're incredibly hard, complex problems.