On Windows, I would argue that CCCP + MPC-HC is better. For example, it offers support for Vista's Enhanced Video Render (EVR) while MPlayer does not.
I've tried the CCCP before. It's a huge mess. A million checkboxes all over the place. A cruddy gui from 100 years ago. mplayer, on the other hand, is a box with a video in it.
I've tried the CCCP before. It's a huge mess. A million checkboxes all over the place. A cruddy gui from 100 years ago. mplayer, on the other hand, is a box with a video in it.
It's silly for you to discuss the configuration options for CCCP when you obviously don't care about them. It works out of the box - you don't need to touch them at all. If you just run MPC-HC and press the "1" key you get the same plain box with better video in it; without the "1" key the default is a drop-down menu, play/pause/etc buttons and a time slider. Checkboxes don't appear unless you ask for them.
I tried installing Mplayer on a friend's PC, and the playback stuttered annoyingly. I did the same with CCCP + MPC-HC, no problems at all. Additionally, MPC-HC is much better for DVD playback.
I've tried the CCCP before. It's a huge mess. A million checkboxes all over the place. A cruddy gui from 100 years ago. mplayer, on the other hand, is a box with a video in it.
The GUI may be from a 100 years ago, but it's simple, straight forward and doesn't require reading the MAN page to have any clue how it works.
The GUI may be from a 100 years ago, but it's simple, straight forward and doesn't require reading the MAN page to have any clue how it works.
CCCP has never worked for me. Three times in the past I have downloaded it, installed it with all default settings, then tried to play videos, and it has not been cooperative.
Also, if reading the man page is so hard, I guess I'm just smarter and/or less lazy than you? In the time you've spent complaining here you could have mastered it already.
CCCP has never worked for me. Three times in the past I have downloaded it, installed it with all default settings, then tried to play videos, and it has not been cooperative.
Uncooperative how? Your experience is not representative of the majority of people. By the way, Scott, you need to differentiate between the media player and CCCP; CCCP is a set of codecs that can be used on many players, even MPlayer. It just comes bundled with players by default. Additionally, when the default player with CCCP became MPC-HC rather than MPC there were a number of significant improvements.
So Ubuntu saw that I had a hard drive with an MBR that wasn't pointing to GRUB and that it would be great and helpful if it pointed that MBR to the MBR on my primary drive that had Ubuntu on it.
Thanks Ubuntu, I can't load windows anymore. Good job.
So I installed Ubuntu 9.04 and tried to upgrade it to 9.10 just to see how it goes. In the time it takes to upgrade, I could have burned a 9.10 cd and done 5 or 6 clean installs of it.
Majority is smart. Here's the full story, in case anyone doesn't know.
For the longest time, Linux users hated NVidia and ATi. They didn't have any drivers whatsoever. It sucked balls.
Then eventually NVidia made a closed source driver. They complained, and still do, that it is closed source. However, it is an awesome driver that totally works.
ATi more recently has made drivers that are almost completely open source. Of course, these drivers still suck ass. ATi has no problem open sourcing their drivers, because they don't have any secrets. They are shitty drivers. NVidia will not open source their drivers because then ATi will learn all of their magic secrets to making drivers that don't suck.
Also, 90% of the code in NVidia's drivers is the same between Linux, Mac, and Windows. If they open sourced them for Linux, they would also give away all their Windows and Mac driver secrets as well.
Intel has always open sourced their video drivers and hardware, and they are very good drivers. The problem is that their video hardware sucks. It works perfectly well for 2d, but it has no 3d whatsoever.
Also, as far as dual-monitors go, NVidias twinview blows everyone else away on Linux.
I would like it if NVidia open sourced their drivers, because then everyone could have good video drivers, regardless of their hardware. But those driver-writing secrets are why NVidia has 2/3 and ATi does not have 2/3, so can you really blame them?
Thus, despite being an open source preferring person, having something that works well is more important to me. That is why I will continue to only buy NVidia video cards for the foreseeable future. Even in Windows, the drivers are just plain better.
What are you talking about? You should ONLY use mplayer. It's really simple to use.
Omnutia has been told this many times. He refuses to get himself a better video playback experience because it would mean he has had to listen to my advice.
A million checkboxes all over the place.
You're doing it wrong. You install it, and you're done, no need to go into the settings unless you have shit hardware, or have better non-free codecs (like CoreAVC) that you wish to use instead of the ffmpeg default. It's like Mplayer, install and you're done.
Yeah, I don't understand. For a Windows computer it's so straight forward. How can it not work, you must be doing something wrong.
The same goes for say Mplayer, and Omnutia not being able to hit the space bar to toggle pause.
I don't get how ATI can suck so much on Linux and give me no issues on Windows. Apple also seems to make it work without a problem.
ATi sucks on Windows and OSX, you just don't realize it. Here is an example of the kind of suckage ATi has that is regardless of OS.
Back before we had a Mac mini we were trying to build a living room PC. We had an ATi card in there. We connected the S-Video output from the ATi card to the TV. Nothing came out. The only way to make anything come out was with a special driver in Windows. So we switched to an older shittier NVidia card. The S-Video started outputting the POST and the BIOS. It realized we had no VGA connected, and used S-Video as the primary output in all OSes. This is one of those small things almost nobody notices that you can use to tell who is really better.
Here are two more examples from other companies.
The iPod is better than all other mp3 players, why? You can pause a podcast, sync to iTunes, then it will play from where you left off. Such a small, almost unnoticeable feature, yet it makes all the difference in the world. It shows you who is really paying attention to doing shit right, and who is just doing the bare minimum effort to get the product out the door.
Back in the day AMD CPUs had a better price/performance ratio than Intel. You could get a faster AMD for less money than a slower Intel chip. That's why I just gave away a pile of socket A equipment. However, if you paid close attention at the time, you would have seen that AMD was going to lose in the long run.
There was a demo video of what would happen to a CPU if you took the heat sink off. They compared a Pentium 4 to an Athlon while running an fps (I think Quake 3). When they took the heat sink off the AMD, the computer blue screened, froze, died, and then the magic smoke came out. When they took the heat sink off the Intel, the game slowed down to 0 fps and just stayed there. When they put the heat sink back on, the game went back up to full speed.
These are things that normal people never ever ever notice. Only people who do a ton of weird shit with computers and pay very close attention will ever see these kinds of things. But this is also how people, like myself and others, know which products are good and which are not as good. If you just use something casually, you won't notice a problem. Or you'll only notice when PEBKAC.
So yes, most people won't notice a problem with an ATi card on Windows or Mac. But believe me, under the hood it's got issues you can't see. NVidia, mostly because of its superior drivers, does not have many of these same issues.
Yes, AMD manages to compete with Intel despite having a fraction of its resources. And despite AMD being doomed in the long run, more computer's today are being shipped with AMD CPUs than ever before.
Is AMD going to take a majority market share? If they will it's not going to be for a while. But they don't do too bad for the size of their operation and they don't make vastly inferior hardware.
Yes, AMD manages to compete with Intel despite having a fraction of its resources. And despite AMD being doomed in the long run, more computer's today are being shipped with AMD CPUs than ever before.
Is AMD going to take a majority market share? If they will it's not going to be for a while. But they don't do too bad for the size of their operation and they don't make vastly inferior hardware.
If you are just a normal person observing surface-level information, it's easy to see how you can draw this conclusion. A consumer AMD CPU has comparable performance to an Intel and at lower or similar prices. The thing is, you are not taking into account all sorts of things that are beyond your vision.
Ever wonder why almost no laptops have AMD? Because they are electricity hogs. They will kill your battery big time. Intel gets way more performance per-watt. This is just one of the reasons that out of the top 500 supercomputers in the world only 43 use AMD. More supercomputers use PowerPC chips than use AMD. 393 of them use Intel.
A CPU is an incredibly complicated device. There are so many things going on, I doubt there are more than a handful of people on earth, if any, who really fully understand every aspect of the modern CPUs. If you're just a consumer looking at some benchmarks and maybe some prices, and all you do is play some games and browse the web, you won't notice a difference. But people who know more, know better.
It's the same as how a normal person who just drives to work every day won't even really notice if their car has a 4, 6, or even 8 cylinder engine. It makes a huge difference, but a normal consumer person probably won't even realize. Heck, many people can't even tell standard definition apart from HDTV.
At the present time, Intel CPUS and NVidia GPUs are just plain better than AMD/ATi. It's not about brand, it's about technology. Those companies just happen to be technologically superior at the moment, and it doesn't look like it's going to change anytime soon.
You have said several times now that intel and nvidia are just better and you have yet to say anything with regard to why. I'm sorry you said amd's were power hogs, which is true...circa 5 years. I see plenty of laptops with amd cpus these days, maybe you just haven't been to a best buy in years.
So stop talking about supercomputer using intel, because that only means intel made better deals. Stop talking about normal people not knowing the difference, because we know the difference between a 4 banger and a V8. Stop otherwise throwing up smoke screens and a explain why Intel and nVidia are technologically superior.
At the present time, Intel CPUS and NVidia GPUs are just plain better than AMD/ATi. It's not about brand, it's about technology.
I absolutely disagree. NVidia is lagging behind in terms of feature spec right now, having not even bothered to bring their current line of DX10 cards up to speed with DX10.1. This, of course, impacts nobody but developers, but now it forces them to write two different code paths: one using R2VB and one using VTF. The only reason for this is stubbornness: the ATI-created R2VB "won" (got accepted into the 10.1 standard) and VTF didn't.
On that note, NVidia's DirectX11 offerings are apparently non-existant, and while I don't believe they faked the demo unit for their new board, they certainly don't have anywhere near as much to show as ATI, whose hardware tessellator is looking mighty fine right now. In fact, they've already shipped review units to Toms Hardware, which means, as far as I can tell, it's a production-ready device.
On processors: bringing up the Quake-3-sans-heat-sink test right now is like making the claim that Linux sucks because you have to manually edit your xorg.conf file. No, it's 2009. That argument isn't even relevant anymore. AMD has made huge strides in terms of power management (Cool-and-Quiet was a huge thing while I was working there), even shipping a 65W version of their quad-core processor, right on-par with Intel. Gone are the days of cooking an egg on your Athlon, Scott.
Finally, let's talk about virtualization technology. Despite using the same silicone, Intel has selectively disabled it's VT on an odd smattering of its chips, probably in an attempt to segment the market artificially. Meanwhile, AMD keeps chugging along, enabling AMD-V on its full line of processors. Is this necessarily indicative of better technology? No. Is it indicative of the way the two companies view their users? Absolutely. I'd rather throw my lot in with the company that gives me more tech than less, especially considering XP-Mode is right around the corner.
To de(re?)rail the conversation back a bit, these are true and wise words:
mplayer, on the other hand, is a box with a video in it.
Additionally, though I have been a pulse hater for a long time, I have always thought it had great potential. I am running an RC of 9.10 on my laptop, and am very happy with pulse for local sound playback. Per application volume controls? Fuck yes. When the final version comes out, I'll be experimenting with the network-based playback again, as I'll be doing a little experimenting with having a single source which all the machines (and our 5 surround sound systems) can tie into. Hopefully, all the machines in the house will be able to easily do either local sound playback, or just replicate this stream so the music will play throughout the house.
In any event, pulse sucks much less than it used to, as evidenced by the fact that I'm now using it. It doesn't even completely destroy my performance any more!
Comments
If you just run MPC-HC and press the "1" key you get the same plain box with better video in it; without the "1" key the default is a drop-down menu, play/pause/etc buttons and a time slider. Checkboxes don't appear unless you ask for them.
I tried installing Mplayer on a friend's PC, and the playback stuttered annoyingly. I did the same with CCCP + MPC-HC, no problems at all.
Additionally, MPC-HC is much better for DVD playback.
Also, if reading the man page is so hard, I guess I'm just smarter and/or less lazy than you? In the time you've spent complaining here you could have mastered it already.
Also, like I said, MPlayer has a ton of GUIs.
By the way, Scott, you need to differentiate between the media player and CCCP; CCCP is a set of codecs that can be used on many players, even MPlayer. It just comes bundled with players by default.
Additionally, when the default player with CCCP became MPC-HC rather than MPC there were a number of significant improvements.
Thanks Ubuntu, I can't load windows anymore. Good job.
For the longest time, Linux users hated NVidia and ATi. They didn't have any drivers whatsoever. It sucked balls.
Then eventually NVidia made a closed source driver. They complained, and still do, that it is closed source. However, it is an awesome driver that totally works.
ATi more recently has made drivers that are almost completely open source. Of course, these drivers still suck ass. ATi has no problem open sourcing their drivers, because they don't have any secrets. They are shitty drivers. NVidia will not open source their drivers because then ATi will learn all of their magic secrets to making drivers that don't suck.
Also, 90% of the code in NVidia's drivers is the same between Linux, Mac, and Windows. If they open sourced them for Linux, they would also give away all their Windows and Mac driver secrets as well.
Intel has always open sourced their video drivers and hardware, and they are very good drivers. The problem is that their video hardware sucks. It works perfectly well for 2d, but it has no 3d whatsoever.
Also, as far as dual-monitors go, NVidias twinview blows everyone else away on Linux.
I would like it if NVidia open sourced their drivers, because then everyone could have good video drivers, regardless of their hardware. But those driver-writing secrets are why NVidia has 2/3 and ATi does not have 2/3, so can you really blame them?
Thus, despite being an open source preferring person, having something that works well is more important to me. That is why I will continue to only buy NVidia video cards for the foreseeable future. Even in Windows, the drivers are just plain better.
Back before we had a Mac mini we were trying to build a living room PC. We had an ATi card in there. We connected the S-Video output from the ATi card to the TV. Nothing came out. The only way to make anything come out was with a special driver in Windows. So we switched to an older shittier NVidia card. The S-Video started outputting the POST and the BIOS. It realized we had no VGA connected, and used S-Video as the primary output in all OSes. This is one of those small things almost nobody notices that you can use to tell who is really better.
Here are two more examples from other companies.
The iPod is better than all other mp3 players, why? You can pause a podcast, sync to iTunes, then it will play from where you left off. Such a small, almost unnoticeable feature, yet it makes all the difference in the world. It shows you who is really paying attention to doing shit right, and who is just doing the bare minimum effort to get the product out the door.
Back in the day AMD CPUs had a better price/performance ratio than Intel. You could get a faster AMD for less money than a slower Intel chip. That's why I just gave away a pile of socket A equipment. However, if you paid close attention at the time, you would have seen that AMD was going to lose in the long run.
There was a demo video of what would happen to a CPU if you took the heat sink off. They compared a Pentium 4 to an Athlon while running an fps (I think Quake 3). When they took the heat sink off the AMD, the computer blue screened, froze, died, and then the magic smoke came out. When they took the heat sink off the Intel, the game slowed down to 0 fps and just stayed there. When they put the heat sink back on, the game went back up to full speed.
These are things that normal people never ever ever notice. Only people who do a ton of weird shit with computers and pay very close attention will ever see these kinds of things. But this is also how people, like myself and others, know which products are good and which are not as good. If you just use something casually, you won't notice a problem. Or you'll only notice when PEBKAC.
So yes, most people won't notice a problem with an ATi card on Windows or Mac. But believe me, under the hood it's got issues you can't see. NVidia, mostly because of its superior drivers, does not have many of these same issues.
Is AMD going to take a majority market share? If they will it's not going to be for a while. But they don't do too bad for the size of their operation and they don't make vastly inferior hardware.
Ever wonder why almost no laptops have AMD? Because they are electricity hogs. They will kill your battery big time. Intel gets way more performance per-watt. This is just one of the reasons that out of the top 500 supercomputers in the world only 43 use AMD. More supercomputers use PowerPC chips than use AMD. 393 of them use Intel.
A CPU is an incredibly complicated device. There are so many things going on, I doubt there are more than a handful of people on earth, if any, who really fully understand every aspect of the modern CPUs. If you're just a consumer looking at some benchmarks and maybe some prices, and all you do is play some games and browse the web, you won't notice a difference. But people who know more, know better.
It's the same as how a normal person who just drives to work every day won't even really notice if their car has a 4, 6, or even 8 cylinder engine. It makes a huge difference, but a normal consumer person probably won't even realize. Heck, many people can't even tell standard definition apart from HDTV.
At the present time, Intel CPUS and NVidia GPUs are just plain better than AMD/ATi. It's not about brand, it's about technology. Those companies just happen to be technologically superior at the moment, and it doesn't look like it's going to change anytime soon.
So stop talking about supercomputer using intel, because that only means intel made better deals. Stop talking about normal people not knowing the difference, because we know the difference between a 4 banger and a V8. Stop otherwise throwing up smoke screens and a explain why Intel and nVidia are technologically superior.
On that note, NVidia's DirectX11 offerings are apparently non-existant, and while I don't believe they faked the demo unit for their new board, they certainly don't have anywhere near as much to show as ATI, whose hardware tessellator is looking mighty fine right now. In fact, they've already shipped review units to Toms Hardware, which means, as far as I can tell, it's a production-ready device.
Interestingly, SLI is undeniably better than Crossfire, but I don't seem to have to convince you of the technological superiority of SLI, knowing full well the wonders of it for yourself.
On processors: bringing up the Quake-3-sans-heat-sink test right now is like making the claim that Linux sucks because you have to manually edit your xorg.conf file. No, it's 2009. That argument isn't even relevant anymore. AMD has made huge strides in terms of power management (Cool-and-Quiet was a huge thing while I was working there), even shipping a 65W version of their quad-core processor, right on-par with Intel. Gone are the days of cooking an egg on your Athlon, Scott.
Finally, let's talk about virtualization technology. Despite using the same silicone, Intel has selectively disabled it's VT on an odd smattering of its chips, probably in an attempt to segment the market artificially. Meanwhile, AMD keeps chugging along, enabling AMD-V on its full line of processors. Is this necessarily indicative of better technology? No. Is it indicative of the way the two companies view their users? Absolutely. I'd rather throw my lot in with the company that gives me more tech than less, especially considering XP-Mode is right around the corner.
In any event, pulse sucks much less than it used to, as evidenced by the fact that I'm now using it. It doesn't even completely destroy my performance any more!