This forum is in permanent archive mode. Our new active community can be found here.

Building A Computer

1192022242564

Comments

  • edited September 2012
    It would be great if Lenovo made one of these that was also part of the official Lenovo dock. Then just let people stick a card right in there.

    Really, I think there is a huge opportunity for one of the niche laptop makers to do this.
    Post edited by Apreche on
  • edited September 2012
    Probably wouldn't be a full length card but whatever they put in those oversize laptops.

    Right now, prohibitive cost and no standard taking off is a problem. Expresscard is kinda common but bulky, only Macs have light peak and USB3.0 may still have the latency issues USB 2 did (It was never designed for low latency at that kind of bandwidth.).

    Here's the MiniPCIE version which is a 5Gbx1 lane system and a Thunderbolt version with a 5Gbx2 slot.

    http://www.hwtools.net/Adapter/PE4L V2.1.html
    http://www.hwtools.net/Adapter/TH05.html

    This place sells WiMax addapters..
    Post edited by Omnutia on
  • I'm not sure if this website will be kept up-to-date. But for now, when someone asks you for advice, you can say "This is a good starting point".
  • RymRym
    edited September 2012
    Our computers are old enough to predate PCI Express 3.0... PCIe 2.0 x16.

    Might not be a problem, but I've been out of the loop on mobo hardware long enough that I actually have to research how much compatibility there is. (Many of you are too young to remember the slightly different versions of AGP and the tabs on video cards at one point). ;^)
    Post edited by Rym on
  • Nice. Found my answer.

    Q: Will PCIe 3.0 products be compatible with existing PCIe 1.x and PCIe 2.x products?

    A: PCI-SIG is proud of its long heritage of developing compatible architectures and its members have consistently produced compatible and interoperable products. In keeping with this tradition, the PCIe 3.0 architecture is fully compatible with prior generations of this technology, from software to clocking architecture to mechanical interfaces. That is to say PCIe 1.x and 2.x cards will seamlessly plug into PCIe 3.0-capable slots and operate at their highest performance levels. Similarly, all PCIe 3.0 cards will plug into PCIe 1.x- and PCIe 2.x-capable slots and operate at the highest performance levels supported by those configurations.
  • Nice. Found my answer.

    Q: Will PCIe 3.0 products be compatible with existing PCIe 1.x and PCIe 2.x products?

    A: PCI-SIG is proud of its long heritage of developing compatible architectures and its members have consistently produced compatible and interoperable products. In keeping with this tradition, the PCIe 3.0 architecture is fully compatible with prior generations of this technology, from software to clocking architecture to mechanical interfaces. That is to say PCIe 1.x and 2.x cards will seamlessly plug into PCIe 3.0-capable slots and operate at their highest performance levels. Similarly, all PCIe 3.0 cards will plug into PCIe 1.x- and PCIe 2.x-capable slots and operate at the highest performance levels supported by those configurations.
    I could have told you that.
  • Yeah. I just had to be sure. I forgot that PCIe 3.0 didn't come out until like 2010. My upgrade cycle of video cards in the AGP days hit a roadblock due to the voltage thing. I remember some video cards coming out with either connector around the same time.
  • Also, remember AGP aperture settings? Those were heady days when dedicated video cards with enhanced performance (or even the idea of buying a "video card" really started taking off for many people.
  • edited September 2012
    I always kept the AGP aperture at the highest setting, but my machines had a lot of RAM at the time. I probably still have an AGP computer around somewhere...
    Post edited by Victor Frost on
  • Originally the entire idea of AGP was that you would make a special bus so that the GPU and video card could bypass the CPU to access the system RAM. RAM in those days was expensive, so they didn't put multiple gigs of it on the video cards. I had a 32MB TNT2, and that was a big fucking deal.

    Nowadays that feature still exists, but is only really used by integrated GPUs in laptops and such since real GPUs have their own memory. Instead we just needed to make the bus itself faster so we could load all those textures from the disk into the video card without having to wait a year and a day. PCIe 3.0 is fast enough that almost nothing anyone here will do can fill the bus.
  • PCIe 3.0 also changes the basic encoding scheme of the data on the bus (though backward compatibility to the old format appears to be a requirement). The formats themselves are CRAZY. ;^)
  • I might be going halfsies with my parents on a new desktop for Christmas, so I'm pricing out a build to present to them right now. And so I need to ask: does anybody know where the price/performance sweet spot is for graphics cards nowadays? I haven't paid attention for a while and all the specs seem meaningless to me now.
  • The latest update was a tick, right? Is it worth waiting for the tock or should I go for it right now?
  • edited January 2013
    Is it the end of the world if I have to settle for a Western Digital Green as a boot drive? I'm nowhere near building a PC as yet, but I'm far closer than I was months ago.
    Post edited by Diagoras on
  • edited January 2013
    Western Digital makes pretty damn solid hard drives, in my experience. And yeah, SSDs are expensive - I wouldn't blame you, for one.
    Post edited by Linkigi(Link-ee-jee) on
  • So, I am building a PC from almost nothing, and I have pretty much the entire computer finalized for what I want, which is to play any PC game at top graphics and to be future proof for at least 5 years. The prices are rounded, but the entire rig comes to about $1,470. Is there anything else I am missing from this or any suggestions? The only thing I do have for the computer right now is the Windows 7 cd, keyboard, and mousepad.

    Antec 300 Black Steel Mid-Tower Computer Case, plain with a little bit of... vanity. - $60

    ASRock P67 Intel Motherboard
    , the same mother board as Scott. - $140

    Intel Core i7 processor 3.4Gz - $300

    EVGA NVidia GTX 660, literally one step below Scotts video card in the NVidia line. - $230

    Corsair 80 Plus Bronze, will getting a more efficient power supply really matter? - $90

    16 Gigs(4 gigs x 4) of the cheapest RAM on NewEgg - $85

    Seagate Barracuda 2 TB Hard Drive - $110

    Dell UItraSharp 24" Widescreen LCD Monitor - $320

    Asus dvd burner - $20

    Encore 300mbps Wireless Adapter Yes, I do need it. - $13

    Extra stuff

    Razer Abyss Mouse - $40

    Gigabyte Speakers - $8

    Symba Microphone better than nothing! - $11

    Kingston 32 gig SDHC Flash Card because I plan on getting a GoPro White - $25

    Flash Card Reader - $10

    Playstation 2 Controller Adapter to play emulated PS2 games. - $5

    6ft DVI-D cable - $10

    Rosewill HDMI cable 6ft because I don't have one - $7

    Belkin Surge Protector - $15

    Anti-Static Band - $5

    50 cable zip ties - $3
  • Careful, that ultrasharp isn't the ultrasharpest one. No HDMI, for one.
  • Careful, that ultrasharp isn't the ultrasharpest one. No HDMI, for one.
    Yeah, I thought the same thing when I picked it out on NewEgg, but the cheapest Ultrasharp monitor with HDMI is $160 more than the one that I chose and seemed to be the exact same in all other regards.
  • edited January 2013
    You can step down to a Core i5 instead of the i7; the performance difference is very slight. Frankly, for a large majority of games either one is overkill. You would be significantly better off saving $100 on the CPU and switching the graphics card over to a GTX 660 Ti or a GTX 670.

    You might want to consider an SSD, though even now they're a significant expense.

    Also, for Scott, I'll tell you that UltraSharp is actually 6-bit and not 8-bit - it uses temporal dithering. That said, it's still an IPS panel, so it has many of the other advantages of that technology, and on the whole it's a good monitor.
    Post edited by lackofcheese on
  • Just checked Ultrasharp web site. Apparently there are new models, or is my memory bad? Anyway they now clearly identify the good ones with an icon that denotes "premier color".

    Also the good ones have stands that are more versatile and allow more adjustments. Because I bought two arms, I have two of the better stands lying around. If anyone has a dell monitor with a shit stand, please take these good ones out of my apartment.
  • Just checked Ultrasharp web site. Apparently there are new models, or is my memory bad? Anyway they now clearly identify the good ones with an icon that denotes "premier color".

    Also the good ones have stands that are more versatile and allow more adjustments. Because I bought two arms, I have two of the better stands lying around. If anyone has a dell monitor with a shit stand, please take these good ones out of my apartment.
    GIVE TO GEORGE!
  • Just checked Ultrasharp web site. Apparently there are new models, or is my memory bad? Anyway they now clearly identify the good ones with an icon that denotes "premier color".

    Also the good ones have stands that are more versatile and allow more adjustments. Because I bought two arms, I have two of the better stands lying around. If anyone has a dell monitor with a shit stand, please take these good ones out of my apartment.
    GIVE TO GEORGE!
    Please, come and take them.

  • Clockian, I think you can definitely afford to step down to a Core i5 processor and probably even step down a bit on the video card - my new rig is running a Geforce 650 Ti and seems to do fine.

    Also, you can probably save a mint on the RAM by getting 2x8 instead of 4x4, and looking for deals on the processor where it gets bundled with RAM - that'll knock about $10 off the price, and also give you free expansion slots if there ever comes a time when that isn't enough. As always, I'll recommend Corsair's RAM, because it's damn good.

    And check to see if your video card have an included DVI cable - $10 might be overpaying for one anyways.
  • edited January 2013
    Clockian, I think you can definitely afford to step down to a Core i5 processor and probably even step down a bit on the video card - my new rig is running a Geforce 650 Ti and seems to do fine.

    Also, you can probably save a mint on the RAM by getting 2x8 instead of 4x4, and looking for deals on the processor where it gets bundled with RAM - that'll knock about $10 off the price, and also give you free expansion slots if there ever comes a time when that isn't enough. As always, I'll recommend Corsair's RAM, because it's damn good.

    And check to see if your video card have an included DVI cable - $10 might be overpaying for one anyways.
    I have so many extra cables if people need them. At worst you should never pay more than monoprice cost for a cable.
    Post edited by Apreche on
  • edited January 2013
    Just checked Ultrasharp web site. Apparently there are new models, or is my memory bad? Anyway they now clearly identify the good ones with an icon that denotes "premier color".
    New-ER models, yeah. It's no surprise, dude, I remember listening to you do your Ultrasharp spiel on the podcast - same as you've been saying since - when I was on a bus home from work, when I still lived in Headingly, and that was late 2008 and early 2009. Can't expect the product line to stay static forever.

    Post edited by Churba on
  • Just checked Ultrasharp web site. Apparently there are new models, or is my memory bad? Anyway they now clearly identify the good ones with an icon that denotes "premier color".

    Also the good ones have stands that are more versatile and allow more adjustments. Because I bought two arms, I have two of the better stands lying around. If anyone has a dell monitor with a shit stand, please take these good ones out of my apartment.
    GIVE TO GEORGE!
    Please, come and take them.
    Next time I visit I will relieve you of them. Or Cat, Karl, and Natalie might be coming to DC for Katsucon, maybe give it to them to bring down.
  • edited January 2013
    Is HDMI necessary on a monitor?

    While I'm here, if anyone has any spare USB motherboard header cables or the header cables for HDAudio output to the front of the case and wouldn't mind mailing it to me in CA, I should be able to find something you might want.
    Post edited by Omnutia on
  • Is HDMI necessary on a monitor?

    While I'm here, if anyone has any spare USB motherboard header cables or the header cables for HDAudio output to the front of the case and wouldn't mind mailing it to me in CA, I should be able to find something you might want.
    Pretty sure I have some of those USB to back of case brackets. Usually the HDAudio to front of case cables are built into the case and aren't separate.

    Guys, I have built so many computers. I have so much of this shit, but I gladly gave a lot of it to Scojo.
  • Clockian, I think you can definitely afford to step down to a Core i5 processor and probably even step down a bit on the video card - my new rig is running a Geforce 650 Ti and seems to do fine.
    Since he said he wanted it to last 5 years, I'd stick with a stronger GPU - at least a GTX 660.

    However, it would be better to buy a 660 now and upgrade it in 3 years or so than it would be to stretch a 670 over 5 years.
  • Clockian, I think you can definitely afford to step down to a Core i5 processor and probably even step down a bit on the video card - my new rig is running a Geforce 650 Ti and seems to do fine.
    Since he said he wanted it to last 5 years, I'd stick with a stronger GPU - at least a GTX 660.

    However, it would be better to buy a 660 now and upgrade it in 3 years or so than it would be to stretch a 670 over 5 years.
    On Newegg it looks like 660 are $230, 670 are $400 and 680 are $470. Why not just go all out with the 680? I have one, it kicks ass. It might hurt the wallet, but once you get it you will not regret it. You probably won't have to upgrade it until 2000 forever.
Sign In or Register to comment.