By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - PCI Express 3.0

it will probably be some time until devices can fully tap all that bandwidth, but it's good to know the potential is there.

First Bluetooth, then USB and now PCI Express. It's clearly the era of version 3.0, and given that the PCI Express specification has been humming along at 2.0 speeds for over two years now, we'd say an update was definitely due. Thankfully, the PCI-SIG has announced the availability of the PCIe Base 3.0 specification to its members today, and the highlights are certainly notable. There's a new 128b/130b encoding scheme and a data rate of 8 gigatransfers per second (GT/s), doubling the interconnect bandwidth over the PCIe 2.0 specification. And since we're sure you're fretting it, we'll go ahead and affirm that it maintains backward compatibility with previous PCIe architectures. We're also told that based on this data rate expansion, "it is possible for products designed to the PCIe 3.0 architecture to achieve bandwidth near 1 gigabyte per second (GB/s) in one direction on a single-lane (x1) configuration and scale to an aggregate approaching 32 GB/s on a sixteen-lane (x16) configuration." A lot of technobabble, sure, but one thing's for sure: your next graphics card is bound to murder your current one if paired with a PCIe 3.0 motherboard.

http://www.engadget.com/2010/11/19/pci-express-makes-the-3-0-leap-doubles-bandwidth-over-pcie-2-0/



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network

They should just use the graphics card to do cpu's job.



Galaki said:

They should just use the graphics card to do cpu's job.


well GPUs are actually slower at many types of proccessing than CPUs as CPUs are heavily optimised for certain types of instructions, and for programs to run well on GPUs they would have to be rewritten to take advantage of GPUs rather than CPUs. Maybe in 5-10 years CPUs and GPUs will have converged into one type of chip but by then we will probably be doing everything in the cloud  anyway...



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Galaki said:

They should just use the graphics card to do cpu's job.


GPUs aren't very good at serial or integer tasks. CPus aren't very good at massively parallel floating point tasks. Making one chip that attempts to do both has never worked historically (Cell, Larrabee) and putting a GPU on the CPU die (Sandy Bridge, Llano) is probably the best solution.

PCIe 3.0 is, as the OP said, unnecessary until graphics cards saturate the PCIe 2.0 link which won't be for a while.



Galaki said:

They should just use the graphics card to do cpu's job.


Isn't that AMD Fusion?



Around the Network

This is more relevant to laptops where you want a narrow interface which uses as little power as possible due to space and battery reasons. At present the standard PCI-E 16x desktop interface is not being pushed. However this might allow for a 2x PCI-E interface on laptops to be effective.



Tease.

Since PCI-E is being discussed, I figured I'd ask a question.

My mobos PCI-E 2.0 does 10GB/s per card in SLI mode(standard 16x) but when adding a 3rd card, two of the PCI-E ports go down to 8x speed. So its 16x 8x 8x in tri sli...

The video card I have scales pretty well in tri sli according to reviews (20-30% performance boost). However considering the fact that the 2nd and 3rd PCI-E is going to be only 8x...is it worth adding another graphics card?



disolitude said:

Since PCI-E is being discussed, I figured I'd ask a question.

My mobos PCI-E 2.0 does 10GB/s per card in SLI mode(standard 16x) but when adding a 3rd card, one of the PCI-E ports goes down to 8x speed. So its 16x 16x 8x in tri sli...

The video card I have scales pretty well in tri sli according to reviews (20-30% performance boost). However considering the fact that the 3rd PCI-E is going to be only 8x...is it worth adding another graphics card?


What the F would you possibly need 3 cards for?

Two decent cards can max out any game in existence.



superchunk said:
disolitude said:

Since PCI-E is being discussed, I figured I'd ask a question.

My mobos PCI-E 2.0 does 10GB/s per card in SLI mode(standard 16x) but when adding a 3rd card, one of the PCI-E ports goes down to 8x speed. So its 16x 16x 8x in tri sli...

The video card I have scales pretty well in tri sli according to reviews (20-30% performance boost). However considering the fact that the 3rd PCI-E is going to be only 8x...is it worth adding another graphics card?


What the F would you possibly need 3 cards for?

Two decent cards can max out any game in existence.

3 screen gaming...

I don't need it yet, but I figure a year from now GTX470s will be 100-150 bucks used and 200 new. Might as well, if it will be beneficial. Crysis and Metro 2033 already run somewhat choppy when completely maxed out at 5160x1080 resolution...



disolitude said:
superchunk said:
disolitude said:

Since PCI-E is being discussed, I figured I'd ask a question.

My mobos PCI-E 2.0 does 10GB/s per card in SLI mode(standard 16x) but when adding a 3rd card, one of the PCI-E ports goes down to 8x speed. So its 16x 16x 8x in tri sli...

The video card I have scales pretty well in tri sli according to reviews (20-30% performance boost). However considering the fact that the 3rd PCI-E is going to be only 8x...is it worth adding another graphics card?


What the F would you possibly need 3 cards for?

Two decent cards can max out any game in existence.

3 screen gaming...

I don't need it yet, but I figure a year from now GTX470s will be 100-150 bucks used and 200 new. Might as well, if it will be beneficial. Crysis and Metro 2033 already run somewhat choppy when completely maxed out at 5160x1080 resolution...

ok, didn't think of that. Carry on.