Most computers have two main processors. The CPU (Central Processing Unit) does the bulk of the general number crunching, while the GPU (Graphics Processing Unit) does all the calculations necessary to put complex images on the screen. Traditionally, the GPU has been used only for graphical processing, determining the placement of pixels, vertices, lines, and other geometric constructs in 2D and 3D spaces. Thanks to the evolution of GPUs, they're increasingly finding work outside of pixel and polygon-crunching.

CPUs are primarily serial processors. While they might have multiple cores and threads, they still function by performing many calculations very rapidly, one after the other. GPUs, on the other hand, have become increasingly parallel processors. While an Intel or AMD CPU might have 2, 4, or 8 cores, an Nvidia GeForce or ATI Radeon GPU can have hundreds, all working at the same time. The individual cores on a GPU perform relatively simple functions, but since they all work together simultaneously, they can perform some impressive mathematical feats.
The sheer processing power found in GPUs have not gone unnoticed by researchers, who have turned to GPGPU to produce a wide array of models and simulations that serial processing CPUs either can't handle or are simply far less efficient. While video cards were once solely used for getting graphics on your computer screen, now they're computational multitools for scientists. Their parallel architecture makes them ideal for handling hundreds of similar computations at once, exactly the sort of work scientific simulations require. However, they're also incredibly suited for brute-force security exploration. Parallel processors can produce thousands of passwords or keys per second, which is why they've become an ideal engine for password crackers and decryption tools.

If you want to take advantage of GPGPU at home, GPUradar features a catalog of software that takes advantage of your GPU's parallel processing power. The site notes that GPGPU assists much more than strange scientific modeling programs. Several video converters, codecs, and players benefit from GPUs, as do numerous cryptography and security tools. Even every statistics student's best friend, Mathematica, can use parallel-processing GPUs to crunch numbers as of version 7 of the software.
Nvidia is actively promoting the use of its GPU architecture for general processing. The company calls its parallel processing architecture CUDA, and while it's a fundamental part of its GeForce GPUs, it's also marketed as a valuable tool for researchers. According to Nvidia, its CUDA architecture has been used by SeismicCity to interpret seismic data to find potential oil wells, the University of Illinois at Urbana-Champaign to model the satellite tobacco mosaic virus, Techniscan Medical Systems to process ultrasound data, and General Mills to simulate how cheese melts.
GPGPU.org keeps track of advances in and implementations of GPGPU technology. The site has cataloged GPU use in researching biology, physics, mathematics, and even computer security. Dozens of researcher papers have been written on how GPGPU can improve modeling, including vastly speeding up searching protein substructures, modeling the movement of particles influenced by gravity, and rapidly scanning computers to find virii and other security breaches.
http://www.tested.com/news/gpgpu-explained-how-your-gpu-is-taking-over-cpu-tasks/1017/
@TheVoxelman on twitter










