[vox-tech] GPU calculations

Peter Jay Salzman p at dirac.org
Tue Jun 13 10:28:48 PDT 2006


On Tue 13 Jun 06, 10:20 AM, Richard Harke <rharke at earthlink.net> said:
> On Tue June 13 2006 08:44, Peter Jay Salzman wrote:
> > I recently read some papers where people performed an FFT on Nvidia
> > hardware.  The idea is that a GPU is capable of performing certain types of
> > operations very quickly, faster than a CPU.
> >
> > Has anyone looked into this?  I've seen one project port FFTW to be
> > a GPU-enhanced FFTW.
> >
> > Any idea on what it would take to write a "hello world" type program where
> > 1 + 1 is thrown onto a GPU and the result is returned to a local variable?
> >
> Why don't you look at   www.gpgpu.org
> GPGPU -> General Purpose computing on a GPU
> 
> Richard Harke
 
Thanks, Richard.  Good find!  I think you do numerical computing as well --
have you done any of this?  I've seen GPU implementations for solving sparse
and dense linear systems -- which is, essentially, solving partial
differential equations using implicit discretization methods.  You do this
in your own work, don't you?

In my own work with finance, I'd be interested in the FFT, high performance
sorting and database work (which I just found in that link you mentioned!)
I've also been thinking of extending my dissertation work, solving the
Schrodinger-Newton PDE in 2 and 3 dimensions.

I'm not entirely sure why, but every paper I've read on the subject, so far,
uses NVidia hardware.  Need to do more reading to find out why...

Pete


More information about the vox-tech mailing list