Posts by BRANKKO

1) Message boards : BOINC client : Incredible fast computing on GPU (Message 28729)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
I know that my ATI 3850 has
Over 1 teraFLOPS of Compute Power – up to 640 stream processors (320 x 2) on a single card deliver the raw horsepower to attack the most demanding graphics applications

I hate to burst your bubble, but the HD3850 only has 320 stream processors and a peak of 427GFlops. Better check this wiki page.

Even if you have the 3850x2 (which has two GPUs and 2x320 stream processors) it'll do 855GFlops at max.


It was suspicious to me too, but BOINC Manager reports
ATI GPU 0: ATI Radeon HD 3800 (RV670) (CAL version 1.4.427, 256MB, 442 GFLOPS peak)

And it's still pretty nice :)

Anyway, it just looks like that two GPU's are better then two CPU's O_o
2) Message boards : BOINC Manager : My Wish List - part 3. (Message 28727)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
I wouldn't change too much in the Manager though, as it's "only" a GUI. All those tricky things such as throttle and stuff is done by the client.

Thanks for info. I tought that it's interated and I've just downloaded source for Manager, so I gonna look under the hood at this core client.
3) Message boards : BOINC Manager : My Wish List - part 3. (Message 28721)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
If is it possible to edit Manager to work with some project for X sec and wait/sleep Y sec, I would like to try to edit the source code and force it to work that way. It seems like possible solution.
4) Message boards : BOINC client : Incredible fast computing on GPU (Message 28713)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
I'm running BOINC for a years, and SETI@home classic since 2003.

Few fays ago, I've started my first project that support ATI GPU calculations. It's Collatz Conjecture.

I use BOINC Manager 6.10.18 (just upgraded from 6.10.17) on Windows 7 x64 Ultimate.

For years of computing I've got about ~40.000 work done for SETI and ~35.000 for Climatepredicion.net. But for only few days of running Collatz Conjecture on my ATI GPU, I got ~50.000 work done for Collatz Conjecture.





I know that my ATI 3850 has
Over 1 teraFLOPS of Compute Power – up to 640 stream processors (320 x 2) on a single card deliver the raw horsepower to attack the most demanding graphics applications

And CPU (Intel DualCore D930 @ 2x3GHz) benchmarks says:
1323 MIPS (Whetstone) per CPU
3928 MIPS (Dhrystone) per CPU
but I can not believe that GPU computed more work for few days then CPU for few years(?!)

Is it possible or it's maybe some computation bug in Collatz Conjecture?
I've checked some random macnihes with CUDA support on the net and they also have so much work done per WU/job.
5) Message boards : BOINC Manager : My Wish List - part 3. (Message 28712)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
Use at most ______ % GPU time

like
Use at most ______ % CPU time

I have a powerful graphic card but it's very noisy when it's running at full load all the time...

It could be something like: run 10min and sleep 5min or so... or just running at xx% GPU usage...
6) Message boards : Projects : SZTAKI Desktop Grid (Message 28711)
Posted 13 Nov 2009 by Profile BRANKKO
Post:
Is there any way to trick project application in BOINC Manager and run x86 app in x64 mode?
Is it (hypothetically) possible? It it is, that's option that can be integrated to BOINC manager.




Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.