Building an energy-efficient BOINC Computer / Cluster

Message boards : Questions and problems : Building an energy-efficient BOINC Computer / Cluster
Message board moderation

To post messages, you must log in.

AuthorMessage
Nicnac

Send message
Joined: 4 Jan 16
Posts: 3
Germany
Message 66602 - Posted: 4 Jan 2016, 20:31:19 UTC
Last modified: 4 Jan 2016, 21:02:41 UTC

Hey everyone,
This is my first post here as I am not yet participating in any BIONC project and I hope this is the right place to ask. I found an old thread on the topic but thought it would be worth discussing this again and with a few special ideas I've had.

To start off I have been thinking about building a dedicated machine for BIONC for a long time now. I am thinking about SETI, Rosetta and CPU Projects mainly. I have a 3-year old gaming rig that still packs a punch but running it 24/7 long time is prohibitive for me due to electricity cost.

So the main concern for my project is to make it energy efficient. My first idea was having up to 10 raspberry pis (V.2 model B) running at all times. The main selling point here is the ultra low energy consumption (~4W) and the relatively powerful quad-core cpu.

Another idea is to have a nice setup with maybe 2 high-end CPUs and have that running at all times.
Again, the main scope of the project is to have a machine running maybe even years and get long-term service from it, not quick credits for a couple months.

I am not gonna do a GPU build since that would draw too much power.

What are your thoughts on this? Is a raspi setup viable?
Thanks in advance for your input!
ID: 66602 · Report as offensive
Les Bayliss
Help desk expert

Send message
Joined: 25 Nov 05
Posts: 1654
Australia
Message 66603 - Posted: 4 Jan 2016, 22:31:14 UTC - in response to Message 66602.  

Hello Nicnac

The main problem with your idea, is that the apps written by projects are specific to the operating system. And most are for the Intel "Complex instruction set computing" (CISC) type of cpu, whereas Raspberry Pi is a "Reduced instruction set computing" (RISC) type of cpu.
All different OSs need their own specific applications for each project.

You can see on this page: Choosing BOINC projects, what each project supports. Android apps are in a similar boat, so perhaps you could buy lots of cheap Android based phones/tablets and use them. There seems to be more projects using Android, than Raspberry Pi.
ID: 66603 · Report as offensive
Profile Agentb
Avatar

Send message
Joined: 30 May 15
Posts: 265
United Kingdom
Message 66604 - Posted: 4 Jan 2016, 23:28:16 UTC - in response to Message 66602.  
Last modified: 4 Jan 2016, 23:29:37 UTC

Nicnac hi


To start off I have been thinking about building a dedicated machine for BIONC for a long time now. I am thinking about SETI, Rosetta and CPU Projects mainly. I have a 3-year old gaming rig that still packs a punch but running it 24/7 long time is prohibitive for me due to electricity cost.

Good to think about this, especially if electricity is expensive.

So the main concern for my project is to make it energy efficient. My first idea was having up to 10 raspberry pis (V.2 model B) running at all times. The main selling point here is the ultra low energy consumption (~4W) and the relatively powerful quad-core cpu.

Another idea is to have a nice setup with maybe 2 high-end CPUs and have that running at all times.
Again, the main scope of the project is to have a machine running maybe even years and get long-term service from it, not quick credits for a couple months.

I am not gonna do a GPU build since that would draw too much power.

What are your thoughts on this? Is a raspi setup viable?
Thanks in advance for your input!


GPU computing will normally be more energy-efficient than any CPU. However you are quite limited on the applications which you can run, see Les's posts to find them.

Some examples -
At Einstein@Home Raspberry Pi2 generate about 400 RAC/day for 4 Watts. So 100RAC/Watt but note these are not GPU apps.

A lower power GPU such as the GTX 750 Ti adds 60W to say a 140W PC - and i would expect it to generate 23000 RAC/day, so 23000/200= 460 RAC/Watt.

I run a large AMD HD7990 which draws 480W but generates 340000 RAC daily so 708 RAC/Watt.

Maybe start up a project on the gaming rig to get a feel for the projects and what works.

What CPU / GPU combination do you have? Perhaps don't run it 7x24? There are several solutions to achieve auto shut down.
ID: 66604 · Report as offensive
noderaser
Avatar

Send message
Joined: 2 Jan 14
Posts: 276
United States
Message 66605 - Posted: 5 Jan 2016, 2:42:46 UTC

Why not get started while running BOINC on the computer when you're using it?
My Detailed BOINC Stats
ID: 66605 · Report as offensive
Nicnac

Send message
Joined: 4 Jan 16
Posts: 3
Germany
Message 66610 - Posted: 5 Jan 2016, 15:04:06 UTC
Last modified: 5 Jan 2016, 15:06:37 UTC

Thanks for the quick replies!

@Les Bayliss Thanks for the info on the CPU differences, I wasn't aware of that!

@Agentb Thanks for those numbers! The RAC/Watt is very interesting and with that and Les Bayliss' info a low power CPU or GPU setup seems to be much better than any Raspi cluster.

To get started I could certainly use my gaming rig and I might as well do that but with a 900W power supply it is extremely wasteful (and I would pay ~2.2k € a year). Also running a system only for certain times a day is not what I am looking for since I want a system that can run continuously so that the credits will add up over time.

@noderaser I am not currently using my big pc (it's not even at my house right now) but an older macbook which can hardly handle running BOINC in the background. Also the main idea is to have an independent system for BOINC projects.

--

I calculated that the maximum Wattage my build should have is 100-150W. Could you recommend a BOINC-specialized build (possibly including GPU) for that? Or is such a low-power setup not viable?

Thanks again!
ID: 66610 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 5080
United Kingdom
Message 66615 - Posted: 5 Jan 2016, 18:10:14 UTC - in response to Message 66610.  

I calculated that the maximum Wattage my build should have is 100-150W. Could you recommend a BOINC-specialized build (possibly including GPU) for that? Or is such a low-power setup not viable?

Thanks again!

I'm currently running a Haswell (fourth-generation) Intel i5 based computer, which is drawing under 100 watts from the wall socket (currently fluctuating between 90-95W, varies according to project).

That's with all four cores, and the integrated HD 4600 intel GPU, all crunching for BOINC. Power draw excludes monitor, but you'd have that switched off most of the time anyway.

The machine is actually an off-the-shelf Dell Optiplex 9020 with mechanical hard drive - power consumption could possibly be reduced even more by using SSD storage, though at added purchase cost.

The HD 4600 alone is earning over 10,000 RAC at Einstein - host 8864187. Hope that gives you an idea of what's possible.
ID: 66615 · Report as offensive
Nicnac

Send message
Joined: 4 Jan 16
Posts: 3
Germany
Message 66618 - Posted: 5 Jan 2016, 19:58:27 UTC

Thanks Richard, That helps a lot! I remember having an old pc like that sitting around somewhere, too!
ID: 66618 · Report as offensive
noderaser
Avatar

Send message
Joined: 2 Jan 14
Posts: 276
United States
Message 66623 - Posted: 6 Jan 2016, 6:15:49 UTC - in response to Message 66610.  
Last modified: 6 Jan 2016, 6:16:01 UTC

@noderaser I am not currently using my big pc (it's not even at my house right now) but an older macbook which can hardly handle running BOINC in the background. Also the main idea is to have an independent system for BOINC projects.

Oh come on now, I've got a G4 PowerBook that still makes contributions, though not as many as it used to--mostly because I don't have the time to work on the hobby it's earmarked for anymore.

You've got to start somewhere.
My Detailed BOINC Stats
ID: 66623 · Report as offensive
Bert

Send message
Joined: 24 Aug 15
Posts: 16
United States
Message 66626 - Posted: 6 Jan 2016, 9:28:28 UTC - in response to Message 66623.  

Downclocking your CPU clockspeed just 20% likely will make a heck of a difference in power consumption. I've hooked up a watt meter (kill-a-watt) and looked at various processors. Some are quite extreme, like going from 3.8ghz to 3 ghz will approximately halve the power consumption. So if you set your desktop processor to some moderate speed, about 1/4 to 1/3 less than the rated speed then likely your perf/watt will improve greatly.

I tried to downclock a GPU card but sadly the driver software (ati catalyst) only lets people overclock--underclocking is for some reason strictly forbidden.
ID: 66626 · Report as offensive
Profile Richie

Send message
Joined: 2 Jul 14
Posts: 186
Finland
Message 66627 - Posted: 6 Jan 2016, 13:40:14 UTC

You can regulate power consumption also by setting max temperatures low with TThrottle: http://efmer.com/b/?q=tthrottle

Here's a calculator for power consumption: http://outervision.com/power-supply-calculator
* That calculator might show whatever number for total consumption (I wouldn't take the total amount as an absolute truth). But if you choose your CPU with stock speed at first and put something in the other boxes, then you can compare how much the power consumption will change if you put some "overclocking settings" there. That change might be interesting to look at.
ID: 66627 · Report as offensive
Coleslaw
Avatar

Send message
Joined: 23 Feb 12
Posts: 198
United States
Message 66634 - Posted: 7 Jan 2016, 2:54:17 UTC - in response to Message 66604.  
Last modified: 7 Jan 2016, 3:20:09 UTC

Nicnac hi


To start off I have been thinking about building a dedicated machine for BIONC for a long time now. I am thinking about SETI, Rosetta and CPU Projects mainly. I have a 3-year old gaming rig that still packs a punch but running it 24/7 long time is prohibitive for me due to electricity cost.

Good to think about this, especially if electricity is expensive.

So the main concern for my project is to make it energy efficient. My first idea was having up to 10 raspberry pis (V.2 model B) running at all times. The main selling point here is the ultra low energy consumption (~4W) and the relatively powerful quad-core cpu.

Another idea is to have a nice setup with maybe 2 high-end CPUs and have that running at all times.
Again, the main scope of the project is to have a machine running maybe even years and get long-term service from it, not quick credits for a couple months.

I am not gonna do a GPU build since that would draw too much power.

What are your thoughts on this? Is a raspi setup viable?
Thanks in advance for your input!


GPU computing will normally be more energy-efficient than any CPU. However you are quite limited on the applications which you can run, see Les's posts to find them.

Some examples -
At Einstein@Home Raspberry Pi2 generate about 400 RAC/day for 4 Watts. So 100RAC/Watt but note these are not GPU apps.

A lower power GPU such as the GTX 750 Ti adds 60W to say a 140W PC - and i would expect it to generate 23000 RAC/day, so 23000/200= 460 RAC/Watt.

I run a large AMD HD7990 which draws 480W but generates 340000 RAC daily so 708 RAC/Watt.

Maybe start up a project on the gaming rig to get a feel for the projects and what works.

What CPU / GPU combination do you have? Perhaps don't run it 7x24? There are several solutions to achieve auto shut down.


The big problem here though is that you have to look at all the projects you intend to support. They all have different scoring and so a RasPi2 will score different at Einstein than it would at Collatz. I would also recommend going to some projects and get real numbers rather than guestimated numbers. Another thing is that TDP numbers don't really = real power draw numbers so it would be best to ask in the forums of the various projects to get what users are actually reporting.

I have several cell phones with various ARM chips. An LG Optimus Fuel has one ARMv7 Processor rev 3 (v7l)(2 processor cores) @ 1.2GHz. It will produce about half the points of a Genuine Intel(R) CPU T1400 @ 1.73GHz CPU (2 processor cores). The TDP for the T1400 is 27 watts. So, if you are basing it off TDP power draw... the rest of the system will cause it to draw even more. The cell phone pulls less than 4 watts last I checked. So even 2 phones at 8 watts out performs that older laptop CPU. The numbers I compared a long time ago was at Enigma or WCG (can't remember now). However, I have not done comparisons with other chips or GPU's at other projects as I will run the devices regardless. These phones were purchased for $10 each brand new. Throw the up front costs into your over all expense consideration and you will find things look a little different. Ras.Pi's are a bit more costly than cheap phones... So, are most GPU's. It could take years for that GPU's efficiency to make up the difference in buying a bunch of ARM devices. The advantage to CPU's in general will be project selection.
ID: 66634 · Report as offensive
Bert

Send message
Joined: 24 Aug 15
Posts: 16
United States
Message 66636 - Posted: 7 Jan 2016, 9:22:37 UTC - in response to Message 66634.  

You get best efficiency at full load but lower CPU clock rate.

Run your old laptop at 1GHz rather than the full 1.73GHz and the energy usage will dramatically go down. You can feel this from the exhaust heat, or by looking at battery life, or by measuring it with a watt meter at the power outlet.

Futhermore, many numerical calculations (like weather modeling or large physical simlations) are limited by the speed of the system memory. So you can run your CPU at 4GHz and much of the CPU's time will be spent waiting for memory. SMT or hyperthreading works well to reduce those wasted cycles. So does downclocking your CPU to say 2GHz. To the CPU the memory will now seem twice as fast.
ID: 66636 · Report as offensive
Coleslaw
Avatar

Send message
Joined: 23 Feb 12
Posts: 198
United States
Message 66641 - Posted: 7 Jan 2016, 17:39:21 UTC - in response to Message 66636.  

You get best efficiency at full load but lower CPU clock rate.

Run your old laptop at 1GHz rather than the full 1.73GHz and the energy usage will dramatically go down. You can feel this from the exhaust heat, or by looking at battery life, or by measuring it with a watt meter at the power outlet.

Futhermore, many numerical calculations (like weather modeling or large physical simlations) are limited by the speed of the system memory. So you can run your CPU at 4GHz and much of the CPU's time will be spent waiting for memory. SMT or hyperthreading works well to reduce those wasted cycles. So does downclocking your CPU to say 2GHz. To the CPU the memory will now seem twice as fast.


These are all true. However, in my example it was to show that there are leaps and bounds differences that can easily be compared. My example with the ARM device was less than 4 watts for the entire device. The laptop, will have much more as you have to factor, HDD, RAM, display, etc.. into that efficiency. The chip is rather old now but then again so is the ARM chip in question. Quad cores are now dipping into the $10-$20 price point. So, if you compare to some of the modern Atom CPU's, you would probably get a better apples to apples comparison.

So, to return to this last statement of yours, there certainly are things you can do to improve efficiency. The first step is to decide what you are supporting to know what hardware options you have in the first place...
ID: 66641 · Report as offensive
Richard Haselgrove
Volunteer tester
Help desk expert

Send message
Joined: 5 Oct 06
Posts: 5080
United Kingdom
Message 66822 - Posted: 13 Jan 2016, 18:38:02 UTC

It would be a good idea for anyone seeking energy efficiency to consider all those little peripheral extras as well.

I've just replaced a (probably 10 year old - maybe more) 16 port 'Fast' (10/100) ethernet switch with an equivalent Gigabit version. The ambient noise level in my crunchery dropped dramatically - the fan in that old switch was louder than anything else. And the new one working at 10 times the speed has a maximum stated power draw of 0.2A, against 0.5A for the old one.

I'll probably have a panic attack because of the silence, when I go downstairs tomorrow morning and forget what I've done!
ID: 66822 · Report as offensive
Coleslaw
Avatar

Send message
Joined: 23 Feb 12
Posts: 198
United States
Message 66827 - Posted: 14 Jan 2016, 4:11:20 UTC - in response to Message 66822.  

It would be a good idea for anyone seeking energy efficiency to consider all those little peripheral extras as well.

I've just replaced a (probably 10 year old - maybe more) 16 port 'Fast' (10/100) ethernet switch with an equivalent Gigabit version. The ambient noise level in my crunchery dropped dramatically - the fan in that old switch was louder than anything else. And the new one working at 10 times the speed has a maximum stated power draw of 0.2A, against 0.5A for the old one.

I'll probably have a panic attack because of the silence, when I go downstairs tomorrow morning and forget what I've done!


The noise would be the biggest selling point next to the gigabit speed bump for me. The cost of upgrading all the peripherals probably wouldn't justify the up front cost. As it would take probably decades for the energy savings to pay off the upgrade cost. Kinda like how you can pick up 2P boards with 2 AMD G34 chips (24 cores) and heat sinks for under $100 USD these days. Sure you can pick up more energy efficient Intel offerings, but that up front cost savings can buy an awful lot of electricity. That is why I typically don't recommend going out and buying an Android ARM phARM. The up front investment just takes too long to justify the long term efficiency argument. So, I do them in small increments and use them in ways where they actually pay for themselves. Win win. I think I'm up to somewhere around 50 ARM cores of various generations and speeds.
ID: 66827 · Report as offensive
Bert

Send message
Joined: 24 Aug 15
Posts: 16
United States
Message 67162 - Posted: 25 Jan 2016, 13:29:32 UTC - in response to Message 66602.  

a rasberry pi cluster would be hard to beat for those boinc projects that do support arm well (and assuming those projects do okay with the amounts of RAM and storage that the pi's have).

The biggest thing is that it's an investment that takes money and time. During the heating season, there is no to little such cost when using available laptops and computers around the house.

Often I run desktops in the coldest months of winter season, mostly just for boinc. My laptop is my main computer I use, but it also runs boinc; and in the winter it's on all day (and some nights too).

For anyone on linux with a quadcore or higher, I use this program to run boinc while I use the computer or when away. This newest version detects idle states and frees up more cores for boinc when idle is detected. Restricting the clockrates to about or under 75% of the max greatly helps with power consumption (as well as that annoying fan noise). When the user active, there's an asymmetric frequency setup so that single and dual threaded performance is optimized, and to keep boinc tasks from hogging up memory bandwidth.

http://ee.freeshell.org/Slow/

The program can be modified for higher core counts (have tested previous versions on hex and octa cores) without too much difficulty. Just edit a few lines, mostly or all in the #define section.

Try it out; feedback, bug reports, suggestions are appreciated.
ID: 67162 · Report as offensive

Message boards : Questions and problems : Building an energy-efficient BOINC Computer / Cluster

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.