Thread 'Android/ARM efficiencies (RasPi/Cellphone)vs CPU/GPU'

Message boards : Android : Android/ARM efficiencies (RasPi/Cellphone)vs CPU/GPU
Message board moderation

To post messages, you must log in.

AuthorMessage
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94046 - Posted: 6 Dec 2019, 5:52:53 UTC
Last modified: 6 Dec 2019, 5:56:03 UTC

Has there any study been done on the efficiencies of Android phones, and single board computers like the Raspberry pi?

An old article says they're about 1/10th as fast as a powerful PC CPU, but use up also less than 1/10th the power to run.
A newer article says they're 100s of times slower than GPU projects.
I'm merely trying to look if it's feasible to replace a 4 core i5 CPU, running at 3Ghz 65W, with an 8core ARM at 1,7-2Ghz @ 2,5W.
In the process also trying to see if a Raspberry pi makes for some efficient crunching?

The TDP of a CPU like that is about 65W.
I've read online somewhere that the Snapdragon 835 CPU has 200-500Gflops of processing power, but I suspect this is CPU+GPU together.
I know most cellphones aren't going to break any performance records, but considering that some can CPU crunch at below 2,5W at the wall, does this make for an interesting project to revive old phones (and/or Raspberry pis)?

I'm sure phones won't get anywhere near the work done of GPUs.
But could an array of old Snapdragon phones be potentially more efficient than a Core i5/7?
ID: 94046 · Report as offensive
ProfileDave
Help desk expert

Send message
Joined: 28 Jun 10
Posts: 2691
United Kingdom
Message 94047 - Posted: 6 Dec 2019, 8:22:43 UTC - in response to Message 94046.  

But could an array of old Snapdragon phones be potentially more efficient than a Core i5/7?


Possibly, but it will also depend on which projects you crunch. Some will not run on either Qualcom or ARM architecture.
ID: 94047 · Report as offensive
ProfileJord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 15560
Netherlands
Message 94048 - Posted: 6 Dec 2019, 8:41:05 UTC - in response to Message 94046.  

The 2.5W an ARM device uses is only to top off the battery, once that one is full filled, power usage will go down. Even if the CPU is being used a lot.
Then it'll use up to 5 kW per year. Although those numbers are based on values of years ago, so YMMV.

Things to know:
1. an 8 core CPU consists of 4 Little cores and 4 BIG cores. The 4 BIG cores are used by Android only. BOINC cannot use them. If you set an 8 core CPU to use 8 cores in BOINC you load 2 tasks per Little core, slowing calculations down. Better to set BOINC to use 4 cores only. This best be followed for all 6 (3), 8 (4), 10 (5) etc. cored ARM CPUs.
2. Android 10 is at this moment not supported by the projects science applications. All tasks will err.
3. the GPU in Android devices isn't used at this moment. It is detected, but no project has applications for it, while the developers ponder if it's such a good idea to even consider using it due to the extra heat it'll give.
4. consider heat. Running the Android devices 24/7 will heat up its CPU and battery considerably. This can cause the battery to bloat and potentially explode. Do use active cooling to keep this down. My devices all lie in the path of 120mm/140mm fans spinning at 5V (half speed).
5. Mind that you get 64bit CPUs that run on 64bit Android, or 32bit CPUs that run on 32bit Android. Not 64bit CPUs that run 32bit Android, because those will lack the necessary 32bit compatibility libraries all Unix/Linux/Android OSes require to run 32bit instructions on 64bit CPUs. Tasks run on 64bit CPUs with 32bit Android will fail. To check that your Android is 32bit or 64bit, install an app like [url=]https://play.google.com/store/apps/details?id=com.antutu.ABenchMark&hl=enAnTuTu Benchmarks[/url] (click My device, at Android it'll say the number plus bitness) or AIDA64 (under Android it shows what kernel version you have, for 64bit that's aarch64).

Is it worth it to run ARM devices over an i5/i7? In my opinion, yes, as for way less power usage per year, three devices run a Seti RAC of ~1,200.
As long as you keep an eye on them - mine regularly lose the wifi connection and need a reboot - plus you keep them cool enough, they'll just run work for projects for a nice addition to your RAC. You may have to spend some extra bucks to keep them cooled, but then they can survive hot summers as well.
ID: 94048 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94049 - Posted: 6 Dec 2019, 9:11:10 UTC
Last modified: 6 Dec 2019, 9:51:20 UTC

I have a few older phones.
One Snapdragon 820 cellphone, which already has a ballooned battery, and pushed out screen. It won't act well as a personal cellphone, but it would probably still do for crunching.
A second phone was one of those $15 throwaway phones; using a Qualcomm® Snapdragon™ 1.4GHz, with Quad-Core MSM8917 CPU.
A third one uses an older Krait 300 series CPU. A dual core CPU running at 1,7Ghz. A bit of a slow dog, but using very little power (like you mention, it uses 2,5W while charging the battery, probably just 1-1.5W tops crunching CPU only).

The lower 2 are slow phones, but perhaps they can still be useful.

In a hypothetical case scenario, where I would have 20 phones like these (running <65W), how would it compare to a GPU running at 65W (eg: a GT 1650)?

Edit: A little research, and I found that the Raspberry Pis aren't particular good candidates.
They seem to suck quite an amount of power, and aren't really efficient for this kind of work load.
Probably due to the process they're created on (pi3B being built on 40nm using 6,4W; and the pi4 at 28nm using 7,6W under load).

This plus the fact that both the Pi 3 and 4 will need active cooling, vs a cell phone running at 2-3 Watts is large enough to distribute CPU heat across the body.
They're not really there, compared to older cellphones (eg: Snapdragon 820 = 14nm, SD 835 = 10nm using between 2 to 5Watts on the wall).

The Krait 300 in this case would be much slower than the Pi4; seeing it's both made on 28nm, but the Krait 300 is only dual core vs quad on the Pi.
However, 2 Krait 300 phones, consume about as much as 1 Raspberry Pi 4 (due to lpddr memory, vs regular on the pi 4).

Another con for the Pi, seems to be that the SD card will see quite an amount of wear from Boinc projects.
This is not really a good idea. Most cellphones have (much faster, much more reliable) emmc or internal SSDs.

This answers some of my questions.
The Pi is classed at a lower efficiency than most (modern) cellphones.
The problem with Snapdragon 835 or up, is that they're mostly found on Android devices that have upgraded to Android 10 (which is incompatible with Boinc at the moment).
Boinc runs fine on Android 9, but I see many more projects compatible with Android 4 to Android 7.
Phones running these operating systems are usually dual core phones, with low RAM...

What I'd really want to see, is how a lot of 10 of these would do in crunching, compared to a full desktop pc....
ID: 94049 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94055 - Posted: 6 Dec 2019, 11:51:01 UTC

Another thing to consider, is that GPUs are fed by CPUs.
And they both need extra cooling (case/CPU fans, and PSU efficiency losses).
Quite often GPUs are rated at their TDP levels, but a GPU being utilized by a task to 80W, might actually consume more than 120W on the wall.
2 to 4 GPUs are more efficient, but could reach up to 450W at the wall, resulting in 40% higher power consumption than advertised on power reading software (like GPUz, Nvidia-smi, or other).

Meanwhile, older cell phones utilize up to 2,5 Watts on the wall, and newer ones up to 5Watts continuous (after charger).

To compete with a 450W CPU, I'd need 100x 4,5W phones. Their charging circuits probably get more efficient, the larger the USB hub is.
With 4 cores each phone, running at ~2Ghz, am I correct into assuming that they'd be only about as fast as a single GTX 1650 GPU (at 896 cores @ ~1935-1950 Mhz)?
ID: 94055 · Report as offensive
ProfileDave
Help desk expert

Send message
Joined: 28 Jun 10
Posts: 2691
United Kingdom
Message 94058 - Posted: 6 Dec 2019, 16:42:00 UTC

It also depends where your power is coming from. During the summer, the majority of our power in daylight hours comes from the roof and for my laptop, I could relatively easily get another panel, some battery storage and a 12V in psu to run it completely from that panel and do that nine months of the year, only needing mains for the 1.5 months either side of solstice.
ID: 94058 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94066 - Posted: 7 Dec 2019, 8:01:55 UTC - in response to Message 94058.  

Not sure. Yes, phones would use way less power, however, would you run a phone 24/7 1 year long from solar power, knowing that in 1 hour you could do more work with a GPU (even if you had to pay the 1 hour * 200Watt electric bill =~$.02 on electricity)?

I think from a performance point of view, phones probably won't make much sense, unless the newer Snapdragon 8cx and 7cx chipsets built on the 7nm and 8nm architecture (which currently still cost too much money to justify trying to save a few pennies on electricity)...
ID: 94066 · Report as offensive
boboviz
Help desk expert

Send message
Joined: 12 Feb 11
Posts: 419
Italy
Message 94070 - Posted: 7 Dec 2019, 10:37:17 UTC - in response to Message 94046.  

Has there any study been done on the efficiencies of Android phones, and single board computers like the Raspberry pi?

With Raspberry i think it's better a linux distro to crunch on Boinc.
ID: 94070 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94071 - Posted: 7 Dec 2019, 11:19:29 UTC
Last modified: 7 Dec 2019, 11:20:32 UTC

Too bad Android phones don't support GPU crunching, because Snapdragon GPUs (as well as Kirin and Exynos are about as powerful (or more) than Intel UHD 605 IGP GPUs, and use considerably less power (2-4W vs 10-15W on Intel IGPs); and the slower Intel IGPs are supported...
ID: 94071 · Report as offensive
ProfileJord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 15560
Netherlands
Message 94072 - Posted: 7 Dec 2019, 11:52:55 UTC - in response to Message 94071.  

Too bad Android phones don't support GPU crunching
Misinformation. The phones do. The GPUs do. BOINC does. It's just that no project at the moment of writing has a GPU app for Android. Doesn't mean it's not supported. Go ask your favorite project if they can make an OpenCL GPU application.

Sat Dec 07 12:51:39 GMT+01:00 2019||OpenCL: Mali-T830 0: Mali-T830 (driver version 1.2, device version OpenCL 1.2 v1.r28p0-01rel0.4d1a5d64f7660449be8b4b1d8d26b173, 3732MB, 3732MB available, 0 GFLOPS peak)

ID: 94072 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94075 - Posted: 7 Dec 2019, 22:04:37 UTC
Last modified: 7 Dec 2019, 22:49:30 UTC

I've done some preliminary tests with my first old cellphone, a Samsung Galaxy Mega 6.3.
It uses a Qualcomm Snapdragon 400 MSM8930 CPU, 2 cores at 1,7Ghz built on the 28nm process.
This phone is dog slow, and consumes 2,8W during use.
While crunching, for some reason, the battery seems to deplete, but the phone refuses to charge beyond 0.5A. I've tried multiple chargers and cables; and while some cheaper cables did cap off at 0.5A, this phone, even with more expensive cables, refused to charge beyond 2,8W at the wall.
This makes this older (2013) phone continuously run out of battery, and stall the projects; unsuited for crunching!
After a day of Einstein, it got like 400 credits plus finished about 33% on 2 tasks (My RTX2060 gets ~200-300k credits per day roughly estimated, while my 20 thread 1,9Ghz Xeon gets ~300 credits per thread per day est, though at 100W at the wall (70W TDP for the CPU)).

So while this phone is definitely not the best, it does beat CPU crunching at the same lithography (22nm), efficiency wise.

I've also plugged in an old (2017) Chinese phone, a Doogee Mix, using a MediaTek Helio P25 MT6757T. It uses an octacore CPU (running 8 tasks simultaneously in Boinc), built on a 16nm process.
This phone has been significantly faster, running at 4.3W instead.
It's gathered about 400 Einstein points, and 300 Moo Wrapper points, plus 8 tasks between 20 and 90% finished.
This secondary phone is very similar in performance to a Snapdragon 820 and 831.
The phone did get pretty hot, and needed some form of active cooling.
I found no better way, than to just place it above my PC's PSU, where the fan continuously sucks fresh air past it, and the temps were acceptable.
This one did not suffer from battery power loss during crunching 8 threads at 100% CPU.

I would probably recommend phones with a Snapdragon 600/800 series or greater built on 16nm or smaller, to be used for crunching. (especially when folding on the IGP, but make sure the phone has sufficient cooling in this case scenario).
Anything above 5 watts on a phone, is too much heat (unless you can run it without the battery, leave the case open, or actively cool it).

Another thought, the Snapdragon chipsets of 2017 and up, running 6 threads or more, are not only more efficient than running an Intel GMA or HD IGP graphics (6 to 12 threads at 600-1050Mhz), but they beat them in performance too. With last years cellphones beating even Intel UHD IGPs.

The Intel Iris Plus graphics (48-64 cores @ 900-1100Mhz; 500-800Gflops) are still unbeat by even modern (2019) snapdragon cellphones, running CPU only.
ID: 94075 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94077 - Posted: 8 Dec 2019, 0:36:46 UTC

I've just tested an older model Android Stick PC (black, with the 2 yellow antennas) with Rockchip Mali 400 (40nm), and the device just overheated and reset.
This is your classic Android 2.3 / 4.0 or 4.1 from before 2016.
Not recommended for Boinc.

I also just purchased a newer model Chinese stick PC, with an AMLOGIC S905Y2, quadcore 1,5ghz 12nm.
It would make for a much better cruncher than the Raspberry Pi 3B+; and while slower, probably will be more energy efficient than the Raspberry Pi 4.
Results on that will come in soon.
ID: 94077 · Report as offensive
ProfileJord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 15560
Netherlands
Message 94098 - Posted: 8 Dec 2019, 15:40:27 UTC - in response to Message 94075.  

I've also plugged in an old (2017) Chinese phone, a Doogee Mix, using a MediaTek Helio P25 MT6757T. It uses an octacore CPU (running 8 tasks simultaneously in Boinc)
You aren't running 8 tasks simultaneously. You're running 4 tasks, alternating with another 4 tasks because BOINC can only use the slowest 4 CPU cores on an 8 core CPU. The fastest cores are only used by Android itself, because of the power-hungry usage they require, not by any of the programs/apps running on Android. So you're loading 2 tasks per CPU core, slowing calculations down enormously.

See https://en.wikipedia.org/wiki/ARM_big.LITTLE for more information on these cores.
BOINC development is looking if they can use the higher cores as well, but all BOINC development is slow.
ID: 94098 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94105 - Posted: 8 Dec 2019, 19:14:24 UTC - in response to Message 94098.  

I've also plugged in an old (2017) Chinese phone, a Doogee Mix, using a MediaTek Helio P25 MT6757T. It uses an octacore CPU (running 8 tasks simultaneously in Boinc)
You aren't running 8 tasks simultaneously. You're running 4 tasks, alternating with another 4 tasks because BOINC can only use the slowest 4 CPU cores on an 8 core CPU. The fastest cores are only used by Android itself, because of the power-hungry usage they require, not by any of the programs/apps running on Android. So you're loading 2 tasks per CPU core, slowing calculations down enormously.

See https://en.wikipedia.org/wiki/ARM_big.LITTLE for more information on these cores.
BOINC development is looking if they can use the higher cores as well, but all BOINC development is slow.

Is there any more info on this?
ID: 94105 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94111 - Posted: 8 Dec 2019, 22:46:05 UTC
Last modified: 8 Dec 2019, 22:49:42 UTC

OK, yes, indeed.

On the Doogee Mix, with octacore, running 4 to 8 tasks resulted in the GPU frequency to lower from 2,2Ghz to 1,2Ghz, and the low power cores to go from 1,6Ghz to 900Mhz.

4 threads however showed 86% CPU utilization, and 5 showed 100%, with the cores going even lower in speed.
I presume that keeping the phone extremely well cooled (like in a freezer), might allow it to run more than 4 tasks at a time,
But the trade-off at normal temperatures between 8 threads to 4 threads is a lot!
Definitely anything over 5 tasks is not recommended.
and 5 or 4 tasks is up for debate.
However running between 3 to 4 tasks, it made more sense to run 3 tasks at a time.
3 tasks ramped up the high power CPU to 2,2Ghz. CPU utilization did fluctuate between 75 and 86%, but the gain (of running 3 cores at nearly double the frequency) was better than running 4 tasks at the lower frequency.

This also lowers points score considerably on these phones.
If only 3 tasks can run at 2,2Ghz (and 7 Watts), these phones become more competitive with modern PCs running 8 threads at 4Ghz @ 80W.
A desktop CPU is around 5x faster, consuming 10x more power, and with a 12 core iGPU (like Intel 8th gen UHD, iris, or higher), could really bridge the gap between the two!
ID: 94111 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94202 - Posted: 11 Dec 2019, 20:11:12 UTC
Last modified: 11 Dec 2019, 20:32:06 UTC

I just tested an LG Realm, an older 2014 model, equipped with a dual core 1,2Ghz, 28nm Snapdragon 400 chipset.
While very slow, it crunches at a mere 2,5Watts (0.5A at the wall, 5W while charging). I am wondering if this device is competitive with CPU crunching, considering the low power usage.
Comparatively, a 4 core 8 thread CPU, running at 3,6Ghz with a 35W TDP 14nm design, (running at 50-65W on the wall), will perform 8x faster, consuming 20-26x higher power consumption.
28nm Cellphone crunching may be competitive to CPU crunching perhaps...

I also found a way to have my old Samsung Mega work.
By setting the CPU to 90% (time utilization, as the core usage is set to 100%), the device is able to keep the battery charged, at the cost of 10% slower performance.
ID: 94202 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94204 - Posted: 11 Dec 2019, 20:37:37 UTC

Just also tested a Windows tablet, equipped with an Intel X5 Z8300 CPU (4 core, 1,4/1,84Ghz turbo), and Intel UHD 4200 with 12 shaders at 500Mhz; running a full version of Windows 10.
The CPU seems to be at 14nm, but there's no way the iGPU utilizes that same lithography.
The speed should be about half the speed of an Intel Celeron G4900 using the UHD 610 of 12 shaders at 1050Mhz, (with proper cooling).

The ($100 chinese) tablet immediately starts thermal throttling to 80c, with just the IGP running (to 800Mhz) and occasionally blacks out, or freezes.
This has to do with the GPU drawing too much power under load.
I disabled the GPU, for CPU only projects, but with 4 CPU threads, the device still would thermal throttle.
Once set to 3 threads, it would crunch at 1,6Ghz.
I'm still waiting on power figures, as it currently draws 12,5W on the wall.

While energy efficient, Winows/x86 cpus aren't really as efficient as ARM.
ID: 94204 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94217 - Posted: 13 Dec 2019, 6:19:01 UTC

Chinese tablet crunches between 6,5 to 7,5W at the wall at only 3 threads (@1,4Ghz).

The Android stick PC (with the yellow antennas), I opened, and noticed they had placed a rubber heat insulator on top of the CPU.
I removed this, and thermal taped the back and front to the plastic casing.
The stick now runs fine at 4 threads 1Ghz (3,5W, 99% CPU utilization). It rose to 4,1-4,2Watts with HDMI monitor and mouse plugged in.
This is very efficient, considering the above Chinese Windows tablet consuming twice the power for just beating this in performance.

3 threads on the stick pc showed 86% CPU utilization, and a clock speed of between 1Ghz and 1,2Ghz,and 2 threads ran at 1,4Ghz; which is both slower than 4 tasks at 1Ghz.
So I ran 4 threads. I guess the 'half the threads on android' isn't always true. At least, not for media player devices.

The stick pc has no thermal sensors, so I think the 'thermal throttling' is hard-coded to the load.
In any case, it now runs fine from a USB 3.0 port on the PC without it restarting itself every often.
ID: 94217 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94255 - Posted: 14 Dec 2019, 11:45:12 UTC
Last modified: 14 Dec 2019, 11:54:09 UTC

I've done some theoretical calculations, and the numbers are approximate, but should somewhat reflect on performance in Boinc.

An approximate figure on the work done in Boinc, would be to multiply the cpu threads, times the core frequency.
It should give a certain peak workload. Then divide it per watt.

For instance,
I have 2 Xeon E5 2650L v2 CPUs. They have 10 cores, 20 threads, and run at a constant boost frequency of 1,8Ghz.
You could say each Xeon does 20 threads x 1,8Ghz = 36Ghz of work.
Each Xeon takes 70W off the wall, and with the whole system, it results in 110W.
Meaning, 2 systems perform 72Ghz @ 220W. (a fictive number to approximately compare each system)

If I have a ryzen 9 3850, running 32 cores at 4,5Ghz, with a 125W TDP (using 175W at the wall),
We get 144Ghz of work done at 175W, nearly double the work, for less power.

If I have an array of Media players using the Amlogic S905 chipset, which is the highest performer in it's price class, to match the power draw of above,
each player would consume 3W at the wall, crunching 4 threads at 1,8Ghz, thus do 7.2Ghz of work at 3W.
Having 60 of these units, we'd get 432Ghz of work done at a mere 180W.
We'd have to probably add a 5-10W on fans to cool down the boards, and we're pretty close to the same power draw, and we're still 3x faster than a modern Ryzen per watt!!

If I now would compare this work to a generic $160 GPU, with 1280 cores at 1,8Ghz, consuming 125 Watts, fed by a CPU that uses 35W; the system at the wall should consume about 200W.
Using only 1 GPU, that system would perform 2304 Ghz of work at 200W.

The mid range GPU, per watt, is:
- More than 5x faster than the media player array;
- More than 15x faster than a Ryzen CPU
- More than 30x faster than a dual xeon system.

While being:
- Slightly cheaper than 2 xeon systems,
- About 5x cheaper than a Ryzen 9
- About 7.5x cheaper than 60 media players.
ID: 94255 · Report as offensive

Message boards : Android : Android/ARM efficiencies (RasPi/Cellphone)vs CPU/GPU

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.