Linux only uses Nvidia GPU, not Intel IGP?

Message boards : GPUs : Linux only uses Nvidia GPU, not Intel IGP?
Message board moderation

To post messages, you must log in.

AuthorMessage
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 93820 - Posted: 22 Nov 2019, 3:51:23 UTC

Is there a way to be able to use both Intel's IGP (of a Celeron G4900 series CPU), and an Nvidia GTX/RTX GPU?
ID: 93820 · Report as offensive
Profile Joseph Stateson
Volunteer tester
Avatar

Send message
Joined: 27 Jun 08
Posts: 641
United States
Message 93823 - Posted: 22 Nov 2019, 5:25:36 UTC - in response to Message 93820.  
Last modified: 22 Nov 2019, 5:42:03 UTC

Is there a way to be able to use both Intel's IGP (of a Celeron G4900 series CPU), and an Nvidia GTX/RTX GPU?


The official release of intel OpenCL for Linux is 18.1
https://registrationcenter.intel.com/en/products/download/3599/

It is called 18.1 but it is only good for ubuntu 16
what intel guru has to say about 18.04
https://software.intel.com/en-us/forums/opencl/topic/797941


Out of curiosity, can you click on the download and see if the file actually ends in .tgz
Unaccountably, I got something else and had to rename it to .tgz
OTOH you might not want to register for that download as it is a PITA plus it is no good anyway.

I got intel OpenCL installed on my Linux 18.04 by visiting the GitHub, non-official release. My motherboard is h110-BTC with i7-6700 and it has (so far) a gtx-1060 and a p106-100 running the Linux special SETI app. Since that special app is the anonymous platform it seems I cannot download the intel app that SETI has available. That is a guess. I have another thought**.


The following works and install intel OpenCL and is recognized by BOINC

https://github.com/intel/compute-runtime/releases

I put in 19.45.14764 package, the lastest. All I had to do was a copy and paste of all the wget at once and then did that sudo dpkg -i *.deb
and rebooted and boinc shows the following:
1			11/21/2019 10:33:51 PM	Starting BOINC client version 7.16.3 for x86_64-pc-linux-gnu	
2			11/21/2019 10:33:51 PM	log flags: file_xfer, sched_ops, task	
3			11/21/2019 10:33:51 PM	Libraries: libcurl/7.58.0 OpenSSL/1.1.1 zlib/1.2.11 libidn2/2.0.4 libpsl/0.19.1 (+libidn2/2.0.4) nghttp2/1.30.0 librtmp/2.3	
4			11/21/2019 10:33:51 PM	Data directory: /var/lib/boinc-client	
5			11/21/2019 10:33:57 PM	CUDA: NVIDIA GPU 0: P106-100 (driver version 440.26, CUDA version 10.2, compute capability 6.1, 4096MB, 3974MB available, 4374 GFLOPS peak)	
6			11/21/2019 10:33:57 PM	CUDA: NVIDIA GPU 1: GeForce GTX 1060 3GB (driver version 440.26, CUDA version 10.2, compute capability 6.1, 3019MB, 2945MB available, 3936 GFLOPS peak)	
7			11/21/2019 10:33:57 PM	OpenCL: NVIDIA GPU 0: P106-100 (driver version 440.26, device version OpenCL 1.2 CUDA, 6081MB, 3974MB available, 4374 GFLOPS peak)	
8			11/21/2019 10:33:57 PM	OpenCL: NVIDIA GPU 1: GeForce GTX 1060 3GB (driver version 440.26, device version OpenCL 1.2 CUDA, 3019MB, 2945MB available, 3936 GFLOPS peak)	
9			11/21/2019 10:33:57 PM	OpenCL: Intel GPU 0: Intel(R) Gen9 HD Graphics NEO (driver version 19.45.14764, device version OpenCL 2.1 NEO, 2908MB, 2908MB available, 100 GFLOPS peak)	


Since I was unable to get any SETI INTEL work units I tried Einstein.
802	Einstein@Home	11/21/2019 11:14:19 PM	Requesting new tasks for Intel GPU	
803	Einstein@Home	11/21/2019 11:14:21 PM	Scheduler request completed: got 0 new tasks	


Didn't work either.

*** looking at both SETI and Einstein both project claim I have a 9th generation intel HD graphics CPU. In actuality the i7-6700 is only 6 generation and only 100 GFLOPS. That is 40 orders of magnitude less than the nvidia. Not worth trying to figure out why it is not working.

both seti and Einstein show the following
PU type: 
GenuineIntel Intel(R) Core(TM) i7-6700 CPU @ 3.40GHz [Family 6 Model 94 Stepping 3] 
Number of processors:  8 
Coprocessors:  [2] NVIDIA P106-100 (4095MB) driver: 440.26
INTEL Intel(R) Gen9 HD Graphics NEO (2908MB)  
Operating system:  Linux Ubuntu Ubuntu 18.04.3 LTS [5.0.0-36-generic|libc 2.27 (Ubuntu GLIBC 2.27-3ubuntu1)] 
BOINC client version:  7.16.3


You have a much newer CPU maybe it is better than 100 GFLOPS which is pathetic compared to an NVidia or ATI.

if you get that OpenCL installed please post the GFLOPS I am curious about the lasts intel chips potentials.
ID: 93823 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 93857 - Posted: 22 Nov 2019, 21:41:20 UTC

Like you say, the gflops are low.
However, it's for my portable pc, so the electricity I don't have to pay.
It runs with an RTX 2060, and a Celeron CPU right now. The RTX 2060 is only loaded halfway.
The CPU is even slower.
From what I could read, the celeron has 12 GPU cores running at 600-700Mhz,
The cpu has 2 cores running at 3,1Ghz.
When GPU crunching, only 10% of a CPU core is utilized, leaving the remaining of the core to another project.
In this case, a lot more work can be done, even if the GPU isn't that efficient. The CPU isn't either.
ID: 93857 · Report as offensive
Profile Joseph Stateson
Volunteer tester
Avatar

Send message
Joined: 27 Jun 08
Posts: 641
United States
Message 93864 - Posted: 22 Nov 2019, 22:53:59 UTC - in response to Message 93857.  
Last modified: 22 Nov 2019, 23:30:09 UTC

Like you say, the gflops are low.
However, it's for my portable pc, so the electricity I don't have to pay.
It runs with an RTX 2060, and a Celeron CPU right now. The RTX 2060 is only loaded halfway.
The CPU is even slower.
From what I could read, the celeron has 12 GPU cores running at 600-700Mhz,
The cpu has 2 cores running at 3,1Ghz.
When GPU crunching, only 10% of a CPU core is utilized, leaving the remaining of the core to another project.
In this case, a lot more work can be done, even if the GPU isn't that efficient. The CPU isn't either.


Problem is not always the electricity. I was lucky to get Microsoft to replace my surface pro for free even after the warrantee period. I had run Einstein intel app on it for while and stopped when I noticed the screen budging out. The problem was the charger had to be on to keep the app running that that constant charging while running the app overheated the battery. Will not be running any apps like that again.

ID: 93864 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 93867 - Posted: 23 Nov 2019, 6:50:32 UTC
Last modified: 23 Nov 2019, 7:47:14 UTC

So I switched over to Intel, and now have 2 screens showing in Linux.
lspci | grep VGA

Shows both graphics cards.
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 610
01:00.0 VGA compatible controller: NVIDIA Corporation TU106 [GeForce RTX 2060 Rev. A] (rev a1)


I've installed the Intel OpenCL drivers, even the OpenGL drivers.
Boinc still doesn't see the Intel GPU.

So I edited my apps_config.xml in the Einstein@home folder, as well as the cc_config.xml in the main boinc folder to display:
apps_config.xml
<app_config>
   <app>
      <name>Einstein@Home</name>
      <max_concurrent>2</max_concurrent>
      <gpu_versions>
          <gpu_usage>.25</gpu_usage>
      </gpu_versions>
    </app>
    <app_version>
       <app_name>Einstein@Home</app_name>
       <ngpus>2</ngpus>
   </app_version>
   <project_max_concurrent>3</project_max_concurrent>
</app_config>


cc_config.xml
<cc_config>
  <log_flags>
    <task>1</task>
    <file_xfer>1</file_xfer>
    <sched_ops>1</sched_ops>
  </log_flags>
  <options>
  <use_all_gpus>1</use_all_gpus>
  </options>
</cc_config>


This in an attempt to run 2 instances of Einstein on my Nvidia GPU, as well as enable all GPUs for boinc.

However, intel_gpu_top tells me the intel GPUs are still inactive, and Boinc doesn't seem to find it either..

I just checked my log files, and saw that Boinc DOES see my Intel IGP.
A log entry reads:
Sat 23 Nov 2019 01:43:53 AM EST |  | OpenCL: Intel GPU 0: Intel(R) Gen9 HD Graphics NEO (driver version 19.46.14807, device version OpenCL 2.1 NEO, 6277MB, 6277MB available, 101 GFLOPS peak) <<== This is probably at 350Mhz idle frequency
Sat 23 Nov 2019 02:15:57 AM EST | Moo! Wrapper | Not requesting tasks: don't need (CPU: job cache full; NVIDIA GPU: job cache full; Intel GPU: )

The rest of the log shows no project has an intel GPU job available. Probably why it's not being engaged.


In response to it's GPU performance, since they're doing single precision,
The Celeron G4900 CPU is rated at ~100Gflops of processing power.
The GPU is rated at ~211Gflops of single precision under great cooling, according to this site.
Once the CPU gets hotter, the GPU gets dialed back in boost frequency, and performance drops.
At 65Watts, and with the stock Intel Cooler, you can expect ~65C tops under load.
With an aftermarket cooler, 55C; and full performance.
So working in tandem with the GPU, the G4900 could potentially do 300Gflops.

This result also falls in line with my personal calculations,
12 GPU cores, at 1Ghz peak = 12Ghz.
The CPU is a dual core at 3,1Ghz peak, or 6,2Ghz.
Roundabout half the flops of it's IGP.

The RTX 2060 does 8Tflops (8000Gflops, or even 10Tflops under continuous boost frequency).
However, most of the time, less than 50% of this performance is tapped into when crunching.
ID: 93867 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 93868 - Posted: 23 Nov 2019, 7:46:15 UTC
Last modified: 23 Nov 2019, 7:48:46 UTC

WORKS!!!!
Ok,so the Collatz Conjecture had some jobs for my IGP, and now all are successfully working!

Procedure for Lubuntu:
1- Install Nvidia GPU, run from Nvidia GPU, install Lubuntu, install Nvidia drivers.
2- Once everything works fine, go to BIOS, and change graphics solution from PEG to IGP. In my case, the IGP had a measurly max of 64MB of VRAM.
3- Install Intel drivers (openCL, like mentioned above).
4- Lubuntu will boot on Intel IGP, but then switch over to desktop mode on Nvidia. Intel IGP display shows text boot lines.
5- I did a lot of things I think aren't really necessary.
If the command:
lspci grep VGA

shows the IGP in Linux, and the drivers are installed, Boinc should be able to use it.
In my case it shows:
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 610
01:00.0 VGA compatible controller: NVIDIA Corporation TU106 [GeForce RTX 2060 Rev. A] (rev a1)


Intel top shows:
intel-gpu-top - 1049/1049 MHz;    0% RC6; 10.44 Watts;       63 irqs/s

      IMC reads:     2543 MiB/s
     IMC writes:      535 MiB/s

          ENGINE      BUSY                                  MI_SEMA MI_WAIT
     Render/3D/0   99.63% |██████████████████████████████▉|      0%      0%
       Blitter/0    0.00% |                               |      0%      0%
         Video/0    0.00% |                               |      0%      0%
  VideoEnhance/0    0.00% |                               |      0%      0%




LM-Sensors show (sensors)
coretemp-isa-0000
Adapter: ISA adapter
Package id 0:  +67.0°C  (high = +80.0°C, crit = +100.0°C)
Core 0:        +66.0°C  (high = +80.0°C, crit = +100.0°C)
Core 1:        +67.0°C  (high = +80.0°C, crit = +100.0°C)

acpitz-acpi-0
Adapter: ACPI interface
temp1:        +27.8°C  (crit = +119.0°C)

pch_cannonlake-virtual-0
Adapter: Virtual device
temp1:        +53.0°C

The CPU temp with the stock intel cooler is keeping up (Fan speed at 80%, temps hovering between 66 and 70C, up from 60C and 25% fan speed on CPU only).
Every 2.0s: nvidia-smi                                                    Port: Sat Nov 23 02:47:38 2019

Sat Nov 23 02:47:38 2019
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 440.26       Driver Version: 440.26       CUDA Version: 10.2     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce RTX 2060    Off  | 00000000:01:00.0  On |                  N/A |
| 74%   71C    P2   170W / 170W |    178MiB /  5934MiB |     96%      Default |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0       777      G   /usr/lib/xorg/Xorg                            56MiB |
|    0      4054      C   ...1.40_x86_64-pc-linux-gnu__opencl_nvidia   109MiB |
+-----------------------------------------------------------------------------+



Nvidia-smi is also working well with this setting,
It's finally showing 100% energy usage:




Thanks guys!
I'm ecstatic!
ID: 93868 · Report as offensive
Profile Jord
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 29 Aug 05
Posts: 15477
Netherlands
Message 93876 - Posted: 23 Nov 2019, 10:14:56 UTC - in response to Message 93867.  
Last modified: 23 Nov 2019, 10:17:32 UTC

So I edited my apps_config.xml in the Einstein@home folder, as well as the cc_config.xml in the main boinc folder to display:
apps_config.xml
<app_config>
   <app>
      <name>Einstein@Home</name>
      <max_concurrent>2</max_concurrent>
      <gpu_versions>
          <gpu_usage>.25</gpu_usage>
      </gpu_versions>
    </app>
    <app_version>
       <app_name>Einstein@Home</app_name>
       <ngpus>2</ngpus>
   </app_version>
   <project_max_concurrent>3</project_max_concurrent>
</app_config>
Please read the documentation on how to use the app_config.xml file: https://boinc.berkeley.edu/wiki/Client_configuration#Project-level_configuration. The name and app_name are NOT Einstein@Home, but the application name found in the client_state.xml file for those tasks.
cc_config.xml
<cc_config>
  <log_flags>
    <task>1</task>
    <file_xfer>1</file_xfer>
    <sched_ops>1</sched_ops>
  </log_flags>
  <options>
  <use_all_gpus>1</use_all_gpus>
  </options>
</cc_config>

use_all_gpus is only needed when you have two GPUs of the same brand but different models. It's not needed when you have two different brand GPUs, those will be used automatically.
ID: 93876 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94268 - Posted: 15 Dec 2019, 0:03:06 UTC
Last modified: 15 Dec 2019, 0:05:45 UTC

Is the Intel Atom N3650 supported for IGP?

I've followed the same guide, to install the newest Intel OpenCl drivers, but it doesn't work on my laptop.
I've read elsewhere that people were able to make the Intel Atom CPU work...


I did:
wget https://github.com/intel/compute-runtime/releases/download/19.49.15055/intel-gmmlib_19.3.4_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/19.49.15055/intel-igc-core_1.0.3032_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/19.49.15055/intel-igc-opencl_1.0.3032_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/19.49.15055/intel-opencl_19.49.15055_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/19.49.15055/intel-ocloc_19.49.15055_amd64.deb

then
sudo dpkg -i *.deb
sudo add-apt-repository ppa:intel-opencl/intel-opencl
sudo apt-get update
sudo apt-get install intel-opencl-icd


intel tools works, but Boinc doesn't see the GPU.

From the log:
Sat 14 Dec 2019 06:58:45 PM EST |  | Data directory: /var/lib/boinc-client
Sat 14 Dec 2019 06:58:45 PM EST |  | No usable GPUs found
Sat 14 Dec 2019 06:58:46 PM EST |  | [libc detection] gathered: 2.30, Ubuntu GLIBC 2.30-0ubuntu2
Sat 14 Dec 2019 06:58:46 PM EST |  | Host name: hb-pc
Sat 14 Dec 2019 06:58:46 PM EST |  | Processor: 2 GenuineIntel Intel(R) Celeron(R) CPU N3060 @ 1.60GHz [Family 6 Model 76 Stepping 4]
Sat 14 Dec 2019 06:58:46 PM EST |  | Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology tsc_reliable nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm sse4_1 sse4_2 movbe popcnt tsc_deadline_timer aes rdrand lahf_lm 3dnowprefetch epb pti ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid tsc_adjust smep erms dtherm ida arat md_clear
Sat 14 Dec 2019 06:58:46 PM EST |  | OS: Linux Ubuntu: Ubuntu 19.10 [5.3.0-24-generic|libc 2.30 (Ubuntu GLIBC 2.30-0ubuntu2)]
Sat 14 Dec 2019 06:58:46 PM EST |  | Memory: 3.75 GB physical, 0 bytes virtual
Sat 14 Dec 2019 06:58:46 PM EST |  | Disk: 28.46 GB total, 4.63 GB free
Sat 14 Dec 2019 06:58:46 PM EST |  | Local time is UTC -5 hours
Sat 14 Dec 2019 06:58:46 PM EST |  | Config: GUI RPCs allowed from:
Sat 14 Dec 2019 06:58:46 PM EST |  | General prefs: from http://www.gpugrid.net/ (last modified 30-Nov-2019 04:49:00)
Sat 14 Dec 2019 06:58:46 PM EST |  | Host location: none
ID: 94268 · Report as offensive
Les Bayliss
Help desk expert

Send message
Joined: 25 Nov 05
Posts: 1654
Australia
Message 94269 - Posted: 15 Dec 2019, 0:20:58 UTC - in response to Message 94268.  

In GPU computing there is this, not far from the top:
Intel GPUs: Currently Ivy Bridge and Haswell are the only Intel CPUs with an OpenCL capable Intel GPU, however future embedded GPUs may also support OpenCL. You will need to install Intel graphics drivers to enable OpenCL support. It's also needed to add a monitor or VGA dummy-plug before the Intel GPU is recognized.
ID: 94269 · Report as offensive
Profile Joseph Stateson
Volunteer tester
Avatar

Send message
Joined: 27 Jun 08
Posts: 641
United States
Message 94283 - Posted: 15 Dec 2019, 17:51:16 UTC - in response to Message 94268.  

Is the Intel Atom N3650 supported for IGP?



I don't think it does. Looking here
https://en.wikipedia.org/wiki/List_of_Intel_Atom_microprocessors

There is no mention of the 3650 and OpenCL or GL.

However, I did see my old bay trail n2808 listed.

3 years I was experimenting with a Liva-X and attached an asic miner to it and also installed the intel OpenCL. The N2808 is listed as having support for OpenCL in that WiKI but the newer (?) 3650 is not.

I stopped using the OpenCL on the N2808 as the video driver with that library was not as good as the driver without it plus it was overheating. Bitcoin utopia was an interesting crunch while it lasted but it totally screwed up my "credits" to where no other projects would even show up on a statistical graph because their number were so small. Should have been banned.
ID: 94283 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 94308 - Posted: 17 Dec 2019, 11:44:27 UTC - in response to Message 94283.  
Last modified: 17 Dec 2019, 11:54:42 UTC

Is the Intel Atom N3650 supported for IGP?



I don't think it does. Looking here
https://en.wikipedia.org/wiki/List_of_Intel_Atom_microprocessors

There is no mention of the 3650 and OpenCL or GL.

However, I did see my old bay trail n2808 listed.

3 years I was experimenting with a Liva-X and attached an asic miner to it and also installed the intel OpenCL. The N2808 is listed as having support for OpenCL in that WiKI but the newer (?) 3650 is not.

I stopped using the OpenCL on the N2808 as the video driver with that library was not as good as the driver without it plus it was overheating. Bitcoin utopia was an interesting crunch while it lasted but it totally screwed up my "credits" to where no other projects would even show up on a statistical graph because their number were so small. Should have been banned.

Thank you.

Yes I ask, because the N3650 does support opencl, unlike my older Core I5 7400 igp.
Intel_gpu_htop works fine, but I guess like you say, the N3650 is not supported by BOINC (which makes no sense).
The IGP directly drives my LCD laptop screen, but I'll try plugging in a dummy plug in the HDMI output next time. Though I would say, that it's an unusual requirement, considering there already is an lcd display being fed by it.

My HP Stream laptop has excellent cooling, and the CPU never deviates from it's max CPU boost frequency when crunching.
Very unusual, because most laptops throttle speed at full CPU load.
Even my N2600 runs hotter than the 3650, so I had hoped that the 3650 could do GPU crunching too, considering how cool it runs.
ID: 94308 · Report as offensive
Dirk Broer

Send message
Joined: 19 Jun 10
Posts: 17
British Virgin Islands
Message 94436 - Posted: 21 Dec 2019, 15:05:00 UTC - in response to Message 94269.  

In GPU computing there is this, not far from the top:
Intel GPUs: Currently Ivy Bridge and Haswell are the only Intel CPUs with an OpenCL capable Intel GPU, however future embedded GPUs may also support OpenCL. You will need to install Intel graphics drivers to enable OpenCL support. It's also needed to add a monitor or VGA dummy-plug before the Intel GPU is recognized.


I am crunching Seti Beta using the Intel UHD 605 IGP of my Pentium J5005. You need to install the Intel® Graphics Compute Runtime for OpenCL™ in order to be able to crunch, it won't run with the standard Beignet driver.

ID: 94436 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 96195 - Posted: 29 Feb 2020, 11:23:50 UTC - in response to Message 94436.  

In GPU computing there is this, not far from the top:
Intel GPUs: Currently Ivy Bridge and Haswell are the only Intel CPUs with an OpenCL capable Intel GPU, however future embedded GPUs may also support OpenCL. You will need to install Intel graphics drivers to enable OpenCL support. It's also needed to add a monitor or VGA dummy-plug before the Intel GPU is recognized.


I am crunching Seti Beta using the Intel UHD 605 IGP of my Pentium J5005. You need to install the Intel® Graphics Compute Runtime for OpenCL™ in order to be able to crunch, it won't run with the standard Beignet driver.


No, there are very specific GPUs that work, and ones that don't work.
Eventhough the N3650 is faster and newer than the Celeron G processor (is 1 generation behind), the Celeron works fine, and the Atom does not.

The Pentium N4000/5000 do work.

So it's a matter of compatibility on the lower end of the spectrum.
Not sure why Intel left out the Atom processors, as a lot of them are in a landfill right now, and they perform better than the few celeron G processors they've released.
ID: 96195 · Report as offensive
ProDigit

Send message
Joined: 8 Nov 19
Posts: 718
United States
Message 96453 - Posted: 5 Mar 2020, 22:16:26 UTC
Last modified: 5 Mar 2020, 22:19:08 UTC

So I redid the Pentium Gold G5600 (a quadcore).
Here's what made it work (not sure if I did some redundant work):

cd neo
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/intel-gmmlib_19.4.1_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/intel-igc-core_1.0.3390_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/intel-igc-opencl_1.0.3390_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/intel-opencl_20.08.15750_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/intel-ocloc_20.08.15750_amd64.deb
wget https://github.com/intel/compute-runtime/releases/download/20.08.15750/ww08.sum
sha256sum -c ww08.sum
sudo dpkg -i *.deb
sudo apt install intel-gpu-tools



I could have instead just done:
sudo apt install intel-opencl intel-gpu-tools

The Pentium Gold is crunching fine!
I love intel gpu top, it's like nvidia-smi for intel!
                   render busy: 100%: ████████████████████                   render space: 1926/16384

                          task  percent busy
                            CS: 100%: ████████████████████    vert fetch: 0 (0/sec)
                           TSG:  98%: ███████████████████▋    prim fetch: 0 (0/sec)
                           VFE:  98%: ███████████████████▋ VS invocations: 0 (0/sec)
                           GAM:  65%: █████████████        GS invocations: 0 (0/sec)
                          GAFS:   2%: ▌                         GS prims: 0 (0/sec)
                           TDG:   1%: ▎                    CL invocations: 0 (0/sec)
                            SF:   0%:                           CL prims: 0 (0/sec)
                                                           PS invocations: 0 (0/sec)
                                                           PS depth pass: 0 (0/sec)
ID: 96453 · Report as offensive

Message boards : GPUs : Linux only uses Nvidia GPU, not Intel IGP?

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.