Linux GPU usage without X?

Message boards : Questions and problems : Linux GPU usage without X?
Message board moderation

To post messages, you must log in.

AuthorMessage
Wigyori

Send message
Joined: 21 Jul 10
Posts: 2
Hungary
Message 33880 - Posted: 21 Jul 2010, 11:51:48 UTC
Last modified: 21 Jul 2010, 11:52:04 UTC

Hi,

I have a couple servers around with some spare resources and available PCIe slots. However, since these are built for serving websites and so, I wouldn't want to install an X server onto them just to use the GPU.

Is it possible to run the BOINC client and GPU-using projects without an X server?

Thanks,
ZH
ID: 33880 · Report as offensive
Profile Gundolf Jahn

Send message
Joined: 20 Dec 07
Posts: 1069
Germany
Message 33883 - Posted: 21 Jul 2010, 14:43:26 UTC - in response to Message 33880.  

Try the command-line interface Boinccmd.

Gruß,
Gundolf
Computer sind nicht alles im Leben. (Kleiner Scherz)
ID: 33883 · Report as offensive
Wigyori

Send message
Joined: 21 Jul 10
Posts: 2
Hungary
Message 33896 - Posted: 22 Jul 2010, 8:38:10 UTC - in response to Message 33883.  

Thanks Gundolf, I'm aware of boinccmd, and also already use that on these boxes. Let me rephrase the question.

Has anyone played with the nvidia driver, to load in the kernel module - so that BOINC would be able to detect it and make use of it - without installing an X server?

Thanks,
ZH
ID: 33896 · Report as offensive
whynot

Send message
Joined: 8 May 10
Posts: 89
Ukraine
Message 33931 - Posted: 24 Jul 2010, 12:11:02 UTC

Has anyone played with the nvidia driver, to load in the kernel module - so that BOINC would be able to detect it and make use of it - without installing an X server?


(sincerly, I can't provide any guidance (I don't have any Nvidia chips on hands), just suggestion.) If I understand your problem, then let me rephrase it a bit. One would need X'es as way to ensure that driver not just loaded but plugged in correctly (by observing some fancy effects like unusual display resolution, unusual frequences available, and such). I would suggest just use boinc for that. Like this:


    Deploy Nvidia's driver (you should know how to do it);

    (probably) ensure it's plugged (I would look in /proc/ filesystem but I don't know what to look for);

    Ensure that in project preferences (the one must provide Linux/Nvidia cores (I can be wrong with exact terminolog)) GPU usage isn't switched off;

    Then run (restart?) 'boinc'.



Then if all these pieces fit, boinc will fetch work. If not then something has failed or was just plain wrong. Then unload Nvidia (you don't need it other way) and retry each 6 month.


I'm counting for science,
points just make me sick.
ID: 33931 · Report as offensive

Message boards : Questions and problems : Linux GPU usage without X?

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.