Is there anyone here that has installed Open OnDemand and all associated pieces on a single server? We have small server with 2 six core processors, 384 Gb of RAM and 4 GeForce RTX 2080 GPU. It’s running CentOS 7
I’m new to all of this and currently I have this server, that hasn’t seen use yet, setup with jupyterhub, Singularity and Slurm. Since this is the same node user’s login to I have concerns about them using ssh to run jobs and bypassing Slurm. I have jupyterhub configured to run through Slurm when it launches user notebooks. My department chair is worried about making sure jobs are queued up to allocate resources fairly. I’m also not certain about C/C++ programs that may use CUDA and then create a visualization that needs to be launched in a window (like some of the CUDA samples do). I have found that applications like that cannot be launched via X since they use OpenGL.
I ran across Open OnDemand on some of my searches and wondered if it might solve my problems of controlling the user’s access a bit more. I’d appreciate any info from anyone willing to discuss this with me. I could possibly create an Open OnDemand front end on a VM (I say possibly as it would require some cooperation from IT folks) as long as the users could still store all their data on this single server as that is where the major disk storage would be.