LUIS Services Computing
Research cluster housing

Research cluster housing

Integration of the institution's own hardware for scientific computing into the LUIS cluster system

Who can use this service

Students No
Individual Employees No
Central Facilities Yes
Administrators No

Service Research cluster housing

With the service Research cluster housing, the institution's own hardware for scientific computing can be integrated into the existing cluster system of the LUIS (see also the service Scientific Computing). The LUIS provides the infrastructure for this and bears the operating costs for air conditioning and power supply. The prerequisite is that certain hardware and software requirements are met. The respective institution receives exclusive computing time reserved on the hardware it has brought in as compensation. In addition, it is possible to use the rest of the cluster system.

  • Scope of services

    The LUIS provides the necessary infrastructure for the hardware of the institution. This essentially consists of air conditioning, power supply, UPS (uninterruptible power supply), racks and various disk systems for data storage. Furthermore, the software infrastructure is provided which is necessary for the administration of the server. This includes authentication, operating systems, batch system and kernel modules to be able to access the parallel file systems.

    The following services are offered as part of the service Research cluster housing:

    • Assistance in purchasing new hardware
    • Deployment of the hardware and installation of the operating system (CentOS)
    • Connection to the central storage, batch and software system
    • Installation of special software (if desired)
    • Telephone and Email support
    • System monitoring
    • Performance indicators to the hardware owner
    • Costs for replacing defective hardware components are borne by the respective owner
  • Service-specific functional parameters
    • Maximum runtime(walltime) of a computing job: 200 hours
    • Initial exclusive reservation of computing time on your own hardware: Monday to Friday between 8 a.m. and 8 p.m.
    • Access to the login nodes of the cluster system: 24/7 (except maintenance times)
    • Maintenance times:  2x a week per year for the entire cluster system (January and July) after notification.
    • Direct SSH access to the research cluster: No, access only via the batch system
    • Service calls to the hardware provider are preferably made via the LUIS
    • Storage quota on the central storage systems: as defined in the service Scientific Computing
    • Administrative access: No
    • User account requests via the service Scientific Computing
  • Hardware requirements
    • Computing node: 19" rack servers suitable for the infrastructure available in the LUIS cluster
    • Total minimum number of computing nodes: 4 (exceptions are possible by arrangement)
    • Compute nodes require:
      • InfiniBand card (at least QDR with QSFP connection)
      • Gigabit Ethernet port or 10 GE port (if this can be clocked down to 1GE)
      • At least IPMI 2.x (for managing the nodes) with enterprise-features
    • Selection of the hard disk size and configuration, the CPUs and the amount of main memory is left to the discretion of the institution

    We are happy to support you with the hardware procurement. Before purchasing, please contact the Scientific Computing Group by email: cluster-help@luis.uni-hannover.de

Consulting and Documentation

This service is tailored to the needs of the participating institutes. Please contact us, if you have any questions. For information on how to use your newly added servers as part of the cluster system, please see the documentation section of the service Scientific Computing.

Benefits for institutions

  • No administration effort for the setup.
  • Dedicated access for members of the institution during specified time intervals.
  • Institutions can use the entire cluster system.
  • Reduced procurement costs for institutions. This means that for the same financial resources, you can get more computing power.

Benefits for the LUH

  • Idle periods of individual cluster computers are available to the user community as a computing resource.

Contact

Cluster-Team
Monday to Friday 8:00 a.m. – 5:00 p.m.
Cluster-Team
Monday to Friday 8:00 a.m. – 5:00 p.m.