COMSOL

De Wiki de Calcul Québec
Aller à : Navigation, rechercher
Cette page est une traduction de la page COMSOL et la traduction est complétée à 100 % et à jour.

Autres langues :anglais 100% • ‎français 100%

Note: This documentation has been tested on Colosse. Parts of the instruction might be different on other servers.

Sommaire

Description

COMSOL is an engineering simulation software to facilitate modeling at all levels - geometry, mesh, physics, optimisation and visualisation.

COMSOL is a commercial software. Each research group needs to provide its own licence to use it on the Calcul Québec servers. Software installation is usually done in the group folder and can be done either by a Calcul Québec analyst or by a group member.

Details

SSH configuration and keys

COMSOL is an application that is losely integrated, which means that the user is responsible for starting different processes on every node that is allocated using the batch script. So it is necessary to have a means of communication between the worker nodes, and in this case we use SSH.

As no information can be entered by the user during communication between nodes, it may be necessary to generate SSH keys. On certain servers, this is automatically done. To test if this is the case, use the command

[name@server $] ssh localhost


The SSH connection should work without asking anything on your part. If this is not the case, you should generate SSH keys.


How to start COMSOL on multiple nodes

Model configuration

To permit the distribution of COMSOL computations to multiple nodes, you should first configure the "Study" section of the model to enable execution in "batch" mode or "cluster computing".

The options "batch" and "cluster computing" are not visible by default.

Hence you must first activate the visibility of these options. In the section "Model Builder", click the "Show" button. This is the 4th button from the left that is immediately under the tab "Model Builder".
Show button in the Model Builder section.
  • After that check "Advanced Study Options".
    Advanced options in the "Study" section.
  • After this you can access the "Cluster computing" and "Batch" options when you right-click on the "Study" section of your model.
"Cluster Computing" option in the "Étude/Study" menu.
  • You should then check "Distribute parametric sweep" in the "Cluster Computing" menu, so that COMSOL distributes simulation parameters.
"Cluster Computing" option of "Étude/Study" menu.
  • The only thing left to do now is to run your script on the compute server.

Submission file

Here is an example of an example submission file allowing to run COMSOL on 4 nodes with Moab as a function of the COMSOL version.

COMSOL 4.2

File : submit_comsol_moab.sh
#!/bin/bash
 
#####################################
# Mandatory options                 #
#####################################
 
 
#PBS -N give_a_name	     # Name of the job
#PBS -A aaa-111-aa 	     # Project assigned to the job
#PBS -l nodes=4:ppn=8        # Number of nodes and cores per node
#PBS -l walltime=15:00:00    # Maximum wallclock time of the job (15 hours here)
 
 
#####################################
# Unrequired options                #
#####################################
 
# List of users to which the server that executes the job has to send mail
#PBS -M john@doe.ca
 
# Under which circumstances mail is to be sent?
# "b" = when job begins
# "e" = when job ends
# "a" = when job aborts
 
#PBS -m bea
 
# Execute the job from the current working directory.
cd "${PBS_O_WORKDIR}"
 
module load /rap/aaa-111-aa/modulefiles/comsol/4.x
 
# Command to run
 
# Creating the hostfile
/clumeq/bin/moabhl2hl.py --format HP-MPI > hosts.txt
# Count the number of nodes available
NN=$(wc -l < hosts.txt)
 
# Initialize mpd
comsol -nn ${NN} mpd boot -f hosts.txt -mpirsh ssh
 
# Launch the COMSOL job
comsol -nn ${NN} batch -inputfile /home/USER/model.mph -outputfile /home/USER/results.mph -batchlog /home/USER/model.log -tmpdir /scratch/aaa-111-aa/
 
# Kill all instances of mpd once finished
comsol mpd allexit
 
# Delete the hostfile
rm hosts.txt


Note : this submission script uses the file hosts.txt generated thank to the utility moabhl2hl.py, that is available on Colosse. To generate this file on other servers, please contact us.

COMSOL 4.3a

File : submit_comsol_moab.sh
#!/bin/bash
 
#####################################
# Mandatory options                 #
#####################################
 
 
#PBS -N give_a_name	     # Name of the job
#PBS -A aaa-111-aa 	     # Project assigned to the job
#PBS -l nodes=4:ppn=8        # Number of nodes and cores per node
#PBS -l walltime=15:00:00    # Maximum wallclock time of the job (15 hours here)
 
 
#####################################
# Unrequired options                #
#####################################
 
# List of users to which the server that executes the job has to send mail
#PBS -M john@doe.ca
 
# Under which circumstances mail is to be sent?
# "b" = when job begins
# "e" = when job ends
# "a" = when job aborts
 
#PBS -m bea
 
# Execute the job from the current working directory.
cd "${PBS_O_WORKDIR}"
 
module load /rap/aaa-111-aa/modulefiles/comsol/4.x
 
# Command to run
# Launch the COMSOL job
comsol -clustersimple batch -inputfile /home/USER/model.mph -outputfile /home/USER/results.mph -batchlog /home/USER/model.log -tmpdir /scratch/aaa-111-aa/


Results (on Colosse)

To see if COMSOL was correctly distributed on Colosse, several tests were made. In particular, the same task was launched successively on Colosse with different number of cores. The simulation consists of the calculation of the acoustic response of an object, and is in fact in the resolution field of pressure and velocity in response to an excitation at a certain frequency. The excitation frequency varies from 0 to 7000Hz, in steps of 10Hz, and therefore the total simulation actually consists of 700 independent simulations. The simulation times are then extracted from COMSOL's log file. The blue curve in the graph below shows the results. The red line on the same graph shows that COMSOL correctly distributes the different frequencies with 80 cores, as the simulation time is approximately divided by 10 compared to the time for eight cores.

Evolution of the simulation time depending on the number of cores (blue curve) and a comparison with the expected result (red curve).
Outils personnels
Espaces de noms

Variantes
Actions
Navigation
Ressources de Calcul Québec
Outils
Partager