GenASiS   Basics: Object-oriented utilitarian functionality for large-scale physics simulations

Aside from numerical algorithms and problem setup, large-scale physics simulations on distributed-memory supercomputers require more basic utilitarian functionality, such as physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime param...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer physics communications Jg. 196; H. C; S. 506 - 534
Hauptverfasser: Cardall, Christian Y., Budiardja, Reuben D.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States Elsevier B.V 01.11.2015
Elsevier
Schlagworte:
ISSN:0010-4655, 1879-2944
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Aside from numerical algorithms and problem setup, large-scale physics simulations on distributed-memory supercomputers require more basic utilitarian functionality, such as physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of this sort of rudimentary functionality, along with individual ‘unit test’ programs and larger example problems demonstrating their use. These classes compose the Basics division of our developing astrophysics simulation code GenASiS  (General Astrophysical Simulation System), but their fundamental nature makes them useful for physics simulations in many fields. Program title:GenASiS Catalogue identifier: AEXE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEXE_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Creative Commons Attribution - Non Commercial - ShareALike 4.0 International No. of lines in distributed program, including test data, etc.: 32863 No. of bytes in distributed program, including test data, etc.: 148873 Distribution format: tar.gz Programming language: Fortran 2003 (tested with gfortran 4.9.2, Intel Fortran 15, NAG Fortan 5.3.1, Cray Compiler 8.2.5). Computer: PC, cluster, supercomputer. Operating system: Linux, Unix. RAM: For example problems, depends on user-specified problem size and number of processes. The fluid dynamics problems with 1283 cells on 8 processes use about 300 MB per process. The molecular dynamics problems with 6912 particles on 12 processes use about 20 MB per process. Classification: 4.14, 6.5, 20. External routines: MPI [1] and Silo [2] Nature of problem: By way of illustrating GenASiSBasics functionality, solve example fluid dynamics and molecular dynamics problems. Solution method: For fluid dynamics examples, finite-volume. For molecular dynamics examples, leapfrog and velocity-Verlet integration. Unusual features: The example problems named above are not ends in themselves, but serve to illustrate our object-oriented approach and the functionality available though GenASiSBasics. In addition to these more substantial examples, we provide individual unit test programs for each of the classes comprised by GenASiSBasics. Additional comments: A version of the GenASiSBasics source code is available from the CPC program library with this publication, and minor revisions will be maintained at http://astro.phys.utk.edu/activities:genasis. Running time: For example problems, depends on user-specified problem size and number of processes. The fluid dynamics problems with 1283 cells on 8 processes take about ten minutes of wall clock time on a Cray XC30. The molecular dynamics problems with 6912 particles for 10000 time steps on 12 processes take a little over an hour on a Cray XC30. References:[1]http://www.mcs.anl.gov/mpi/[2]https://wci.llnl.gov/simulation/computer-codes/silo
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
AC05-00OR22725
USDOE Office of Science (SC), Nuclear Physics (NP)
ISSN:0010-4655
1879-2944
DOI:10.1016/j.cpc.2015.06.001