The coordinate dimensions and bounds have been moved to module
COORDINATES. In the rewritten subroutine initialize_coordinates() the
block dimensions and domain bounds are obtained from module PARAMETERS,
and then all other module variables are initialized.
- a new module COORDS handles the mesh variables which needed to be
separated from the MESH module since they are used in PROBLEM module,
which is required by MESH module; this created a circular dependency;
by introducing a new COORDS module we removed that problem;
- add a new subroutine set_datablock_dims() to set dimensions of the
allocatable arrays in data blocks; this subroutine is called from
init_mesh(), which is the right place to initiate BLOCK module;
- remove dependency of blocks.o on variables.o;
- boundaries.o depends on timers.o;
- CONSERVATIVE flag determines if the scheme must be fully
conservative; this works only when adaptive mesh is used; in such a
case instead of update variables at each block, we first calculate
high order integration of fluxes, then we synchronize them between
blocks at different levels, and finally we perform one-step update
for each block using updated fluxes; this might be also a first step
to implement Galerkin methods;
- a new file 'mesh.log' is created with the following colums: step,
time, the number of leafs, the number of meta blocks, the coverage
efficiency, which is the number of leafs divided by the number of top
level blocks covering the whole domain, the AMR efficiency which
shows the advantage of using adaptive mesh (with boundaries taken
into account), block distribution over levels and processors;
- the AMR efficiency is the number of leafs multiplied by the number of
cells in one block (with boundaries included) and divided by the
effective resolution with boundaries included; if this parameter is
smaller than 1.0 we should expect faster calculations due to the
adaptive mesh, if the parameter is larger than 1.0, we only slow down
the calculations by using the adaptive mesh;
- the number of seeds and seed values must be stored in the restart
file in order to restart a job properly and guarantee unchanged
generation of random numbers after restart;
- add a new module INTEGRALS which handles initialization and
calculation of the conserved variables and energies;
- add make dependencies to makefile;
- call the initialization, storage and termination subroutines from the
driver;
- call subroutine evolve_forcing() before the update of all blocks;
this subroutine evolves the forcing source terms by an interval dt in
the Fourier space; then during the update the forcing Fourier
coefficients will be transformed to real space for each block
separately;
- implement subroutine evolve_forcing() which evolves the driving
components in Fourier space during the one hydrodynamic timestep; the
integrated forcing Fourier components are stored in module array
ftab; this complex array will be used to calculate forcing in real
space for each block;
- add a new module FORCING to handle forcing source terms, e.g. for
turbulence driving;
- implement initial versions of init_forcing() and clear_forcing()
subroutines;
- add compilation flag FORCING and use it during compilation process in
makefile;
- add a new module RANDOM which handles initialization and generation
of different type random number distributions;
- include the module in the compilation process;
- initialize the random generator from driver;
- implement the monotonicity preserving family of reconstruction; the
implementation covers the 5th, 7th and 9th order spacial MP
reconstruction;
- implement two new functions used by the MP reconstructions, minmod()
and median();
- pass the spacial increment to the reconstruction subroutine since
some interpolation methods require it; in addition, move obtaining
the spacial interval and its inversion to the subroutine update();
VARIABLES
- create new module 'variables' which stores references to variable
indices; we gonna store dofferent objects related to variables in
this module;
IO
- the subroutine write_data() is now a wrapper subroutine which call
the subroutine writing in a supported file format; the file format is
chosen at the time of compilation;
- the subroutine write_data_h5() is a new subroutine taking care of
initialization and storage data in the HDF5 format; depending on the
file type it calls subroutines to create specific groups and store
the right data;
- new subroutines write_atrributes_h5(), write_coordinates_h5(),
write_variables_h(), write_variables_full_h5(),
write_metablocks_h5(), write_datablocks_h5(), and number of other
supporting subroutines to store all data in the proper format for job
restart, vizualization, and debugging;
MAKE
- add option to enable/disable HDF5 file compression;