Merge branch 'master' into reconnection
This commit is contained in:
commit
150146871e
23
.gitlab-ci.yml
Normal file
23
.gitlab-ci.yml
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
image: debian
|
||||||
|
|
||||||
|
stages:
|
||||||
|
- build
|
||||||
|
|
||||||
|
build:
|
||||||
|
stage: build
|
||||||
|
before_script:
|
||||||
|
- apt-get -q update
|
||||||
|
- apt-get -q -y install gawk make gfortran libhdf5-dev libopenmpi-dev
|
||||||
|
|
||||||
|
script:
|
||||||
|
- cd ./build/
|
||||||
|
- cp -al make.default make.config
|
||||||
|
- cp -al ./hosts/default ./hosts/$HOSTNAME
|
||||||
|
- export HDF5DIR=/usr/lib/x86_64-linux-gnu/hdf5/serial
|
||||||
|
- make MPI=N NDIMS=2
|
||||||
|
- make clean
|
||||||
|
- make MPI=N NDIMS=3
|
||||||
|
- make clean
|
||||||
|
- make MPI=Y NDIMS=2
|
||||||
|
- make clean
|
||||||
|
- make MPI=Y NDIMS=3
|
13
CHANGELOG.md
Normal file
13
CHANGELOG.md
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
# 2019-10-04 No version yet. ##
|
||||||
|
----
|
||||||
|
|
||||||
|
- support for rectangular adaptive domain in 2D and 3D;
|
||||||
|
- support for hydrodynamical (HYDRO) and magnetohydrodynamical (MHD) equations, both in classical and relativistic formulations;
|
||||||
|
- support for adiabatic (ADI) and isothermal (ISO) equation of state;
|
||||||
|
- support for viscosity and resistivity source terms;
|
||||||
|
- support for passive scalars;
|
||||||
|
- time integration using Euler and 2nd order Runge-Kutta methods or up to 4th order Stron Stability Preserving Runge-Kutta;
|
||||||
|
- a number of spatial interpolation using 2nd order TVD methods, up to 9th order Monotonicity-Preserving;
|
||||||
|
- HLL-family of approximate Riemann solvers (HLL, HLLC, and HLLD);
|
||||||
|
- GLM scheme for the induction equation;
|
||||||
|
- MPI parallelization;
|
63
README.md
63
README.md
@ -1,12 +1,11 @@
|
|||||||
--------------------------------------------------------------------------------
|
|
||||||
# **The AMUN Code**
|
# **The AMUN Code**
|
||||||
## Copyright (C) 2008-2019 Grzegorz Kowal ##
|
## Copyright (C) 2008-2019 Grzegorz Kowal
|
||||||
--------------------------------------------------------------------------------
|
|
||||||
|
|
||||||
AMUN is a parallel code to perform numerical simulations in fluid approximation
|
AMUN is a parallel code to perform numerical simulations in fluid approximation
|
||||||
on uniform or non-uniform (adaptive) meshes. The goal in developing this code is
|
on uniform or non-uniform (adaptive) meshes. The goal in developing this code is
|
||||||
to create a solid framework for simulations with support for number of numerical
|
to create a solid framework for simulations with support for number of numerical
|
||||||
methods which can be selected in an easy way through the parameter file. The
|
methods which can be selected in an easy way through a parameter file. The
|
||||||
following features are already implemented:
|
following features are already implemented:
|
||||||
|
|
||||||
* hydrodynamic and magnetohydrodynamic set of equations (HD and MHD),
|
* hydrodynamic and magnetohydrodynamic set of equations (HD and MHD),
|
||||||
@ -18,11 +17,13 @@ following features are already implemented:
|
|||||||
* 2nd order TVD interpolation with number of limiters and higher order
|
* 2nd order TVD interpolation with number of limiters and higher order
|
||||||
reconstructions,
|
reconstructions,
|
||||||
* Riemann solvers of Roe- and HLL-types (HLL, HLLC, and HLLD),
|
* Riemann solvers of Roe- and HLL-types (HLL, HLLC, and HLLD),
|
||||||
* periodic and open boundary conditions,
|
* standard boundary conditions: periodic, open, reflective, hydrostatic, etc.
|
||||||
* viscous and resistive source terms,
|
* viscous and resistive source terms,
|
||||||
|
* suppor for passive scalars (up to 100),
|
||||||
* data stored in the HDF5 format,
|
* data stored in the HDF5 format,
|
||||||
* MPI parallelization,
|
* MPI parallelization,
|
||||||
* completely written in Fortran 2003.
|
* completely written in Fortran 2003,
|
||||||
|
* Python interface to read data.
|
||||||
|
|
||||||
This program is free software: you can redistribute it and/or modify it under
|
This program is free software: you can redistribute it and/or modify it under
|
||||||
the terms of the GNU General Public License as published by the Free Software
|
the terms of the GNU General Public License as published by the Free Software
|
||||||
@ -46,12 +47,16 @@ Developers
|
|||||||
Requirements
|
Requirements
|
||||||
============
|
============
|
||||||
|
|
||||||
* Fortran 2003 compiler (tested compilers include
|
* Fortran 2003 compiler, tested compilers include:
|
||||||
[GNU Fortran](http://gcc.gnu.org/fortran/) version 4.5 or newer,
|
- [GNU Fortran](https://gcc.gnu.org/fortran/) version 4.5 or newer,
|
||||||
[Intel Fortran](https://software.intel.com/en-us/fortran-compilers) compiler
|
- [PGI Community Edition](https://www.pgroup.com/products/community.htm),
|
||||||
version 9.0 or newer)
|
version 18.10 or newer,
|
||||||
* [HDF5 libraries](http://www.hdfgroup.org/HDF5/) version 1.8 or newer.
|
- [Intel Fortran](https://software.intel.com/en-us/fortran-compilers)
|
||||||
* [OpenMPI](https://www.open-mpi.org/) version 1.8 or newer for parallel runs.
|
compiler version 9.0 or newer.
|
||||||
|
* [HDF5 libraries](https://www.hdfgroup.org/solutions/hdf5/), tested with
|
||||||
|
version 1.8 or newer.
|
||||||
|
* [OpenMPI](https://www.open-mpi.org/) for parallel runs, tested with version
|
||||||
|
1.8 or newer.
|
||||||
|
|
||||||
|
|
||||||
Environment Variables
|
Environment Variables
|
||||||
@ -65,11 +70,15 @@ the HDF5 libraries have been installed.
|
|||||||
|
|
||||||
Compilation
|
Compilation
|
||||||
===========
|
===========
|
||||||
1. Clone the AMUN source code: `git clone https://bitbucket.org/amunteam/amun-code.git`,
|
1. Clone the AMUN source code:
|
||||||
or unpack the archive downloaded from page
|
- from Bitbucket:
|
||||||
|
`git clone https://grzegorz_kowal@bitbucket.org/amunteam/amun-code.git`,
|
||||||
|
- from GitLab:
|
||||||
|
`git clone https://gitlab.com/gkowal/amun-code.git`
|
||||||
|
- or unpack the archive downloaded from page
|
||||||
[Downloads](https://bitbucket.org/amunteam/amun-code/downloads/).
|
[Downloads](https://bitbucket.org/amunteam/amun-code/downloads/).
|
||||||
2. Go to directory **build/hosts/** and copy file **default** to a new file named
|
2. Go to directory **build/hosts/** and copy file **default** to a new file named
|
||||||
exactly as your host name (name returned by command `hostname`).
|
exactly as your host name, i.e. `cp default $HOSTNAME`.
|
||||||
3. Customize your compiler and compilation options in your new host file.
|
3. Customize your compiler and compilation options in your new host file.
|
||||||
4. Go up to directory **build/** and copy file **make.default** to **make.config**.
|
4. Go up to directory **build/** and copy file **make.default** to **make.config**.
|
||||||
5. Customize compilation time options in **make.config**.
|
5. Customize compilation time options in **make.config**.
|
||||||
@ -80,16 +89,18 @@ Compilation
|
|||||||
Usage
|
Usage
|
||||||
=====
|
=====
|
||||||
|
|
||||||
In order to run some test problems you can simply copy corresponding parameter
|
In order to run some test problems you can simply copy the problem parameter
|
||||||
from directory **problems/** to the location when you wish to run your test.
|
file from directory **problems/** to the location where you wish to run your
|
||||||
Copy the executable file **amun.x** compiled earlier to the same directory. If
|
test. Copy the executable file **amun.x** from the **build/** directory compiled
|
||||||
you provide option _-i <parameter_file>_, the code will know that the parameters
|
earlier. If you provide option _-i <parameter_file>_, the code will know that
|
||||||
have to be read from file _<parameter_file>_. If you don't provide this option,
|
parameters have to be read from file _<parameter_file>_. If you don't provide
|
||||||
the code will assume that the parameters are stored in file **params.in** in the
|
this option, the code assumes that the parameters are stored in file
|
||||||
same director.
|
**params.in** in the same director.
|
||||||
|
|
||||||
In order to run serial version, type in your terminal: `amun.x -i params.in`.
|
In order to run serial version, just type in your terminal:
|
||||||
|
`./amun.x -i ./params.in`.
|
||||||
|
|
||||||
In order to run the parallel version (after compiling the code with MPI
|
In order to run parallel version (after compiling the code with MPI support),
|
||||||
version), type in your terminal: `mpirun -n N ./amun.x -i params.in`, where N is
|
type in your terminal:
|
||||||
the number of processors.
|
`mpirun -n N ./amun.x -i ./params.in`,
|
||||||
|
where N is the number of processors to use.
|
25
bitbucket-pipelines.yml
Normal file
25
bitbucket-pipelines.yml
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
# This is a sample build configuration for Other.
|
||||||
|
# Check our guides at https://confluence.atlassian.com/x/5Q4SMw for more examples.
|
||||||
|
# Only use spaces to indent your .yml configuration.
|
||||||
|
# -----
|
||||||
|
# You can specify a custom docker image from Docker Hub as your build environment.
|
||||||
|
image: atlassian/default-image:2
|
||||||
|
|
||||||
|
pipelines:
|
||||||
|
default:
|
||||||
|
- step:
|
||||||
|
name: Build
|
||||||
|
script:
|
||||||
|
- apt-get -q update
|
||||||
|
- apt-get -q -y install gawk make gfortran libhdf5-dev libopenmpi-dev
|
||||||
|
- cd ./build
|
||||||
|
- cp -al make.default make.config
|
||||||
|
- cp -al ./hosts/default ./hosts/$HOSTNAME
|
||||||
|
- export HDF5DIR=/usr/lib/x86_64-linux-gnu/hdf5/serial
|
||||||
|
- make MPI=N NDIMS=2
|
||||||
|
- make clean
|
||||||
|
- make MPI=N NDIMS=3
|
||||||
|
- make clean
|
||||||
|
- make MPI=Y NDIMS=2
|
||||||
|
- make clean
|
||||||
|
- make MPI=Y NDIMS=3
|
347
python/amun.py
347
python/amun.py
@ -37,6 +37,7 @@ import numpy as np
|
|||||||
import os.path as op
|
import os.path as op
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
|
|
||||||
def amun_compatible(fname):
|
def amun_compatible(fname):
|
||||||
'''
|
'''
|
||||||
Subroutine checks if the HDF5 file is AMUN compatible.
|
Subroutine checks if the HDF5 file is AMUN compatible.
|
||||||
@ -47,39 +48,28 @@ def amun_compatible(fname):
|
|||||||
|
|
||||||
Return values:
|
Return values:
|
||||||
|
|
||||||
ret - True or False;
|
True or False;
|
||||||
|
|
||||||
Examples:
|
Examples:
|
||||||
|
|
||||||
comp = amun_compatible('p000010_00000.h5')
|
comp = amun_compatible('p000010_00000.h5')
|
||||||
|
|
||||||
'''
|
'''
|
||||||
try:
|
with h5.File(fname, 'r') as f:
|
||||||
f = h5.File(fname, 'r')
|
if 'codes' in f.attrs:
|
||||||
|
if f.attrs['code'].astype(str) == "AMUN":
|
||||||
# check if the file is written in the AMUN format or at least contains
|
return True
|
||||||
# necessary groups
|
else:
|
||||||
#
|
print("'%s' contains attribute 'code'," % fname, \
|
||||||
ret = True
|
" but it is not 'AMUN'!")
|
||||||
if 'code' in f.attrs:
|
return False
|
||||||
if f.attrs['code'].astype(str) != "AMUN":
|
elif 'attributes' in f and 'coordinates' in f and \
|
||||||
print("'%s' contains attribute 'code'", \
|
'variables' in f:
|
||||||
" but it is not set to 'AMUN'!" % fname)
|
return True
|
||||||
ret = False
|
else:
|
||||||
elif not 'attributes' in f or \
|
print("'%s' misses one of these groups:" % fname, \
|
||||||
not 'coordinates' in f or \
|
"'attributes', 'coordinates' or 'variables'!")
|
||||||
not 'variables' in f:
|
return False
|
||||||
print("'%s' misses one of these groups: ", \
|
|
||||||
"'attributes', 'coordinates' or 'variables'!" % fname)
|
|
||||||
ret = False
|
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
except:
|
|
||||||
print("It seems '%s' is not an HDF5 file!" % fname)
|
|
||||||
ret = False
|
|
||||||
|
|
||||||
return ret
|
|
||||||
|
|
||||||
|
|
||||||
def amun_attribute(fname, aname):
|
def amun_attribute(fname, aname):
|
||||||
@ -93,7 +83,7 @@ def amun_attribute(fname, aname):
|
|||||||
|
|
||||||
Return values:
|
Return values:
|
||||||
|
|
||||||
ret - the value of the attribute;
|
ret - the value of the attribute or None;
|
||||||
|
|
||||||
Examples:
|
Examples:
|
||||||
|
|
||||||
@ -101,29 +91,19 @@ def amun_attribute(fname, aname):
|
|||||||
|
|
||||||
'''
|
'''
|
||||||
if not amun_compatible(fname):
|
if not amun_compatible(fname):
|
||||||
return False
|
return None
|
||||||
|
|
||||||
try:
|
with h5.File(fname, 'r') as f:
|
||||||
f = h5.File(fname, 'r')
|
if aname in f['attributes'].attrs:
|
||||||
g = f['attributes']
|
attr = f['attributes'].attrs[aname]
|
||||||
|
|
||||||
if aname in g.attrs:
|
|
||||||
attr = g.attrs[aname]
|
|
||||||
if attr.dtype.type is np.string_:
|
if attr.dtype.type is np.string_:
|
||||||
ret = np.squeeze(attr).astype(str)
|
ret = np.squeeze(attr).astype(str)
|
||||||
else:
|
else:
|
||||||
ret = np.squeeze(attr)
|
ret = np.squeeze(attr)
|
||||||
|
return ret
|
||||||
else:
|
else:
|
||||||
print("Attribute '%s' cannot be retrieved from '%s'!" % (aname, fname))
|
print("Attribute '%s' cannot be found in '%s'!" % (aname, fname))
|
||||||
ret = False
|
return None
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
except:
|
|
||||||
print("Attribute '%s' cannot be retrieved from '%s'!" % (aname, fname))
|
|
||||||
ret = False
|
|
||||||
|
|
||||||
return ret
|
|
||||||
|
|
||||||
|
|
||||||
def amun_coordinate(fname, iname):
|
def amun_coordinate(fname, iname):
|
||||||
@ -137,7 +117,7 @@ def amun_coordinate(fname, iname):
|
|||||||
|
|
||||||
Return values:
|
Return values:
|
||||||
|
|
||||||
ret - the values of the item;
|
ret - the value of the item or None;
|
||||||
|
|
||||||
Examples:
|
Examples:
|
||||||
|
|
||||||
@ -145,29 +125,14 @@ def amun_coordinate(fname, iname):
|
|||||||
|
|
||||||
'''
|
'''
|
||||||
if not amun_compatible(fname):
|
if not amun_compatible(fname):
|
||||||
return False
|
return None
|
||||||
|
|
||||||
try:
|
with h5.File(fname, 'r') as f:
|
||||||
f = h5.File(fname, 'r')
|
if iname in f['coordinates']:
|
||||||
g = f['coordinates']
|
return np.array(f['coordinates'][iname])
|
||||||
|
|
||||||
if iname in g:
|
|
||||||
item = g[iname]
|
|
||||||
if item.dtype.type is np.string_:
|
|
||||||
ret = np.squeeze(item).astype(str)
|
|
||||||
else:
|
|
||||||
ret = np.squeeze(item)
|
|
||||||
else:
|
else:
|
||||||
print("Coordinate item '%s' cannot be retrieved from '%s'!" % (iname, fname))
|
print("Coordinate item '%s' not found in group 'coordinate' of '%s'!" % (iname, fname))
|
||||||
ret = False
|
return None
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
except:
|
|
||||||
print("Coordinate item '%s' cannot be retrieved from '%s'!" % (iname, fname))
|
|
||||||
ret = False
|
|
||||||
|
|
||||||
return ret
|
|
||||||
|
|
||||||
|
|
||||||
def amun_dataset(fname, vname, shrink = 1, progress = False):
|
def amun_dataset(fname, vname, shrink = 1, progress = False):
|
||||||
@ -192,138 +157,138 @@ def amun_dataset(fname, vname, shrink = 1, progress = False):
|
|||||||
|
|
||||||
'''
|
'''
|
||||||
if not amun_compatible(fname):
|
if not amun_compatible(fname):
|
||||||
return False
|
return None
|
||||||
|
|
||||||
try:
|
dname = op.dirname(fname)
|
||||||
dname = op.dirname(fname)
|
|
||||||
|
|
||||||
if progress:
|
if progress:
|
||||||
sys.stdout.write("Data file path:\n '%s'\n" % (dname))
|
sys.stdout.write("Data file path:\n '%s'\n" % (dname))
|
||||||
|
|
||||||
# get attributes necessary to reconstruct the domain
|
# get attributes necessary to reconstruct the domain
|
||||||
#
|
#
|
||||||
eqsys = amun_attribute(fname, 'eqsys')
|
eqsys = amun_attribute(fname, 'eqsys')
|
||||||
eos = amun_attribute(fname, 'eos')
|
eos = amun_attribute(fname, 'eos')
|
||||||
nr = amun_attribute(fname, 'isnap')
|
nr = amun_attribute(fname, 'isnap')
|
||||||
nc = amun_attribute(fname, 'nprocs')
|
nc = amun_attribute(fname, 'nprocs')
|
||||||
nl = amun_attribute(fname, 'nleafs')
|
nl = amun_attribute(fname, 'nleafs')
|
||||||
if eos == 'adi':
|
if eos == 'adi':
|
||||||
gm = amun_attribute(fname, 'gamma')
|
gm = amun_attribute(fname, 'gamma')
|
||||||
|
|
||||||
# prepare array to hold data
|
# get block dimensions and the maximum level
|
||||||
#
|
#
|
||||||
ndims = amun_attribute(fname, 'ndims')
|
ndims = amun_attribute(fname, 'ndims')
|
||||||
nn = amun_attribute(fname, 'ncells')
|
nn = amun_attribute(fname, 'ncells')
|
||||||
bm = np.array([nn, nn, nn])
|
bm = np.array([nn, nn, nn])
|
||||||
if ndims == 2:
|
if ndims == 2:
|
||||||
bm[2] = 1
|
bm[2] = 1
|
||||||
ng = amun_attribute(fname, 'nghosts')
|
ng = amun_attribute(fname, 'nghosts')
|
||||||
ml = amun_attribute(fname, 'maxlev')
|
ml = amun_attribute(fname, 'maxlev')
|
||||||
f = h5.File(fname, 'r')
|
|
||||||
if 'rdims' in f['attributes'].attrs:
|
|
||||||
rm = amun_attribute(fname, 'rdims')
|
|
||||||
elif 'bdims' in f['attributes'].attrs:
|
|
||||||
rm = amun_attribute(fname, 'bdims')
|
|
||||||
else:
|
|
||||||
rm = amun_attribute(fname, 'domain_base_dims')
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
# build the list of supported variables
|
# get the base block dimensions
|
||||||
#
|
#
|
||||||
variables = []
|
rm = amun_attribute(fname, 'bdims')
|
||||||
f = h5.File(fname, 'r')
|
if rm is None:
|
||||||
|
rm = amun_attribute(fname, 'domain_base_dims')
|
||||||
|
if rm is None:
|
||||||
|
rm = amun_attribute(fname, 'rdims')
|
||||||
|
if rm is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# build the list of supported variables
|
||||||
|
#
|
||||||
|
variables = []
|
||||||
|
with h5.File(fname, 'r') as f:
|
||||||
for var in f['variables'].keys():
|
for var in f['variables'].keys():
|
||||||
variables.append(var)
|
variables.append(var)
|
||||||
f.close()
|
|
||||||
|
|
||||||
# add derived variables if possible
|
# add derived variables if possible
|
||||||
#
|
#
|
||||||
variables.append('level')
|
variables.append('level')
|
||||||
if 'velx' in variables and 'vely' in variables and 'velz' in variables:
|
if 'velx' in variables and 'vely' in variables and 'velz' in variables:
|
||||||
variables.append('velo')
|
variables.append('velo')
|
||||||
variables.append('divv')
|
variables.append('divv')
|
||||||
variables.append('vort')
|
variables.append('vort')
|
||||||
if 'magx' in variables and 'magy' in variables and 'magz' in variables:
|
if 'magx' in variables and 'magy' in variables and 'magz' in variables:
|
||||||
variables.append('magn')
|
variables.append('magn')
|
||||||
variables.append('divb')
|
variables.append('divb')
|
||||||
variables.append('curr')
|
variables.append('curr')
|
||||||
if (eqsys == 'hd' or eqsys == 'mhd') and eos == 'adi' \
|
if (eqsys == 'hd' or eqsys == 'mhd') and eos == 'adi' \
|
||||||
and 'pres' in variables:
|
and 'pres' in variables:
|
||||||
variables.append('eint')
|
variables.append('eint')
|
||||||
if 'dens' in variables and 'pres' in variables:
|
if 'dens' in variables and 'pres' in variables:
|
||||||
variables.append('temp')
|
variables.append('temp')
|
||||||
if (eqsys == 'hd' or eqsys == 'mhd') \
|
if (eqsys == 'hd' or eqsys == 'mhd') \
|
||||||
and 'dens' in variables \
|
and 'dens' in variables \
|
||||||
and 'velx' in variables \
|
and 'velx' in variables \
|
||||||
and 'vely' in variables \
|
and 'vely' in variables \
|
||||||
and 'velz' in variables:
|
and 'velz' in variables:
|
||||||
variables.append('ekin')
|
variables.append('ekin')
|
||||||
if (eqsys == 'mhd' or eqsys == 'srmhd') \
|
if (eqsys == 'mhd' or eqsys == 'srmhd') \
|
||||||
and 'magx' in variables \
|
and 'magx' in variables \
|
||||||
and 'magy' in variables \
|
and 'magy' in variables \
|
||||||
and 'magz' in variables:
|
and 'magz' in variables:
|
||||||
variables.append('emag')
|
variables.append('emag')
|
||||||
if eqsys == 'hd' and 'ekin' in variables and 'eint' in variables:
|
if eqsys == 'hd' and 'ekin' in variables and 'eint' in variables:
|
||||||
variables.append('etot')
|
variables.append('etot')
|
||||||
if eqsys == 'mhd' and 'eint' in variables \
|
if eqsys == 'mhd' and 'eint' in variables \
|
||||||
and 'ekin' in variables \
|
and 'ekin' in variables \
|
||||||
and 'emag' in variables:
|
and 'emag' in variables:
|
||||||
variables.append('etot')
|
variables.append('etot')
|
||||||
if (eqsys == 'srhd' or eqsys == 'srmhd') and 'velo' in variables:
|
if (eqsys == 'srhd' or eqsys == 'srmhd') and 'velo' in variables:
|
||||||
variables.append('lore')
|
variables.append('lore')
|
||||||
|
|
||||||
# check if the requested variable is in the variable list
|
# check if the requested variable is in the variable list
|
||||||
#
|
#
|
||||||
if not vname in variables:
|
if not vname in variables:
|
||||||
print('The requested variable cannot be extracted from the file datasets!')
|
print('The requested variable cannot be extracted from the file datasets!')
|
||||||
return False
|
return None
|
||||||
|
|
||||||
# check if the shrink parameter is correct (block dimensions should be
|
# check if the shrink parameter is correct (block dimensions should be
|
||||||
# divisible by the shrink factor)
|
# divisible by the shrink factor)
|
||||||
#
|
#
|
||||||
shrink = max(1, int(shrink))
|
shrink = max(1, int(shrink))
|
||||||
if shrink > 1:
|
if shrink > 1:
|
||||||
if (nn % shrink) != 0:
|
if (nn % shrink) != 0:
|
||||||
print('The block dimension should be divisible by the shrink factor!')
|
print('The block dimension should be divisible by the shrink factor!')
|
||||||
return False
|
return None
|
||||||
sh = shrink
|
sh = shrink
|
||||||
while(sh > 2 and sh % 2 == 0):
|
while(sh > 2 and sh % 2 == 0):
|
||||||
sh = int(sh / 2)
|
sh = int(sh / 2)
|
||||||
if (sh % 2) != 0:
|
if (sh % 2) != 0:
|
||||||
print('The shrink factor should be a power of 2!')
|
print('The shrink factor should be a power of 2!')
|
||||||
return False
|
return None
|
||||||
|
|
||||||
# determine the actual maximum level from the blocks
|
# determine the actual maximum level from the blocks
|
||||||
#
|
#
|
||||||
ml = 0
|
levs = []
|
||||||
for n in range(nc):
|
for n in range(nc):
|
||||||
fname = 'p%06d_%05d.h5' % (nr, n)
|
fname = 'p%06d_%05d.h5' % (nr, n)
|
||||||
lname = op.join(dname, fname)
|
lname = op.join(dname, fname)
|
||||||
dblocks = amun_attribute(lname, 'dblocks')
|
dblocks = amun_attribute(lname, 'dblocks')
|
||||||
if dblocks > 0:
|
if dblocks > 0:
|
||||||
levels = amun_coordinate(lname, 'levels')
|
levs = np.append(levs, [amun_coordinate(lname, 'levels')])
|
||||||
ml = max(ml, levels.max())
|
ml = int(levs.max())
|
||||||
|
|
||||||
# prepare dimensions of the output array and allocate it
|
# prepare dimensions of the output array and allocate it
|
||||||
#
|
#
|
||||||
dm = np.array(rm[0:ndims] * bm[0:ndims] * 2**(ml - 1) / shrink, \
|
dm = np.array(rm[0:ndims] * bm[0:ndims] * 2**(ml - 1) / shrink, \
|
||||||
dtype = np.int32)
|
dtype = np.int32)
|
||||||
ret = np.zeros(dm[::-1])
|
ret = np.zeros(dm[::-1])
|
||||||
|
|
||||||
# iterate over all subdomain files
|
# iterate over all subdomain files
|
||||||
#
|
#
|
||||||
nb = 0
|
nb = 0
|
||||||
for n in range(nc):
|
for n in range(nc):
|
||||||
fname = 'p%06d_%05d.h5' % (nr, n)
|
fname = 'p%06d_%05d.h5' % (nr, n)
|
||||||
lname = op.join(dname, fname)
|
lname = op.join(dname, fname)
|
||||||
dblocks = amun_attribute(lname, 'dblocks')
|
dblocks = amun_attribute(lname, 'dblocks')
|
||||||
if dblocks > 0:
|
if dblocks > 0:
|
||||||
levels = amun_coordinate(lname, 'levels')
|
levels = amun_coordinate(lname, 'levels')
|
||||||
coords = amun_coordinate(lname, 'coords')
|
coords = amun_coordinate(lname, 'coords')
|
||||||
dx = amun_coordinate(lname, 'dx')
|
dx = amun_coordinate(lname, 'dx')
|
||||||
dy = amun_coordinate(lname, 'dy')
|
dy = amun_coordinate(lname, 'dy')
|
||||||
dz = amun_coordinate(lname, 'dz')
|
dz = amun_coordinate(lname, 'dz')
|
||||||
f = h5.File(lname, 'r')
|
with h5.File(lname, 'r') as f:
|
||||||
g = f['variables']
|
g = f['variables']
|
||||||
if vname == 'level':
|
if vname == 'level':
|
||||||
dataset = np.zeros(g[variables[0]].shape)
|
dataset = np.zeros(g[variables[0]].shape)
|
||||||
@ -451,8 +416,6 @@ def amun_dataset(fname, vname, shrink = 1, progress = False):
|
|||||||
else:
|
else:
|
||||||
dataset = g[vname][:,:,:,:]
|
dataset = g[vname][:,:,:,:]
|
||||||
|
|
||||||
f.close()
|
|
||||||
|
|
||||||
# rescale all blocks to the effective resolution
|
# rescale all blocks to the effective resolution
|
||||||
#
|
#
|
||||||
for l in range(dblocks):
|
for l in range(dblocks):
|
||||||
@ -479,13 +442,9 @@ def amun_dataset(fname, vname, shrink = 1, progress = False):
|
|||||||
% (vname, fname, nb, nl))
|
% (vname, fname, nb, nl))
|
||||||
sys.stdout.flush()
|
sys.stdout.flush()
|
||||||
|
|
||||||
if (progress):
|
if (progress):
|
||||||
sys.stdout.write('\n')
|
sys.stdout.write('\n')
|
||||||
sys.stdout.flush()
|
sys.stdout.flush()
|
||||||
|
|
||||||
except:
|
|
||||||
print("Dataset '%s' cannot be retrieved from '%s'!" % (vname, fname))
|
|
||||||
ret = False
|
|
||||||
|
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
13
python/setup.py
Normal file
13
python/setup.py
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
setup(
|
||||||
|
name='amun',
|
||||||
|
description='Python Interface fo AMUN snapshots',
|
||||||
|
version='0.1',
|
||||||
|
author='Grzegorz Kowal',
|
||||||
|
author_email='grzegorz@amuncode.org',
|
||||||
|
url='https://www.amuncode.org/',
|
||||||
|
license='GPLv3',
|
||||||
|
py_modules=['amun'],
|
||||||
|
install_requires=['h5py', 'numpy']
|
||||||
|
)
|
Loading…
x
Reference in New Issue
Block a user