Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
7b16947
introduces mid level interface #37
BenjaminRodenberg Jun 10, 2019
3881299
Use higher order cubic interpolation for more accurate results.
BenjaminRodenberg Jul 23, 2019
90ecfab
allow access to different interpolation strategies
BenjaminRodenberg Jul 26, 2019
51ba5fd
Replacing all API functions to new python_bindings structure
IshaanDesai Dec 13, 2019
43d3dd5
Bringing branch to current develop state
IshaanDesai Dec 13, 2019
0581091
First design of new mid-level interface style
IshaanDesai Dec 19, 2019
c64d184
Adding more mid-level functions
IshaanDesai Dec 19, 2019
6861089
Removing all helper functions from main Adapter class
IshaanDesai Dec 19, 2019
f7e9bfd
Removing read and write fields and formatting new class AdapterCore
IshaanDesai Dec 27, 2019
19199a6
Design changes with the perspective of HT/fenics-fenics tutorial
IshaanDesai Dec 27, 2019
af415fa
Reformulating inheritance order and minor editing
IshaanDesai Dec 27, 2019
b515d2f
Changing function signatures
IshaanDesai Dec 28, 2019
c4d787c
Merge branch 'develop' into introduceMidLevelInterface
IshaanDesai Dec 28, 2019
7a1a0dc
Fixing indentation
IshaanDesai Dec 28, 2019
a67d211
Adding more interface functions and changing core function signatures
IshaanDesai Dec 28, 2019
810b891
Changing order of read and write functions in initialization
IshaanDesai Dec 30, 2019
fefd4ad
Resolving minor bugs and adding interpolation type function
IshaanDesai Jan 7, 2020
9f7e63a
Partially fixing mock tests and other minor bug fixes
IshaanDesai Jan 8, 2020
b4c6c39
Renaming and changing functions to improve control flow
IshaanDesai Jan 12, 2020
4f49e09
Changing from specific boundary condition handling to a generalised way
IshaanDesai Jan 20, 2020
635142c
Adapter Core now contains only helper functions and fixing one test
IshaanDesai Jan 22, 2020
71da182
Changing checkpointing handling and minor editing
IshaanDesai Jan 28, 2020
500a0c3
Updating to preCICE v2
IshaanDesai Mar 3, 2020
1ea4f62
Changing the way boundary conditions are updated
IshaanDesai Mar 4, 2020
cffad10
Adding initialize_data function and removing old boundary functions
IshaanDesai Mar 21, 2020
1abd5f4
Saving a copy of u function in solver state
IshaanDesai Apr 6, 2020
3425c84
Read Data and Write Data arrays are now local
IshaanDesai Apr 13, 2020
63f0aa1
fenics mesh and vertices changed to only local variables
IshaanDesai Apr 17, 2020
a3529ab
Fix checkpointing test
IshaanDesai Apr 25, 2020
cd264c1
Changing boundary conditions updating and fixing mock tests partially
IshaanDesai May 1, 2020
1b5ee0a
Editing documentation
IshaanDesai May 2, 2020
c896728
New point source functions to implement FSI cases
IshaanDesai May 5, 2020
4ce10c6
Minor debugging
IshaanDesai May 5, 2020
9c7d5fd
Merge branch 'develop' into introduceMidLevelInterface
IshaanDesai May 8, 2020
51b7d98
Moving mesh setup within initialize()
IshaanDesai May 9, 2020
d1167d6
Simplifying design by replacing variables with API calls
IshaanDesai May 11, 2020
6a5132c
Moving Expression classes into separate file
IshaanDesai May 14, 2020
eae1f21
Changing initialization so that precice.initializa_data() call is not…
IshaanDesai May 15, 2020
63bb1d7
Moving interpolation strategy selection into Adapter constructor
IshaanDesai May 15, 2020
b180bb5
Adding warning when no interpolation strategy is selected
IshaanDesai May 16, 2020
2e44c52
Removing read_function variable
IshaanDesai May 16, 2020
da4f727
Numpy docstring style documentation and removing redundant functions
IshaanDesai May 20, 2020
e5adc6f
Fixing read and write mock tests
IshaanDesai May 20, 2020
7d16eeb
Adding mock test for Expression handling and more documentation
IshaanDesai May 22, 2020
8907ea2
Changing read and write function names and minor documentation
IshaanDesai May 27, 2020
5db22f3
initialize_data function is now not mandatory
IshaanDesai May 27, 2020
5b2055e
Changing documentation and code clean up
IshaanDesai May 29, 2020
7874111
Changes in documentation and other minor stuff
IshaanDesai May 31, 2020
525a823
Adding safeguard for input write_function
IshaanDesai May 31, 2020
108d134
Merge branch 'develop' into introduceMidLevelInterface
IshaanDesai Jun 4, 2020
ddcf12d
Implementing last review things and adding documentation
IshaanDesai Jun 5, 2020
fe23d5d
Fix initialization of expression in test.
BenjaminRodenberg Jun 5, 2020
901672b
Determine read_function_type using function_space
IshaanDesai Jun 5, 2020
a7408a3
Further modifying Expression mock test: still not working
IshaanDesai Jun 8, 2020
5213f9d
Fixing Expression handling test
IshaanDesai Jun 9, 2020
e36ef05
Setting correct version of pyprecice
IshaanDesai Jun 9, 2020
aca0a6c
Disabling Windows and OSx tests as fenics dependency not satisfied
IshaanDesai Jun 10, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 22 additions & 21 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,18 +22,18 @@ matrix:
- conda create -n fenicsproject -c conda-forge fenics # install fenics via conda. See https://fenicsproject.org/download/
- conda activate fenicsproject

- os: windows
name: "Mock tests [Windows]"
# does not support python
language: bash
env: PY=py
python: "3.7"
before_install: choco install python --version 3.7.5
install:
- $PY -m pip install --upgrade pip
- $PY -m pip install scipy
- $PY -m pip install --upgrade setuptools
- $PY -m pip install wheel
#- os: windows
#name: "Mock tests [Windows]"
## does not support python
#language: bash
#env: PY=py
#python: "3.7"
#before_install: choco install python --version 3.7.5
#install:
#- $PY -m pip install --upgrade pip
#- $PY -m pip install scipy
#- $PY -m pip install --upgrade setuptools
#- $PY -m pip install wheel

- os: linux
name: "Systemtests"
Expand All @@ -44,15 +44,16 @@ matrix:
- curl -LO --retry 3 https://raw.githubusercontent.com/precice/systemtests/master/trigger_systemtests.py
- travis_wait 60 $PY trigger_systemtests.py --adapter fenics --wait

- os: osx
name: "Mock tests [OSx]"
# does not support python
language: generic
env: PY=python3
install:
- pip3 install scipy
#- os: osx
#name: "Mock tests [OSx]"
## does not support python
#language: generic
#env: PY=python3
#install:
#- pip3 install scipy

script:
- mkdir -p precice && echo -e "from setuptools import setup\nsetup(name='pyprecice', version='0.1.0')" > precice/setup.py
- mkdir -p precice && echo -e "from setuptools import setup\nsetup(name='pyprecice', version='2.0.2.1')" > precice/setup.py
- $PY -m pip install ./precice/
- if [ "$TRAVIS_OS_NAME" = "linux" ]; then $PY setup.py test ; else $PY setup.py test -s tests.test_fenicsadapter ; fi
#- if [ "$TRAVIS_OS_NAME" = "linux" ]; then $PY setup.py test ; else $PY setup.py test -s tests.test_fenicsadapter ; fi
- $PY setup.py test
5 changes: 5 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,11 @@ As a first test, try to import the adapter via `python3 -c "import fenicsadapter

You can run the other tests via `python3 setup.py test`.

Single tests can be also be run. For example the test `test_vector_write` in the file `test_write_read.py` can be run as follows:
```
python3 -m unittest tests.test_write_read.TestWriteandReadData.test_vector_write
```

## Use the adapter

Add ``from fenicsadapter import Adapter`` in your FEniCS code. Please refer to the examples in the [tutorials repository](https://github.com/precice/tutorials) for usage examples:
Expand Down
3 changes: 1 addition & 2 deletions fenicsadapter/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1 @@
from .fenicsadapter import Adapter, GeneralInterpolationExpression, ExactInterpolationExpression

from .fenicsadapter import Adapter
285 changes: 285 additions & 0 deletions fenicsadapter/adapter_core.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,285 @@
"""
This module consists of helper functions used in the Adapter class. Names of the functions are self explanatory
"""

import dolfin
from dolfin import SubDomain, Point, PointSource
from fenics import FunctionSpace, VectorFunctionSpace, Function
import numpy as np
from enum import Enum
import logging

logger = logging.getLogger(__name__)
logger.setLevel(level=logging.INFO)


class FunctionType(Enum):
"""
Defines scalar- and vector-valued function.
Used in assertions to check if a FEniCS function is scalar or vector.
"""
SCALAR = 0 # scalar valued function
VECTOR = 1 # vector valued function


def determine_function_type(input_obj):
"""
Determines if the function is scalar- or vector-valued based on rank evaluation.

Parameters
----------
input_obj :
A FEniCS function.

Returns
-------
tag : bool
0 if input_function is SCALAR and 1 if input_function is VECTOR.
"""
if type(input_obj) == FunctionSpace: # scalar-valued functions have rank 0 is FEniCS
if input_obj.num_sub_spaces() == 0:
return FunctionType.SCALAR
elif input_obj.num_sub_spaces() == 2:
return FunctionType.VECTOR
elif type(input_obj) == Function:
if input_obj.value_rank() == 0:
return FunctionType.SCALAR
elif input_obj.value_rank() == 1:
return FunctionType.VECTOR
else:
raise Exception("Error determining type of given dolfin Function")
else:
raise Exception("Error determining type of given dolfin FunctionSpace")


def filter_point_sources(point_sources, filter_out):
"""
Filter dictionary of PointSources (point_sources) with respect to a given domain (filter_out). If a PointSource
is applied at a point inside of the given domain (filter_out), this PointSource will be removed from dictionary.

Parameters
----------
point_sources : python dictionary
Dictionary containing coordinates and associated PointSources {(point_x, point_y): PointSource, ...}.
filter_out: FEniCS domain
Defines the domain where PointSources should be filtered out.

Returns
-------
filtered_point_sources : python dictionary
A dictionary with the filtered PointSources.
"""
filtered_point_sources = dict()

for point in point_sources.keys():
# Filter double boundary points to avoid instabilities and create PointSource
if filter_out.inside(point, 1):
print("Found a double-boundary point at {location}.".format(location=point))
else:
filtered_point_sources[point] = point_sources[point]

return filtered_point_sources


def convert_fenics_to_precice(data, sample_points):
"""
Converts data of type dolfin.Function into Numpy array for all x and y coordinates on the boundary.

Parameters
----------
data : FEniCS function
A FEniCS function referring to a physical variable in the problem.
sample_points : array_like
The coordinates of the vertices in a numpy array [N x D] where
N = number of vertices and D = dimensions of geometry.

Returns
-------
array : array_like
Array of FEniCS function values at each point on the boundary.
"""
if type(data) is dolfin.Function:
x_all, y_all = sample_points[:, 0], sample_points[:, 1]
return np.array([data(x, y) for x, y in zip(x_all, y_all)])
else:
raise Exception("Cannot handle data type %s" % type(data))


def get_coupling_boundary_vertices(mesh_fenics, coupling_subdomain, fenics_dimensions, dimensions):
"""
Extracts vertices from a given mesh which lie on the given coupling domain.

Parameters
----------
mesh_fenics : FEniCS Mesh
Mesh of complete domain.
coupling_subdomain : FeniCS Domain
Subdomain consists of only the coupling interface region.
fenics_dimensions : int
Dimensions of FEniCS case setup.
dimensions : int
Dimensions of coupling case setup.

Returns
-------
fenics_vertices : numpy array
Array consisting of all vertices lying on the coupling interface.
coordinates : array_like
The coordinates of the vertices in a numpy array [N x D] where
N = number of vertices and D = dimensions of geometry.
n : int
Number of vertices on the coupling interface.
"""
n = 0
fenics_vertices = []
vertices_x = []
vertices_y = []
vertices_z = []

if not issubclass(type(coupling_subdomain), SubDomain):
raise Exception("No correct coupling interface defined! Given coupling domain is not of type dolfin Subdomain")

for v in dolfin.vertices(mesh_fenics):
if coupling_subdomain.inside(v.point(), True):
n += 1
fenics_vertices.append(v)
vertices_x.append(v.x(0))
if dimensions == 2:
vertices_y.append(v.x(1))
elif fenics_dimensions == 2 and dimensions == 3:
vertices_y.append(v.x(1))
vertices_z.append(0)
else:
raise Exception("Dimensions of coupling problem (dim={}) and FEniCS setup (dim={}) do not match!"
.format(dimensions, fenics_dimensions))

assert (n != 0), "No coupling boundary vertices detected"

if dimensions == 2:
return fenics_vertices, np.stack([vertices_x, vertices_y], axis=1)
elif dimensions == 3:
return fenics_vertices, np.stack([vertices_x, vertices_y, vertices_z], axis=1)


def are_connected_by_edge(v1, v2):
"""
Checks if vertices are connected by an edge.

Parameters
----------
v1 : dolfin.vertex
Vertex 1 of the edge
v2 : dolfin.vertex
Vertex 2 of the edge

Returns
-------
tag : bool
True is v1 and v2 are connected by edge and False if not connected
"""
for edge1 in dolfin.edges(v1):
for edge2 in dolfin.edges(v2):
if edge1.index() == edge2.index(): # Vertices are connected by edge
return True
return False


def get_coupling_boundary_edges(mesh_fenics, coupling_subdomain, id_mapping):
"""
Extracts edges of mesh which lie on the coupling boundary.

Parameters
----------
mesh_fenics : FEniCS Mesh
FEniCS mesh of the complete region.
coupling_subdomain : FEniCS Domain
FEniCS domain of the coupling interface region.
id_mapping : python dictionary
Dictionary mapping preCICE vertex IDs to FEniCS global vertex IDs.

Returns
-------
vertices1_ids : numpy array
Array of first vertex of each edge.
vertices2_ids : numpy array
Array of second vertex of each edge.
"""
vertices = dict()

for v1 in dolfin.vertices(mesh_fenics):
if coupling_subdomain.inside(v1.point(), True):
vertices[v1] = []

for v1 in vertices.keys():
for v2 in vertices.keys():
if are_connected_by_edge(v1, v2):
vertices[v1] = v2
vertices[v2] = v1

vertices1_ids = []
vertices2_ids = []

for v1, v2 in vertices.items():
if v1 is not v2:
vertices1_ids.append(id_mapping[v1.global_index()])
vertices2_ids.append(id_mapping[v2.global_index()])

vertices1_ids = np.array(vertices1_ids)
vertices2_ids = np.array(vertices2_ids)

return vertices1_ids, vertices2_ids


def get_forces_as_point_sources(fixed_boundary, function_space, coupling_mesh_vertices, data):
"""
Creating two dicts of PointSources that can be applied to the assembled system. Appling filter_point_source to
avoid forces being applied to already existing Dirichlet BC, since this would lead to an overdetermined system
that cannot be solved.

Parameters
----------
fixed_boundary : FEniCS domain
FEniCS domain consisting of a fixed boundary condition. For example in FSI cases usually the solid body is fixed
at one end.
function_space : FEniCS function space
Function space on which the finite element problem definition lives.
coupling_mesh_vertices : numpy.ndarray
The coordinates of the vertices on the coupling interface. Coordinates of vertices are stored in a
numpy array [N x D] where N = number of vertices and D = dimensions of geometry
data : PointSource
FEniCS PointSource data carrying forces

Returns
-------
x_forces : list
Dictionary carrying X component of forces with reference to each point on the coupling interface.
y_forces : list
Dictionary carrying Y component of forces with reference to each point on the coupling interface.

"""
# PointSources are scalar valued, therefore we need an individual scalar valued PointSource for each dimension in a vector-valued setting
# TODO: a vector valued PointSource would be more straightforward, but does not exist (as far as I know)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move the todo somewhere else so it does not get lost/forgotten?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I totally agree. I think we should create an issue for this todo.


x_forces = dict() # dict of PointSources for Forces in x direction
y_forces = dict() # dict of PointSources for Forces in y direction

# Check for shape of coupling_mesh_vertices and raise Assertion for 3D
n_vertices, dims = coupling_mesh_vertices.shape

assert (dims != 2, "This Adapter can create Point Sources only from 2D data, 3D data was supplied")

vertices_x = coupling_mesh_vertices[:, 0]
vertices_y = coupling_mesh_vertices[:, 1]

for i in range(n_vertices):
px, py = vertices_x[i], vertices_y[i]
key = (px, py)
x_forces[key] = PointSource(function_space.sub(0), Point(px, py), data[i, 0])
y_forces[key] = PointSource(function_space.sub(1), Point(px, py), data[i, 1])

# Avoid application of PointSource and Dirichlet boundary condition at the same point by filtering
x_forces = filter_point_sources(x_forces, fixed_boundary)
y_forces = filter_point_sources(y_forces, fixed_boundary)

return x_forces.values(), y_forces.values() # don't return dictionary, but list of PointSources

31 changes: 0 additions & 31 deletions fenicsadapter/checkpointing.py

This file was deleted.

Loading