Keywords

1 Introduction

The unique capabilities of the mathematical modeling method combined with the growth of the performance of supercomputers open new horizons in biomedical research. One of the directions of such studies is the invention of a personal biologically relevant 3D computer model of a human heart. There are a lot of practical applications of such model, such as the extraction of new knowledge about the processes in the heart, study of the effects of drugs, generation of synthetic data for the development and tuning of novel diagnostics methods, and the construction of realistic training simulators for educational purposes. Studies in the field of numerical modeling of processes in the human heart using supercomputers are being actively pursued in many research groups [1,2,3,4,5,6,7,8,9,10,11,12]. Current research areas include the development of methods for reconstructing a personal 3D model of heart and vessels according to computer tomography, automated segmentation of CT images, the creation of a mathematical and computer electrophysiological and electro-mechanical model of the whole human heart and its basic elements. The development of methods for the effective use of supercomputers for solving problems of numerical modeling of processes in the heart is one of the state-of-the-art topics of the current research.

In this paper, we present CardioModel – new software for numerical modeling of the human heart electrophysiology using supercomputers. The main motivation for development of this software is the abundance of open scientific problems the two main topics: (i) personalization and increasing the relevance of models of heart, (ii) increasing the efficiency of computational resources usage. Our software is based on the numerical solution of monodomain or bidomain equations [14, 15] that describe electrical activity in heart. We discretize the equations in time and space. Time discretization is performed using an explicit scheme. Space discretization is done by representing the patient’s heart tissue domain with a three-dimensional tetrahedral grid. The structure of the grid is given by a finite-element Atlas model of the human heart [28]. The Atlas model is anatomically segmented by marking the affiliation of the grid nodes to specific anatomical segments (left, right, ventricles, atria, etc.) [29]. The conduction system of heart is represented as a graph and is connected with the grid. In order to achieve personalization of the Atlas model we merge the vertices of the tetrahedral mesh with an array of the patient’s heart CT or MRI data. To increase the reliability of simulation results we employ biologically relevant cell models available from the CellML repository taking into account the segmentation of the grid to specific regions. Our software is implemented in C++ and has a flexible object-oriented architecture. That allows us easily extending the code with cell models and numerical schemes to take into account additional physical effects. We employ the MPI technology to parallelize the code on distributed memory. The ParMetis library [16] is used to distribute the grid over nodes of a supercomputer. For the SLAE solution we use iterative methods from the PETSc library [17] or the implementations of direct methods developed at Intel [18] and at UNN [19]. The project is based on the experience gained earlier in the development of the Virtual Heart software [20].

The rest of the paper is organized as follows. Section 2 provides an overview of similar works. Section 3 describes a mathematical model. The numerical scheme is formulated in Sect. 4. A parallel algorithm is given in Sect. 5. Section 6 provides an implementation overview. In Sect. 7 we present selected results of numerical simulation and discuss performance data.

2 Related Work

In 1993 the Physiome Project (http://physiomeproject.org) was presented aimed at “a quantitative description of the physiological dynamics and functional behavior of the intact organism” [1]. The CellML model repository [2] is one of the most important components of the project. It contains computer implementations of mathematical models of human and animal cells for use in numerical simulations.

The cardiac computational modeling software Alya Red CCM [3] is created in the Barcelona Supercomputing Center. The model underlying the code includes electrophysiological and electromechanical activity of the heart, as well as blood flow through its cavity. Electrophysiological and mechanical components are simulated on the same grid and the modeling of the blood flow is associated with the mechanics of the heart. The rabbit ventricular mesh developed in the University of Oxford [4] is used in simulations. The tool is part of the Alya software – the multiphysics simulation code [5] being developed by BSC.

Essential results of electrophysiology and electromechanics modeling of the heart are obtained in the Computational Cardiology Lab in the Johns Hopkins University [6]. The laboratory implements full heart simulation. Much attention is paid to the applications of the model: the reproduction of normal and pathological cardiac activity. Studies are underway to personalize the heart model [7].

Research centers of IBM in conjunction with the Lawrence Livermore National Laboratory carry out the Cardioid Cardiac Modeling project. The software developed by the team simulates the heart at several levels: electrophysiology of the heart tissue, electromechanics of the whole heart, and also modeling of the tissue fibers [8].

In the REO laboratory of INRIA a group of researchers is working on the modeling of electrophysiology of the heart using the model based on the description of surfaces [9]. The proposed approach allows reproducing typical heart pathologies. The group also deals with the inverse problem of electrophysiology – determining the parameters of the model from a given cardiogram, numerical modeling of cardiograms and creating simplified models of electrophysiology.

Remarkable results in numerical simulation of the heart are achieved by a team of researchers at the University of Oxford. The team develops the Chaste software aimed at solving computationally intensive problems in biology and physiology on supercomputers [13]. The part of the software, Cardiac Chaste, is capable of doing simulations of electro-mechanical processes in the heart. The software is publicly available.

Research in this direction is done in the institutes of the Russian Academy of Sciences (RAS) and in the universities. In the Institute of Numerical Mathematics of the RAS and MIPT, studies are underway to model the human cardiovascular system [10, 11]. Research on the human heart left ventricle modeling is performed in the Institute of Mathematics and Mechanics of the Ural branch of the RAS. The team proposed a new model of the left ventricle of heart which allows personalization according to MRI data. The software for simulation was also developed [12].

In general, there is a considerable interest in developing a biologically relevant personalized computer model of the human heart. One of the steps in this direction requires the development of parallel algorithms and their implementations capable of carrying out extremely computationally intensive calculations on supercomputers.

3 Model

The bidomain model is one of the most common continuous mathematical models describing the electrical activity of the heart. The model was proposed by Schmidt in 1969 [14] and was first formalized by Tang in 1978 [15]. The model assumes that the heart tissue consists of two continuous domains separated by a cell membrane: the inner space of the cells of the cardiac tissue (intracellular domain) and the space between the cells (extracellular domain). Interaction between domains is described by the flux through the cell membrane. The description of each domain includes the relationship between the electric field obtained from the potential, the current density, and the conductivity tensor.

Next, we give a formal description of the bidomain model. Let some domain \( \Omega \) be filled with cardiac tissue. The tissue is considered as a set of two continuous domains – intracellular and extracellular, interacting through the cell membrane. Let \( \varvec{u} \) be a set of variables describing the tissue cell, \( \varphi_{m} \) and \( \varphi_{e} \) – intra- and extracellular potentials, \( V = \varphi_{m} - \varphi_{e} \) – the transmembrane voltage, \( \sigma_{m} \) and \( \sigma_{e} \) – intra- and extracellular conductivity tensors, \( I_{m}^{st} \) and \( I_{e}^{st} \) – stimulus currents inside and outside the cell per unit area, \( \chi \) – surface area to volume ratio, \( C_{m} \) – membrane capacity per unit area, \( I_{ion} \) – ion current per unit area. The functions \( I_{ion} \) and \( \frac{{\partial \varvec{u}}}{\partial t} \) are defined by the model of the cell. The bidomain model is then given by the following set of equations:

$$ \begin{array}{*{20}c} {\chi \left( {C_{m} \frac{\partial V}{\partial t} + I_{ion} \left( {\varvec{u}, V} \right)} \right) - {\nabla } \cdot \left( {\sigma_{m} {\nabla }\left( {V + \varphi_{e} } \right)} \right) = I_{m}^{st} } \\ {{\nabla } \cdot \left( {\left( {\sigma_{m} + \sigma_{e} } \right){\nabla }\varphi_{e} + \sigma_{m} {\nabla }V} \right) = - I_{m}^{st} - I_{e}^{st} } \\ { \frac{{\partial \varvec{u}}}{\partial t} = f\left( {\varvec{u}, V} \right) } \\ {\varvec{n} \cdot \left( {\sigma_{m} {\nabla }\left( {V + \varphi_{e} } \right)} \right) = 0; \varvec{n} \cdot \left( {\sigma_{e} {\nabla }\varphi_{e} } \right) = 0 } \\ \end{array} $$
(1)

4 Method

We employ the following numerical scheme to get the solution of (1) in time. The bidomain equations are discretized in time and space. Time discretization is performed using an explicit scheme. Space discretization is done by dividing the heart tissue domain with a three-dimensional tetrahedral mesh; the finite element method is applied. Numerical solution of the equations is performed using the operator splitting method [24], which allows solving linear and nonlinear parts of the system (1) independently. This approach is commonly used to simulate electrophysiological activity [25,26,27]. According to the operator splitting method, the first equation of system (1) can be represented as follows: \( \frac{\partial V}{\partial t} = \left( {\varGamma_{1} + \varGamma_{2} } \right)V \), where \( \varGamma_{1} \) and \( \varGamma_{2} \) are nonlinear and linear differential operators, respectively. In the bidomain model \( \varGamma_{1} = \frac{{ - I_{ion} }}{{ C_{m} }} \), \( \varGamma_{2} = \frac{{\nabla \cdot \left( {\sigma_{m} \nabla \left( {V + \varphi_{e} } \right)} \right) + I_{m}^{st} }}{{\chi C_{m} }} \). Then the solution of system (1) is calculated as follows.

  1. 1.

    Integrate the ODE system (2) over time with a step size \( \frac{\varDelta t}{2} \) using computed values in time \( t \) as initial conditions:

    $$ \left\{ {\begin{array}{*{20}c} {\frac{\partial V}{\partial t} = \varGamma_{1} V} \\ {\frac{{\partial \varvec{u}}}{\partial t} = f\left( {\varvec{u}, V} \right)} \\ \end{array} } \right. $$
    (2)
  2. 2.

    Integrate the PDE \( \frac{\partial V}{\partial t} = \varGamma_{2} V \) over time with a step size \( \varDelta t \) using the values computed during the step 1 as initial conditions.

  3. 3.

    Integrate the ODE system (2) over time with a step size \( \frac{\varDelta t}{2} \) using the values computed during the step 2 as initial conditions.

We employ the fourth order Runge–Kutta method to solve the ODE system, the finite element method to solve PDE system, and the Euler method for time discretization. The SLAE systems arising during the application of the finite element method are solved by means of relevant iterative methods. Additional information on the numerical integration of the bidomain equations can be obtained in the reviews [21,22,23].

The described approach allows us to find the solution of the bidomain equations. However, in order to perform fine tuning of the model behavior one has to adjust multiple parameter values in a large number of equations. Therefore, we developed a special scheme to simplify tuning. In this regard we represent a conduction system (CS) of the heart as a graph connecting a sinoatrial node, an atrioventricular node, left and right legs of the His bundle and Purkinje fiber. Vertices of the graph correspond to the regions of the heart, and the edges show the presence of conduction pathways up to the terminal nodes. Additional graph vertices can be added to improve the quality of the model. Each vertex of the graph is associated with the biological model of the cell from the CellML repository, the coupling coefficients with other nodes of the conduction system, and also the region of influence on other cells of the heart.

Each cell in the conduction system is an oscillator with its own natural frequency. Tuning those frequencies along with the coupling strengths (gap junction conductance) allows achieving the globally synchronous regime in the conduction system with the desired delays of the pulse propagation on different segments of the graph.

The dynamics of the conduction system is simulated in parallel with the main solver pipeline. The cells of the conduction system in heart are electrically coupled with myocardium through gap junctions. This gives a way to connect the dynamics of the conduction system with the dynamics of the excitable cells of cardiac tissue obtained in the main solver pipeline. Particularly, each time step of simulation the membrane potentials of the conduction system cells are coupled with the membrane potentials of the tissue cells with the diffusive coupling terms. Each node of the conduction system graph has a parameter R that describes the area of tissue this node corresponds to. Each tissue cell that is located inside the sphere of radius R around the conduction system node will have additional term in the equation for the rate of membrane potential: \( D_{t - cs} \left( {V_{cs} - V_{t} } \right) \) where \( V_{cs} \) and \( V_{t} \) are membrane potentials of the conduction system cell and tissue cell respectively. Including the conduction system in the heart model allows us to fit the model behavior with theoretical expectations and to simulate typical heart pathologies. The corresponding simulation results are given in Sect. 7 below.

5 Parallel Algorithm

The parallel algorithm propagates the voltage \( V \) and the potential \( \varphi_{e} \) over time. The main numerical scheme is as follows:

  1. 1.

    Data initialization step. We read parameters from the configuration file, initialize MPI, allocate memory, read and initialize the mesh, the voltage \( V\left( 0 \right) \), the potential \( \varphi_{e} \left( 0 \right) \), the cells state \( \varvec{u} \), the currents \( I_{e}^{st} \) and \( I_{m}^{st} \).

  2. 2.

    Mesh partitioning step. The mesh is partitioned by means of the parallel ParMetis software to reduce MPI communications during the main computational loop.

  3. 3.

    Matrix assembly step. We generate FEM matrices in parallel: the mass matrix and the stiffness matrix.

  4. 4.

    Pre-compute step. Each MPI process solves of the ODE system (2) with the time step \( \frac{\varDelta t}{2} \) by the fourth order Runge-Kutta method for the heart cells located in the mesh vertexes belonging to the process. The initial conditions are set to default values (rest states for the tissue models) or are loaded from file.

  5. 5.

    Main computational loop. During this step we propagate the voltage, potential and cells state over time by means of the Operator splitting method with a time step \( \varDelta t. \) Every iteration of this loop is fully parallel and is as follows:

    1. a.

      The software calculates diffusion \( \nabla \cdot \left( {\sigma_{m} \nabla V} \right) \) for the voltage \( V\left( t \right) \) by the finite element method. In this regard we need finding a solution of a sparse SLAE by iterative or direct method. Both approaches make it possible to find the solution of the system in parallel on the distributed memory. In most experiments we use iterative methods with appropriate preconditions, if necessary.

    2. b.

      We compute the total current \( I_{sum} = I_{m}^{st} + I_{e}^{st} + \nabla \cdot \left( {\sigma_{m} \nabla V} \right) \) in parallel.

    3. c.

      We solve the Poisson equation \( \nabla \cdot \left( {\left( {\sigma_{m} + \sigma_{e} } \right)\nabla \varphi_{e} } \right) = - I_{sum} \) with respect to \( \varphi_{e} \left( {t + dt} \right) \) by the finite element method. We solve a sparse SLAE in parallel.

    4. d.

      The software computes diffusion \( \left( {\sigma_{m} \nabla \varphi_{e} } \right) \), where \( \varphi_{e} \) has been found before. We employ the finite element method and solve a sparse SLAE in parallel.

    5. e.

      For each cell (mesh vertex) that has a coupling with the conduction system node compute additional current term: \( I_{cs} = D_{t - cs} \left( {V_{cs} - V_{t} } \right) \).

    6. f.

      Each MPI process solves ODE \( \frac{\partial V}{\partial t} = \frac{{\nabla \cdot \left( {\sigma_{m} \nabla \left( {V + \varphi_{e} } \right)} \right) + I_{m}^{st} + I_{cs} }}{{\chi C_{m} }} \) with respect to the voltage \( V \) by the Euler method with the time step \( \varDelta t. \) The results of computation at step A are used as initial conditions.

    7. g.

      Each MPI process solves ODEs describing the dynamics of conduction system with RungeKutta 4th order approximation for time step \( \varDelta t. \)

    8. h.

      Each MPI process solves the ODE system (2) with the time step \( \Delta t \) by the fourth order Runge-Kutta method for the heart cells located in the mesh cells belonging to the process. The initial conditions are the results of calculating \( V \) at step f.

  6. 6.

    Post-compute step. Each MPI process solves the ODE system (2) with the time step \( \frac{\varDelta t}{2} \) by the fourth order Runge-Kutta method for the heart cells located in the mesh cells belonging to the process. The initial conditions are the results from the last step of the main computation loop.

  7. 7.

    Finalization step. The results of computations are stored, the memory is released.

6 Implementation Overview

There are many approaches for the heart activity simulation. Therefore, when designing the CardioModel software, we considered the possibility of expanding the software with new models and numerical schemes as one of the key requirements. In this regard, we make use of the object-oriented methodology to design our application. The following main subsystems and interfaces of their communication have been identified: Modeling of electrical activity of heart, Mathematical models, Configuration, Numerical schemes, Post-processing, and Graphical user interface. Thereby we can easily choose different cell models, try new methods of solving SLAEs and specific preconditioners, and add new modules for processing simulations results.

The software is implemented in the C++ programming language. Parallelization for distributed memory systems is performed using MPI technology. The ParMetis parallel library is used to distribute the mesh along the compute processes in an optimal manner. We employ distributed SLAE solvers and preconditioners from the PETSc library. The results of simulations are stored using the VTK library to ensure compatibility with the most common professional 3D visualization tools, for example, ParaView. We also implement storing the current calculations state and restarting from the previously saved checkpoint. The CMake software is used to provide the build infrastructure.

7 Numerical Results

7.1 Application

The personal three-dimensional tetrahedral finite element model of the human heart tissue is constructed by combining the vertices of a reference finite element Atlas model of a human heart with the patient’s heart tissue on a tomogram (Fig. 1).

Fig. 1.
figure 1

The data preparation cycle. (a) The Atlas model of the heart surface. (b) The unified tetrahedral mesh of the heart. (c) The segmented mesh. (d) 3D image data. (e) The heart segmented by machine learning methods. (f) The personalized model of the human heart.

The Atlas model is anatomically segmented by marking the affiliation of the mesh nodes to specific anatomical segments (left, right ventricles, atria, etc.). This markup is then transferred to a tomogram. The fitting is performed by the Coherent Point Drift resulting in a FEM mesh of heart which is a prerequisite for the simulation of electrical activity. The conduction system of the heart is represented as a graph and is connected with the mesh. The pre-segmented Atlas model is constructed from a three-dimensional surface model of the heart as follows. Firstly, we employ the NetGen software to generate a tetrahedral grid of the heart bounded with its outer surface. Next, by means of smoothing we remove cavities, limited by the inner surface of the chambers (atria and ventricles). Finally, we make the segmentation of the finite-element mesh of the heart. At this step, the inner surfaces are expanded until they come into contact with elements of another compartment or with outer surfaces. Our software performs fast validation of the vertices of the finite element mesh.

Below we present some simulation results. First, we consider three regimes of heart: the regular mode (Fig. 2a), atrial fibrillation due to a single spiral wave (Fig. 2b), and atrial fibrillation with spiral wave brake up and the formation of spiral chaos (Fig. 2c). The snapshots are taken at specific times, corresponding to the initial, intermediate and final part of the simulation. The color scale corresponds to a change in voltage V from −100 mV to 25 mV. In the bottom right inset one can see the wave front pattern computed with a threshold transform.

Fig. 2.
figure 2

Simulation results. (a) V in the regular mode. (b) Atrial fibrillation, a single spiral wave. (c) Atrial fibrillation, spiral chaos.

In the case of normal heart activation (Fig. 2a) one can see how the action potential originates from the sino-atrial node and propagates over atria. After that the excitation is delivered down to the ventricular apex through the Hiss bundle and eventually goes up covering left and right ventricles with a help of Purkinje fibers. The second set of results (Fig. 2b) demonstrates the simulation of atrial fibrillation which is believed to be stipulated by the spiral wave of action potential that persists in the atrial media and generates high frequency pulses. The rate of excitation may reach 500 bpm due to spiral wave curvature. At the same time AVN serves as a filter only allowing limited number of excitations to go down into the ventricles. Then (Fig. 2c) we demonstrate how the single spiral wave may become unstable and break up into a dynamic regime called spiral chaos which is also considered to be a background for atrial fibrillation. One can observe how the increase in the wave width leads to the spiral instability and to the consequent wave front break up. Multiple spiral rotors occur in the atria. They are irregular and persistent and lead to incoherent high frequency excitation.

Lastly (Fig. 3), we demonstrate the simulated electrical activity of heart under infarction conditions. The area of infarction was introduced in the left ventricle of the heart that otherwise showed a normal activation pattern. The Figure has three insets: top left illustrates the 3D image of the electrical action potential with the colormap depicting the membrane voltage (mV). The snapshot is taken at the moment of time when the ventricles finished depolarization. The area of infarction (and hence a conduction block) is clearly observed. Top right inset shows the snapshot of the dynamics of the conduction system at the moment the terminal pieces of Purkinje fiber are excited. The time series in the inset below shows the electrical activity of significant cells of the conduction system: sino-atrial node (SAN), atrio-ventricular node (AVN), terminal cells of the Hiss bundle located at the apex of the heart (LH, RH), and last (terminal) cells of the Purkinje fiber of both ventricles (LV, RV). The results are in good agreement with theoretical expectations.

Fig. 3.
figure 3

Simulation results. Electrical activity of the heart under infarction conditions.

7.2 Performance and Scalability

We assess performance and scalability of the CardioModel software as follows. In the experiments we use the finite element mesh (Fig. 4) with 998 151 vertices and 5 140 810 tetrahedrons. The left and right atrium, left and right ventricles, fibrous tissue, and ventricular septum are considered as separate compartments with relevant cell models. The conduction system is represented as the graph with 55 vertices mapped to the mesh and 54 edges.

Fig. 4.
figure 4

The segmented 1M mesh and the conduction system.

In the experiments we use 15 nodes of the Lobachevsky supercomputer. Every node is equipped by 2 Intel Sandy Bridge E5-2660 2.2 GHz CPU (8 cores each), 64 GB RAM, OS Centos 7.2, Intel Parallel Studio 2017, PETSc 3.8, ParMetis 4.0.3.

First, we analyzed the performance of the code depending on the following factors: possible distributions of the mesh between processes to reduce MPI communications; SLAE solver selection; SLAE solver preconditioner selection; and proper selection of the starting point for the iterative methods. As expected, the use of ParMetis dramatically speeds up the calculation due to the reduction of MPI communications. The solution obtained at the previous iteration of the computational loop was chosen as a starting point for the iterative methods of the SLAE solver. This obvious strategy has made it possible to significantly speed up the convergence of methods. Large-scale computational experiments were carried out to select the most suitable method and preconditioner. As a result, it turned out that the best results for the given accuracy were shown by the combination of the Flexible Generalized Minimal Residual method (FGMRES) + GAMG preconditioner.

Table 1 shows the run time of the algorithm depending on the number of MPI processes used. The first column shows the number of processes, in the second – the run time of the ODE solver. The third and fourth columns give the time for solving the SLAE in calculating the diffusion and solving the Poisson equation, respectively. The proportion of time that has come to other stages is negligible and is shown in the sixth column. Lastly, the total calculation time is presented.

Table 1. Performance results when propagating the bidomain equations. The number of time steps is set to 100. The PETSc tolerance parameter is equal to 1e–10. Time is given in seconds.

The results of the experiments show that the stage of the ODE solution scales well enough up to 120 processes. Contrary, the SLAE solution requires a significant amount of MPI communications, which leads to a decrease in the strong scaling efficiency of these stages. Overall strong scaling efficiency calculated from 8 to 64 processes is 96.2%. The further increase in the number of processes to 120 still speeds up the computations, the efficiency is 74.6% of the theoretical optimum.

8 Conclusion and Future Work

We presented the description of the CardioModel software, intended for numerical modeling of human heart electrophysiology on supercomputers. The software uses a tetrahedral finite-element mesh of the heart as input data. The mesh is constructed according to the results of the patient’s computer tomogram and is automatically segmented. Using this mesh we integrate bidomain equations describing the propagation of an electrical signal in cardiac tissues. The paper presents the results of numerical modeling of several regimes of the heart, including the development of pathologies. Software shows acceptable (75%) scalability when using 120 MPI ranks on the mesh of ~1 million vertices and ~5 million tetrahedrons. Further developments include improving personalization of models and increasing the scalability of the software. We plan to develop custom load balancing schemes when using different cell models.