Probabilistic analysis of long-term loss incorporating maximum entropy method and analytical higher-order moments

Quantifying economic losses of civil infrastructures subjected to various hazards under a life-cycle context is of vital importance for risk assessment and management. In previous studies, the expected long-term loss has been widely applied as a standard decision criterion during the life-cycle analysis. However, the expectation may not be informative enough to illustrate uncertainties associated with the long-term loss. Therefore, the higher-order moments and the probability distribution should be investigated. In this paper, a probabilistic analysis framework is proposed to construct the probability density function and cumulative distribution function of long-term loss by assessing the analytical statistical moments. The stochastic renewal process is utilized to assess the long-term loss by considering uncertainties associated with stochastic occurrence and frequency of the hazards. Based on the maximum entropy method, the proposed approach shows superior efficiency to assess the probability distribution of long-term loss than crude Monte Carlo simulation. The probability distribution can be essential information for decision-making process of risk management. An illustrative example is investigated to show the probability density function of longterm loss of civil infrastructure subjected to hurricane hazards. A good agreement of results obtained by the proposed approach and Monte Carlo simulation has verified the accuracy and effectiveness of the proposed method.


Introduction
In recent decades, substantial financial and social losses caused by natural hazards have raised the awareness of the public to risk management of civil infrastructures. Life-cycle performance and risk assessment of engineering structures have been key concerns of decision-makers to mitigate the potential risk under hazards. Researchers have paid special attention to long-term loss estimation and risk management of civil infrastructure subjected to various natural hazards, such as earthquakes, hurricanes, and flooding [1][2][3]. In this context, the long-term loss refers to the cumulative financial cost of the civil infrastructure over the service life due to damage under hazards. The long-term loss assessment is also known as the life-cycle risk assessment. Though the expected longterm loss has been widely utilized as a standard decision criterion, researchers have stated that the expectation is not fully informative and moved away from expected losses to improve decision-making. For instance, Goda and Hong [4] proposed an optimal seismic design framework by considering different risk attitudes of decisionmakers based on stochastic dominance criteria. The cumulative prospect theory was incorporated to aid decision-making by considering probability distributions associated with hazard risk [5,6]. Furthermore, more studies have started to investigate the statistical parameters of the long-term loss by studying the standard deviation and higher-order moments. For instance, Pandey and Van der Weide proposed an analytical renewal model to assess the life-cycle cost considering the expectation and standard deviation [7]. Li et al. [8] showed the necessity of assessing higher-order moments of the long-term loss considering stationary and non-stationary hazard models. In their study, the long-term loss under non-stationary hazards may have larger higher-order moments (i.e., skewness and kurtosis) compared with the stationary model. Uncertainties associated with stochastic occurrence and intensity of hazards are commonly modeled by stochastic processes. Stationary hazard arrivals can be modeled by stochastic processes with a stationary occurrence rate, e.g., the homogeneous Poisson process. Non-stationary hazards can be described as hazards considering time-varying or stochastic occurrence rates, e.g., the non-homogeneous Poisson process, the mixed Poisson process. In addition to statistical moments, probability distribution of the long-term loss can also be significant information for the decision-making process. For instance, Pandey and Van der Weide [9] highlighted the importance of assessing the probability distribution of the seismic damage cost of the engineering structures during the life-cycle. It shows that the full probability distribution provides a more realistic estimate of the potential damage cost. Wang and Zhang [10] proposed the probability-based loss estimation of structures subjected to tropical cyclone damage. Though these approaches successfully assessed the probability distribution, they may not be able to directly incorporate the statistical moments of long-term loss into the probabilistic assessment. Although Monte Carlo simulation (MCS) can be easily employed for estimating the distribution of long-term loss, it can be time-consuming under different scenarios since a large number of simulations are required for a single case. To address such limitation, this study aims to deliver a probabilistic approach based on maximum entropy method and statistical moments. The proposed approach provides substantial flexibility for the probabilistic assessment, as the statistical moments can be directly used to produce the probability density function (PDF) and cumulative distribution function (CDF) of long-term loss.
Maximum entropy method belongs to the so-called moment method. Moment method [11][12][13] is usually employed to fit the underlying PDF of a random variable of interest by using a finite number of moments as constraints in reliability analysis. Two main concerns to fit an unknown PDF are statistical moments estimation and the selection of an appropriate distribution model. For instance, a random variable of interest is usually the output of a complex system with various input variables, which makes it intractable for statistical moments estimation, especially for higher-order moments. To address this issue, dimension-reduction techniques are widely adopted to reduce the computational effort induced by estimating the high dimensional integration for higher-order moments, i.e., the so-called univariate dimension-reduction method [12] and the bivariate dimension-reduction method [13]. Besides dimensionreduction technique, cubature rules [14,15] are also used for higher-order moments approximation. To further improve the efficiency and accuracy of moments estimation, adaptive dimension-reduction methods with delineation of cross terms [16] or sensitivity analysis [17,18] are proposed. In this paper, long-term loss of civil infrastructures is of great interest and the statistical moments of long-term loss can be analytically obtained via using moment generation function (MGF) [8].
Once the statistical moments is obtained, the probability density function of a random variable can be recovered by a selected distribution model. Another key issue is to select an appropriate distribution model. In recent years, various parametric probability distribution models are proposed. A guideline for comparing different probability distribution models is summarized in reference [19], in which merits and drawbacks of various approaches are concluded. Pearson system [20] can be employed to construct PDF of a random variable through a family of different probability distribution types with the first-four statistical moments as constraints. As described in [18], the Pearson system is adopted to construct the PDF of structural response due to its flexibility. Generally, Pearson system can fit various types of PDFs. However, it may not produce stable results near the boundaries belonging to different distribution types [21]. Johnson system [22] contains three distribution types, i.e., lognormal, bounded and unbounded distributions. A mathematical transformation function is proposed to transform these distributions into standard normal distributions. Johnson system shows a low calculation performance for unknown parameters from estimated statistical moments [23]. Moreover, both Pearson system and Johnson system may not be available for an unbounded bimodal PDF even though Johnson system provides a bounded and restricted bimodal PDF. Saddlepoint approximations [24] can construct the PDF of a random variable by using a few statistical moments to determine the cumulant generating function, while it may encounter the numerical instability and the latent PDF usually follows an exponential form leading to unaccurate estimation for non-exponential PDF types [19]. The generalized lambda distribution is a flexible tool to fit different types of PDF but the accuracy may not be guaranteed in some regions of the skewness-kurtosis plane [25]. Recently, shifted generalized lognormal distribution with first-four statistical moments is developed to fit a PDF within almost the entire range of skewness and kurtosis [26,27], whereas the complicated computation is required [28]. Some other forms of probability distribution models, e.g., Hermite model [29], cubic normal transformation [28,30] are also adopted to fit unknown PDF. Among various moment approaches, maximum entropy method (MEM) is considered the most unbiased one, which originates from the modern information theory. The main idea of MEM is to fit the underlying PDF by using the first-n statistical moments of a system output as constraints. The maximum entropy principle is developed by Jaynes [31], which can be used to select the most possible PDF from a large number of candidates since the process adds minimum spurious information. Shore and Johnson have proven that the MEM can satisfy all the conditions of consistency with only using statistical moments as constraints [32]. Hence, the MEM does not suffer the disadvantages existing in other probability distribution models mentioned above and it contains a whole family of generalized exponential distributions, e.g., normal, lognormal, gamma, beta and exponential distributions. Furthermore, MEM is available for multi-modal distribution types [33]. In recent years, MEM attracts great attention in structural reliability analysis due to its conceptual elegance. Ref. [33] developed a two-step strategy to evaluate the structural reliability efficiently, in which a normalized moment based quadrature rule with dimension-reduction technique is employed to estimate the first-four statistical moments of system output efficiently, then MEM can be applied to derive the PDF of structural response. A novel hybrid dimension-reduction method incorporates an improved MEM with GOpoly algorithm [34] and EBE method [35] is developed to improve the efficiency of structural analysis [17]. An improved maximum entropy method based on a nonlinear mapping and sparse grid numerical integration technique is proposed for achieving a good balance between accuracy and efficiency for structural reliability analysis [36]. Generally, the statistical moments used in MEM are usually integer moments, which means orders of moments are all integer. For practical application of complex engineering problems, the conventional MEM with integer moments as constraints may encounter a difficulty that the accuracy of integer moments may not be guaranteed, especially for higherorder moments, e.g., skewness and kurtosis. To mitigate this difficulty, fractional moments based MEM is introduced in structural reliability analysis [37], which means orders of moments are fractional values. Moreover, an improved dimension-reduction technique named multiplicative dimension-reduction method is developed for fractional moments estimation. Then Xu et al. [38,39] expand the fractional moments-based MEM for structural dynamic systems incorporating with equivalent extreme value distribution. To further improve the accuracy, an adaptive scaled unscented transformation with maximum entropy method is developed for efficiently estimating the structural reliability analysis [40]. Recently, a transformed mixed-degree cubature rule and fractional moments based maximum entropy method is developed for reconstructing the unknown PDF of structural response [41]. Note that in structural reliability problems, the involved systems are quite complex and the failure probability estimation draws more attention to the long tail of PDF of system outputs, which requires an extremely accurate estimated PDF of a system output and significantly impede the process of reliability analysis. Fortunately, MEM can tackle this problem with a good trade-off between accuracy and efficiency. Hence, it can be concluded that MEM is a powerful tool to derive the PDF of a system output. However, to the best of the authors' knowledge, the maximum entropy approach has not been incorporated in loss assessment to identify the probability distribution of long-term loss. In this paper, maximum entropy method incorporating higher-order analysis based on stochastic renewal process is developed. The probability distribution of long-term loss can be accurately constructed by MEM since the statistical moments can be analytically obtained via MGF [8].
In this paper, maximum entropy method incorporating the first-four analytical moments, i.e., mean, standard deviation, skewness and kurtosis, of long-term loss is proposed to derive the underlying PDF of long-term loss of civil infrastructure, which can significantly mitigate the computational effort induced by Monte Carlo simulation when analyzing various hazard scenarios. The organization of this paper can be summarized as follows. In Higher-order analysis of long-term loss section, the stochastic model of hazards is briefly reviewed and the framework of higher-order analysis for statistical moments of long-term loss is summarized. Maximum entropy method is presented in Maximum entropy method for deriving the PDF of long-term loss section and the application of the proposed framework to assess probabilistic long-term loss is illustrated in Illustrative example section to demonstrate the accuracy and efficiency of the proposed method.

Stochastic model of hazards
During the long-term loss assessment, it is of great importance to quantify uncertainties associated with hazard frequency and intensity. Stochastic models are widely utilized to model hazard arrivals and the uncertain magnitudes. These stochastic models are commonly proposed based on historical data. For instance, on the basis of historical records, the homogeneous Poisson process (HPP) can be employed to model the occurrence of natural hazards, such as earthquakes [42,43] and hurricanes [44,45]. In recent studies, researchers indicated that the stationary stochastic models may not be able to capture non-stationary and time-dependent characteristics associated with hazard frequency and magnitude under various scenarios. To this end, some non-stationary stochastic models have been proposed to quantify such variability. For instance, the renewal process has been used to model the stochastic arrivals of strong earthquakes over a long-term period [46]. The occurrence of hurricanes related to climate change can be represented as a non-homogeneous or mixed Poisson process [47]. In this section, a stochastic renewal process is adopted to model hazard occurrence. The renewal process has been widely utilized in recent studies [48][49][50], as it is the generalization of a HPP. A renewal process is a counting process with the interarrival times independently identically distributed (i.i.d).
Considering the service life of the engineering structure is (0, t int ], T k represents the arrival time of kth hazard event. Hence, the arriving times of hazards are collections of k non-negative random variables, i.e., {T 1 , T 2 ,..., T k }, which can be defined as: where {S 1 , S 2 ,..., S k } are inter-arrival times.
Based on the properties of renewal theory, the number of hazard events in the time interval (0, t int ] can be expressed as: is a k-fold convolution of inter-arrival time S k . The notation Φ(·) represents the renewal function which satisfies an integration conditioning on the first arrival time t 1 . Assuming that the cumulative distribution function (CDF) of inter-arrival time is continuous, then the expected number of hazard events can be represented by the following equation: If the first arrival time t 1 exceeds t int , there will be no hazard events occurring within the time period (0, t int ], which also means that E[N (t int )] = 0 . Otherwise, the number of renewals gives: (1) Substituting Eq. (4) for Eq. (3), the expected number of hazard arrivals can be formulated as: When the inter-arrival time S k follows the exponential distribution, the renewal process mentioned above becomes a HPP, which is also known as a Poisson renewal process. The HPP has constant occurrence rate α. It represents the rate of hazard arrivals within a specific time unit, e.g., t int . Hence, for the HPP, the expected value of the number of hazard events can be calculated as: The probability of n arrivals of hazard within a time interval is a Poisson distribution, which can be expressed as: and the probability density function of the inter-arrival time S gives:

The definition of long-term loss
The long-term loss can be computed as the accumulative damage cost due to hazards over the service life of the engineering system. In this section, the framework of the analytical analysis of long-term loss is introduced. The analytical assessment is based on the moment generating function methods [8]. Herein, the definition of long-term loss, MGF and derivations of statistical moments of longterm loss are included. Given a time interval (0, t int ], t int can be defined as the service life of civil infrastructures. Based on the renewal model mentioned above, the arrival time and inter-arrival time are denoted as T k and S k for the kth event, respectively, and T k =S 1 +S 2 +···+S k . The long-term loss L lt can be expressed as: where L k represents the loss severity, which is the probability of failure given the hazards times the associated economic consequence induced by structural damage under the kth hazard event and r denotes a monetary discount rate. Note that the inter-arrival time is assumed to be independent of the loss severity L k . Hence, the longterm loss is the total economic loss of civil infrastructures caused by hazards within a time domain (0, t int ], which is shown in Fig. 1.

Moment generation function
The analytical assessment of statistical moments of longterm loss can be based on the moment generating function approach. For a random variable, its raw moments can be derived by taking derivations of the MGF [51]. Hence, for a given random variable X, the MGF with β ∈ R can be defined as: Specifically, the first-two raw moments can be derived through taking the first and second derivatives of MGF: Then taking β=0, the first raw moments gives: Generally, the jth raw moment of the random variable can be calculated by taking the jth derivative of MGF at zero, which can be formulated as:

Analytical statistical moments based on moment generation function
On the basis of the MGF, the derivation of statistical moments of long-term loss relies on finding its MGF M L lt (·) . Herein, the derivation of MGF of a Poisson renewal process is provided. Conditioning on N(t int )=n, the MGF of long-term loss can be formulated based on the law of total expectation: where M L (·) is the MGF of loss severity L. The loss severity L is given as following the exponential distribution with a parameter θ, i.e., L∼EXP(θ), [52]. Then substituting M L into Eq. (14), the long-term loss under the HPP can be reformulated as: Then the mean (u 1 ) and standard deviation (u 2 ) of long-term loss can be easily obtained by taking the first and second derivatives of MGF at β=0: In this paper, the higher-order moments of long-term loss are also of great interest. Similarly, the skewness (u 3 ) and kurtosis (u 4 ) can be derived as [8]:

Maximum entropy method for deriving the PDF of long-term loss
Once the first four central moments of long term loss are obtained by the MGF, the PDF of long-term loss can be approximated accordingly by using the statistical moments and a given distribution model. In this paper, maximum entropy method is specifically used since it is regarded as the most unbiased approximation of a underlying PDF, which means that the MEM can find the most probable PDF among all candidate PDFs under statistical moments' constraints. Given a continuous random variable Y with PDF f Y (y), the information-theoretical entropy of Y is defined as: The estimation of underlying PDF f Y (y) by using a finite number of statistical moments of Y as constraints is a classical problem in statistics, where the MEM developed by Jaynes [31] is one of the most popular approaches to address this problem. On the basis of principle of maximum entropy, the general constrained optimization formulation of MEM reads: where E Y i is the ith raw moment of Y, and k represents the given number of statistical moments, herein, k=4. For the long term loss L lt , the first-four analytical central moments are obtained by MGF mentioned above, i.e., u 1 , u 2 , u 3 , u 4 . Before dealing with the optimization problem in Eq. (21), a standardized transformation is firstly adopted to transform the random variable L lt into a standard variable Y, which is given by: (18) Hence, the first-four raw moments of the standardized random variable Y are easily derived as: Once the first-four raw moments of Y are obtained, the optimization problem in Eq. (21) can be solved by introducing the following Lagrangian function: where λ=[λ 0 ,λ 1 ,...,λ k ] T is a vector collecting the Lagrangian multipliers. For optimal solution, a key condition is given by: then a closed form of f Y (y) can be obtained by: in which λ 0 is derived based on the normalization axiom in probability theory such that: Therefore, a closed form of f Y (y) can be obtained by determining the Lagrangian multipliers λ. To this end, an equivalent un-constrained optimization formulation based on Kullback-Leibler (K-L) divergence is employed to solve the unknown parameters. As is known, K-L divergence can measure the divergence between the true PDF, f Y (y) and its estimator, f Y y , which is expressed as [37]: Substituting Eq. (26) for f Y y , then Eq. (28) can be rewritten as: in which H f Y y is the entropy of the true PDF. When minimizing K-L divergence of true PDF and its estimator, the estimated PDF is close to the real one. Hence, the minimization of K-L divergence shown in Eq. (29) is an alternative choice of Eq. (21). Although the value of H f Y y is unknown, it is invariant and independent of parameters λ. Thus, minimizing K-L divergence in Eq. (29) is equivalent to the minimization of the following objective function: It is noted that the objective function Γ(λ) is convex about λ. Therefore, a global minimum can be found by minimizing the objective function. The key condition is given as: It is concluded that Eq. (31) is just the constrained conditions in Eq. (21). Therefore, the unknown parameters λ can be determined by using the following un-constrained optimization formulation: (30) Once the Lagrangian multipliers are determined by Eq. (32), the PDF of long-term loss can be accordingly derived by: The procedure of the proposed method incorporating with analytical statistical moments and maximum entropy method is shown in Fig. 2. The homogeneous Poisson process has been emphasized in this study. Furthermore, the proposed approach can be applied to non-stationary models, such as the non-Poisson renewal model, non-homogeneous Poisson process. The derivation of the associated moment generating functions and the first-four moments could be different, while the maximum entropy estimation process can still be adopted. Future studies should incorporate non-stationary hazard models and investigate the effect on the probability distribution of the loss.

Illustrative example
The proposed probabilistic approach is applied to a coastal bridge to quantify the long-term loss subjected to hurricane hazards. The investigated bridge is a multispan simply girder bridge and this type of bridge is vulnerable to deck unseating failure under hurricanes [53,54]. The stochastic Poisson process is employed to model (33) Fig. 2 The flowchart of the proposed method the occurrence of hurricanes over the service life of the bridge. Poisson process is a special renewal process with exponentially distributed inter-arrival times. Therefore, the proposed renewal approach can be utilized. Given the information, the key inputs for the long-term loss assessment include the occurrence rate α, service life t int , distribution parameters of loss severity L, and the monetary discount rate r. By using the moment generating approach, the first-four analytical central moments of can be computed. Then the PDF of long-term loss can be constructed effectively through maximum entropy method with the first-four analytical central moments as constraints. The associated probabilistic distribution can provide significant information for the following risk management and decision-making. In this example, the key parameter of hurricane hazard is the occurrence rate α of the stochastic Poisson process. It is computed as 0.245 by counting the annual number of hurricane events in the investigated region [8]. The loss severity refers to the damage cost of the system given the occurrence of the hazard. The loss severity can be computed as the product of the failure probability given hazard occurrence and the consequence, e.g., the repair cost of bridge superstructure due to deck unseating. The failure probability is commonly computed by the vulnerability assessment. For illustrative purposes, the probability of failure of the bridge is assumed to be 0.1. The repair cost of the bridge can be 12.832 million USD [55]. Then the loss severity can be computed as 1.283 million USD. Herein, the mean value of the loss severity is taken as 1.283 million USD and the loss severity is assumed to follow an exponential distribution. The monetary discount rate is 2% for a long time horizon. The service life of the bridge is 75 years. Given these parameters, the first-four central moments can be computed as 12.2099×10 6 USD, 4.3773×10 6 USD, 0.6101, and 3.5411, which are essential inputs for MEM. The Lagrangian multipliers of MEM are estimated as λ=[0.3459,0.5015,−0.1287,0.0164]. Figure 3 shows the PDFs and CDFs in logarithm scale of the probabilistic long-term loss using the proposed method and Monte Carlo simulation. It is seen that the PDF and CDF in logarithm scale obtained by the proposed method accord well with those estimated by 10 6 MCS. For parametric analysis, when the service life reaches 150 years, the associated PDF and CDF can be reconstructed as shown in Fig. 4, where the first-four central moments  as λ=[0.3169,0.5012,−0.1163,0.0141 ]. Again, the results show the accuracy of the proposed method. Apart from the service life, the monetary discount rate can be an essential parameter for the loss assessment. The probabilistic long-term loss considering the monetary discount rate at 3% is also estimated. Figures 5 and  6 show the probabilistic distributions of long-term loss with 75-year and 150-year service life, receptively. Herein, the monetary discount rate is r=3%.  λ=[0.3968,0.5142,−0.1 502,0.0192], respectively. Compared with the cases with r=2%, the losses with a larger discount rate are more right-skewed. The decision-makers may need to pay special attention to the potential tail risks.
Moreover, the loss severity can be affected by the structural performance, as it is the product of the failure   Table 1 and the unit of the mean and standard deviation is 10 6 USD. Both the proposed method and MCS are performed on a computer with Intel(R) Core(TM) i7-9750H CPU processor at 2.60GHz with 16 GB RAM. As seen from Table 1, the estimated first-four central moments accord well with those obtained by MCS, the maximum relative error is lower than 1%, which illustrates the accuracy of the proposed method. Furthermore, the CPU time required by the proposed method is just within a few seconds while that of MCS is much larger. For instance, in case 2, MCS consumes more than 600s while the proposed approach just takes 3.8s. The proposed method shows significant efficiency especially when taking various parameters into account for the probabilistic analysis of long-term loss. The results described from Figs. 3, 4, 5, 6, 7 and 8 show that the proposed method incorporating analytical statistical moments and maximum entropy method has indicated high accuracy in predicting the probabilistic distribution of long-term loss. Apart from the expected long-term loss, decision-makers may pay special attention to the upper tail of the long-term loss, as the upper tail is associated with the extreme cost. Therefore, the accuracy of the upper tail can be essential. For the case shown in Fig. 3, different percentiles of long-term loss obtained by the proposed method and MCS are listed in Table 2. It can be concluded that the proposed method can obtain comparable results with MCS, which also means that the proposed method can ensure the accuracy of the upper tail of PDF. Besides, the decision-makers can identify the characteristics of extreme losses based on the upper tail of the PDF of long-term loss. For instance, as shown in Table 2, when 95% percentile is of interest, the decision-makers can conclude that they are 95% confident that the longterm loss of this bridge does not exceed 20.16 × 10 6 USD within the 75-years service life. Such statements may be regarded as metrics for the risk-based decision-making. By employing the proposed approach, the PDF of long-term loss can be evaluated under various given parameters related to loss estimation, i.e., the monetary discount rate r, time interval t int and expected loss severity E[L] . Basic parameters are defined as r=2%, t int =75 and E[L] = 1.283 × 10 6 USD and for different cases just one parameter varies for comparison. Figure 9 shows the PDF of long-term loss under different given parameters and the first-four central-moments under various parameters are listed in Table 3. It can be identified that the expectation of the long-term loss can be sensitive to the changes of the monetary discount rate, the expected loss severity, and the service life, as shown in Fig. 9(a) to (c). It can be seen in Fig. 9(b) that the standard deviation is less likely to be affected by the service life when the discount rate and loss severity remain unchanged. Seen from Table 3, the skewness and kurtosis are not affected by the change of loss severity under the same service life and discount rate.

Concluding remarks
In this paper, a probabilistic analysis framework of the long-term loss of civil infrastructures under hazards is proposed. The probability distribution of the long-term loss can be effectively attained by incorporating the analytical statistical moments and the maximum entropy approach. Besides the mean and the standard deviation   of long-term loss, higher-order moments of long-term loss, i.e., skewness and kurtosis, can be assessed by using the MGF based on the stochastic renewal process. Once the first-four analytical statistical moments are obtained, the maximum entropy method can be used for deriving the underlying probability density function of long-term loss. Compared with the Monte Carlo simulation, the proposed method can significantly reduce the computational effort of long-term loss assessment. The illustrative example demonstrates the accuracy and efficiency of the proposed approach. The PDF and CDF obtained by the proposed method accord well with those estimated by Monte Carlo simulation. The proposed method is conditioned on several assumptions. For instance, the hazard arrivals are based on the homogenous Poisson process. The analytical derivations of the statistical moments are based on an assumption that the loss severity is assumed to follow the exponential distribution. Future studies are encouraged to explore non-stationary hazard models and different loss severity models and vulnerability assessment should also be considered.