 Research
 Open Access
 Published:
Probabilistic analysis of longterm loss incorporating maximum entropy method and analytical higherorder moments
Journal of Infrastructure Preservation and Resilience volume 3, Article number: 7 (2022)
Abstract
Quantifying economic losses of civil infrastructures subjected to various hazards under a lifecycle context is of vital importance for risk assessment and management. In previous studies, the expected longterm loss has been widely applied as a standard decision criterion during the lifecycle analysis. However, the expectation may not be informative enough to illustrate uncertainties associated with the longterm loss. Therefore, the higherorder moments and the probability distribution should be investigated. In this paper, a probabilistic analysis framework is proposed to construct the probability density function and cumulative distribution function of longterm loss by assessing the analytical statistical moments. The stochastic renewal process is utilized to assess the longterm loss by considering uncertainties associated with stochastic occurrence and frequency of the hazards. Based on the maximum entropy method, the proposed approach shows superior efficiency to assess the probability distribution of longterm loss than crude Monte Carlo simulation. The probability distribution can be essential information for decisionmaking process of risk management. An illustrative example is investigated to show the probability density function of longterm loss of civil infrastructure subjected to hurricane hazards. A good agreement of results obtained by the proposed approach and Monte Carlo simulation has verified the accuracy and effectiveness of the proposed method.
Introduction
In recent decades, substantial financial and social losses caused by natural hazards have raised the awareness of the public to risk management of civil infrastructures. Lifecycle performance and risk assessment of engineering structures have been key concerns of decisionmakers to mitigate the potential risk under hazards. Researchers have paid special attention to longterm loss estimation and risk management of civil infrastructure subjected to various natural hazards, such as earthquakes, hurricanes, and flooding [1,2,3]. In this context, the longterm loss refers to the cumulative financial cost of the civil infrastructure over the service life due to damage under hazards. The longterm loss assessment is also known as the lifecycle risk assessment. Though the expected longterm loss has been widely utilized as a standard decision criterion, researchers have stated that the expectation is not fully informative and moved away from expected losses to improve decisionmaking. For instance, Goda and Hong [4] proposed an optimal seismic design framework by considering different risk attitudes of decisionmakers based on stochastic dominance criteria. The cumulative prospect theory was incorporated to aid decisionmaking by considering probability distributions associated with hazard risk [5, 6]. Furthermore, more studies have started to investigate the statistical parameters of the longterm loss by studying the standard deviation and higherorder moments. For instance, Pandey and Van der Weide proposed an analytical renewal model to assess the lifecycle cost considering the expectation and standard deviation [7]. Li et al. [8] showed the necessity of assessing higherorder moments of the longterm loss considering stationary and nonstationary hazard models. In their study, the longterm loss under nonstationary hazards may have larger higherorder moments (i.e., skewness and kurtosis) compared with the stationary model. Uncertainties associated with stochastic occurrence and intensity of hazards are commonly modeled by stochastic processes. Stationary hazard arrivals can be modeled by stochastic processes with a stationary occurrence rate, e.g., the homogeneous Poisson process. Nonstationary hazards can be described as hazards considering timevarying or stochastic occurrence rates, e.g., the nonhomogeneous Poisson process, the mixed Poisson process. In addition to statistical moments, probability distribution of the longterm loss can also be significant information for the decisionmaking process. For instance, Pandey and Van der Weide [9] highlighted the importance of assessing the probability distribution of the seismic damage cost of the engineering structures during the lifecycle. It shows that the full probability distribution provides a more realistic estimate of the potential damage cost. Wang and Zhang [10] proposed the probabilitybased loss estimation of structures subjected to tropical cyclone damage. Though these approaches successfully assessed the probability distribution, they may not be able to directly incorporate the statistical moments of longterm loss into the probabilistic assessment. Although Monte Carlo simulation (MCS) can be easily employed for estimating the distribution of longterm loss, it can be timeconsuming under different scenarios since a large number of simulations are required for a single case. To address such limitation, this study aims to deliver a probabilistic approach based on maximum entropy method and statistical moments. The proposed approach provides substantial flexibility for the probabilistic assessment, as the statistical moments can be directly used to produce the probability density function (PDF) and cumulative distribution function (CDF) of longterm loss.
Maximum entropy method belongs to the socalled moment method. Moment method [11,12,13] is usually employed to fit the underlying PDF of a random variable of interest by using a finite number of moments as constraints in reliability analysis. Two main concerns to fit an unknown PDF are statistical moments estimation and the selection of an appropriate distribution model. For instance, a random variable of interest is usually the output of a complex system with various input variables, which makes it intractable for statistical moments estimation, especially for higherorder moments. To address this issue, dimensionreduction techniques are widely adopted to reduce the computational effort induced by estimating the high dimensional integration for higherorder moments, i.e., the socalled univariate dimensionreduction method [12] and the bivariate dimensionreduction method [13]. Besides dimensionreduction technique, cubature rules [14, 15] are also used for higherorder moments approximation. To further improve the efficiency and accuracy of moments estimation, adaptive dimensionreduction methods with delineation of cross terms [16] or sensitivity analysis [17, 18] are proposed. In this paper, longterm loss of civil infrastructures is of great interest and the statistical moments of longterm loss can be analytically obtained via using moment generation function (MGF) [8].
Once the statistical moments is obtained, the probability density function of a random variable can be recovered by a selected distribution model. Another key issue is to select an appropriate distribution model. In recent years, various parametric probability distribution models are proposed. A guideline for comparing different probability distribution models is summarized in reference [19], in which merits and drawbacks of various approaches are concluded. Pearson system [20] can be employed to construct PDF of a random variable through a family of different probability distribution types with the firstfour statistical moments as constraints. As described in [18], the Pearson system is adopted to construct the PDF of structural response due to its flexibility. Generally, Pearson system can fit various types of PDFs. However, it may not produce stable results near the boundaries belonging to different distribution types [21]. Johnson system [22] contains three distribution types, i.e., lognormal, bounded and unbounded distributions. A mathematical transformation function is proposed to transform these distributions into standard normal distributions. Johnson system shows a low calculation performance for unknown parameters from estimated statistical moments [23]. Moreover, both Pearson system and Johnson system may not be available for an unbounded bimodal PDF even though Johnson system provides a bounded and restricted bimodal PDF. Saddlepoint approximations [24] can construct the PDF of a random variable by using a few statistical moments to determine the cumulant generating function, while it may encounter the numerical instability and the latent PDF usually follows an exponential form leading to unaccurate estimation for nonexponential PDF types [19]. The generalized lambda distribution is a flexible tool to fit different types of PDF but the accuracy may not be guaranteed in some regions of the skewnesskurtosis plane [25]. Recently, shifted generalized lognormal distribution with firstfour statistical moments is developed to fit a PDF within almost the entire range of skewness and kurtosis [26, 27], whereas the complicated computation is required [28]. Some other forms of probability distribution models, e.g., Hermite model [29], cubic normal transformation [28, 30] are also adopted to fit unknown PDF.
Among various moment approaches, maximum entropy method (MEM) is considered the most unbiased one, which originates from the modern information theory. The main idea of MEM is to fit the underlying PDF by using the firstn statistical moments of a system output as constraints. The maximum entropy principle is developed by Jaynes [31], which can be used to select the most possible PDF from a large number of candidates since the process adds minimum spurious information. Shore and Johnson have proven that the MEM can satisfy all the conditions of consistency with only using statistical moments as constraints [32]. Hence, the MEM does not suffer the disadvantages existing in other probability distribution models mentioned above and it contains a whole family of generalized exponential distributions, e.g., normal, lognormal, gamma, beta and exponential distributions. Furthermore, MEM is available for multimodal distribution types [33]. In recent years, MEM attracts great attention in structural reliability analysis due to its conceptual elegance. Ref. [33] developed a twostep strategy to evaluate the structural reliability efficiently, in which a normalized moment based quadrature rule with dimensionreduction technique is employed to estimate the firstfour statistical moments of system output efficiently, then MEM can be applied to derive the PDF of structural response. A novel hybrid dimensionreduction method incorporates an improved MEM with GOpoly algorithm [34] and EBE method [35] is developed to improve the efficiency of structural analysis [17]. An improved maximum entropy method based on a nonlinear mapping and sparse grid numerical integration technique is proposed for achieving a good balance between accuracy and efficiency for structural reliability analysis [36]. Generally, the statistical moments used in MEM are usually integer moments, which means orders of moments are all integer. For practical application of complex engineering problems, the conventional MEM with integer moments as constraints may encounter a difficulty that the accuracy of integer moments may not be guaranteed, especially for higherorder moments, e.g., skewness and kurtosis. To mitigate this difficulty, fractional moments based MEM is introduced in structural reliability analysis [37], which means orders of moments are fractional values. Moreover, an improved dimensionreduction technique named multiplicative dimensionreduction method is developed for fractional moments estimation. Then Xu et al. [38, 39] expand the fractional momentsbased MEM for structural dynamic systems incorporating with equivalent extreme value distribution. To further improve the accuracy, an adaptive scaled unscented transformation with maximum entropy method is developed for efficiently estimating the structural reliability analysis [40]. Recently, a transformed mixeddegree cubature rule and fractional moments based maximum entropy method is developed for reconstructing the unknown PDF of structural response [41]. Note that in structural reliability problems, the involved systems are quite complex and the failure probability estimation draws more attention to the long tail of PDF of system outputs, which requires an extremely accurate estimated PDF of a system output and significantly impede the process of reliability analysis. Fortunately, MEM can tackle this problem with a good tradeoff between accuracy and efficiency. Hence, it can be concluded that MEM is a powerful tool to derive the PDF of a system output. However, to the best of the authors’ knowledge, the maximum entropy approach has not been incorporated in loss assessment to identify the probability distribution of longterm loss. In this paper, maximum entropy method incorporating higherorder analysis based on stochastic renewal process is developed. The probability distribution of longterm loss can be accurately constructed by MEM since the statistical moments can be analytically obtained via MGF [8].
In this paper, maximum entropy method incorporating the firstfour analytical moments, i.e., mean, standard deviation, skewness and kurtosis, of longterm loss is proposed to derive the underlying PDF of longterm loss of civil infrastructure, which can significantly mitigate the computational effort induced by Monte Carlo simulation when analyzing various hazard scenarios. The organization of this paper can be summarized as follows. In Higherorder analysis of longterm loss section, the stochastic model of hazards is briefly reviewed and the framework of higherorder analysis for statistical moments of longterm loss is summarized. Maximum entropy method is presented in Maximum entropy method for deriving the PDF of longterm loss section and the application of the proposed framework to assess probabilistic longterm loss is illustrated in Illustrative example section to demonstrate the accuracy and efficiency of the proposed method.
Higherorder analysis of longterm loss
Stochastic model of hazards
During the longterm loss assessment, it is of great importance to quantify uncertainties associated with hazard frequency and intensity. Stochastic models are widely utilized to model hazard arrivals and the uncertain magnitudes. These stochastic models are commonly proposed based on historical data. For instance, on the basis of historical records, the homogeneous Poisson process (HPP) can be employed to model the occurrence of natural hazards, such as earthquakes [42, 43] and hurricanes [44, 45]. In recent studies, researchers indicated that the stationary stochastic models may not be able to capture nonstationary and timedependent characteristics associated with hazard frequency and magnitude under various scenarios. To this end, some nonstationary stochastic models have been proposed to quantify such variability. For instance, the renewal process has been used to model the stochastic arrivals of strong earthquakes over a longterm period [46]. The occurrence of hurricanes related to climate change can be represented as a nonhomogeneous or mixed Poisson process [47]. In this section, a stochastic renewal process is adopted to model hazard occurrence. The renewal process has been widely utilized in recent studies [48,49,50], as it is the generalization of a HPP.
A renewal process is a counting process with the interarrival times independently identically distributed (i.i.d). Considering the service life of the engineering structure is (0, t_{int}], T_{k} represents the arrival time of kth hazard event. Hence, the arriving times of hazards are collections of k nonnegative random variables, i.e., {T_{1}, T_{2},..., T_{k}}, which can be defined as:
where {S_{1}, S_{2},..., S_{k}} are interarrival times.
Based on the properties of renewal theory, the number of hazard events in the time interval (0, t_{int}] can be expressed as:
in which F_{S}^{k}(t_{int}) is a kfold convolution of interarrival time S_{k}. The notation Φ(·) represents the renewal function which satisfies an integration conditioning on the first arrival time t_{1}. Assuming that the cumulative distribution function (CDF) of interarrival time is continuous, then the expected number of hazard events can be represented by the following equation:
If the first arrival time t_{1} exceeds t_{int}, there will be no hazard events occurring within the time period (0, t_{int}], which also means that \(\mathbb {E}\left [ {N\left (t_{\text {int}} \right)} \right ] = 0\). Otherwise, the number of renewals gives:
Substituting Eq. (4) for Eq. (3), the expected number of hazard arrivals can be formulated as:
When the interarrival time S_{k} follows the exponential distribution, the renewal process mentioned above becomes a HPP, which is also known as a Poisson renewal process. The HPP has constant occurrence rate α. It represents the rate of hazard arrivals within a specific time unit, e.g., t_{int}. Hence, for the HPP, the expected value of the number of hazard events can be calculated as:
The probability of n arrivals of hazard within a time interval is a Poisson distribution, which can be expressed as:
and the probability density function of the interarrival time S gives:
The definition of longterm loss
The longterm loss can be computed as the accumulative damage cost due to hazards over the service life of the engineering system. In this section, the framework of the analytical analysis of longterm loss is introduced. The analytical assessment is based on the moment generating function methods [8]. Herein, the definition of longterm loss, MGF and derivations of statistical moments of longterm loss are included. Given a time interval (0, t_{int}], t_{int} can be defined as the service life of civil infrastructures. Based on the renewal model mentioned above, the arrival time and interarrival time are denoted as T_{k} and S_{k} for the kth event, respectively, and T_{k}=S_{1}+S_{2}+···+S_{k}. The longterm loss L_{lt} can be expressed as:
where L_{k} represents the loss severity, which is the probability of failure given the hazards times the associated economic consequence induced by structural damage under the kth hazard event and r denotes a monetary discount rate. Note that the interarrival time is assumed to be independent of the loss severity L_{k}. Hence, the longterm loss is the total economic loss of civil infrastructures caused by hazards within a time domain (0, t_{int}], which is shown in Fig. 1.
Analytical statistical moments of longterm loss
Moment generation function
The analytical assessment of statistical moments of longterm loss can be based on the moment generating function approach. For a random variable, its raw moments can be derived by taking derivations of the MGF [51]. Hence, for a given random variable X, the MGF with \(\beta \in \mathbb {R}\) can be defined as:
Specifically, the firsttwo raw moments can be derived through taking the first and second derivatives of MGF:
Then taking β=0, the first raw moments gives:
Generally, the jth raw moment of the random variable can be calculated by taking the jth derivative of MGF at zero, which can be formulated as:
Analytical statistical moments based on moment generation function
On the basis of the MGF, the derivation of statistical moments of longterm loss relies on finding its MGF \({M_{{L_{lt}}}}\left (\cdot \right)\). Herein, the derivation of MGF of a Poisson renewal process is provided. Conditioning on N(t_{int})=n, the MGF of longterm loss can be formulated based on the law of total expectation:
where M_{L}(·) is the MGF of loss severity L. The loss severity L is given as following the exponential distribution with a parameter θ, i.e., L∼EXP(θ), [52]. Then substituting M_{L} into Eq. (14), the longterm loss under the HPP can be reformulated as:
Then the mean (u_{1}) and standard deviation (u_{2}) of longterm loss can be easily obtained by taking the first and second derivatives of MGF at β=0:
In this paper, the higherorder moments of longterm loss are also of great interest. Similarly, the skewness (u_{3}) and kurtosis (u_{4}) can be derived as [8]:
Maximum entropy method for deriving the PDF of longterm loss
Once the first four central moments of long term loss are obtained by the MGF, the PDF of longterm loss can be approximated accordingly by using the statistical moments and a given distribution model. In this paper, maximum entropy method is specifically used since it is regarded as the most unbiased approximation of a underlying PDF, which means that the MEM can find the most probable PDF among all candidate PDFs under statistical moments’ constraints.
Given a continuous random variable Y with PDF f_{Y}(y), the informationtheoretical entropy of Y is defined as:
The estimation of underlying PDF f_{Y}(y) by using a finite number of statistical moments of Y as constraints is a classical problem in statistics, where the MEM developed by Jaynes [31] is one of the most popular approaches to address this problem. On the basis of principle of maximum entropy, the general constrained optimization formulation of MEM reads:
where \(\mathbb {E}\left [ {{Y^{i}}} \right ]\)is the ith raw moment of Y, and k represents the given number of statistical moments, herein, k=4. For the long term loss L_{lt}, the firstfour analytical central moments are obtained by MGF mentioned above, i.e., u_{1}, u_{2}, u_{3}, u_{4}. Before dealing with the optimization problem in Eq. (21), a standardized transformation is firstly adopted to transform the random variable L_{lt} into a standard variable Y, which is given by:
Hence, the firstfour raw moments of the standardized random variable Y are easily derived as:
Once the firstfour raw moments of Y are obtained, the optimization problem in Eq. (21) can be solved by introducing the following Lagrangian function:
where λ=[λ_{0},λ_{1},...,λ_{k}]^{T} is a vector collecting the Lagrangian multipliers. For optimal solution, a key condition is given by:
then a closed form of f_{Y}(y) can be obtained by:
in which λ_{0} is derived based on the normalization axiom in probability theory such that:
Therefore, a closed form of f_{Y}(y) can be obtained by determining the Lagrangian multipliers λ. To this end, an equivalent unconstrained optimization formulation based on KullbackLeibler (KL) divergence is employed to solve the unknown parameters. As is known, KL divergence can measure the divergence between the true PDF, f_{Y}(y) and its estimator, \({\hat {f}_{Y}}\left (y \right)\), which is expressed as [37]:
Substituting Eq. (26) for \({\hat {f}_{Y}}\left (y \right)\), then Eq. (28) can be rewritten as:
in which \({\mathcal {H}}\left ({{f_{Y}}\left (y \right)} \right)\) is the entropy of the true PDF. When minimizing KL divergence of true PDF and its estimator, the estimated PDF is close to the real one. Hence, the minimization of KL divergence shown in Eq. (29) is an alternative choice of Eq. (21). Although the value of \({\mathcal {H}}\left ({{f_{Y}}\left (y \right)} \right)\) is unknown, it is invariant and independent of parameters λ. Thus, minimizing KL divergence in Eq. (29) is equivalent to the minimization of the following objective function:
It is noted that the objective function Γ(λ) is convex about λ. Therefore, a global minimum can be found by minimizing the objective function. The key condition is given as:
It is concluded that Eq. (31) is just the constrained conditions in Eq. (21). Therefore, the unknown parameters λ can be determined by using the following unconstrained optimization formulation:
Once the Lagrangian multipliers are determined by Eq. (32), the PDF of longterm loss can be accordingly derived by:
The procedure of the proposed method incorporating with analytical statistical moments and maximum entropy method is shown in Fig. 2. The homogeneous Poisson process has been emphasized in this study. Furthermore, the proposed approach can be applied to nonstationary models, such as the nonPoisson renewal model, nonhomogeneous Poisson process. The derivation of the associated moment generating functions and the firstfour moments could be different, while the maximum entropy estimation process can still be adopted. Future studies should incorporate nonstationary hazard models and investigate the effect on the probability distribution of the loss.
Illustrative example
The proposed probabilistic approach is applied to a coastal bridge to quantify the longterm loss subjected to hurricane hazards. The investigated bridge is a multispan simply girder bridge and this type of bridge is vulnerable to deck unseating failure under hurricanes [53, 54]. The stochastic Poisson process is employed to model the occurrence of hurricanes over the service life of the bridge. Poisson process is a special renewal process with exponentially distributed interarrival times. Therefore, the proposed renewal approach can be utilized. Given the information, the key inputs for the longterm loss assessment include the occurrence rate α, service life t_{int}, distribution parameters of loss severity L, and the monetary discount rate r. By using the moment generating approach, the firstfour analytical central moments of can be computed. Then the PDF of longterm loss can be constructed effectively through maximum entropy method with the firstfour analytical central moments as constraints. The associated probabilistic distribution can provide significant information for the following risk management and decisionmaking.
In this example, the key parameter of hurricane hazard is the occurrence rate α of the stochastic Poisson process. It is computed as 0.245 by counting the annual number of hurricane events in the investigated region [8]. The loss severity refers to the damage cost of the system given the occurrence of the hazard. The loss severity can be computed as the product of the failure probability given hazard occurrence and the consequence, e.g., the repair cost of bridge superstructure due to deck unseating. The failure probability is commonly computed by the vulnerability assessment. For illustrative purposes, the probability of failure of the bridge is assumed to be 0.1. The repair cost of the bridge can be 12.832 million USD [55]. Then the loss severity can be computed as 1.283 million USD. Herein, the mean value of the loss severity is taken as 1.283 million USD and the loss severity is assumed to follow an exponential distribution. The monetary discount rate is 2% for a long time horizon. The service life of the bridge is 75 years. Given these parameters, the firstfour central moments can be computed as 12.2099×10^{6} USD, 4.3773×10^{6}USD, 0.6101, and 3.5411, which are essential inputs for MEM. The Lagrangian multipliers of MEM are estimated as λ=[0.3459,0.5015,−0.1287,0.0164]. Figure 3 shows the PDFs and CDFs in logarithm scale of the probabilistic longterm loss using the proposed method and Monte Carlo simulation. It is seen that the PDF and CDF in logarithm scale obtained by the proposed method accord well with those estimated by 10^{6} MCS. For parametric analysis, when the service life reaches 150 years, the associated PDF and CDF can be reconstructed as shown in Fig. 4, where the firstfour central moments of longterm loss are 14.9343×10^{6}USD, 4.4849×10^{6}USD, 0.5735, and 3.4922. The Lagrangian multipliers for MEM are estimated as λ=[0.3169,0.5012,−0.1163,0.0141]. Again, the results show the accuracy of the proposed method.
Apart from the service life, the monetary discount rate can be an essential parameter for the loss assessment. The probabilistic longterm loss considering the monetary discount rate at 3% is also estimated. Figures 5 and 6 show the probabilistic distributions of longterm loss with 75year and 150year service life, receptively. Herein, the monetary discount rate is r=3%. For 75year service life, the firstfour central moments are 9.3735×10^{6} USD, 3.6461×10^{6}USD, 0.7108, and 3.7512. For 150year, the firstfour analytical central moments are 10.3614×10^{6} USD, 3.3663×10^{6}USD, 0.7000, and 3.7349. The Lagrangian multipliers for the two cases are obtained as λ=[0.4030,0.5155,−0.1529,0.0197] and λ=[0.3968,0.5142,−0.1502,0.0192], respectively. Compared with the cases with r=2%, the losses with a larger discount rate are more rightskewed. The decisionmakers may need to pay special attention to the potential tail risks.
Moreover, the loss severity can be affected by the structural performance, as it is the product of the failure probability and the repair cost. The structural performance can be reduced due to hazards or can be improved due to repair actions. To identify the impact of the changes in loss severity on the probability distribution of the longterm loss, Figs. 7 and 8 show the PDFs and CDFs of the longterm loss considering different loss severity. For instance, Fig. 7 indicates the case with doubled expected loss severity (\(\mathbb {E}\left [ L \right ] = 2 \times 1.283 \times {10^{6}}\) USD). The distribution of loss severity remains as the exponential distribution. The service life and the monetary discount rate are still 75 years and 2%, respectively.. By using the proposed approach, the first four central moments are 24.4197×10^{6} USD, 8.7546×10^{6}USD, 0.6101, 3.5411 and the Lagrangian multipliers for MEM are λ=[0.3459,0.5015,−0.1287,0.0164]. Figure 8 gives the scenario when the expected loss severity is reduced by half (\(\mathbb {E}\left [ L \right ] = 0.5 \times 1.283 \times {10^{6}}\)). Likewise, the service life is 75 years and the monetary discount rate is 2%. The first four central moments are 6.1049×10^{6} USD, 2.1886×10^{6}USD, 0.6101, 3.5411 and the associated Lagrangian multipliers for MEM are λ=[0.3459,0.5015,−0.1287,0.0164]. Again, the results produced by the proposed method are in good agreement with those obtained by MCS.
Moreover, the comparison of the firstfour central moments obtained by the proposed method and MCS (10^{6} runs) is provided in Table 1 and the unit of the mean and standard deviation is 10^{6} USD. Both the proposed method and MCS are performed on a computer with Intel(R) Core(TM) i79750H CPU processor at 2.60GHz with 16 GB RAM. As seen from Table 1, the estimated firstfour central moments accord well with those obtained by MCS, the maximum relative error is lower than 1%, which illustrates the accuracy of the proposed method. Furthermore, the CPU time required by the proposed method is just within a few seconds while that of MCS is much larger. For instance, in case 2, MCS consumes more than 600s while the proposed approach just takes 3.8s. The proposed method shows significant efficiency especially when taking various parameters into account for the probabilistic analysis of longterm loss. The results described from Figs. 3, 4, 5, 6, 7 and 8 show that the proposed method incorporating analytical statistical moments and maximum entropy method has indicated high accuracy in predicting the probabilistic distribution of longterm loss. Apart from the expected longterm loss, decisionmakers may pay special attention to the upper tail of the longterm loss, as the upper tail is associated with the extreme cost. Therefore, the accuracy of the upper tail can be essential. For the case shown in Fig. 3, different percentiles of longterm loss obtained by the proposed method and MCS are listed in Table 2. It can be concluded that the proposed method can obtain comparable results with MCS, which also means that the proposed method can ensure the accuracy of the upper tail of PDF. Besides, the decisionmakers can identify the characteristics of extreme losses based on the upper tail of the PDF of longterm loss. For instance, as shown in Table 2, when 95% percentile is of interest, the decisionmakers can conclude that they are 95% confident that the longterm loss of this bridge does not exceed 20.16 × 10^{6} USD within the 75years service life. Such statements may be regarded as metrics for the riskbased decisionmaking.
By employing the proposed approach, the PDF of longterm loss can be evaluated under various given parameters related to loss estimation, i.e., the monetary discount rate r, time interval t_{int} and expected loss severity \(\mathbb {E}\left [ L \right ]\). Basic parameters are defined as r=2%, t_{int}=75 and \(\mathbb {E}\left [ L \right ]=1.283\times 10^{6}\) USD and for different cases just one parameter varies for comparison. Figure 9 shows the PDF of longterm loss under different given parameters and the firstfour centralmoments under various parameters are listed in Table 3. It can be identified that the expectation of the longterm loss can be sensitive to the changes of the monetary discount rate, the expected loss severity, and the service life, as shown in Fig. 9(a) to (c). It can be seen in Fig. 9(b) that the standard deviation is less likely to be affected by the service life when the discount rate and loss severity remain unchanged. Seen from Table 3, the skewness and kurtosis are not affected by the change of loss severity under the same service life and discount rate.
Concluding remarks
In this paper, a probabilistic analysis framework of the longterm loss of civil infrastructures under hazards is proposed. The probability distribution of the longterm loss can be effectively attained by incorporating the analytical statistical moments and the maximum entropy approach. Besides the mean and the standard deviation of longterm loss, higherorder moments of longterm loss, i.e., skewness and kurtosis, can be assessed by using the MGF based on the stochastic renewal process. Once the firstfour analytical statistical moments are obtained, the maximum entropy method can be used for deriving the underlying probability density function of longterm loss. Compared with the Monte Carlo simulation, the proposed method can significantly reduce the computational effort of longterm loss assessment. The illustrative example demonstrates the accuracy and efficiency of the proposed approach. The PDF and CDF obtained by the proposed method accord well with those estimated by Monte Carlo simulation. The proposed method is conditioned on several assumptions. For instance, the hazard arrivals are based on the homogenous Poisson process. The analytical derivations of the statistical moments are based on an assumption that the loss severity is assumed to follow the exponential distribution. Future studies are encouraged to explore nonstationary hazard models and different loss severity models and vulnerability assessment should also be considered.
Availability of data and materials
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Abbreviations
 HPP:

homogeneous Poisson process
 MGF:

moment generation function
References
Giouvanidis AI, Dong Y (2020) Seismic loss and resilience assessment of singlecolumn rocking bridges. Bull Earthq Eng 18:4481–4513. https://doi.org/10.1007/s10518020008655.
Zhu D, Li Y, Dong Y, Yuan P (2021) Longterm loss assessment of coastal bridges from hurricanes incorporating overturning failure mode. Adv. Bridge Eng. 2(1):1–15. https://doi.org/10.1186/s43251020000307.
Cheng M, Frangopol DM (2021) Lifecycle optimization of structural systems based on cumulative prospect theory: Effects of the reference point and risk attitudes. Reliab Eng Syst Saf:108100. https://doi.org/10.1016/j.ress.2021.108100.
Goda K, Hong H (2006) Optimal seismic design considering risk attitude, societal tolerable risk level, and life quality criterion. J Struct Eng 132(12):2027–2035. https://doi.org/10.1061/(ASCE)07339445(2006)132:12(2027).
Goda K, Hong H (2008) Application of cumulative prospect theory: Implied seismic design preference. Struct Saf 30(6):506–516. https://doi.org/10.1016/j.strusafe.2007.09.007.
Cheng M, Frangopol DM (2022) Lifecycle optimization of structural systems based on cumulative prospect theory: Effects of the reference point and risk attitudes. Reliab Eng Syst Saf 218:108100. https://doi.org/10.1016/j.ress.2021.108100.
Pandey MD, Van Der Weide J (2017) Stochastic renewal process models for estimation of damage cost over the lifecycle of a structure. Struct Saf 67:27–38. https://doi.org/10.1016/j.strusafe.2017.03.002.
Li Y, Dong Y, Qian J (2020) Higherorder analysis of probabilistic longterm loss under nonstationary hazards. Reliab Eng Syst Saf 203:107092. https://doi.org/10.1016/j.ress.2020.107092.
Pandey MD, van der Weide J (2018) Probability distribution of the seismic damage cost over the life cycle of structures. Struct Saf 72:74–83. https://doi.org/10.1016/j.strusafe.2017.12.007.
Wang C, Zhang H (2018) Probabilitybased estimate of tropical cyclone damage: An explicit approach and application to Hong Kong, China. Eng Struct 167:471–480. https://doi.org/10.1016/j.engstruct.2018.04.064.
Zhao YG, Ono T (2001) Moment methods for structural reliability. Struct Saf 23(1):47–75. https://doi.org/10.1016/S01674730(00)000278.
Rahman S, Xu H (2004) A univariate dimensionreduction method for multidimensional integration in stochastic mechanics. Probabilistic Eng Mech 19(4):393–408. https://doi.org/10.1016/j.probengmech.2004.04.003.
Xu H, Rahman S (2004) A generalized dimensionreduction method for multidimensional integration in stochastic mechanics. Int J Numer Methods Eng 61(12):1992–2019. https://doi.org/10.1002/nme.1135.
Xu J, Kong F (2018) A cubature collocation based sparse polynomial chaos expansion for efficient structural reliability analysis. Struct Saf 74:24–31. https://doi.org/10.1016/j.strusafe.2018.04.001.
Mysovskikh I (1980) The approximation of multiple integrals by using interpolatory cubature formulae In: Quantitative Approximation, 217–243.. Elsevier. https://doi.org/10.1016/B9780122136504.500258.
Liu R, Fan W, Wang Y, Ang AHS, Li Z (2019) Adaptive estimation for statistical moments of response based on the exact dimension reduction method in terms of vector. Mech Syst Signal Process 126:609–625. https://doi.org/10.1016/j.ymssp.2019.02.035.
Chen Z, Zhou P, Liu Y, Ji P (2019) A novel approach to uncertainty analysis using methods of hybrid dimension reduction and improved maximum entropy. Struct Multidiscip Optim 60(5):1841–1866. https://doi.org/10.1007/s00158019022948.
Xu J, Zhang Y, Dang C (2020) A novel hybrid cubature formula with pearson system for efficient momentbased uncertainty propagation analysis. Mech Syst Signal Process 140:106661. https://doi.org/10.1016/j.ymssp.2020.106661.
Xi Z, Hu C, Youn BD (2012) A comparative study of probability estimation methods for reliability analysis. Struct Multidiscip Optim 45(1):33–52. https://doi.org/10.1007/s0015801106565.
Pearson K (1894) Contributions to the mathematical theory of evolution. Philos Trans R Soc Lond A 185:71–110.
Youn BD, Xi Z, Wang P (2008) Eigenvector dimension reduction (EDR) method for sensitivityfree probability analysis. Struct Multidiscip Optim 37(1):13–28. https://doi.org/10.1007/s0015800702107.
Johnson NL, Kotz S, Balakrishnan N (1995) Continuous Univariate Distributions, Volume 2. Hoboken, Wiley.
Li G, He W, Zeng Y (2019) An improved maximum entropy method via fractional moments with Laplace transform for reliability analysis. Struct Multidiscip Optim 59(4):1301–1320. https://doi.org/10.1007/s0015801821296.
Huang B, Du X (2005) Uncertainty analysis by dimension reduction integration and saddlepoint approximations. J Mech Des 128(1):26–33. https://doi.org/10.1115/1.2118667.
Corlu CG, Meterelliyoz M (2016) Estimating the parameters of the generalized lambda distribution: which method performs best?. Commun StatSimul Comput 45(7):2276–2296. https://doi.org/10.1080/03610918.2014.901355.
Low YM (2013) A new distribution for fitting four moments and its applications to reliability analysis. Struct Saf 42:12–25. https://doi.org/10.1016/j.strusafe.2013.01.007.
Xu J, Dang C (2019) A new bivariate dimension reduction method for efficient structural reliability analysis. Mech Syst Signal Process 115:281–300. https://doi.org/10.1016/j.ymssp.2018.05.046.
Zhao YG, Zhang XY, Lu ZH (2018) Complete monotonic expression of the fourthmoment normal transformation for structural reliability. Comput Struct 196:186–199. https://doi.org/10.1016/j.compstruc.2017.11.006.
Wintérstein SR (1988) Nonlinear vibration models for extremes and fatigue. J Eng Mech 114(10):1772–1790. https://doi.org/10.1061/(ASCE)07339399(1988)114:10(1772).
Zhao YG, Zhang XY, Lu ZH (2018) A flexible distribution and its application in reliability engineering. Reliab Eng Syst Saf 176:1–12. https://doi.org/10.1016/j.ress.2018.03.026.
Jaynes ET (1957) Information theory and statistical mechanics. Phys Rev 106(4):620. https://doi.org/10.1103/PhysRev.106.620.
Shore J, Johnson R (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum crossentropy. IEEE Trans Inf Theory 26(1):26–37. https://doi.org/10.1109/TIT.1980.1056144.
Li G, Zhang K (2011) A combined reliability analysis approach with dimension reduction method and maximum entropy method. Struct Multidiscip Optim 43(1):121–134. https://doi.org/10.1007/s0015801005462.
Rajan A, Kuang YC, Ooi MPL, Demidenko SN, Carstens H (2018) Momentconstrained maximum entropy method for expanded uncertainty evaluation. IEEE Access 6:4072–4082. https://doi.org/10.1109/ACCESS.2017.2787736.
Hao W, Harlim J (2018) An equationbyequation method for solving the multidimensional moment constrained maximum entropy problem. Commun Appl Math Comput Sci 13(2):189–214. https://doi.org/10.2140/camcos.2018.13.189.
He W, Zeng Y, Li G (2019) A novel structural reliability analysis method via improved maximum entropy method based on nonlinear mapping and sparse grid numerical integration. Mech Syst Signal Process 133:106247. https://doi.org/10.1016/j.ymssp.2019.106247.
Zhang X, Pandey MD (2013) Structural reliability analysis based on the concepts of entropy, fractional moment and dimensional reduction method. Struct Saf 43:28–40. https://doi.org/10.1016/j.strusafe.2013.03.001.
Xu J (2016) A new method for reliability assessment of structural dynamic systems with random parameters. Struct Saf 60:130–143. https://doi.org/10.1016/j.strusafe.2016.02.005.
Xu J, Kong F (2018) An adaptive cubature formula for efficient reliability assessment of nonlinear structural dynamic systems. Mech Syst Signal Process 104:449–464. https://doi.org/10.1016/j.ymssp.2017.10.039.
Xu J, Kong F (2019) Adaptive scaled unscented transformation for highly efficient structural reliability analysis by maximum entropy method. Struct Saf 76:123–134. https://doi.org/10.1016/j.strusafe.2018.09.001.
He S, Xu J, Zhang Y (2021) Reliability computation via a transformed mixeddegree cubature rule and maximum entropy. Appl Math Model. https://doi.org/10.1016/j.apm.2021.11.016.
Kagan YY, Jackson DD (1991) Longterm earthquake clustering. Geophys J Int 104(1):117–133. https://doi.org/10.1111/j.1365246X.1991.tb02498.x.
Rackwitz R (2002) Optimization and risk acceptability based on the life quality index. Struct Saf 24(24):297–331. https://doi.org/10.1016/S01674730(02)000292.
Elsner JB, Bossak BH (2001) Bayesian analysis of us hurricane climate. J Clim 14(23):4341–4350. https://doi.org/10.1175/15200442(2001)014<4341:BAOUSH>2.0.CO;2.
Katz RW (2002) Stochastic modeling of hurricane damage. J Appl Meteorol 41(7):754–762. https://doi.org/10.1175/15200450(2002)041<0754:SMOHD>2.0.CO;2.
Matthews MV, Ellsworth WL, Reasenberg PA (2002) A brownian model for recurrent earthquakes. Bull Seismol Soc Am 92(6):2233–2250. https://doi.org/10.1785/0120010267.
Ellingwood BR, Lee JY (2016) Managing risks to civil infrastructure due to natural hazards: communicating longterm risks due to climate change In: Risk Analysis of Natural Hazards, 97–112.. Springer. https://doi.org/10.1007/9783319221267_7.
Yang DY, Frangopol DM (2019) Lifecycle management of deteriorating civil infrastructure considering resilience to lifetime hazards: A general approach based on renewalreward processes. Reliab Eng Syst Saf 183:197–212. https://doi.org/10.1016/j.ress.2018.11.016.
Ramkrishnan R, Kolathayar S, Sitharam T (2021) Probabilistic seismic hazard analysis of north and central himalayas using regional ground motion prediction equations. Bull Eng Geol Environ 80(10):8137–8157. https://doi.org/10.1007/s10064021024349.
Li Y, Dong Y, Frangopol DM, Gautam D (2020) Longterm resilience and loss assessment of highway bridges under multiple natural hazards. Struct Infrastruct Eng 16(4):626–641. https://doi.org/10.1080/15732479.2019.1699936.
Ross SM (2014) Introduction to Probability Models. San Diego, Academic press.
Smith RL (2003) Statistics of extremes, with applications in environment, insurance, and finance. Extreme Values in Finance, Telecommunications, and the Environment. Chapman and Hall/CRC, New York, pp 20–97
Li Y, Dong Y (2019) Riskinformed hazard loss of bridges in a lifecycle context In: ICASP13. https://doi.org/10.22725/ICASP13.120.
Li Y, Dong Y, Zhu D, et al (2002) Copulabased vulnerability analysis of civil infrastructure subjected to hurricanes. Struct Infrastruct Eng 16:626–641. https://doi.org/10.3389/fbuil.2020.571911.
Mondoro A, Frangopol DM, Soliman M (2017) Optimal riskbased management of coastal bridges vulnerable to hurricanes. J Infrastruct Syst 23(3):04016046. https://doi.org/10.1061/(ASCE)IS.1943555X.0000346.
Acknowledgments
The main works of this research were conducted in the Hong Kong Polytechnic University (PolyU), the authors gratefully acknowledge the support of PolyU and international collaborators.
Funding
This study has been supported by the Research Institute for Sustainable Urban Development, the Hong Kong Polytechnic University (PolyU1BBWM), the National Natural Science Foundation of China (Grant No. 52078448), and the Research Grants Council of the Hong Kong Special Administrative Region, China (No. PolyU 15219819 and PolyU 15221521).
Author information
Authors and Affiliations
Contributions
ZY built the framework of the proposed method and was a major contributor in writing this manuscript. LY analyzed and interpreted the data and wrote this manuscript. DY helped review and edit this manuscript. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Consent for publication
Not applicable.
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visithttp://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Zhang, Y., Li, Y. & Dong, Y. Probabilistic analysis of longterm loss incorporating maximum entropy method and analytical higherorder moments. J Infrastruct Preserv Resil 3, 7 (2022). https://doi.org/10.1186/s43065022000527
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s43065022000527
Keywords
 Stochastic model of hazards
 Longterm loss
 Moment generation function
 Analytical higherorder moments
 Maximum entropy method