Highly inaccurate result in OpenTURNS reliability model when using subset of RandomVector

194 Views Asked by At

I have a reliability model built in OpenTURNS with several limit state functions, that take anywhere from two to eight random variables (RVs). My initial attempt was to define a single RandomVector with all eight variables, and use this RandomVector for all event calculations. For the two-variable limit state function, results are sensible with Monte Carlo, but completely inaccurate when using FORM or SORM. However, when I apply FORM or SORM with a RandomVector containing just the two RVs for the two-variable limit state function, it works well.

The correct probability is 0.000427, whereas both FORM and SORM with the eight-var model return values on the order of 1e-29. With the two-var model, FORM returns the correct value of 0.00427.

The components of the design point vectors are similar when using the two-variable or eight-variable RandomVectors:

  • Design point for eight-var model (see first and third elements): [-0.445716,0.0305458,3.30454,-0.119868,0.0317001,-0.0382662,-0.0233416,7.59606,7.5671]

  • Design point for two-var model: [-0.438289,3.30553]

Please see the reprex below. I'm using OpenTURNS 1.14 on Windows 10.

# Define marginal distributions for wall thickness and depth
wt_dist = ot.Normal(0.156, 0.003666)
od_dist = ot.Normal(8.625, 0.0146625)
d_dist = ot.Normal(0.063, 0.0276486)
lg_dist = ot.Normal(2.36, 0.143478)
ys_dist = ot.Normal(57000, 2700)
ts_dist = ot.Normal(80565, 3868)
cv_dist = ot.TruncatedDistribution(ot.Normal(37, 5), 4)
mdlerr_dist = ot.Dirac(1)
press_dist = ot.Dirac(1140.3)

# Setup FORM optimizer
optimizer = ot.Cobyla()
eps = 1e-10
optimizer.setMaximumIterationNumber(5000)
optimizer.setMaximumAbsoluteError(eps)
optimizer.setMaximumRelativeError(eps)
optimizer.setMaximumResidualError(eps)
optimizer.setMaximumConstraintError(eps)

# === Full model ===
marginals = [
    wt_dist,
    od_dist,
    d_dist,
    lg_dist,
    ys_dist,
    ts_dist,
    cv_dist,
    mdlerr_dist,
    press_dist
    ]
n_vars = len(marginals)

# Define correlations between variables (using the normal copula)
cor_mat = ot.CorrelationMatrix(n_vars)
cor_mat[4, 5] = cor_mat[5, 4] = 0.98675
copula = ot.NormalCopula(cor_mat)
composed_dist = ot.ComposedDistribution(marginals, copula)
composed_dist.setName("Distributions")
composed_dist.setDescription(['WT', 'OD', 'D', 'L', 'YS', 'TS', 'CV', 'e', 'P'])
rv_vect = ot.RandomVector(composed_dist)  # vector of random variables

model = ot.SymbolicFunction(['WT', 'OD', 'D', 'L', 'YS', 'TS', 'CV', 'e', 'P'], ['WT-D'])
g = ot.CompositeRandomVector(model, rv_vect)
event = ot.ThresholdEvent(g, ot.Less(), 0.0)

# FORM test 1
algo = ot.FORM(optimizer, event, rv_vect.getMean())
algo.run()
result = algo.getResult()
prob_form1 = result.getEventProbability()
design_pt1 = result.getStandardSpaceDesignPoint()

# MC test 1
experiment = ot.MonteCarloExperiment()
algo = ot.ProbabilitySimulationAlgorithm(event, experiment)
algo.setMaximumCoefficientOfVariation(0.05)
algo.setMaximumOuterSampling(int(1e6))
algo.run()
result = algo.getResult()
prob_MC1 = result.getProbabilityEstimate()


# === Reduced model ===
marginals = [
    wt_dist,
    d_dist
    ]
n_vars = len(marginals)

# Define correlations between variables (using the normal copula)
cor_mat = ot.CorrelationMatrix(n_vars)
copula = ot.NormalCopula(cor_mat)
composed_dist = ot.ComposedDistribution(marginals, copula)
composed_dist.setName("Distributions")
composed_dist.setDescription(['WT', 'D'])
rv_vect = ot.RandomVector(composed_dist)  # vector of random variables

model = ot.SymbolicFunction(['WT', 'D'], ['WT-D'])
g = ot.CompositeRandomVector(model, rv_vect)
event = ot.ThresholdEvent(g, ot.Less(), 0.0)

# FORM test 2
algo = ot.FORM(optimizer, event, rv_vect.getMean())
algo.run()
result = algo.getResult()
prob_form2 = result.getEventProbability()
design_pt2 = result.getStandardSpaceDesignPoint()

# MC test 2
experiment = ot.MonteCarloExperiment()
algo = ot.ProbabilitySimulationAlgorithm(event, experiment)
algo.setMaximumCoefficientOfVariation(0.05)
algo.setMaximumOuterSampling(int(1e6))
algo.run()
result = algo.getResult()
prob_MC2 = result.getProbabilityEstimate()

print(prob_form1)
print(design_pt1)
print(prob_MC1)
print(prob_form2)
print(design_pt2)
print(prob_MC2)
1

There are 1 best solutions below

0
On BEST ANSWER

You use the Cobyla optimization algorithm, which is quite robust but demanding in terms of calls to the model. Each iteration of Cobyla has a cost proportional to the dimension of your input random vector in terms of evaluations of the model, as the algorithm build and update a linear approximation of it. When you use it with 8 inputs, the algorithm stops because you reached the maximum number of evaluations allowed (100 by default) and you get the following warning:

WRN - Warning! The Cobyla algorithm failed to converge. The error message is Maximum number of function evaluations reached

1.4631933217717485e-29

[-0.445716,0.0305458,3.30454,-0.119868,0.0317001,-0.0382662,-0.0233416,7.59606,7.5671]

0.0004238055834266587

0.0004273278619031894

[-0.438289,3.30553]

0.0004415498399381834

If you increase the bound on the number of evaluations, using:

optimizer.setMaximumEvaluationNumber(100000)

Then you get:

WRN - Warning! The Cobyla algorithm could not enforce the convergence criteria

0.0004273278619032821

[-0.438289,-3.05982e-08,3.30553,-2.76053e-08,-4.41471e-08,-4.71149e-08,-4.95428e-08,-5.77001e-09,1.00438e-07]

0.0004238055834266587

0.00042732786190326374

[-0.438289,3.30553]

0.0004415498399381834

The warning is here because you can hardly reach a 1e-10 precision on the solution using Cobyla.

Thks for using OpenTURNS!