How to aggregate ranvars across echelons with a Gaussian copula

This guide shows how to aggregate store-level ranvars into a warehouse-level ranvar while preserving shared demand shocks. It compares three cases: independence, full dependence as a stress test, and a Gaussian copula in between.

Use this pattern when the downstream ranvars already include local starting conditions such as store stock, and the upstream echelon must inherit that information.

Step 1: Build the downstream marginals

Model one ranvar per downstream location after local stock has already been consumed. In the example below, Stores.UncoveredDemand is the quantity that must still be supplied by the upstream echelon.

Step 2: Aggregate with three dependence assumptions

Paste the script below into a new Envision project and run it.

table Stores[store] = with
  [| as store, as StockOnHand, as Mu, as Dispersion |]
  [| "Paris" , 3, 6, 2.0 |]
  [| "Lyon"  , 1, 4, 1.5 |]
  [| "Lille" , 0, 2, 1.2 |]

Stores.Demand = negativeBinomial(Stores.Mu, Stores.Dispersion)
Stores.UncoveredDemand = max(Stores.Demand - Stores.StockOnHand, 0)

// Baseline: independent downstream locations.
warehouseCov0 = sum(Stores.UncoveredDemand)

// Stress test: fully dependent locations, using a common quantile draw.
nbSamples = 1000
montecarlo nbSamples with
  u = random.uniform(0, 1)
  total = sum(quantile(Stores.UncoveredDemand, u))
  sample warehouseCov1 = ranvar(total)

// Core method: Gaussian copula between the store marginals.
rho = 0.55
epsilon = 0.0001 // `normal.quantile` expects p in ]0, 1[

montecarlo nbSamples with
  commonU = max(epsilon, min(1 - epsilon, random.uniform(0, 1)))
  globalShock = normal.quantile(0, 1, commonU)

  Stores.LocalNoise = random.normal(0 into Stores, 1)
  Stores.Latent = rho * globalShock + sqrt(1 - rho^2) * Stores.LocalNoise
  Stores.U = normal.cdf(0, 1, Stores.Latent)

  total = sum(quantile(Stores.UncoveredDemand, Stores.U))
  sample warehouseCopula = ranvar(total)

show table "Store marginals" with
  Stores.store
  Stores.StockOnHand
  mean(Stores.UncoveredDemand) as "Mean uncovered"
  variance(Stores.UncoveredDemand) as "Var uncovered"
  quantile(Stores.UncoveredDemand, 0.9) as "P90 uncovered"

show summary "Warehouse comparison" with
  mean(warehouseCov0) as "Mean cov=0" // ~8.26
  variance(warehouseCov0) as "Var cov=0" // ~18.33
  quantile(warehouseCov0, 0.9) as "P90 cov=0" // 14
  mean(warehouseCov1) as "Mean cov=1" // ~7.82
  variance(warehouseCov1) as "Var cov=1" // ~44.40
  quantile(warehouseCov1, 0.9) as "P90 cov=1" // 17
  mean(warehouseCopula) as "Mean copula" // ~8.19
  variance(warehouseCopula) as "Var copula" // ~27.64
  quantile(warehouseCopula, 0.9) as "P90 copula" // 15

Run the script and verify:

This uses normal.quantile and normal.cdf to generate correlated uniforms without going through a discretized zedfunc representation of the standard normal CDF.

Step 3: Recover rho from censored downstream time series

Use a second script when you want to estimate the dependence parameter from a panel of synchronized downstream histories. The script below generates 100 store time series with negative-binomial marginals and a Gaussian copula, then applies stock caps to create days where demand is only partially observable.

After generation, the calibration operates on the observed sales only. The estimated parameter is the average pair correlation of the latent Gaussian layer. The rho used in the copula construction is recovered afterward as its square root.

trueRho = 0.55
truePairCorr = trueRho^2

numStores = 100
epsilon = 0.0001

startDate = date(2024, 1, 1)
endDate = date(2026, 9, 26)
keep span date = [startDate .. endDate]

table Stores max 100 = extend.range(numStores)
Stores.Mu = 2 + Stores.N / 20
Stores.Dispersion = max(1, Stores.Mu * 2)
Stores.DemandRanvar = negativeBinomial(Stores.Mu, Stores.Dispersion)
Stores.Cdf = cdf(Stores.DemandRanvar)

Day.CommonU = random.uniform(epsilon into Day, 1 - epsilon)
Day.Global = normal.quantile(0, 1, Day.CommonU)

table StoreDays max 200k = cross(Stores, Day)
StoreDays.Local = random.normal(0 into StoreDays, 1)
StoreDays.Latent = trueRho * Day.Global + sqrt(1 - truePairCorr) * StoreDays.Local
StoreDays.U = normal.cdf(0, 1, StoreDays.Latent)
StoreDays.Demand = quantile(Stores.DemandRanvar, StoreDays.U)

// On capped days, only the sales are observed, not the full demand.
StoreDays.StockCap = quantile(Stores.DemandRanvar, random.uniform(0.3 into StoreDays, 0.8))
StoreDays.Sales = min(StoreDays.Demand, StoreDays.StockCap)
StoreDays.IsCensored = StoreDays.Sales >= StoreDays.StockCap
StoreDays.IsObserved = not StoreDays.IsCensored

StoreDays.F = valueAt(Stores.Cdf, StoreDays.Sales)
StoreDays.P = int(Stores.DemandRanvar, StoreDays.Sales, StoreDays.Sales)
StoreDays.Uhat = max(epsilon, min(1 - epsilon, StoreDays.F - 0.5 * StoreDays.P))
StoreDays.Zhat = if StoreDays.IsObserved then normal.quantile(0, 1, StoreDays.Uhat) else 0

Day.ObservedStoreCount = count(StoreDays.IsObserved)
Day.SumZ = sum(StoreDays.Zhat) when StoreDays.IsObserved
Day.SumZ2 = sum(StoreDays.Zhat^2) when StoreDays.IsObserved
Day.PairSignal = if Day.ObservedStoreCount < 2 then 0
                 else (Day.SumZ^2 - Day.SumZ2) /
                      (Day.ObservedStoreCount * (Day.ObservedStoreCount - 1))

pairCorr = avg(Day.PairSignal) when (Day.ObservedStoreCount >= 2)

rho = sqrt(pairCorr)

show summary "Recovered dependence" with
  trueRho as "True rho" // 0.55
  rho as "Learned rho" // ~0.60
  truePairCorr as "True pair corr" // 0.3025
  pairCorr as "Estimated pair corr" // ~0.3599
  avg(Day.ObservedStoreCount) as "Avg observed stores" // ~49.7

Run the script and verify:

User Contributed Notes
0 notes + add a note