The starting point for many empirical Bayes tasks, such as inference or estimation, is to posit that the true prior \(G\) lies in a convex class of priors \(\mathcal{G}\). Such classes of priors are represented in this package through the abstract type,
ConvexPriorClass
Abstract type representing convex classes of probability distributions \(\mathcal{G}\).
Currently, the following choices for \(\mathcal{G}\) are available:
DiscretePriorClass
DiscretePriorClass(support) <: Empirikos.ConvexPriorClass
Type representing the family of all discrete distributions supported on a subset of support
, i.e., it represents all DiscreteNonParametric
distributions with support = support
and probs
taking values on the probability simplex.
Note that DiscretePriorClass(support)(probs) == DiscreteNonParametric(support, probs)
.
Examples
julia> gcal = DiscretePriorClass([0,0.5,1.0])
DiscretePriorClass | support = [0.0, 0.5, 1.0]
julia> gcal([0.2,0.2,0.6])
DiscreteNonParametric{Float64, Float64, Vector{Float64}, Vector{Float64}}(support=[0.0, 0.5, 1.0], p=[0.2, 0.2, 0.6])
MixturePriorClass
MixturePriorClass(components) <: Empirikos.ConvexPriorClass
Type representing the family of all mixture distributions with mixing components equal to components
, i.e., it represents all MixtureModel
distributions with components = components
and probs
taking values on the probability simplex.
Note that MixturePriorClass(components)(probs) == MixtureModel(components, probs)
.
Examples
julia> gcal = MixturePriorClass([Normal(0,1), Normal(0,2)])
MixturePriorClass (K = 2)
Normal{Float64}(μ=0.0, σ=1.0)
Normal{Float64}(μ=0.0, σ=2.0)
julia> gcal([0.2,0.8])
MixtureModel{Normal{Float64}}(K = 2)
components[1] (prior = 0.2000): Normal{Float64}(μ=0.0, σ=1.0)
components[2] (prior = 0.8000): Normal{Float64}(μ=0.0, σ=2.0)
GaussianScaleMixtureClass
GaussianScaleMixtureClass(σs) <: Empirikos.ConvexPriorClass
Type representing the family of mixtures of Gaussians with mean 0
and standard deviations equal to σs
. GaussianScaleMixtureClass(σs)
represents the same class of distributions as MixturePriorClass.(Normal.(0, σs))
julia> gcal = GaussianScaleMixtureClass([1.0,2.0])
GaussianScaleMixtureClass | σs = [1.0, 2.0]
julia> gcal([0.2,0.8])
MixtureModel{Normal{Float64}}(K = 2)
components[1] (prior = 0.2000): Normal{Float64}(μ=0.0, σ=1.0)
components[2] (prior = 0.8000): Normal{Float64}(μ=0.0, σ=2.0)