Skip to content

Latest commit

 

History

History
181 lines (140 loc) · 16.4 KB

File metadata and controls

181 lines (140 loc) · 16.4 KB

API

Module-wide re-exports

Turing.jl directly re-exports the entire public API of Distributions.jl. Please see its documentation for more details.

Individual exports and re-exports

In this API documentation, for the sake of clarity, we have listed the module that actually defines each of the exported symbols. Note, however, that all of the following symbols are exported unqualified by Turing. That means, for example, you can just write

using Turing

@model function my_model() end

sample(my_model(), Prior(), 100)

instead of

DynamicPPL.@model function my_model() end

sample(my_model(), Turing.Inference.Prior(), 100)

even though Prior() is actually defined in the Turing.Inference module and [@model](@extref DynamicPPL.@model) in the DynamicPPL package.

Modelling

Exported symbol Documentation Description
@model DynamicPPL.@model Define a probabilistic model
@varname AbstractPPL.@varname Generate a VarName from a Julia expression
to_submodel DynamicPPL.to_submodel Define a submodel
prefix DynamicPPL.prefix Prefix all variable names in a model with a given VarName
LogDensityFunction DynamicPPL.LogDensityFunction A struct containing all information about how to evaluate a model. Mostly for advanced users
@addlogprob! DynamicPPL.@addlogprob! Add arbitrary log-probability terms during model evaluation
setthreadsafe DynamicPPL.setthreadsafe Mark a model as requiring threadsafe evaluation
might_produce Libtask.might_produce Mark a method signature as potentially calling Libtask.produce
@might_produce Libtask.@might_produce Mark a function name as potentially calling Libtask.produce

Inference

Exported symbol Documentation Description
sample StatsBase.sample Sample from a model
MCMCThreads AbstractMCMC.MCMCThreads Run MCMC using multiple threads
MCMCDistributed AbstractMCMC.MCMCDistributed Run MCMC using multiple processes
MCMCSerial AbstractMCMC.MCMCSerial Run MCMC using without parallelism
loadstate Turing.Inference.loadstate Load saved state from an MCMC chain
VNChain n/a Alias for FlexiChain{VarName}

Samplers

Exported symbol Documentation Description
Prior Turing.Inference.Prior Sample from the prior distribution
MH Turing.Inference.MH Metropolis–Hastings
Emcee Turing.Inference.Emcee Affine-invariant ensemble sampler
ESS Turing.Inference.ESS Elliptical slice sampling
Gibbs Turing.Inference.Gibbs Gibbs sampling
GibbsConditional Turing.Inference.GibbsConditional Gibbs sampling with analytical conditional posterior distributions
HMC Turing.Inference.HMC Hamiltonian Monte Carlo
SGLD Turing.Inference.SGLD Stochastic gradient Langevin dynamics
SGHMC Turing.Inference.SGHMC Stochastic gradient Hamiltonian Monte Carlo
PolynomialStepsize Turing.Inference.PolynomialStepsize Returns a function which generates polynomially decaying step sizes
HMCDA Turing.Inference.HMCDA Hamiltonian Monte Carlo with dual averaging
NUTS Turing.Inference.NUTS No-U-Turn Sampler
SMC Turing.Inference.SMC Sequential Monte Carlo
PG Turing.Inference.PG Particle Gibbs
CSMC Turing.Inference.CSMC The same as PG
RepeatSampler Turing.Inference.RepeatSampler A sampler that runs multiple times on the same variable
externalsampler Turing.Inference.externalsampler Wrap an external sampler for use in Turing

Data structures

Exported symbol Documentation Description
@vnt DynamicPPL.@vnt Generate a VarNameTuple
VarNamedTuple DynamicPPL.VarNamedTuple A mapping from VarNames to values
OrderedDict OrderedCollections.OrderedDict An ordered dictionary

DynamicPPL utilities

Please see the generated quantities and probability interface guides for more information.

Exported symbol Documentation Description
returned DynamicPPL.returned Calculate additional quantities defined in a model
predict StatsAPI.predict Generate samples from posterior predictive distribution
pointwise_loglikelihoods DynamicPPL.pointwise_loglikelihoods Compute log likelihoods for each sample in a chain
logprior DynamicPPL.logprior Compute log prior probability
logjoint DynamicPPL.logjoint Compute log joint probability
condition AbstractPPL.condition Condition a model on data
decondition AbstractPPL.decondition Remove conditioning on data
conditioned DynamicPPL.conditioned Return the conditioned values of a model
fix DynamicPPL.fix Fix the value of a variable
unfix DynamicPPL.unfix Unfix the value of a variable

Initialisation strategies

Turing.jl provides several strategies to initialise parameters for models.

Exported symbol Documentation Description
InitFromPrior DynamicPPL.InitFromPrior Obtain initial parameters from the prior distribution
InitFromUniform DynamicPPL.InitFromUniform Obtain initial parameters by sampling uniformly in linked space
InitFromParams DynamicPPL.InitFromParams Manually specify (possibly a subset of) initial parameters

Variational inference

See the docs of AdvancedVI.jl for detailed usage and the variational inference tutorial for a basic walkthrough.

Exported symbol Documentation Description
vi Turing.vi Perform variational inference
q_locationscale Turing.Variational.q_locationscale Find a numerically non-degenerate initialization for a location-scale variational family
q_meanfield_gaussian Turing.Variational.q_meanfield_gaussian Find a numerically non-degenerate initialization for a mean-field Gaussian family
q_fullrank_gaussian Turing.Variational.q_fullrank_gaussian Find a numerically non-degenerate initialization for a full-rank Gaussian family
KLMinRepGradDescent AdvancedVI.KLMinRepGradDescent KL divergence minimization via stochastic gradient descent with the reparameterization gradient
KLMinRepGradProxDescent AdvancedVI.KLMinRepGradProxDescent KL divergence minimization via stochastic proximal gradient descent with the reparameterization gradient over location-scale variational families
KLMinScoreGradDescent AdvancedVI.KLMinScoreGradDescent KL divergence minimization via stochastic gradient descent with the score gradient
KLMinWassFwdBwd AdvancedVI.KLMinWassFwdBwd KL divergence minimization via Wasserstein proximal gradient descent
KLMinNaturalGradDescent AdvancedVI.KLMinNaturalGradDescent KL divergence minimization via natural gradient descent
KLMinSqrtNaturalGradDescent AdvancedVI.KLMinSqrtNaturalGradDescent KL divergence minimization via natural gradient descent in the square-root parameterization
FisherMinBatchMatch AdvancedVI.FisherMinBatchMatch Covariance-weighted Fisher divergence minimization via the batch-and-match algorithm

Automatic differentiation types

These are used to specify the automatic differentiation backend to use. See the AD guide for more information.

Exported symbol Documentation Description
AutoEnzyme ADTypes.AutoEnzyme Enzyme.jl backend
AutoForwardDiff ADTypes.AutoForwardDiff ForwardDiff.jl backend
AutoMooncake ADTypes.AutoMooncake Mooncake.jl backend
AutoReverseDiff ADTypes.AutoReverseDiff ReverseDiff.jl backend

Debugging

setprogress!

Distributions

These distributions are defined in Turing.jl, but not in Distributions.jl.

Flat
FlatPos
BinomialLogit
OrderedLogistic
LogPoisson

Tools to work with distributions

Exported symbol Documentation Description
I LinearAlgebra.I Identity matrix
filldist DynamicPPL.filldist Create a product distribution from a distribution and integers
arraydist DynamicPPL.arraydist Create a product distribution from an array of distributions
NamedDist DynamicPPL.NamedDist A distribution that carries the name of the variable

Point estimates

See the mode estimation tutorial for more information.

Exported symbol Documentation Description
maximum_a_posteriori Turing.Optimisation.maximum_a_posteriori Find a MAP estimate for a model
maximum_likelihood Turing.Optimisation.maximum_likelihood Find a MLE estimate for a model
MAP Turing.Optimisation.MAP Type to use with Optim.jl for MAP estimation
MLE Turing.Optimisation.MLE Type to use with Optim.jl for MLE estimation
vector_names_and_params Turing.Optimisation.vector_names_and_params Extract parameter names and values as vectors