Title: | Bayesian Estimation for Finite Mixture of Distributions |
Version: | 1.7 |
Description: | Provides statistical tools for Bayesian estimation of mixture distributions, mainly a mixture of Gamma, Normal, and t-distributions. The package is implemented based on the Bayesian literature for the finite mixture of distributions, including Mohammadi and et al. (2013) <doi:10.1007/s00180-012-0323-3> and Mohammadi and Salehi-Rad (2012) <doi:10.1080/03610918.2011.588358>. |
URL: | https://www.uva.nl/profile/a.mohammadi |
Depends: | R (≥ 3.0.0) |
Imports: | BDgraph |
License: | GPL-2 | GPL-3 [expanded from: GPL (≥ 2)] |
Repository: | CRAN |
NeedsCompilation: | yes |
Author: | Reza Mohammadi |
Maintainer: | Reza Mohammadi <a.mohammadi@uva.nl> |
Packaged: | 2021-05-11 09:14:57 UTC; reza |
Date/Publication: | 2021-05-11 11:42:19 UTC |
Bayesian Estimation for Finite Mixture of Distributions
Description
The R
package bmixture provides statistical tools for Bayesian estimation in finite mixture of distributions.
The package implemented the improvements in the Bayesian literature, including Mohammadi and Salehi-Rad (2012) and Mohammadi et al. (2013).
Besides, the package contains several functions for simulation and visualization, as well as a real dataset taken from the literature.
How to cite this package
Whenever using this package, please cite as
Mohammadi R. (2019). bmixture: Bayesian Estimation for Finite Mixture of
Distributions, R
package version 1.5, https://CRAN.R-project.org/package=bmixture
Author(s)
Reza Mohammadi <a.mohammadi@uva.nl>
References
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi: 10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi: 10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi: 10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi: 10.1214/17-BA1073
Examples
## Not run:
require( bmixture )
data( galaxy )
# Runing bdmcmc algorithm for the galaxy dataset
mcmc_sample = bmixnorm( data = galaxy )
summary( mcmc_sample )
plot( mcmc_sample )
print( mcmc_sample)
# simulating data from mixture of Normal with 3 components
n = 500
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
weight = c( 0.3, 0.5, 0.2 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 )
summary( bmixnorm.obj )
## End(Not run)
Sampling algorithm for mixture of gamma distributions
Description
This function consists of several sampling algorithms for Bayesian estimation for a mixture of Gamma distributions.
Usage
bmixgamma( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1,
mu = NULL, nu = NULL, kesi = NULL, tau = NULL, k.start = NULL,
alpha.start = NULL, beta.start = NULL, pi.start = NULL,
k.max = 30, trace = TRUE )
Arguments
data |
vector of data with size |
k |
number of components of mixture distribution. It can take an integer values. |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
mu |
parameter of alpha in mixture distribution. |
nu |
parameter of alpha in mixture distribution. |
kesi |
parameter of beta in mixture distribution. |
tau |
parameter of beta in mixture distribution. |
k.start |
For the case |
alpha.start |
Initial value for parameter of mixture distribution. |
beta.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Details
Sampling from finite mixture of Gamma distribution, with density:
Pr(x|k, \underline{\pi}, \underline{\alpha}, \underline{\beta}) = \sum_{i=1}^{k} \pi_{i} Gamma(x|\alpha_{i}, \beta_{i}),
where k
is the number of components of mixture distribution (as a defult we assume is unknown
) and
Gamma(x|\alpha_{i}, \beta_{i})=\frac{(\beta_{i})^{\alpha_{i}}}{\Gamma(\alpha_{i})} x^{\alpha_{i}-1} e^{-\beta_{i}x}.
The prior distributions are defined as below
P(K=k) \propto \frac{\lambda^k}{k!}, \ \ \ k=1,...,k_{max},
\pi_{i} | k \sim Dirichlet( 1,..., 1 ),
\alpha_{i} | k \sim Gamma(\nu, \upsilon),
\beta_i | k \sim Gamma(\eta, \tau),
for more details see Mohammadi et al. (2013), doi: 10.1007/s00180-012-0323-3.
Value
An object with S3
class "bmixgamma"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
alpha_sample |
a vector which includes the MCMC samples after burn-in from parameter |
beta_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi: 10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi: 10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi: 10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi: 10.1214/17-BA1073
See Also
Examples
## Not run:
set.seed( 70 )
# simulating data from mixture of gamma with two components
n = 1000 # number of observations
weight = c( 0.6, 0.4 )
alpha = c( 12 , 1 )
beta = c( 3 , 2 )
data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta )
# plot for simulation data
hist( data, prob = TRUE, nclass = 50, col = "gray" )
x = seq( 0, 10, 0.05 )
truth = dmixgamma( x, weight, alpha, beta )
lines( x, truth, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixgamma.obj = bmixgamma( data, iter = 1000 )
summary( bmixgamma.obj )
plot( bmixgamma.obj )
## End(Not run)
Sampling algorithm for mixture of Normal distributions
Description
This function consists of several sampling algorithms for Bayesian estimation for finite a mixture of Normal distributions.
Usage
bmixnorm( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1,
k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL,
k.max = 30, trace = TRUE )
Arguments
data |
vector of data with size |
k |
number of components of mixture distribution. It can take an integer values. |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
k.start |
For the case |
mu.start |
Initial value for parameter of mixture distribution. |
sig.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Details
Sampling from finite mixture of Normal distribution, with density:
Pr(x|k, \underline{\pi}, \underline{\mu}, \underline{\sigma}) = \sum_{i=1}^{k} \pi_{i} N(x|\mu_{i}, \sigma^2_{i}),
where k
is the number of components of mixture distribution (as a defult we assume is unknown
).
The prior distributions are defined as below
P(K=k) \propto \frac{\lambda^k}{k!}, \ \ \ k=1,...,k_{max},
\pi_{i} | k \sim Dirichlet( 1,..., 1 ),
\mu_{i} | k \sim N( \epsilon, \kappa ),
\sigma_i | k \sim IG( g, h ),
where IG
denotes an inverted gamma distribution. For more details see for more details see Stephens, M. (2000), doi: 10.1214/aos/1016120364.
Value
An object with S3
class "bmixnorm"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
mu_sample |
a vector which includes the MCMC samples after burn-in from parameter |
sig_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi: 10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi: 10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi: 10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi: 10.1214/17-BA1073
See Also
Examples
## Not run:
data( galaxy )
set.seed( 70 )
# Runing bdmcmc algorithm for the galaxy dataset
mcmc_sample = bmixnorm( data = galaxy )
summary( mcmc_sample )
plot( mcmc_sample )
print( mcmc_sample)
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 )
summary( bmixnorm.obj )
## End(Not run)
Sampling algorithm for mixture of t-distributions
Description
This function consists of several sampling algorithms for Bayesian estimation for finite mixture of t-distributions.
Usage
bmixt( data, k = "unknown", iter = 1000, burnin = iter / 2, lambda = 1, df = 1,
k.start = NULL, mu.start = NULL, sig.start = NULL, pi.start = NULL,
k.max = 30, trace = TRUE )
Arguments
data |
vector of data with size |
k |
number of components of mixture distribution. Defult is |
iter |
number of iteration for the sampling algorithm. |
burnin |
number of burn-in iteration for the sampling algorithm. |
lambda |
For the case |
df |
Degrees of freedom (> 0, maybe non-integer). df = Inf is allowed. |
k.start |
For the case |
mu.start |
Initial value for parameter of mixture distribution. |
sig.start |
Initial value for parameter of mixture distribution. |
pi.start |
Initial value for parameter of mixture distribution. |
k.max |
For the case |
trace |
Logical: if TRUE (default), tracing information is printed. |
Details
Sampling from finite mixture of t-distribution, with density:
Pr(x|k, \underline{\pi}, \underline{\mu}, \underline{\sigma}) = \sum_{i=1}^{k} \pi_{i} t_p(x|\mu_{i}, \sigma^2_{i}),
where k
is the number of components of mixture distribution (as a defult we assume is unknown
).
The prior distributions are defined as below
P(K=k) \propto \frac{\lambda^k}{k!}, \ \ \ k=1,...,k_{max},
\pi_{i} | k \sim Dirichlet( 1,..., 1 ),
\mu_{i} | k \sim N( \epsilon, \kappa ),
\sigma_i | k \sim IG( g, h ),
where IG
denotes an inverted gamma distribution. For more details see Stephens, M. (2000), doi: 10.1214/aos/1016120364.
Value
An object with S3
class "bmixt"
is returned:
all_k |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
all_weights |
a vector which includes the waiting times for all iterations. It is needed for monitoring the convergence of the BD-MCMC algorithm. |
pi_sample |
a vector which includes the MCMC samples after burn-in from parameter |
mu_sample |
a vector which includes the MCMC samples after burn-in from parameter |
sig_sample |
a vector which includes the MCMC samples after burn-in from parameter |
data |
original data. |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi: 10.1214/aos/1016120364
Richardson, S. and Green, P. J. (1997) On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Society: series B, 59(4):731-792, doi: 10.1111/1467-9868.00095
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711-732, doi: 10.1093/biomet/82.4.711
Cappe, O., Christian P. R., and Tobias, R. (2003) Reversible jump, birth and death and more general continuous time Markov chain Monte Carlo samplers. Journal of the Royal Statistical Society: Series B, 65(3):679-700
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
Wade, S. and Ghahramani, Z. (2018) Bayesian Cluster Analysis: Point Estimation and Credible Balls (with Discussion). Bayesian Analysis, 13(2):559-626, doi: 10.1214/17-BA1073
See Also
Examples
## Not run:
set.seed( 20 )
# simulating data from mixture of Normal with 3 components
n = 2000
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixt.obj = bmixt( data, k = 3, iter = 5000 )
summary( bmixt.obj )
## End(Not run)
Internal bmixture functions and datasets
Description
Nothing to see here move along
Author(s)
Reza Mohammadi <a.mohammadi@uva.nl>
Galaxy data
Description
This dataset considers of 82 observatons of the velocities (in 1000 km/second) of distant galaxies diverging from our own, from six well-separated conic sections of the Corona Borealis. The dataset has been analyzed under a variety of mixture models; See e.g. Stephens (2000).
Usage
data( galaxy )
Format
A data frame with 82 observations on the following variable.
speed
a numeric vector giving the speed of galaxies (in 1000 km/second).
References
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components-an alternative to reversible jump methods. Annals of statistics, 28(1):40-74, doi: 10.1214/aos/1016120364
Examples
data( galaxy )
hist( galaxy, prob = TRUE, xlim = c( 0, 40 ), ylim = c( 0, 0.3 ), nclass = 20,
col = "gray", border = "white" )
lines( density( galaxy ), col = "black", lwd = 2 )
Mixture of Gamma distribution
Description
Random generation and density function for a finite mixture of Gamma distribution.
Usage
rmixgamma( n = 10, weight = 1, alpha = 1, beta = 1 )
dmixgamma( x, weight = 1, alpha = 1, beta = 1 )
Arguments
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
alpha |
vector of non-negative parameters of the Gamma distribution. |
beta |
vector of non-negative parameters of the Gamma distribution. |
Details
Sampling from finite mixture of Gamma distribution, with density:
Pr(x|\underline{w}, \underline{\alpha}, \underline{\beta}) = \sum_{i=1}^{k} w_{i} Gamma(x|\alpha_{i}, \beta_{i}),
where
Gamma(x|\alpha_{i}, \beta_{i})=\frac{(\beta_{i})^{\alpha_{i}}}{\Gamma(\alpha_{i})} x^{\alpha_{i}-1} e^{-\beta_{i}x}.
Value
Generated data as an vector with size n
.
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
See Also
Examples
## Not run:
n = 10000
weight = c( 0.6 , 0.3 , 0.1 )
alpha = c( 100 , 200 , 300 )
beta = c( 100/3, 200/4, 300/5 )
data = rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta )
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixgamma = dmixnorm( x, weight, alpha, beta )
lines( x, densmixgamma, lwd = 2 )
## End(Not run)
Mixture of Normal distribution
Description
Random generation and density function for a finite mixture of univariate Normal distribution.
Usage
rmixnorm( n = 10, weight = 1, mean = 0, sd = 1 )
dmixnorm( x, weight = 1, mean = 0, sd = 1 )
Arguments
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
mean |
vector of means. |
sd |
vector of standard deviations. |
Details
Sampling from finite mixture of Normal distribution, with density:
Pr(x|\underline{w}, \underline{\mu}, \underline{\sigma}) = \sum_{i=1}^{k} w_{i} N(x|\mu_{i}, \sigma_{i}).
Value
Generated data as an vector with size n
.
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
See Also
Examples
## Not run:
n = 10000
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
## End(Not run)
Mixture of t-distribution
Description
Random generation and density function for a finite mixture of univariate t-distribution.
Usage
rmixt( n = 10, weight = 1, df = 1, mean = 0, sd = 1 )
dmixt( x, weight = 1, df = 1, mean = 0, sd = 1 )
Arguments
n |
number of observations. |
x |
vector of quantiles. |
weight |
vector of probability weights, with length equal to number of components ( |
df |
vector of degrees of freedom (> 0, maybe non-integer). df = Inf is allowed. |
mean |
vector of means. |
sd |
vector of standard deviations. |
Details
Sampling from finite mixture of t-distribution, with density:
Pr(x|\underline{w}, \underline{df}, \underline{\mu}, \underline{\sigma}) = \sum_{i=1}^{k} w_{i} t_{df}(x| \mu_{i}, \sigma_{i}),
where
t_{df}(x| \mu, \sigma) = \frac{ \Gamma( \frac{df+1}{2} ) }{ \Gamma( \frac{df}{2} ) \sqrt{\pi df} \sigma } \left( 1 + \frac{1}{df} \left( \frac{x-\mu}{\sigma} \right) ^2 \right) ^{- \frac{df+1}{2} }.
Value
Generated data as an vector with size n
.
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
References
Mohammadi, A., Salehi-Rad, M. R., and Wit, E. C. (2013) Using mixture of Gamma distributions for Bayesian analysis in an M/G/1 queue with optional second service. Computational Statistics, 28(2):683-700, doi: 10.1007/s00180-012-0323-3
Mohammadi, A., and Salehi-Rad, M. R. (2012) Bayesian inference and prediction in an M/G/1 with optional second service. Communications in Statistics-Simulation and Computation, 41(3):419-435, doi: 10.1080/03610918.2011.588358
See Also
Examples
## Not run:
n = 10000
weight = c( 0.3, 0.5, 0.2 )
df = c( 4 , 4 , 4 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixt( n = n, weight = weight, df = df, mean = mean, sd = sd )
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixt = dmixt( x, weight, df, mean, sd )
lines( x, densmixt, lwd = 2 )
## End(Not run)
Plot function for S3
class "bmixgamma"
Description
Visualizes the results for function bmixgamma
.
Usage
## S3 method for class 'bmixgamma'
plot( x, ... )
Arguments
x |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of gamma with two components
n = 500 # number of observations
weight = c( 0.6, 0.4 )
alpha = c( 12 , 1 )
beta = c( 3 , 2 )
data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta )
# plot for simulation data
hist( data, prob = TRUE, nclass = 50, col = "gray" )
x = seq( 0, 10, 0.05 )
truth = dmixgamma( x, weight, alpha, beta )
lines( x, truth, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixgamma.obj <- bmixgamma( data )
plot( bmixgamma.obj )
## End(Not run)
Plot function for S3
class "bmixnorm"
Description
Visualizes the results for function bmixnorm
.
Usage
## S3 method for class 'bmixnorm'
plot( x, ... )
Arguments
x |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixnorm.obj = bmixnorm( data, k = 3 )
plot( bmixnorm.obj )
## End(Not run)
Plot function for S3
class "bmixt"
Description
Visualizes the results for function bmixt
.
Usage
## S3 method for class 'bmixt'
plot( x, ... )
Arguments
x |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixt.obj = bmixt( data, k = 3 )
plot( bmixt.obj )
## End(Not run)
Print function for S3
class "bmixgamma"
Description
Prints the information about the output of function bmixgamma
.
Usage
## S3 method for class 'bmixgamma'
print( x, ... )
Arguments
x |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of gamma with two components
n = 500 # number of observations
weight = c( 0.6, 0.4 )
alpha = c( 12 , 1 )
beta = c( 3 , 2 )
data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta )
# plot for simulation data
hist( data, prob = TRUE, nclass = 50, col = "gray" )
x = seq( 0, 10, 0.05 )
truth = dmixgamma( x, weight, alpha, beta )
lines( x, truth, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixgamma.obj <- bmixgamma( data, iter = 500 )
print( bmixgamma.obj )
## End(Not run)
Print function for S3
class "bmixnorm"
Description
Prints the information about the output of function bmixnorm
.
Usage
## S3 method for class 'bmixnorm'
print( x, ... )
Arguments
x |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 )
print( bmixnorm.obj )
## End(Not run)
Print function for S3
class "bmixt"
Description
Prints the information about the output of function bmixt
.
Usage
## S3 method for class 'bmixt'
print( x, ... )
Arguments
x |
object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixt.obj = bmixt( data, k = 3, iter = 1000 )
print( bmixt.obj )
## End(Not run)
Random generation for the Dirichlet distribution
Description
Random generation from the Dirichlet distribution.
Usage
rdirichlet( n = 10, alpha = c( 1, 1 ) )
Arguments
n |
number of observations. |
alpha |
vector of shape parameters. |
Details
The Dirichlet distribution is the multidimensional generalization of the beta distribution.
Value
A matrix with n
rows, each containing a single Dirichlet random deviate.
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
draws = rdirichlet( n = 500, alpha = c( 1, 1, 1 ) )
boxplot( draws )
Summary function for S3
class "bmixgamma"
Description
Provides a summary of the results for function bmixgamma
.
Usage
## S3 method for class 'bmixgamma'
summary( object, ... )
Arguments
object |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of gamma with two components
n = 500 # number of observations
weight = c( 0.6, 0.4 )
alpha = c( 12 , 1 )
beta = c( 3 , 2 )
data <- rmixgamma( n = n, weight = weight, alpha = alpha, beta = beta )
# plot for simulation data
hist( data, prob = TRUE, nclass = 50, col = "gray" )
x = seq( 0, 10, 0.05 )
truth = dmixgamma( x, weight, alpha, beta )
lines( x, truth, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixgamma.obj <- bmixgamma( data, iter = 500 )
summary( bmixgamma.obj )
## End(Not run)
Summary function for S3
class "bmixnorm"
Description
Provides a summary of the results for function bmixnorm
.
Usage
## S3 method for class 'bmixnorm'
summary( object, ... )
Arguments
object |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixnorm.obj = bmixnorm( data, k = 3, iter = 1000 )
summary( bmixnorm.obj )
## End(Not run)
Summary function for S3
class "bmixt"
Description
Provides a summary of the results for function bmixt
.
Usage
## S3 method for class 'bmixt'
summary( object, ... )
Arguments
object |
An object of |
... |
System reserved (no specific usage). |
Author(s)
Reza Mohammadi a.mohammadi@uva.nl
See Also
Examples
## Not run:
# simulating data from mixture of Normal with 3 components
n = 500
weight = c( 0.3, 0.5, 0.2 )
mean = c( 0 , 10 , 3 )
sd = c( 1 , 1 , 1 )
data = rmixnorm( n = n, weight = weight, mean = mean, sd = sd )
# plot for simulation data
hist( data, prob = TRUE, nclass = 30, col = "gray" )
x = seq( -20, 20, 0.05 )
densmixnorm = dmixnorm( x, weight, mean, sd )
lines( x, densmixnorm, lwd = 2 )
# Runing bdmcmc algorithm for the above simulation data set
bmixt.obj = bmixt( data, k = 3, iter = 1000 )
summary( bmixt.obj )
## End(Not run)