資料介紹
Gaussian linear modelling cannot address current signal processing demands. In
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
下載該資料的人也在下載
下載該資料的人還在閱讀
更多 >
- 簡述對貝葉斯公式的基本理解 0次下載
- 基于貝葉斯網(wǎng)絡(luò)等的疼痛表情識(shí)別方法 11次下載
- 基于貝葉斯網(wǎng)絡(luò)和數(shù)據(jù)挖掘的航班延誤預(yù)測方法 3次下載
- 一種基于貝葉斯方法的網(wǎng)絡(luò)安全態(tài)勢感知混合模型 19次下載
- 貝葉斯網(wǎng)絡(luò)模型之一依賴估測器模型研究 12次下載
- 樸素貝葉斯NB經(jīng)典案例 2次下載
- 基于貝葉斯網(wǎng)絡(luò)的克隆有害性預(yù)測方法 0次下載
- 建立實(shí)體情感演化貝葉斯置信網(wǎng)的方法 0次下載
- 貝葉斯網(wǎng)絡(luò)分析 2次下載
- 貝葉斯算法(bayesian)介紹 0次下載
- 基于貝葉斯分類研究肌肉動(dòng)作模式識(shí)別方法
- 基于貝葉斯網(wǎng)絡(luò)的故障樹在機(jī)械設(shè)備中的應(yīng)用
- 貝葉斯方法在蛋白質(zhì)耐熱性分類中的研究
- 基于貝葉斯網(wǎng)絡(luò)的軟件項(xiàng)目風(fēng)險(xiǎn)評估模型
- 基于應(yīng)變模態(tài)和貝葉斯方法的桿件損傷識(shí)別
- 貝葉斯優(yōu)化是干什么的(原理解讀) 860次閱讀
- 關(guān)于貝葉斯概念進(jìn)行形式化的建模和推理 388次閱讀
- 對樸素貝葉斯算法原理做展開介紹 1536次閱讀
- 使用樸素貝葉斯和GPU進(jìn)行更快的文本分類 1151次閱讀
- 機(jī)器學(xué)習(xí):簡單的術(shù)語帶你領(lǐng)略貝葉斯優(yōu)化之美 1953次閱讀
- 貝葉斯方法到貝葉斯網(wǎng)絡(luò) 3196次閱讀
- 帶你入門常見的機(jī)器學(xué)習(xí)分類算法——邏輯回歸、樸素貝葉斯、KNN、SVM、決策樹 9978次閱讀
- 為什么AlphaGo調(diào)參用貝葉斯優(yōu)化?手動(dòng)調(diào)參需要8.3天 4253次閱讀
- 貝葉斯統(tǒng)計(jì)的一個(gè)實(shí)踐案例讓你更快的對貝葉斯算法有更多的了解 1.4w次閱讀
- 樸素貝葉斯算法詳細(xì)總結(jié) 3.4w次閱讀
- 機(jī)器學(xué)習(xí)之樸素貝葉斯 831次閱讀
- 基于概率的常見的分類方法--樸素貝葉斯 5089次閱讀
- 怎樣通俗易懂地解釋貝葉斯網(wǎng)絡(luò)和它的應(yīng)用? 4077次閱讀
- 貝葉斯分類算法及其實(shí)現(xiàn) 7380次閱讀
- 如何理解貝葉斯公式 3671次閱讀
下載排行
本周
- 1電子電路原理第七版PDF電子教材免費(fèi)下載
- 0.00 MB | 1490次下載 | 免費(fèi)
- 2單片機(jī)典型實(shí)例介紹
- 18.19 MB | 92次下載 | 1 積分
- 3S7-200PLC編程實(shí)例詳細(xì)資料
- 1.17 MB | 27次下載 | 1 積分
- 4筆記本電腦主板的元件識(shí)別和講解說明
- 4.28 MB | 18次下載 | 4 積分
- 5開關(guān)電源原理及各功能電路詳解
- 0.38 MB | 10次下載 | 免費(fèi)
- 6基于AT89C2051/4051單片機(jī)編程器的實(shí)驗(yàn)
- 0.11 MB | 4次下載 | 免費(fèi)
- 7藍(lán)牙設(shè)備在嵌入式領(lǐng)域的廣泛應(yīng)用
- 0.63 MB | 3次下載 | 免費(fèi)
- 89天練會(huì)電子電路識(shí)圖
- 5.91 MB | 3次下載 | 免費(fèi)
本月
- 1OrCAD10.5下載OrCAD10.5中文版軟件
- 0.00 MB | 234313次下載 | 免費(fèi)
- 2PADS 9.0 2009最新版 -下載
- 0.00 MB | 66304次下載 | 免費(fèi)
- 3protel99下載protel99軟件下載(中文版)
- 0.00 MB | 51209次下載 | 免費(fèi)
- 4LabView 8.0 專業(yè)版下載 (3CD完整版)
- 0.00 MB | 51043次下載 | 免費(fèi)
- 5555集成電路應(yīng)用800例(新編版)
- 0.00 MB | 33562次下載 | 免費(fèi)
- 6接口電路圖大全
- 未知 | 30320次下載 | 免費(fèi)
- 7Multisim 10下載Multisim 10 中文版
- 0.00 MB | 28588次下載 | 免費(fèi)
- 8開關(guān)電源設(shè)計(jì)實(shí)例指南
- 未知 | 21539次下載 | 免費(fèi)
總榜
- 1matlab軟件下載入口
- 未知 | 935053次下載 | 免費(fèi)
- 2protel99se軟件下載(可英文版轉(zhuǎn)中文版)
- 78.1 MB | 537791次下載 | 免費(fèi)
- 3MATLAB 7.1 下載 (含軟件介紹)
- 未知 | 420026次下載 | 免費(fèi)
- 4OrCAD10.5下載OrCAD10.5中文版軟件
- 0.00 MB | 234313次下載 | 免費(fèi)
- 5Altium DXP2002下載入口
- 未知 | 233045次下載 | 免費(fèi)
- 6電路仿真軟件multisim 10.0免費(fèi)下載
- 340992 | 191183次下載 | 免費(fèi)
- 7十天學(xué)會(huì)AVR單片機(jī)與C語言視頻教程 下載
- 158M | 183277次下載 | 免費(fèi)
- 8proe5.0野火版下載(中文版免費(fèi)下載)
- 未知 | 138039次下載 | 免費(fèi)
評論
查看更多