Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation

In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimi...

Full description

Bibliographic Details
Main Authors: García-Fernández, Ángel, Vo, Ba-Ngu
Format: Conference Paper
Published: 2015
Online Access:http://hdl.handle.net/20.500.11937/43581
_version_ 1848756740012638208
author García-Fernández, Ángel
Vo, Ba-Ngu
author_facet García-Fernández, Ángel
Vo, Ba-Ngu
author_sort García-Fernández, Ángel
building Curtin Institutional Repository
collection Online Access
description In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimisations after the prediction and update steps. The novelty of this paper is that the KLD minimisation is performed directly on the multitarget prediction and posterior densities.
first_indexed 2025-11-14T09:17:00Z
format Conference Paper
id curtin-20.500.11937-43581
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T09:17:00Z
publishDate 2015
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-435812017-09-13T13:37:33Z Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation García-Fernández, Ángel Vo, Ba-Ngu In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimisations after the prediction and update steps. The novelty of this paper is that the KLD minimisation is performed directly on the multitarget prediction and posterior densities. 2015 Conference Paper http://hdl.handle.net/20.500.11937/43581 10.1109/ICCAIS.2015.7338663 fulltext
spellingShingle García-Fernández, Ángel
Vo, Ba-Ngu
Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title_full Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title_fullStr Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title_full_unstemmed Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title_short Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
title_sort derivation of the phd filter based on direct kullback-leibler divergence minimisation
url http://hdl.handle.net/20.500.11937/43581