Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization

In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and impli...

Full description

Bibliographic Details
Main Authors: Garcia-Fernandez, Angel, Vo, Ba-Ngu
Format: Journal Article
Published: IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC 2015
Online Access:http://purl.org/au-research/grants/arc/DP130104404
http://hdl.handle.net/20.500.11937/24733
_version_ 1848751511215013888
author Garcia-Fernandez, Angel
Vo, Ba-Ngu
author_facet Garcia-Fernandez, Angel
Vo, Ba-Ngu
author_sort Garcia-Fernandez, Angel
building Curtin Institutional Repository
collection Online Access
description In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimizations after the prediction and update steps. We perform the KLD minimizations directly on the multitarget prediction and posterior densities.
first_indexed 2025-11-14T07:53:53Z
format Journal Article
id curtin-20.500.11937-24733
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T07:53:53Z
publishDate 2015
publisher IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-247332022-10-12T02:37:34Z Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization Garcia-Fernandez, Angel Vo, Ba-Ngu In this paper, we provide novel derivations of the probability hypothesis density (PHD) and cardinalised PHD (CPHD) filters without using probability generating functionals or functional derivatives. We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimizations after the prediction and update steps. We perform the KLD minimizations directly on the multitarget prediction and posterior densities. 2015 Journal Article http://hdl.handle.net/20.500.11937/24733 10.1109/TSP.2015.2468677 http://purl.org/au-research/grants/arc/DP130104404 IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC restricted
spellingShingle Garcia-Fernandez, Angel
Vo, Ba-Ngu
Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title_full Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title_fullStr Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title_full_unstemmed Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title_short Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization
title_sort derivation of the phd and cphd filters based on direct kullback-leibler divergence minimization
url http://purl.org/au-research/grants/arc/DP130104404
http://hdl.handle.net/20.500.11937/24733