Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation

In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimi...

Full description

Bibliographic Details
Main Authors: García-Fernández, Ángel, Vo, Ba-Ngu
Format: Conference Paper
Published: 2015
Online Access:http://hdl.handle.net/20.500.11937/43581
Description
Summary:In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimisations after the prediction and update steps. The novelty of this paper is that the KLD minimisation is performed directly on the multitarget prediction and posterior densities.