Derivation of the PHD filter based on direct Kullback-Leibler divergence minimisation
In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimi...
| Main Authors: | , |
|---|---|
| Format: | Conference Paper |
| Published: |
2015
|
| Online Access: | http://hdl.handle.net/20.500.11937/43581 |
| Summary: | In this paper, we provide a novel derivation of the probability hypothesis density (PHD) filter without using probability generating functionals or functional derivatives. The PHD filter fits in the context of assumed density filtering and implicitly performs Kullback-Leibler divergence (KLD) minimisations after the prediction and update steps. The novelty of this paper is that the KLD minimisation is performed directly on the multitarget prediction and posterior densities. |
|---|