Beyond the Kalman FilterParticle Filters for Tracking Applications

December 18, 2016 | Author: nodidino | Category: N/A
Share Embed Donate


Short Description

kalman, particle filter...

Description

• Beyond the Kalman Filter: Particle filters for tracking applications N. J. Gordon Tracking and Sensor Fusion Group Intelligence, Surveillance and Reconnaissance Division Defence Science and Technology Organisation PO Box 1500, Edinburgh, SA 5111, AUSTRALIA. [email protected] N.J. Gordon : Lake Louise : October 2003 – p. 1/47

Contents



– General PF discussion – History – Review – Tracking applications – Sonobuoy – TBD – DBZ

N.J. Gordon : Lake Louise : October 2003 – p. 2/47



What is tracking? – Use models of the real world to – estimate the past and present – predict the future – Achieved by extracting underlying information from sequence of noisy/uncertain observations – Perform inference on-line – Evaluate evolving sequence of probability distributions

N.J. Gordon : Lake Louise : October 2003 – p. 3/47



Recursive filter System model xt = ft (xt`1 ; ›t )

$

p (xt xt`1 )

yt = ht (xt ; t )

$

p (yt xt )

Measurement model

Information available y1:t = (y1 ; : : : ; yt ) p(x0 ) Want p (x0:t+i y1:t ) and especially p (xt y1:t )

N.J. Gordon : Lake Louise : October 2003 – p. 4/47



Recursive filter Prediction p (xt y1:t`1 ) =

Z

p (xt xt`1 ) p (xt`1 y1:t`1 ) dxt`1

Update p (xt y1:t ) =

p (yt y1:t`1 ) =

Z

p (yt xt ) p (xt y1:t`1 ) p (yt y1:t`1 ) p (yt xt ) p (xt y1:t`1 ) dxt

Alternatively ...

p (x0:t y1:t ) = p (x0:t`1 y1:t`1 )

p (xt y1:t ) =

Z

p (yt xt ) p (xt xt`1 ) p (yt y1:t`1 )

p (x0:t y1:t ) dx0:t`1

N.J. Gordon : Lake Louise : October 2003 – p. 5/47

Tracking : what are the problems?



– On-line processing – Target manoeuvres – Missing measurements – Spurious measurements – Multiple objects and/or sensors – Finite sensor resolution – Prior constraints – Signature information

Nonlinear/non-Gaussian models

N.J. Gordon : Lake Louise : October 2003 – p. 6/47



Analytic approximations – EKF and variants : linearisation, Gaussian approx, unimodal – Score function moment approximation : Masreliez (75), West (81), Fahrmeir (92), Pericchi, Smith (92) – Series based approximation to score functions : Wu, Cheng (92)

N.J. Gordon : Lake Louise : October 2003 – p. 7/47



Numerical approximations – Discrete grid: Pole, West (88) – Piecewise pdf: Kitagawa (87), Kramer, Sorenson (88) – Series expansion: Sorenson, Stubberud (68) – Gaussian mixtures: Sorenson, Alspach (72), West (92) – Unscented filter: Julier, Uhlman (95)

N.J. Gordon : Lake Louise : October 2003 – p. 8/47



Monte Carlo Approximations – Sequential Importance Sampling (SIS): Handschin & Mayne, Automatica, 1969. – Improved SIS: Zaritskii et al., Automation and Remote Control, 1975. – Rao-Blackwellisation: Akashi & Kumamoto, Automatica, 1977...

) Too computationally demanding 20-30 years ago

N.J. Gordon : Lake Louise : October 2003 – p. 9/47



Sequential Monte Carlo (SMC) – SMC methods lead to estimate of the complete probability distribution – Approximation centred on the pdf rather than compromising the state space model – Known as Particle filters, SIR filters, bootstrap filters, Monte Carlo filters, Condensation etc

N.J. Gordon : Lake Louise : October 2003 – p. 10/47



Why random samples? – nonlinear/non-Gaussian

– constraints

– whole pdf – moments/quantiles

– association hypotheses independent over time

– HPD interval

– multiple models trivial

– re-parameterisation

– scalable - how big is 1 – parallelisable

N.J. Gordon : Lake Louise : October 2003 – p. 11/47



Comparison Kalman filter

Particle filter

– analytic solution

– sequential MC solution

– restrictive assumptions

– based on simulation

– deduce state from measurement

– no modelling restrictions

– KF “optimal”

– predicts measurements from states

– EKF “sub-optimal”

– optimal (with 1 computational resources)

N.J. Gordon : Lake Louise : October 2003 – p. 12/47



Book Advert Sequential Monte Carlo methods in practice Editors: Doucet, de Freitas, Gordon Springer-Verlag (2001) – Theoretical Foundations – Efficiency Measures – Applications : – Target tracking, missile guidance, image tracking, terrain referenced navigation, exchange rate prediction, portfolio allocation, in-situ ellipsometry, pollution monitoring, communications and audio engineering.

N.J. Gordon : Lake Louise : October 2003 – p. 13/47



Useful Information Books – “Sequential Monte Carlo methods in practice”, Doucet, de Freitas, Gordon, Springer, 2001. – “Monte Carlo strategies in scientific computing”, Liu, Springer, 2001. – “Beyond the Kalman filter : Tracking applications of particle filters”, Ristic, Arulampalam, Gordon, Artech House, 2003? Papers – “On sequential Monte Carlo sampling methods for Bayesian filtering”, Statistics in Computing, Vol 10, No 3, pgs 197-208, 2000. – IEEE Trans. Signal Processing special issue, February 2002. Web site – www.cs.ubc.ca/ nando/smc/index.html (includes software)

N.J. Gordon : Lake Louise : October 2003 – p. 14/47



Particle Filter – Represent uncertainty over x1:t using diversity of weighted particles n

i x1:t ; wti

oN

i=1

– Ideally: i x1:t ‰ p(x1:t j y1:t ) =

where p(y1:t ) =

Z

p(x1:t ; y1:t ) p(y1:t )

p(x1:t ; y1:t )dx1:t

– What if we can’t sample p(x1:t j y1:t )?

N.J. Gordon : Lake Louise : October 2003 – p. 15/47

Particle Filter - Importance Sampling



– Sample from a convenient proposal distribution q(x1:t j y1:t ) – Use importance sampling to modify weights Z

p(x1:t j y1:t )f (x1:t )dx1:t = ı

Z

p(x1:t j y1:t ) q(x1:t j y1:t )f (x1:t )dx1:t q(x1:t j y1:t )

N X

i wti f (x1:t )

i=1

where i x1:t ‰q(x1:t j y1:t )

wti =

p(x1:t j y1:t ) q(x1:t j y1:t )

N.J. Gordon : Lake Louise : October 2003 – p. 16/47

Particle Filter - Importance Sampling



– Pick a convenient proposal – Define the un-normalised weight: w~ti =

p(x1:t ; y1:t ) q(x1:t j y1:t )

– Can then calculate approximation to p(y1:t )

p(y1:t ) ı

N X

w~ti

i=1

– Normalised weight is

wti

p(x1:t ; y1:t ) 1 w~ti p(x1:t j y1:t ) = = PN = q(x1:t j y1:t ) q(x1:t j y1:t ) p(y1:t ) ~ti i=1 w N.J. Gordon : Lake Louise : October 2003 – p. 17/47



Particle Filter - SIS – To perform Sequential Importance Sampling, SIS q(x1:t j y1:t ) , q(x1:t`1 j y1:t`1 ) q(xt j xt`1 ; yt ) | {z }| {z } Keep existing path

Extend path

– The un-normalised weight then takes the appealing form

w~ti

i ;y jy p(x1:t t 1:t`1 ) = i jy ) q(x1:t 1:t

=

i i p(x1:t`1 j y1:t`1 ) p(xti ; yt j x1:t`1 )

i i ; yt ) q(x1:t`1 j y1:t`1 ) q(xti j xt`1

i =wt`1

i i p(yt j xt`1 )p(xti j xt`1 )

|

i q(xti j xt`1 ; yt ) {z Incremental weight

} N.J. Gordon : Lake Louise : October 2003 – p. 18/47

Illustration of SIS



N.J. Gordon : Lake Louise : October 2003 – p. 19/47

Illustration of SIS - data conflict



N.J. Gordon : Lake Louise : October 2003 – p. 20/47



SIS Problem : Whatever the importance function, degeneracy is observed (Kong, Liu and Wong 1994). i with respectively – Introduce a selection scheme to discard/multiply particles x0:t high/low importance weights i ; w ) onto the equally – Resampling maps the weighted random measure (x0:t t i ; N `1 ) weighted random measure (x0:t P – Scheme generates Ni children such that N i=1 Ni = N and satisfies E(Ni ) = Nwki

N.J. Gordon : Lake Louise : October 2003 – p. 21/47

Illustration of SIR



N.J. Gordon : Lake Louise : October 2003 – p. 22/47

Illustration of SIR



N.J. Gordon : Lake Louise : October 2003 – p. 23/47



Ingredients for Particle filter – Importance sampling function (i) ) – Prior p(xt j xt`1 (i) ; yt ) – Optimal p(xt j xt`1

– UKF, linearised EKF, : : : – Redistribution scheme – Multinomial – Deterministic – Residual – Stratified – Careful initialisation procedure (for efficiency) N.J. Gordon : Lake Louise : October 2003 – p. 24/47



Improvements to SIR To alleviate degeneracy problems many other methods have been proposed – Local linearisation (Doucet, 1998; Pitt & Shephard, 1999) using the EKF to estimate the importance distribution or UKF (Doucet et al, 1999) – Rejection methods (Müller, 1991; Hürzeler & Künsch, 1998; Doucet, 1998; Pitt & Shephard, 1999) – Auxiliary particle filters (Pitt & Shephard, 1999) – Kernel smoothing (Gordon, 1993; Liu & West, 2000; Musso et al, 2000) – MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al, 1997; Gilks & Berzuini, 1999; Andrieu et al, 1999) – Bridging densities : (Clapp & Godsill, 1999)

N.J. Gordon : Lake Louise : October 2003 – p. 25/47



Auxiliary SIR - ASIR – Introduced by Pitt and Shephard 1999. – Use importance sampling function q(xt ; i j y1:t ) – Auxiliary variable i refers to index of particle at time t ` 1 – Importance distribution chosen to satisfy i i q(xt ; i j y1:t ) / p(yt j —it )p(xt j xt`1 )wt`1 i – —it is some characterisation of xt given xt`1

i i ) – eg, —it = E(xt j xt`1 ) or —it ‰ p(xt j xt`1

– This gives j

wtj

ij

/ wt`1

i p(yt j xtj )p(xtj j xt`1 )

q(xtj ; i j j y1:t )

=

p(yt j xtj ) j

p(yt j —ti )

N.J. Gordon : Lake Louise : October 2003 – p. 26/47



ASIR – Naturally uses points at t ` 1 which are “close” to measurement yt – If process noise is small then ASIR less sensitive to outliers than SIR – This is because single point —it characterises p(xt j xt`1 ) well – But if process noise is large then ASIR can degrade performance – Since a single point —it does not characterise p(xt j xt`1 )

N.J. Gordon : Lake Louise : October 2003 – p. 27/47



Regularised PF - RPF – Resampling introduced to reduce degeneracy – But, also reduces diversity – RPF proposed as a solution – Uses continuous Kernel based approximation p^(x1:t j y1:t ) ı

N X i=1

wti Kh

xt `

xti



N.J. Gordon : Lake Louise : October 2003 – p. 28/47

RPF



N.J. Gordon : Lake Louise : October 2003 – p. 29/47



RPF – Kernel K(.) and bandwidth h chosen to minimise MISE MISE(p) ^ =E

Z

fp(x ^ t j y1:t ) ` p(xt j y1:t )g2 dxt



– For equally weighted samples, optimal choice is Epanechnikov kernel

Kopt

  nx +2 (1` k x k2 ) if k x k< 1 2cnx = 0 otherwise

– Optimal bandwidth can be obtained as a function of underlying pdf – Assume this is Gaussian with unit covariance matrix hopt

p nx `1 1=(nx +4) = 8N(nx + 4)(2 ı) cnx 

N.J. Gordon : Lake Louise : October 2003 – p. 30/47



MCMC Moves – RPF moves are blind – Instead, introduce Metropolis style acceptance step o n R R R R – Resampled xk and created x1:k = xk ; x1:k`1

– Resampled xkR and then sampled from a proposal distribution o n R P P R P xk ‰ q(: j xk ) and created x1:k = xk ; x1:k`1

– Assume q(: j :) symmetric x1:k =

 x P

1:k x R 1:k

¸ = min

= min

with probability ¸ otherwise 1;

1;

R P jy p(x1:k 1:k )q(xk R jy P p(x1:k 1:k )q(xk

j xkP )) j xkR )

R ) p(yk j xkP )p(xkP j xk`1

R ) p(yk j xkR )p(xkR j xk`1

!

!

N.J. Gordon : Lake Louise : October 2003 – p. 31/47



Tracking dim targets – Detection and tracking of low SNR targets - better not to threshold the sensor data ! – The concept: track-before-detect – Conventional TBD approaches: - Hough transform (Carlson et al, 1994) - dynamic programming (Barniv, 1990; Arnold et al, 1993) - maximum likelihood (Tonissen, 1994) – The performance improved by 3-5 dB in comparison to the MHT (thresholded data).

N.J. Gordon : Lake Louise : October 2003 – p. 32/47



Recursive Bayesian TBD – Drawbacks of conventional TBD approaches: batch processing; prohibit or penalise deviations from the straight line motion; require enormous computational resources. – A recursive Bayesian TBD (Salmond, 2001), implemented as a particle filter – no need to store/process multiple scans – target motion - stochastic dynamic equation – valid for non-gaussian and structured background noise – the effect of point spread function, finite resolution, unknown and fluctuating SNR are accommodated – target presence and absence explicitly modelled – Run Demo

N.J. Gordon : Lake Louise : October 2003 – p. 33/47



Mathematical formulation – Target state vector xk = [xk x_k yk y_k Ik ]T : – State dynamics: xk+1 = fk (xk ; vk ); – Target existence - two state Markov chain, Ek 2 f0; 1g with TPM   1 ` Pb Pb  : ˝= Pd 1 ` Pd

N.J. Gordon : Lake Louise : October 2003 – p. 34/47



Mathematical formulation (Cont’d) – Sensor model: 2D map, image of a region ´x ˆ ´y . – At each resolution cell (i ; j) measured intensity:  (i;j) (i;j)   h (x ) + w if target present k  k k  zk(i;j) =    w (i;j) if target absent k where

  2 2 ´x ´y Ik (i ´x ` xk ) + (j´y ` yk ) (i;j) hk (xk ) ı exp ` 2 2ı˚ 2˚2 N.J. Gordon : Lake Louise : October 2003 – p. 35/47



Example – 30 frames and target present in 7 to 22 – SNR = 6.7 dB (unknown) – 20 ˆ 20 cells Frame 2

Frame 7

Frame 12

Frame 17

Frame 22

Frame 27

N.J. Gordon : Lake Louise : October 2003 – p. 36/47

Particle filter output (6 states)



Figures suppressed to reduce file size.

N.J. Gordon : Lake Louise : October 2003 – p. 37/47



Particle filter output (6 states)

16 TRUE PATH ESTIMATED

1

14

0.9 0.8

12 0.6

10 y

PROBABILITY

0.7

0.5

8 0.4 0.3

START

6

0.2

4

0.1 0

5

10

15 FRAME NUMBER

20

25

30

2 0

2

4

6

8

10

12

14

x

N.J. Gordon : Lake Louise : October 2003 – p. 38/47



PF based TBD - Performance – Derived CRLBs (as a function of SNR) – Compared PF-TBD to CRLBs – Detection and Tracking reliable at 5 dB (or higher)

N.J. Gordon : Lake Louise : October 2003 – p. 39/47



Sonobuoy and Submarine – Noisy bearing measurements from drifting sensors – Uncertainty in sensor locations – Sensor loss – High proportion of spurious bearing measurements – Run demo

N.J. Gordon : Lake Louise : October 2003 – p. 40/47



Blind Doppler – Blind Doppler zone to to filter out ground clutter + ground moving targets – A simple EP measure against any CW or pulse Doppler radar – Causes track loss – Aided by on-board RWR or ESM

N.J. Gordon : Lake Louise : October 2003 – p. 41/47



Tracking with hard constraints – Prior information on sensor constraints – Posterior pdf is truncated (non-Gaussian) – Example : – 2-D tracking with CV model – (r; „; r_) measurements – pd < 1 – EKF and Particle Filter with identical gating and track scoring

N.J. Gordon : Lake Louise : October 2003 – p. 42/47



EKF (a)

(b)

Range−Rate [km/h]

400 50

40

200 0 −200 −400 −600

0

30

50

100

150

T I M E [s] (c) 1

20

Track Score

Y [km]

−800

10

0.8 0.6 TRACK LOSS

0.4 0.2

0 −15

0 −10

−5

0

X [km]

5

10

15

0

50

100

150

T I M E [s] N.J. Gordon : Lake Louise : October 2003 – p. 43/47



Particle Filter (a)

(b)

Range−Rate [km/h]

400 50

40

200 0 −200 −400 −600

0

30

50

100

150

T I M E [s] (c) CLOUD OF PARTICLES AT t=108 s

20

1

Track Score

Y [km]

−800

10

0.8 0.6 0.4 0.2

0

0 −10

0

X [km]

10

0

50

100

150

T I M E [s] N.J. Gordon : Lake Louise : October 2003 – p. 44/47



Track continuity

1

Probability of Track Maintenance

Probability of Track Maintenance

1

0.8

0.6

0.4

0.2

0.8

0.6

PF EKF

0.4

0.2 PF EKF

0

0

20

40

60

80

Doppler blind zone limit [km/h]

T=2s

100

120

0

0

20

40

60

80

100

120

Doppler blind zone limit [km/h]

T=4s

N.J. Gordon : Lake Louise : October 2003 – p. 45/47



Problems hindering SMC – Convergence results – Becoming available (Del Moral, Crisan, Chopin, Lyons, Doucet ...) – Communication bandwidth – Large particle sets impossible to send – Interoperability – Need to integrate with varied tracking algorithms – Computation – Expensive so look to minimise Monte Carlo – Multi-target problems – Rao-Blackwellisation

N.J. Gordon : Lake Louise : October 2003 – p. 46/47



Final comments – Sequential Monte Carlo methods – “Optimal” filtering for nonlinear/non Gaussian models – Flexible/Parallelizable – Not a black-box: efficiency depends on careful design. – If the Kalman filter is appropriate for your application use it – A Kalman filter is a (one) particle filter

N.J. Gordon : Lake Louise : October 2003 – p. 47/47

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF