
Occlusion – Aware Fragment – Based Tracking with Spatial – Temporal Consistency
Abstract of Occlusion – Aware Fragment – Based Tracking with Spatial – Temporal
Occlusion – Aware Fragment – Based Tracking with Spatial – Temporal
we present a robust tracking method by exploiting a fragment-based appearance model with consideration of both temporal continuity
Existing tracking algorithms mainly focus on designing a robust appearance model due to its core role in the tracking problem.
Quite recently, Liu et al. exploit multiple patch-based correlation filters to obtain several response maps, and propose an adaptive weighting method to fuse these response maps for object tracking.This assumption is too rough as it does not take motion estimation into account.Thus it limits the tracking performance especially when some large or complex motions occur.
In, temporal consistent segmentations of moving objects are achieved by clustering the point trajectories which have different survival time.
Conclusion
This Occlusion – Aware Fragment – Based Tracking with Spatial – Temporal Consistency paper presents a novel tracking algorithm by estimating the position of the tracked object and occlusion state in an iterative manner.
The temporal consistency of occlusion states among frames is also taken into consideration for optimizing.
In addition, a simple yet effective training (and updating) strategy is also introduced to ensure the model coefficients are properly learned.