Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-20T18:26:36.608Z Has data issue: false hasContentIssue false

Smart video surveillance for airborne platforms

Published online by Cambridge University Press:  30 September 2008

Ali Sekmen*
Affiliation:
Department of Computer Science, Tennessee State University, Nashville, TN, USA.
Fenghui Yao
Affiliation:
Department of Computer Science, Tennessee State University, Nashville, TN, USA.
Mohan Malkani
Affiliation:
Department of Electrical and Computer Engineering, Tennessee State University, Nashville, TN, USA.
*
*Corresponding author. E-mail: [email protected]

Summary

This paper describes real-time computer vision algorithms for detection, identification, and tracking of moving targets in video streams generated by a moving airborne platform. Moving platforms cause instabilities in image acquisition due to factors such as disturbances and the ego-motion of the camera that distorts the actual motion of the moving targets. When the camera is mounted on a moving observer, the entire scene (background and targets) appears to be moving and the actual motion of the targets must be separated from the background motion. The motion of the airborne platform is modeled as affine transformation and its parameters are estimated using corresponding feature sets in consecutive images. After motion is compensated, the platform is considered as stationary and moving targets are detected accordingly. A number of tracking algorithms including particle filters, mean-shift, and connected component were implemented and compared. A cascaded boosted classifier with Haar wavelet feature extraction for moving target classification was developed and integrated with the recognition system that uses joint-feature spatial distribution. The integrated smart video surveillance system has been successfully tested using the Vivid Datasets provided by the Air Force Research Laboratory. The experimental results show that system can operate in real time and successfully detect, track, and identify multiple targets in the presence of partial occlusion.

Type
Article
Copyright
Copyright © Cambridge University Press 2008

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Araki, S., Matsuoka, T., Yokoya, N. and Takemura, H., “Real-time tracking of multiple moving object contours in a moving camera image sequence,” IEICE Trans. Info. Syst. E83-D (7), 15831591 (Jul. 2000).Google Scholar
2. Barron, J. L., Fleet, D. J. and Beauchemin, S. S., “Performance of optical flow techniques,” Int. J. Comput. Vision 12 (1), 4377 (1994).CrossRefGoogle Scholar
3. Boult, T. E., Micheals, R., Gao, X., Lewis, P., Power, C., Yin, W. and Erkan, A., “Framerate Omnidirectional Surveillance and Tracking of Camouflaged and Occluded Targets,” IEEE Workshop on Visual Surveillance, Fort Collins, CO (1999) pp. 48–55.Google Scholar
4. Bouguet, J. Y., “Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the Algorithm,” Technical Report, Intel Cooperation (2001).Google Scholar
5. Collins, R. T., Lipton, A. J., Kanade, T., Fujiyoshi, H., Duggins, D., Tsin, Y., Tolliver, D., Enomoto, N., Hasegawa, O., Burt, P. and Wixson, L., “A System for Video Surveillance and Monitoring,” Technical Report, Carnegie Mellon University, CMU-RI-TR-00-12 (2000).Google Scholar
6. Comaniciu, D., Ramesh, V. and Meer, P., “Real-Time Tracking of Non-Rigid Objects using Mean Shift,” IEEE Conference on Computer Vision and Pattern Recognition (CVPR'00), Hilton Head Island, South Carolina, Vol. 2 (2000) pp. 142149.CrossRefGoogle Scholar
7. Elgammal, A., Duraiswami, R. and Davis, L. S., “Probabilistic Tracking in Joint Feature-Spatial Spaces,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Wisconsin, USA (Jun. 1622, 2003).Google Scholar
8. Freund, Y. and Schapire, R. E., “A short introduction to boosting,” J. Jpn. Soc. Artific. Intell. 14 (5), 771780 (1999).Google Scholar
9. Grimson, W. and Stauffer, S., “Adaptive Background Mixture Models for Real-Time Tracking,” IEEE Conference on Computer Vision and Pattern Recognition, Fort Collins, CO (1999) pp. 22–29.Google Scholar
10. Huang, L. L., Gu, I. Y. H. and Tian, Q., “Foreground Object Detection from Videos Containing Complex Background,” ACM International Multimedia Conference, Berkeley, CA (2003) pp. 210.Google Scholar
11. Isard, M. and Blake, A., “CONDENSATION – Conditional density propagation for visual tracking,” Int. J. Comput. Vis. 1 (29), 528 (1998).CrossRefGoogle Scholar
12. Isard, M. and Blake, A., “ICondensation: Unifying Low-Level and High Level Tracking in a Stochastic Framework,” Proceedings of 5th European Conference on Computer Vision, Freiburg, Germany, Vol. 1 (1998) pp. 893–908.Google Scholar
13. Kang, J., Cohen, I., Medioni, G. and Yuan, C., “Detection and Tracking of Moving Objects from a Moving Platform in Presense of Strong Parallax,” IEEE International Conference on Computer Vision, China (2005).Google Scholar
14. Kang, J., Cohen, I. and Medioni, G., “Continuous Tracking Within and Across Camera Streams,” IEEE Conference on Computer Vision and Pattern Recognition, Madison, Wisconsin (2003).Google Scholar
15. Khan, Z., Balch, T. and Dellaert, F., “An MCMC-based Particle Filter for Tracking Multiple Interacting Targets,” European Conference on Computer Vision, Prague, Czech Republic (2004).CrossRefGoogle Scholar
16. Li, L., Huang, W., Gu, Y. H. and Tian, Q., “Statistical modeling of complex backgrounds for foreground object detection,” IEEE Trans. Image Process. 13 (11), 14591472 (2004).CrossRefGoogle ScholarPubMed
17. Li, L., GU, I. Y. H. and Tian, Q., “Foreground Object Detection from Videos Containing Complex Background,” ACM Multimedia (2003) pp. 2–10.Google Scholar
18. Liu, A., Haizhou, A. and Guangyou, X., “Moving Object Detection and Tracking Based on Background Subtraction,” Proceedings of SPIE, Wuhan, China, Vol. 4554 (2001).Google Scholar
19. Liu, H., Hong, T., Herman, M. and Chellappa, R., “Accuracy vs. Efficient Trade-offs in Optical Flow Algorithms,” European Conference on Computer Vision, Cambridge, UK (1996).Google Scholar
20. Lucas, B. D. and Kanade, T., “An Iterative Image Registration Technique with an Application to Stereo Vision,” Image Understanding Workshop, Vancouver, Canada (1981) pp. 121130.Google Scholar
21. Margarey, J. and Kingsbury, N., “Motion estimation using a complex-valued wavelet transform,” IEEE Trans. Signal Process. 46 (4), 10691084 (1998).CrossRefGoogle Scholar
22. Nummiaro, K., Koller-Meier, E. and Gool, L. V., “An Adaptive Color-Based Particle Filter,” Image Vis. Comput. 21, 99110 (2002).CrossRefGoogle Scholar
23. Open Source Computer Vision Library (Dec. 2007). http://www.intel.com/research/mrl/research/opencv.Google Scholar
24. Perez, P., Hue, C., Vermaak, J. and Gangnet, M., “Color-Based Probabilistic Tracking,” European Conference on Computer Vision 2002, Lecture Notes in Computer Science (LNCS), 235, Springer, Vol. LNCS2350 (2002) pp. 661675.Google Scholar
25. Privett, G. and Kent, P., “Automated Image Registration with ARACHNID,” Defense and Security Symposium, Orlando, FL (2005).Google Scholar
26. Reddy, B. S. and Chatterji, B. N., “An FFT-based technique for translation, rotation, and scale invariant image registration,” IEEE Trans. Image Process. 5, 12661271 (1996).CrossRefGoogle ScholarPubMed
27. Rehrmann, V., “Object Oriented Motion Estimation in Color Image Sequences,” European Conference on Computer Vision 1998, Vol. I, Lecture Notes in Computer Science (LNCS), 1406 (1998) pp. 704–719.Google Scholar
28. Ross, M., “Model-Free, Statistical Detection and Tracking of Moving Objects,” 13th International Conference on Image Processing (ICIP 2006), Atlanta, GA (Oct. 811, 2006).Google Scholar
29. Schiele, B., “Model-free tracking of cars and people based on color regions,” Image Vis. Comput. 24, 11721178 (2006).CrossRefGoogle Scholar
30. Senior, A., Hampapur, A., Tian, Y. L., Pankanti, S. and Bolle, R., “Appearance Models for Occlusion Handling,” 2nd IEEE Workshop on Performance Evaluation of Tracking and Surveillance, Hawaii, USA (2001).Google Scholar
31. Smith, S. M. and Brady, J. M., “ASSET-2: Real-time motion and shape tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 17 (8), 814820 (Aug. 1995).CrossRefGoogle Scholar
32. Stauffer, C. and Grimson, W., “Learning patterns of activity using real-time tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 22 (8), 747757 (2000).CrossRefGoogle Scholar
33. Stauffer, C. and Grimson, W. E. L, “Adaptive Background Mixture Models for Real-time Tracking,” 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'98), Santa Barbara, CA (1998).Google Scholar
34. Shi, J. and Tomasi, C., “Good Features to Track,” IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA (1994) pp. 593600.Google Scholar
35. Tian, Y. L., Lu, M. and Hampapur, A., “Robust and Efficient Foreground Analysis for Real-Time Video Surveillance,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, Vol. 1 (2005) pp. 11821187.Google Scholar
36. Tian, Y. L. and Hampapur, A., “Robust Salient Motion Detection with Complex Background for Real-Time Video Surveillance,” IEEE Workshop on Motion and Video Computing, Breckenridge, CO, Vol. 2 (2005) pp. 3035.Google Scholar
37. Tweed, D. and Calway, A., “Tracking Many Objects Using Subordinated Condensation,” 13th British Machine Vision Conference, Cardiff, UK (BMVC 2002) (2002).Google Scholar
38. Viola, P. and Jones, M., “Rapid Object Detection Using a Boosted Cascade of Simple Features,” IEEE Conference on Computer Vision and Pattern Recognition, Hawaii, USA (2001), pp. 511518.Google Scholar
39. Webb, A. R., Statistical Pattern Recognition 2nd ed. (John Weley & Sons, UK, 2002).CrossRefGoogle Scholar
40. Yang, T., Li, S. Z., Pan, Q. and Li, J., “Real-Time Multiple Objects Tracking with Occlusion Handling in Dynamic Scenes,” The IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, Vol. 1 (2005) pp. 2025.Google Scholar
41. Yang, C., Duraiswami, R. and Davis, L., “Efficient Mean-Shift Tracking via a New Similarity Measure,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, USA (Jun. 20–25, 2005).Google Scholar
42. Zitova, B. and Flusser, J., “Image registration methods: A survey,” Image Vis. Comput. 21, 9771000 (2003).CrossRefGoogle Scholar