Please use this identifier to cite or link to this item:
|Title:||A robust model generation technique for model-based video coding|
|Source:||IEEE transactions on circuits and systems for video technology, Nov. 2001, v. 11, no. 11, p. 1188-1192.|
|Abstract:||In conventional model-based coding schemes, predefined static models are generally used. These models cannot adapt to new situations, and hence, they have to be very specific and cannot be generated from a single generic model even though they are very similar. In this letter, we present a model-generation technique that can gradually build a model and dynamically modify it according to new video frames scanned. The proposed technique is robust to the object's orientation in the view and can be efficiently implemented with a parallel processing technique. As a result, the proposed technique is more attractive to the practical use of model-based coding techniques in real applications.|
|Rights:||© 2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.|
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
|Appears in Collections:||EIE Journal/Magazine Articles|
Files in This Item:
|Model-Based Video Coding_01.pdf||274.08 kB||Adobe PDF||View/Open|
All items in the PolyU Institutional Repository are protected by copyright, with all rights reserved, unless otherwise indicated. No item in the PolyU IR may be reproduced for commercial or resale purposes.