Off-line automatic virtual director for lecture video

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This research proposed an automatic mechanism to refine the lecture video by composing meaningful video clips from multiple cameras. In order to maximize the captured video information and produce a suitable lecture video for learners, video content should be analysed by considering both visual and audio information firstly. Meaningful events were then detected by extracting lecturer’s and learners’ behaviours according to teaching and learning principles in class. An event-driven camera switching strategy was derived to change the camera view to a meaningful one based on the finite state machine. The final lecture video was then produced by composing all meaningful video clips. The experiment results show that learners felt interested and comfortable while watching the lecture video, and also agreed with the meaningfulness of the selected video clips.

Cite

CITATION STYLE

APA

Huang, D. W., Lin, Y. T., & Lee, G. C. (2014). Off-line automatic virtual director for lecture video. In Lecture Notes in Electrical Engineering (Vol. 260, p. 1279). Springer Verlag. https://doi.org/10.1007/978-94-007-7262-5_144

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free