For this paper, in the context of the French research project Spectacle en Ligne(s), we have recorded the entire set of rehearsals of one theater and opera production using state-of-the-art video equipment. The resulting raw video and audio tracks as well as manually generated annotation data were then preprocessed in order to localize actors and detect their dialogues. Based on these preprocessing steps, we have built a Web-based hypervideo application that allows for navigation through performance time, space, and time using modern HTML5 Web technologies like the emerging Web Components standard. We publish and consume the annotation data as Linked Data Fragments, a novel way to make triple-based structured data available in a scalable way. Researchers interested in the genetic analysis of live performances can, thanks to our application, better understand the different steps to a chef doeuvre. A demo of the application is available at http://spectacleenlignes.fr/hypervideo/.
CITATION STYLE
Steiner, T., Ronfard, R., Champin, P. A., Encelle, B., & Prié, Y. (2015). Curtains up! lights, camera, action! documenting the creation of theater and opera productions with linked data and web technologies. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9114, pp. 533–543). Springer Verlag. https://doi.org/10.1007/978-3-319-19890-3_34
Mendeley helps you to discover research relevant for your work.