Augmenting Film and Video Footage with [Conference Paper]

NESL Technical Report #: 2004-3-1

Authors:

Abstract: With the advent of tiny networked devices, Mark Weiser’s vision of a world embedded with invisible computers is coming to age. Due to their small size and relative ease of deployment, sensor networks have been utilized by zoologists, seismologists and military personnel. In this paper, we investigate the novel application of sensor networks to the film industry. In particular, we are interested in augmenting film and video footage with sensor data. Unobtrusive sensors are deployed on a film set or in a television studio and on performers. During a filming of a scene, sensor data such as light intensity, color temperature and location are collected and synchronized with each film or video frame. Later, editors, graphics artists and programmers can view this data in synchronization with film and video playback. For example, such data can help define a new level of seamless integration between computer graphics and real world photography. A real-time version of our system would allow sensor data to trigger camera movement and cue special effects. In this paper, we discuss the design and implementation of the first part of our embedded film set environment, the augmented recording system. Augmented recording is a foundational component for the UCLA Hypermedia Studio’s research into the use of sensor networks in film and video production. In addition, we have evaluated our system in a television studio.

Local downloads:

Publication Forum: IEEE International Conference on Pervasive Computing and Communications (PerCom)

Page (Start): 3

Page (End): 12

Page (Count): 10

Date: 2004-03-14

Place: Orlando, FL

Public Document?: Yes

NESL Document?: Yes

Document category: Conference Paper

Projects:

Back