Overview
Many of the challenges faced today during the development of interactive storytelling systems have been addressed before by filmmakers. Over the years, cinema has evolved and established various principles and rules to be adopted during the creation of a film. However, unlike movies, where every scene is carefully planned, interactive storytelling does not have this freedom.
This project explores the use of autonomous agents inspired by cinematography to dramatize interactive narratives. The research investigates how filmmaking knowledge, particularly regarding camera work, lighting, and music, can be encoded into intelligent systems that automatically create cinematic presentations for procedurally generated stories whose sequence of events is unknown beforehand.

Research Challenges
- How can cinematographic principles be formalized and encoded into autonomous agent systems?
- What methods enable real-time selection of optimal camera shots for unknown story sequences?
- How can visual and audio elements be automatically manipulated to express narrative emotions?
- What machine learning techniques can capture cinematographic knowledge for automated decision-making?
- How can dramatization systems adapt appropriately to any type of scene without prior planning?
Approach
This research developed a dramatization architecture composed of autonomous agents inspired by filmmaking professionals. Each agent performs specific cinematographic roles, applying established film production principles to create engaging presentations of interactive narratives generated in real-time.
The Virtual Cinematography Director agent selects optimal camera shots for dramatization scenes in real-time using Support Vector Machines (SVM) trained with cinematographic knowledge. The system learns from datasets encoding filmmaking principles to make shot selection decisions that present narrative content in interesting and coherent ways, adapting to the emotional context and dramatic requirements of each scene.
The Director of Photography agent manipulates visual parameters, including lighting, shadows, color grading, and visual effects, to enhance emotional expression in scenes. By analyzing emotional information about characters and the environment, this agent applies cinematographic techniques such as adjusting light intensity and creating atmospheric effects to emphasize emotions such as fear, joy, or tension.
The Music Director agent controls audio aspects of the dramatization, selecting and manipulating musical elements to reinforce scene emotions and enhance viewer immersion. Both the Director of Photography and Music Director use SVMs trained with cinematography knowledge datasets, enabling them to create and manipulate audio-visual parameters that increase the emotional impact and immersion of the interactive storytelling experience.
Key Contributions
Designed a dramatization architecture composed of autonomous agents inspired by filmmaking professionals, enabling automated cinematic presentation of procedurally generated interactive narratives.
Developed intelligent cinematography director using Support Vector Machines to select optimal camera shots in real-time for unknown story sequences, presenting content in engaging and coherent ways.
Created Director of Photography agent that automatically manipulates lighting, shadows, and visual effects based on scene emotions, applying cinematographic techniques to enhance emotional expression and atmosphere.
Developed Music Director agent that selects and manipulates audio elements to reinforce scene emotions, using machine learning trained on filmmaking knowledge to increase viewer immersion in interactive stories.
Prototype Demonstrations
Watch demonstrations of the virtual cinematography agents in action:
Related Publications
-
Edirlei Soares de Lima. Um Modelo De Dramatização Baseado Em Agentes Cinematográficos Autônomos Para Storytelling Interativo. Dissertação de Mestrado. Universidade Federal de Santa Maria (UFSM), Santa Maria, Brazil, 2010. [PDF]
-
Edirlei Soares de Lima; Cesar T. Pozzer; Bruno Feijó; Angelo E. M. Ciarlini; Antonio L. Furtado. Director of Photography and Music Director for Interactive Storytelling. Proceedings of the IX Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2010), Florianopolis, Brazil, November 2010, pp. 129-137. ISBN: 978-0-7695-4359-8 [PDF] [DOI]
-
Edirlei Soares de Lima; Cesar T. Pozzer; Marcos C. d'Ornellas; Angelo E. M. Ciarlini; Bruno Feijó; Antonio L. Furtado. Virtual Cinematography Director for Interactive Storytelling. Proceedings of the International Conference on Advances in Computer Entertainment Technology (ACE 2009), Athens, Greece, October 2009, pp. 263-270. ISBN: 978-1-60558-864-3 [PDF] [DOI]
-
Edirlei Soares de Lima; Cesar T. Pozzer; Marcos C. d'Ornellas; Angelo E. M. Ciarlini; Bruno Feijó; Antonio L. Furtado. Support Vector Machines for Cinematography Real-Time Camera Control in Storytelling Environments. Proceedings of the VIII Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2009), Rio de Janeiro, Brazil, October 2009, pp. 44-51. ISBN: 978-0-7695-3963-8. [PDF] [DOI]