Overview
Drawing is a primary human skill that has been used for thousands of years as a visual complement to written and oral storytelling. The advent of interactive narratives brings the possibility of interaction to traditional stories, but most interactive storytelling systems rely on complex interfaces that can create barriers between users and the narrative experience.
This project explores the use of augmented reality combined with sketch-based interaction to create a more natural and engaging interface for interactive storytelling. The research develops systems that allow users to interact with virtual characters simply by sketching objects on conventional paper, bridging the tactile familiarity of traditional drawing with the dynamic possibilities of computational narratives.

Research Challenges
- How can hand-drawn sketches be accurately recognized and converted into virtual objects in real-time?
- What methods enable augmented reality visualization of interactive narratives over conventional paper?
- How can sketch-based interaction improve user engagement and sense of authorship compared to traditional interfaces?
- What approaches allow users to indirectly influence character decisions and even subvert storylines through drawings?
- How can this natural interaction method work with planning-based narrative generation systems?
Approach
This research combines augmented reality visualization with sketch-based interaction to create a mixed reality interactive storytelling system. The system dramatizes interactive narratives in augmented reality over a conventional sheet of paper, allowing users to freely interact with virtual characters by sketching objects using everyday paper and pencil.
The technical approach uses machine learning for sketch recognition, employing Support Vector Machine (SVM) classifiers trained to recognize user-drawn objects. When the camera captures a sketch on paper, the system identifies the drawing, converts it into a virtual 3D object, and integrates it into the augmented reality story world. Fiducial markers on the paper enable the system to compute camera position and render the virtual world correctly aligned with the physical environment.
The system architecture includes an emotional and social model for characters, an action planner that responds to user interactions, and the sketch-based interface layer. This combination allows users to indirectly affect character decisions through their drawings. For example, sketching a sword might enable a character to defend themselves, while drawing a poison bottle could lead to darker story outcomes.
Later work extended the approach to support generic hand-drawn sketches and planning-based interactive storytelling, enabling more flexible interaction where users can sketch various objects that influence the narrative generation process. User studies demonstrated that hand drawing as an interaction method significantly improves user satisfaction, experience, and system usability.
Example of user interaction: sketching objects on paper that are recognized and converted into virtual objects within the augmented reality narrative environment, allowing users to influence story progression and character actions.
Key Contributions
Developed a novel interaction method using hand-drawn sketches on paper, employing SVM classifiers for real-time sketch recognition and conversion into virtual 3D objects within augmented reality narrative environments.
Created systems that dramatize interactive narratives in augmented reality over conventional paper, using fiducial markers for camera tracking and rendering virtual characters and objects aligned with the physical environment.
Integrated sketch-based interaction with planning-based narrative generation, allowing user drawings to influence plot development and character decisions within procedurally generated story structures.
Demonstrated through user studies that sketch-based interaction significantly improves user satisfaction, engagement, sense of authoring, and overall system usability compared to traditional interactive storytelling interfaces.
Prototype Demonstration
Watch a demonstration of the augmented reality storytelling system with sketch-based interaction:
Related Publications
-
Edirlei Soares de Lima; Felipe Gheno; Ana Viseu. Sketch-Based Interaction for Planning-Based Interactive Storytelling. Proceedings of the XIX Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2020), Recife, Brazil, pp. 348-356, 2020. [PDF] [DOI]
-
Felipe Gheno; Edirlei Soares de Lima. História Viva: A Sketch-Based Interactive Storytelling System. Proceedings of the XX Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2021), Gramado, Brazil, 2021. [PDF] [DOI]
-
Thiago Cler Franco; Edirlei Soares de Lima. Paper and Pencil Interactive Storytelling Based on Generic Hand-drawn Sketches. Proceedings of the XVI Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2017), Curitiba, Brazil, p. 594-597, 2017. [PDF]
-
Edirlei Soares de Lima; Bruno Feijó; Simone D.J. Barbosa; Antonio L. Furtado; Angelo E. M. Ciarlini; Cesar T. Pozzer. Draw Your Own Story: Paper and Pencil Interactive Storytelling. Entertainment Computing, v. 5, p. 33-41, 2014. [DOI] [PDF]
-
Edirlei Soares de Lima; Bruno Feijó; Simone D.J. Barbosa; Antonio L. Furtado; Angelo E. M. Ciarlini; Cesar T. Pozzer. Draw Your Own Story: Paper and Pencil Interactive Storytelling. Proceedings of the 10th International Conference on Entertainment Computing (ICEC 2011), Vancouver, Canada, October 2011, pp. 1-12. ISBN: 978-3-642-24499-5. [PDF] [DOI]