Here you can learn about some of the projects I’m on. Click any project to expand for more details.

  • Gesture Visualization Project

    Understanding ideas from motion-captured gestures to provide meaningful visualizations

    We embody our ideas in our gestures, and this project aims to visualize those ideas from gestures and the speech that accompanies them. Imagine being able to, as you describe a story or idea, have a supporting visualization portray your descriptions for others to see and collaborate with you on. That is the vision this project aims to achieve, starting with the complex challenge of designing a system to interpret gesture features into meaningful concepts for visualization.

    We’ve examined iconic gestures for physical features that convey consistent meaning by analyzing how people retell stories from the same stimulus (in our case, a short cartoon) [1]. We found that gestures that convey dimension contain the most consistent features, and developed a prototype system that visualizes the size of objects from a person’s gestures [2]. This led to implications for translating the different frames of reference a gesture can manifest as [3]. In the future, we are interested in furthering this research as a means to support designers in their work and have contemplated how multimodality and embodiment can be used to support creativity [4].

    Read more on our lab’s website!

    1. Brown, S. A., Chu, S. L., Quek, F., Canaday, P., Li, Q., Loustau, T., … & Zhang, L. (2019, November). Towards a Gesture-Based Story Authoring System: Design Implications from Feature Analysis of Iconic Gestures During Storytelling. In International Conference on Interactive Digital Storytelling (pp. 364-373). Springer, Cham. (Nominee for Best Short Paper)
    2. Brown, S. A., Chu, S.L. & Rani, N. (2020, September). Externalizing Mental Images by Harnessing Size-Describing Gestures: Design Implications for a Visualization System. In International Conference on Advanced Visual Interfaces. (publication forthcoming)
    3. Brown, S. A., Chu, S. L., & Rani, N. (2020, April). Harnessing Gestures to Externalize Discourse Ideas for Common Ground: Design Implications from a Frame of Reference Analysis. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-8).
    4. Brown, S. and Chu, S. L. (2020). In the Flow of Creative Practice: Multimodality and Embodiment for Creativity Support Tools. In Proceedings of Workshop on Where Art Meets Technology: Integrating Tangible and Intelligent Tools in Creative Processes. CHI 2020. Honolulu, HI. ACM.


  • Story Creation Interface

    Using motion tracking to support children's storytelling and writing

    When children sit down to write, they’re often faced with two challenges: the technical aspects of writing, and their overwhelming imagination, full of ideas. This NSF-funded project suggests that by providing an intermediary medium for children to express their ideas, they can have better support at the point of writing. Towards this end, we developed an interface for children to create their stories through different means of motion tracking, which takes their enactment and turns it into a cartoon that they then use to support their writing. I was responsible for developing the core of this interface, which has been iterated upon by my colleagues as we investigate different modes of enactment.

    Read more on our lab’s website!

    1. Brown, S. A., Chu, S. L., & Loustau, T. (2019, November). Embodying Cognitive Processes in Storytelling Interfaces for Children. In International Conference on Interactive Digital Storytelling (pp. 357-363). Springer, Cham.Zarei, N., Quek, F., Chu, S. L. & Brown, S. A. (2020, November) Towards Design Strategies to Support Children’s Narrative Writing Through Enactment. In International Conference on Interactive Digital Storytelling (publication forthcoming)
    2. Zarei, N., Chu, S. L., Quek, F., Rao, N. J., & Brown, S. A. (2020, April). Investigating the Effects of Self-Avatars and Story-Relevant Avatars on Children’s Creative Storytelling. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
  • Therapeutic Interactive Digital Narratives

    Using the power of interactive stories as a means to support therapeutic practice

    This project is at the core of my dissertation work. It envisions therapy where once the patient goes home after a session, they are greeted by an interactive storytelling experience, tailored to their needs through the use of artificial intelligence. Impactful narratives will be generated for the patient through input from both the patient and the therapist, replacing traditional means to take-home therapy assignments with a more engaging, intuitive experience. Therapy already harnesses the power of stories and this project aims to take this idea further with the incorporation of computing as a means to support this process.

    So far, we’ve envisioned that such a system would need a way of responding to the patient’s emotions during the interactive storytelling experience, and have investigated diegetic and non-diegetic means to emotion capture [1].

    1. Brown, S. A., Resch, C., Han, V., Surampudi, S. V., Karanam, P. & Chu, S. L. (2020, November). Capturing User Emotions in Interactive Stories: Comparing a Diegetic and a Non-Diegetic Approach to Self-Reporting Emotion. In International Conference on Interactive Digital Storytelling (publication forthcoming)
  • Science Modeling through Physical Computing

    Teaching science through block-based programming and making!

    This is a fairly young project with no results to report yet, where I’m helping to investigate ways in which block-based programming interfaces can support the learning of computational thinking and science through their incorporation in maker-based activities. One of our initial investigations will explore how we can represent the science concept inside the block-based programming language itself, to further support science learning through these activities. I look forward to reporting more on this in the future!

    Read more on our lab’s website!