A data-driven global emotions visualizer, combining body language of the viewer and emotional language from a twitter API that distinguishes emotional posts and populates visually according to categorical strength in the form of Jungian archetypes.
FEELS, Interactive Mixed Media, 120” x 94”, 2021
Exhibited at the 2021 Rochester Fringe Festival & City Art Space, NY.
Sound credit: Simon Howard, Dillon Robinson
An interactive installation imposing a deliberate sequence of anxieties, ostracizing participants according to body temperature and questioning the phrase “new normal”.
Interactive Mixed Media, 2020.
Holo-Center AR-
Exploring and questioning the term holography in reference to the formula for Luminance in the context of modern day transformations to the art form.
HoloCenter Gallery, Kingston, NY, 2024
Aphid Voids-
A study in Virtual Reality, non-linear narrative and the blending of tangible and non-tangible craft. 2020 to (on going)
Exploring and questioning the term holography in reference to the formula for luminance in the context of modern day transformations to the art form.
Interactive Mixed Media (Oil paint on wood, Computer Vision, Audio-visual, AI-GPT, Unity game Engine + Holographic displays), 90” x 34” x 10”, 2024.
Exhibited at the Boiler Gallery in Brooklyn, NY and as a part of the XR category official selection of the Green Point Brooklyn Film Festival.
This work reads the spatial audio fluctuations of the room and outputs camera based abstractions based on computer vision. Oh Charles also comments on the accelerated progression of artificial intelligence paired with a programmed Interactive AI avatar living within the evolutionary step ladder of humankind, knowingly trapped within the displays.
3D Render, Autodesk Maya + Substance Painter, Exhibited in tandem.
3D Render, Autodesk Maya + Substance Painter, Exhibited in tandem.
This work reads the spatial audio fluctuations of the room and outputs camera based abstractions based on computer vision. Comments on the accelerated progression of artificial intelligence paired with a programmed Interactive AI avatar living within the evolutionary step ladder of humankind, knowingly trapped within the displays.
Without red light.
“Arkhé” means origin, beginning, or source of action; it also describes the first principle from which others are derived. Bearing this in mind, the Arkhé project aims to help people identify, understand, and interact with the fundamental, intangible relationships that exist around us. We achieve this by simplifying experiences to their base components and reframing core interactions in order to evoke complex emotions. Our projects share this universal goal despite their many different forms, ranging from educational tools to art installations.
Co-Lead by Travis Stodter & Jake Adams
Co-Created with students of New Media Interactive Development and Interactive Games and Media at the Rochester Institute of Technology- College of Computer Information Sciences.
Visit theArkhé Project page for more.
The Living Room installation aims to coalesce an experience of American nostalgia while manifesting a sense of growth and adaptation in the participant. Implemented using a variety of embedded sensors, computer vision, and functional programming, the goal of the ongoing Arkhé project is to experiment with tangible digital interactions and tap into atavistic knowledge construction.
An interactive exhibit that explores the complex relationship between technology and personal agency. Through a combination of computer vision and GPT-based AI, seekers of our AI oracle can have their fortunes read and interpreted into a variety of forms. The exhibit encourages visitors to consider the implications of rapidly-advancing technology and the power they hold over their own future.
The Composition Cards project is a mixed-reality tool that aims to help students perceive the underlying structure of their code and learn how to compose software from simple axioms and first principles. We use camera input to read and interpret physical index cards that are laid out end-to-end as if they were a simple programming language and “transpile” them into some other language for execution, enhancing students' understanding of function interactions.
Representations of holo-comics and holographic apps by Jake Adams. 2018 to present, created with the Looking Glass display and unity game engine.
Maldacena: MFR is the worlds first digital holographic comic book- See more here:
Aphid Through the Looking Glass
holographic display application
A holographic display comic book (holo-comic), published in August, 2023.
Violet goes into dream land.
The first holographic display comic book (holo comic)Trailer courtesy of the Looking Glass Factory team.
Current full video of Maldacena: A Mirror For The Real.
Full length video of the holo-comic, with voice over. The video gives an idea of the scope and breadth of the work.
All things interactive.
- Oh Charles, Interactive Mixed Media, 2024
- FEELS, Interactive Mixed Media, 2021
- Phaser’s End, Interactive Mixed Media, 2020.
Oil on wood, four holographic displays, two audio visualizers, 102 in x 42” x 14”, 2024. Installed as the official selection of the XR category, Green Point Brooklyn Film Festival 2024.
This work reads the spatial audio fluctuations of the room and outputs camera based abstractions based on computer vision. Comments on the accelerated progression of artificial intelligence paired with a programmed Interactive AI avatar living within the evolutionary step ladder of humankind, knowingly trapped within the displays.
A global emotions visualizer, reading emotional language from a twitter API that distinguishes emotional posts and populates visually according to categorical strength in the form of Jungian archetypes.
An interactive installation imposing a deliberate sequence of anxieties questioning the phrase “new normal”.
Deaths of minors caused by covid-19 directly or indirectly. It is both a memorial and a statement. The panel utilizes the Looking Glass display and button input, along with Leap Motion.
Panel one full display, FMI89 breathing with input interaction.
A display on panel one in Phaser’s End
A full picture of panel one from the inside.
Close up of a face as its recorded into the Visual synthesizer. It is ready for manipulation.
Panel Three includes a visual synthesizer, an Arduino powered text scroll, and synthesizer input, all visualized in an infinity mirror.