Menu

Menu

Menu

C11 VISUAL ARTS

C11 VISUAL ARTS

C11 VISUAL ARTS

SECOND PERSON


Abstract

Second Person is an interactive installation that investigates displaced embodiment and digital selfhood through gaze-controlled avatars. Participants’ eye movements become both input and experiential medium, recursively shaping a digital body in real time. By destabilizing the boundary between intention, action, and identity, the system functions as a perceptual mirror: agency flows between human and machine. This research engages theories of embodied cognition, distributed agency, and digital identity to explore how mediated interaction reconfigures self-perception.

Keywords: embodied cognition, gaze interaction, recursive feedback, digital identity, avatar synthesis, computer vision, interactive installation

1. Motivation & Contribution

Digital interfaces often maintain clear separations between user and system. Second Person instead collapses these boundaries, turning the user’s gaze into both controller and subject. This generates a uniquely unstable sense of agency—where the user’s identity becomes entangled with a digital avatar.

Key Contributions:

  • A gaze-controlled avatar system that foregrounds experiential interaction over instrumental use.

  • Real-time recursive feedback architecture enabling emergent, ambiguous agency.

  • Accessible computer vision implementation using standard webcam hardware.

  • A cinematic video documentation pipeline for systematic embodiment research.

2. System Overview

Interaction Pipeline

  • Gaze input via webcam-based computer vision.

  • Facial landmark detection informs avatar pose and gaze alignment.

  • Recursive feedback allows the avatar’s behavior to influence user gaze.

  • Automated recording generates 4K documentation of embodiment sessions.

The avatar becomes a digital extension and distortion of the self, producing unstable, mirror-like presence.

3. Technical Architecture

┌──────────────────────────────────────────┐

│ GAZE TRACKING │

│ - Webcam input │

│ - Custom Python computer vision │

├──────────────────────────────────────────┤

│ AVATAR SYNTHESIS │

│ - Landmark detection (OpenCV) │

│ - Stylized 3D rendering pipeline │

│ - NVIDIA RTX GPU acceleration │

├──────────────────────────────────────────┤

│ RECURSIVE FEEDBACK ENGINE │

│ - Behavior loop between avatar & user │

│ - Gaze influences response; response alters gaze │

├──────────────────────────────────────────┤

│ DOCUMENTATION LAYER │

│ - Automated 4K video generation │

│ - Research logging (gaze, latency) │


Core Components:

  • Webcam + OpenCV for gaze tracking.

  • Facial landmark detection algorithms.

  • Python rendering pipeline with GPU acceleration.

  • Recursive control architecture (feedback loop).

  • Automated video logging system.

4. Bill of Materials (BOM)

Component

Model / Library

Qty

Est. Cost

Purpose

Webcam

Standard HD webcam

1

$50

Gaze capture

GPU

NVIDIA RTX (or equivalent)

1

Real-time rendering

Computer

Workstation / laptop

1

Host system

Software

Python, OpenCV, Open3D, PyTorch

Computer vision, rendering, ML

Repository

GitHub

Reproducibility and open research

5. Research Context & Motivation

Theoretical Framework

  • Embodied Cognition (Varela et al. 1991) — perception and action co-constitute identity.

  • Distributed Agency (Clark 1997; Dourish 2001) — selfhood emerges through interactions with technological systems.

  • Virtual embodiment affects how users perceive their own agency.

Research Gaps:

  • Binary user–system separation limits investigation of fluid identity.

  • Visual bias in HCI neglects embodied and recursive modalities.

  • Few interfaces foreground ambiguous, shared agency.

Contribution:
Second Person provides a research platform to study how gaze-driven recursion can destabilize and reveal self-perception in digital space.

6. Future Work

  • Multimodal integration — expanding gaze interaction with gesture and voice inputs.

  • EEG integration to explore neural correlates of digital selfhood.

  • Predictive embodiment algorithms for anticipatory avatar response.

  • Multi-user expansion for distributed shared presence.

7. Ethics & Research Alignment

  • Minimal biometric data retained (gaze vectors only).

  • Open-source transparency for reproducibility.

  • Design aligned with embodied cognition and critical HCI frameworks.

  • Focus on identity, not surveillance.

8. References

  • Varela, F. J., Thompson, E., & Rosch, E. (1991). The embodied mind. MIT Press.

  • Clark, A. (1997). Being There. MIT Press.

  • Dourish, P. (2001). Where the Action Is. MIT Press.

  • Pearce, J. M. (2012). Science.

  • Nosek, B. A. et al. (2015). Science.

9. Citation

Project page: https://www.c11visualarts.com/altered-perception---second-person-2
GitHub repository: https://github.com/CJD-11/Second-Person



@misc{dziadzio2025_secondperson,
title = {Altered Perception—Second Person},
author = {Dziadzio, Corey},
year = {2025},
howpublished = {Project page and GitHub repository},
url = {https://www.c11visualarts.com/altered-perception---second-person-2},
note = {GitHub: https://github.com/CJD-11/Second-Person}
}