C11 VISUAL ARTS
C11 VISUAL ARTS
C11 VISUAL ARTS
4D: An Experimental Camera



Framework for Physical Perception of Photography
Framework for Physical Perception of Photography
Framework for Physical Perception of Photography
Menu
Menu
Abstract
The 4D Experimental Camera is a physical-computational imaging system that uses a dynamic pin-array light-field surface to translate time, light, and memory into tactile spatial data. Instead of passively capturing images, the camera becomes a surface that physically indexes light in real time. A dense array of actuated pins encodes light intensity and spatial information as a dynamic topography, effectively creating an ephemeral, haptic image. This project explores how perception can be made tactile—blurring the boundaries between camera, sculpture, and interface.
Keywords: experimental imaging, tangible interfaces, haptics, light field, memory, embodied perception, 4D pin array
1. Motivation & Contribution
Traditional cameras flatten space and time into 2D images, detaching viewers from the embodied experience of light. This project proposes a camera that pushes back—literally—through a dynamic pin array that translates incoming light intensities into physical motion.
Key Contributions:
A physical light-field camera translating captured light directly into tactile topographies.
Integration of optics and mechanical actuation for a real-time haptic visualization of imagery.
A conceptual shift from passive image capture to embodied image experience.
Open hardware design suitable for experimental and artistic use.
2. System Overview
Conceptual Pipeline
Light capture: Analog lens system focuses incoming light onto a sensor.
Mapping: Custom algorithms convert pixel intensity to vertical displacement.
Output: Actuators push pins to create a tactile relief map in real time.
The surface acts as a physical light field—a living photograph that can be touched.
3. Technical Architecture
Key Components
Raspberry Pi 4 for image capture and processing
Pi Camera module for light capture
Servo-actuated pin array for real-time tactile output
Custom printed guide plate and carrier for precision alignment
Mapping algorithm converts grayscale intensity to pin displacement
4. Bill of Materials (BOM)
Component | Model | Qty | Cost | Purpose |
---|---|---|---|---|
Microcomputer | Raspberry Pi 4 | 1 | $55 | Core processing unit |
Camera | Pi Camera Module (v2 or HQ) | 1 | $30 | Light capture |
Servo Motors | SG90 Micro Servos | 100 | $100 | Pin actuation |
3D Printed Frame | Custom | 1 | $20 | Pin carrier and guide |
Lens | Olympus OM4 lens (manual) | 1 | — | Optical input |
Multiplexer & Power | PCA9685 + 5V supply | 2 | $40 | PWM control |
Fasteners, wiring, misc. | — | — | $25 | Assembly |
Total Estimated Cost | $270 |
5. Research Context & Motivation
Theoretical Framework
Phenomenology of Perception (Merleau-Ponty) — seeing and touching are intertwined modes of understanding the world.
Embodied Perception (Varela et al.) — cognition arises through sensorimotor coupling.
Tangible Media (Ishii & Ullmer) — interfaces can extend sensory experience into the physical domain.
Research Gap: Conventional cameras prioritize representation over experience. This project re-centers the body and touch as primary perceptual modalities.
6. Future Work
Higher-resolution pin arrays and continuous actuators for fluid tactile imagery.
Adaptive mapping (e.g., depth encoding, motion parallax).
Integration with tactile memory systems for time-based replay.
Modular lens-mounting for different optical configurations.
7. Ethics & Accessibility
Pin displacement calibrated for safe touch.
No biometric or personal data stored.
Encourages multisensory engagement and accessible tactile exploration of visual content.
8. References
Varela, F. J., Thompson, E., & Rosch, E. (1991). The Embodied Mind. MIT Press.
Merleau-Ponty, M. (1962). Phenomenology of Perception. Routledge.
Ishii, H., & Ullmer, B. (1997). Tangible Bits. CHI’97.
Clark, A. (1997). Being There. MIT Press.
9. Citation
Project page: https://www.c11visualarts.com/altered-perception---4d-an-experimental-camera
GitHub repository: https://github.com/CJD-11/4D-Experimental-Camera
Abstract
The 4D Experimental Camera is a physical-computational imaging system that uses a dynamic pin-array light-field surface to translate time, light, and memory into tactile spatial data. Instead of passively capturing images, the camera becomes a surface that physically indexes light in real time. A dense array of actuated pins encodes light intensity and spatial information as a dynamic topography, effectively creating an ephemeral, haptic image. This project explores how perception can be made tactile—blurring the boundaries between camera, sculpture, and interface.
Keywords: experimental imaging, tangible interfaces, haptics, light field, memory, embodied perception, 4D pin array
1. Motivation & Contribution
Traditional cameras flatten space and time into 2D images, detaching viewers from the embodied experience of light. This project proposes a camera that pushes back—literally—through a dynamic pin array that translates incoming light intensities into physical motion.
Key Contributions:
A physical light-field camera translating captured light directly into tactile topographies.
Integration of optics and mechanical actuation for a real-time haptic visualization of imagery.
A conceptual shift from passive image capture to embodied image experience.
Open hardware design suitable for experimental and artistic use.
2. System Overview
Conceptual Pipeline
Light capture: Analog lens system focuses incoming light onto a sensor.
Mapping: Custom algorithms convert pixel intensity to vertical displacement.
Output: Actuators push pins to create a tactile relief map in real time.
The surface acts as a physical light field—a living photograph that can be touched.
3. Technical Architecture
Key Components
Raspberry Pi 4 for image capture and processing
Pi Camera module for light capture
Servo-actuated pin array for real-time tactile output
Custom printed guide plate and carrier for precision alignment
Mapping algorithm converts grayscale intensity to pin displacement
4. Bill of Materials (BOM)
Component | Model | Qty | Cost | Purpose |
---|---|---|---|---|
Microcomputer | Raspberry Pi 4 | 1 | $55 | Core processing unit |
Camera | Pi Camera Module (v2 or HQ) | 1 | $30 | Light capture |
Servo Motors | SG90 Micro Servos | 100 | $100 | Pin actuation |
3D Printed Frame | Custom | 1 | $20 | Pin carrier and guide |
Lens | Olympus OM4 lens (manual) | 1 | — | Optical input |
Multiplexer & Power | PCA9685 + 5V supply | 2 | $40 | PWM control |
Fasteners, wiring, misc. | — | — | $25 | Assembly |
Total Estimated Cost | $270 |
5. Research Context & Motivation
Theoretical Framework
Phenomenology of Perception (Merleau-Ponty) — seeing and touching are intertwined modes of understanding the world.
Embodied Perception (Varela et al.) — cognition arises through sensorimotor coupling.
Tangible Media (Ishii & Ullmer) — interfaces can extend sensory experience into the physical domain.
Research Gap: Conventional cameras prioritize representation over experience. This project re-centers the body and touch as primary perceptual modalities.
6. Future Work
Higher-resolution pin arrays and continuous actuators for fluid tactile imagery.
Adaptive mapping (e.g., depth encoding, motion parallax).
Integration with tactile memory systems for time-based replay.
Modular lens-mounting for different optical configurations.
7. Ethics & Accessibility
Pin displacement calibrated for safe touch.
No biometric or personal data stored.
Encourages multisensory engagement and accessible tactile exploration of visual content.
8. References
Varela, F. J., Thompson, E., & Rosch, E. (1991). The Embodied Mind. MIT Press.
Merleau-Ponty, M. (1962). Phenomenology of Perception. Routledge.
Ishii, H., & Ullmer, B. (1997). Tangible Bits. CHI’97.
Clark, A. (1997). Being There. MIT Press.
9. Citation
Project page: https://www.c11visualarts.com/altered-perception---4d-an-experimental-camera
GitHub repository: https://github.com/CJD-11/4D-Experimental-Camera
EXECUTIVE SUMMARY
Project Vision
The 4D Entropy Pin Camera investigates how tactile pin arrays combined with temporal visualization can create tangible representations of abstract mathematical phenomena, enabling users to physically experience entropy decay, dimensional projections, and gravitational influences through embodied interaction.
Key Innovation
Instead of traditional visualization, this system creates tactile interfaces encoded through dynamic physical pin movements and retrieved through direct haptic exploration. This approach grounds complex mathematical concepts in the body's natural sensorimotor capabilities.
Technical Achievement
Building upon the proven foundation of the Tactile Memory Recall system, this ongoing project:
Extends validated tactile architecture from 3-finger tracking to 64-pin spatial arrays
Implements real-time entropy calculation with mathematical precision for information field visualization
Integrates 4D stereographic projection algorithms mapping higher-dimensional spaces to physical pin heights
PROJECT OVERVIEW
The Challenge
Current memory augmentation technologies predominantly target visual and auditory modalities, despite mounting evidence that physical interactions fundamentally shape mental representations and recall performance. This oversight represents a significant gap in our approach to cognitive enhancement, particularly given that spatial memory formation naturally integrates proprioceptive information during environmental exploration.
Solution
Tactile Memory Replay creates a wearable interface that:
Records spatial poses through multi-sensor finger orientation tracking
Stores memory anchors as persistent proprioceptive patterns
Triggers haptic feedback when users approximate recorded configurations
Enables embodied recall without relying on visual or auditory cues
Core Functionality
MAP Mode - Spatial Memory Encoding
User positions fingers → Apply force → System records pose → Memory stored
↓ ↓ ↓ ↓
Natural exploration Contact detection 3D tracking EEPROM persistence
When a user applies pressure above the force threshold (150 units), the system begins recording finger orientations from three IMU sensors over a 3-second window, averaging the readings to create a stable spatial memory.
REPLAY Mode - Embodied Retrieval
Finger movement → Pose comparison → Distance calculation → Haptic activation
↓ ↓ ↓ ↓
Real-time IMU Pattern matching Euclidean distance Motor feedback
The system continuously compares current finger poses against stored memories using 3D Euclidean distance. When the distance falls below 300° tolerance, maximum intensity haptic feedback activates across all three motors.
Menu
Menu
Menu