Menu

Menu

Menu

C11 VISUAL ARTS

C11 VISUAL ARTS

C11 VISUAL ARTS

TACTICLE Memory RECALL

A Haptic Interface for Embodied Spatial Recall

A Haptic Interface for Embodied Spatial Recall

A Haptic Interface for Embodied Spatial Recall

Abstract

Tactile Memory Recall is a low-cost haptic interface that encodes and retrieves spatial memories through embodied finger poses and vibrotactile feedback. Rather than relying on visual or auditory cues, the system uses proprioception as the primary channel of recall. It records finger pose configurations using three IMUs, stores them in persistent memory, and triggers localized haptic feedback when users reproduce the stored poses. By grounding memory formation in sensorimotor experience, the system advances embodied cognition approaches to spatial memory augmentation and accessibility.

Keywords: haptics, embodied cognition, proprioception, tangible interfaces, wearable computing, tactile memory, memory augmentation


1. Motivation & Contribution

Existing memory augmentation tools privilege vision and sound, despite decades of evidence that touch and proprioception are crucial in forming and retrieving spatial memories (e.g., Varela et al. 1991; O’Keefe & Nadel 1978). This project proposes an alternative approach: memory is not shown but felt.

Key Contributions

  • Embodied memory encoding and retrieval via natural finger poses.

  • Low-cost, open-source hardware design ($137) that supports replication.

  • Real-time haptic feedback (<50 ms latency) mapped directly to finger pose similarity.

  • Evaluation framework for measuring tactile recall accuracy, interference, and retention.


2. System Overview

MAP Mode — Encoding

  1. User presses against an object (force > 150 units).

  2. Finger pose orientation (thumb, index, middle) is recorded for 3 s.

  3. The system averages readings and stores them in EEPROM (5-slot capacity).

REPLAY Mode — Retrieval

  1. User moves fingers through space.

  2. Real-time pose data is compared with stored vectors using Euclidean distance.

  3. When distance < 300°, all vibration motors activate at maximum intensity, signaling a match.

Latency: <50 ms end-to-end at 100 Hz loop frequency.


3. Technical Architecture





┌─────────────────────────────────────────────────────────────┐
│ TACTILE RECALL SYSTEM │
├─────────────────────────────────────────────────────────────┤
│ INPUT: 3× MPU6050 IMUs + 3× FSR402 │
│ TCA9548A I2C Multiplexer │
│ PROCESSING: Arduino Mega 2560, sensor fusion, EEPROM store │
│ MATCHING: Euclidean distance + threshold detection │
│ OUTPUT: 3× Coin Motors (PWM) + serial interface │
└─────────────────────────────────────────────────────────────┘



Hardware rationale

  • IMU: pose sensing with 6-DOF precision.

  • Multiplexer: allows multiple IMUs on one bus.

  • FSRs: trigger intentional memory recording.

  • Coin motors: provide direct tactile feedback.

Software

  • Finite state machine with modes: IDLE, MAP, REPLAY, CALIBRATE.

  • Serial commands for debugging and control.

  • Tolerance-based matching prioritizes perception over precision.


4. Bill of Materials (BOM)

Component

Model

Qty

Cost

Purpose

Microcontroller

Arduino Mega 2560

1

$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Pose tracking

Force Sensors

FSR 402

3

$36

Contact detection

Vibration Motors

10 mm coin motors

3

$12

Haptic feedback

MOSFETs + Resistors

2N7000 + 1k/10kΩ

6

$3.50

Motor control & biasing

Breadboard & Wires

$17

Assembly

Total Cost



$137


5. Research Context & Motivation

Theoretical Framework

  • Enactive Cognition: knowledge arises through dynamic interaction (Varela et al., 1991).

  • Motor-Sensory Integration: movement shapes perception and memory.

  • Situated Learning: memory is tied to spatial context.

Spatial Memory Research

  • Cognitive maps (O’Keefe & Nadel 1978) integrate multisensory input.

  • Haptic exploration enhances retention (Lederman & Klatzky 1987, 2009).

Research Gaps

  • Visual/auditory bias in memory tools.

  • Lack of embodied, sensorimotor interfaces.

  • Accessibility limitations for visually impaired users.

Contribution Tactile Memory Recall provides proprioceptive encoding, natural gesture-based retrieval, and inclusive design beyond visual channels.


6. Future Work

  • Wireless ESP32 integration for untethered wearability.

  • Miniaturized ring-form factor sensors.

  • Richer haptic modalities (directional, thermal, ultrasonic).

  • Adaptive ML-based thresholds and personalization.

  • Integration with assistive navigation and spatial learning applications.


7. Ethics & Accessibility

  • Haptic intensity capped to safe tactile levels.

  • No personal data stored; only numeric pose vectors.

  • Supports private, silent recall — valuable for visually impaired users.


8. References

Varela, F. J., Thompson, E., & Rosch, E. (1991). The embodied mind. MIT Press.

  • Clark, A. (1997). Being There. MIT Press.

  • O’Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. OUP.

  • Lederman, S. J., & Klatzky, R. L. (1987, 2009). Haptic perception.

  • Dourish, P. (2001). Where the Action Is. MIT Press.

  • Brock, A. M. et al. (2015). Human–Computer Interaction, 30(2).

  • Yates, F. A. (1966). The Art of Memory.

  • Legge, E. L. et al. (2012). Acta Psychologica.

  • Pearce, J. M. (2012). Science.

  • Nosek, B. A. et al. (2015). Science.


9. Citation

Project page: https://www.c11visualarts.com/altered-perception---tactile-memory-recall
GitHub repository: https://github.com/CJD-11/Tacticle-Memory-Recal


@misc{dziadzio2025tactile,
title = {Altered Perception—Tactile Memory Recall},
author = {Dziadzio, Corey},
year = {2025},
howpublished = {Project page and GitHub repository},
url = {https://www.c11visualarts.com/altered-perception---tactile-memory-recall},
note = {GitHub: https://github.com/CJD-11/Tacticle-Memory-Recall}
}




Abstract

Tactile Recall is a low-cost haptic interface that encodes and retrieves spatial memories through embodied finger poses and vibrotactile feedback. Rather than relying on visual or auditory cues, the system uses proprioception as the primary channel of recall. It records finger pose configurations using three IMUs, stores them in persistent memory, and triggers localized haptic feedback when users reproduce the stored poses. By grounding memory formation in sensorimotor experience, the system advances embodied cognition approaches to spatial memory augmentation and accessibility.

Keywords: haptics, embodied cognition, proprioception, tangible interfaces, wearable computing, tactile memory, memory augmentation


1. Motivation & Contribution

Existing memory augmentation tools privilege vision and sound, despite decades of evidence that touch and proprioception are crucial in forming and retrieving spatial memories (e.g., Varela et al. 1991; O’Keefe & Nadel 1978). This project proposes an alternative approach: memory is not shown but felt.

Key Contributions

  • Embodied memory encoding and retrieval via natural finger poses.

  • Low-cost, open-source hardware design ($137) that supports replication.

  • Real-time haptic feedback (<50 ms latency) mapped directly to finger pose similarity.

  • Evaluation framework for measuring tactile recall accuracy, interference, and retention.


2. System Overview

MAP Mode — Encoding

  1. User presses against an object (force > 150 units).

  2. Finger pose orientation (thumb, index, middle) is recorded for 3 s.

  3. The system averages readings and stores them in EEPROM (5-slot capacity).

REPLAY Mode — Retrieval

  1. User moves fingers through space.

  2. Real-time pose data is compared with stored vectors using Euclidean distance.

  3. When distance < 300°, all vibration motors activate at maximum intensity, signaling a match.

Latency: <50 ms end-to-end at 100 Hz loop frequency.


3. Technical Architecture





┌─────────────────────────────────────────────────────────────┐
│ TACTILE RECALL SYSTEM │
├─────────────────────────────────────────────────────────────┤
│ INPUT: 3× MPU6050 IMUs + 3× FSR402 │
│ TCA9548A I2C Multiplexer │
│ PROCESSING: Arduino Mega 2560, sensor fusion, EEPROM store │
│ MATCHING: Euclidean distance + threshold detection │
│ OUTPUT: 3× Coin Motors (PWM) + serial interface │
└─────────────────────────────────────────────────────────────┘



Hardware rationale

  • IMU: pose sensing with 6-DOF precision.

  • Multiplexer: allows multiple IMUs on one bus.

  • FSRs: trigger intentional memory recording.

  • Coin motors: provide direct tactile feedback.

Software

  • Finite state machine with modes: IDLE, MAP, REPLAY, CALIBRATE.

  • Serial commands for debugging and control.

  • Tolerance-based matching prioritizes perception over precision.


4. Bill of Materials (BOM)

Component

Model

Qty

Cost

Purpose

Microcontroller

Arduino Mega 2560

1

$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Pose tracking

Force Sensors

FSR 402

3

$36

Contact detection

Vibration Motors

10 mm coin motors

3

$12

Haptic feedback

MOSFETs + Resistors

2N7000 + 1k/10kΩ

6

$3.50

Motor control & biasing

Breadboard & Wires

$17

Assembly

Total Cost



$137


5. Research Context & Motivation

Theoretical Framework

  • Enactive Cognition: knowledge arises through dynamic interaction (Varela et al., 1991).

  • Motor-Sensory Integration: movement shapes perception and memory.

  • Situated Learning: memory is tied to spatial context.

Spatial Memory Research

  • Cognitive maps (O’Keefe & Nadel 1978) integrate multisensory input.

  • Haptic exploration enhances retention (Lederman & Klatzky 1987, 2009).

Research Gaps

  • Visual/auditory bias in memory tools.

  • Lack of embodied, sensorimotor interfaces.

  • Accessibility limitations for visually impaired users.

Contribution Tactile Memory Recall provides proprioceptive encoding, natural gesture-based retrieval, and inclusive design beyond visual channels.


6. Future Work

  • Wireless ESP32 integration for untethered wearability.

  • Miniaturized ring-form factor sensors.

  • Richer haptic modalities (directional, thermal, ultrasonic).

  • Adaptive ML-based thresholds and personalization.

  • Integration with assistive navigation and spatial learning applications.


7. Ethics & Accessibility

  • Haptic intensity capped to safe tactile levels.

  • No personal data stored; only numeric pose vectors.

  • Supports private, silent recall — valuable for visually impaired users.


8. References

Varela, F. J., Thompson, E., & Rosch, E. (1991). The embodied mind. MIT Press.

  • Clark, A. (1997). Being There. MIT Press.

  • O’Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. OUP.

  • Lederman, S. J., & Klatzky, R. L. (1987, 2009). Haptic perception.

  • Dourish, P. (2001). Where the Action Is. MIT Press.

  • Brock, A. M. et al. (2015). Human–Computer Interaction, 30(2).

  • Yates, F. A. (1966). The Art of Memory.

  • Legge, E. L. et al. (2012). Acta Psychologica.

  • Pearce, J. M. (2012). Science.

  • Nosek, B. A. et al. (2015). Science.


9. Citation

Project page: https://www.c11visualarts.com/altered-perception---tactile-memory-recall
GitHub repository: https://github.com/CJD-11/Tacticle-Memory-Recal


@misc{dziadzio2025tactile,
title = {Altered Perception—Tactile Memory Recall},
author = {Dziadzio, Corey},
year = {2025},
howpublished = {Project page and GitHub repository},
url = {https://www.c11visualarts.com/altered-perception---tactile-memory-recall},
note = {GitHub: https://github.com/CJD-11/Tacticle-Memory-Recall}
}




Abstract

Tactile Recall is a low-cost haptic interface that encodes and retrieves spatial memories through embodied finger poses and vibrotactile feedback. Rather than relying on visual or auditory cues, the system uses proprioception as the primary channel of recall. It records finger pose configurations using three IMUs, stores them in persistent memory, and triggers localized haptic feedback when users reproduce the stored poses. By grounding memory formation in sensorimotor experience, the system advances embodied cognition approaches to spatial memory augmentation and accessibility.

Keywords: haptics, embodied cognition, proprioception, tangible interfaces, wearable computing, tactile memory, memory augmentation


1. Motivation & Contribution

Existing memory augmentation tools privilege vision and sound, despite decades of evidence that touch and proprioception are crucial in forming and retrieving spatial memories (e.g., Varela et al. 1991; O’Keefe & Nadel 1978). This project proposes an alternative approach: memory is not shown but felt.

Key Contributions

  • Embodied memory encoding and retrieval via natural finger poses.

  • Low-cost, open-source hardware design ($137) that supports replication.

  • Real-time haptic feedback (<50 ms latency) mapped directly to finger pose similarity.

  • Evaluation framework for measuring tactile recall accuracy, interference, and retention.


2. System Overview

MAP Mode — Encoding

  1. User presses against an object (force > 150 units).

  2. Finger pose orientation (thumb, index, middle) is recorded for 3 s.

  3. The system averages readings and stores them in EEPROM (5-slot capacity).

REPLAY Mode — Retrieval

  1. User moves fingers through space.

  2. Real-time pose data is compared with stored vectors using Euclidean distance.

  3. When distance < 300°, all vibration motors activate at maximum intensity, signaling a match.

Latency: <50 ms end-to-end at 100 Hz loop frequency.


3. Technical Architecture





┌─────────────────────────────────────────────────────────────┐
│ TACTILE RECALL SYSTEM │
├─────────────────────────────────────────────────────────────┤
│ INPUT: 3× MPU6050 IMUs + 3× FSR402 │
│ TCA9548A I2C Multiplexer │
│ PROCESSING: Arduino Mega 2560, sensor fusion, EEPROM store │
│ MATCHING: Euclidean distance + threshold detection │
│ OUTPUT: 3× Coin Motors (PWM) + serial interface │
└─────────────────────────────────────────────────────────────┘



Hardware rationale

  • IMU: pose sensing with 6-DOF precision.

  • Multiplexer: allows multiple IMUs on one bus.

  • FSRs: trigger intentional memory recording.

  • Coin motors: provide direct tactile feedback.

Software

  • Finite state machine with modes: IDLE, MAP, REPLAY, CALIBRATE.

  • Serial commands for debugging and control.

  • Tolerance-based matching prioritizes perception over precision.


4. Bill of Materials (BOM)

Component

Model

Qty

Cost

Purpose

Microcontroller

Arduino Mega 2560

1

$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Pose tracking

Force Sensors

FSR 402

3

$36

Contact detection

Vibration Motors

10 mm coin motors

3

$12

Haptic feedback

MOSFETs + Resistors

2N7000 + 1k/10kΩ

6

$3.50

Motor control & biasing

Breadboard & Wires

$17

Assembly

Total Cost



$137


5. Research Context & Motivation

Theoretical Framework

  • Enactive Cognition: knowledge arises through dynamic interaction (Varela et al., 1991).

  • Motor-Sensory Integration: movement shapes perception and memory.

  • Situated Learning: memory is tied to spatial context.

Spatial Memory Research

  • Cognitive maps (O’Keefe & Nadel 1978) integrate multisensory input.

  • Haptic exploration enhances retention (Lederman & Klatzky 1987, 2009).

Research Gaps

  • Visual/auditory bias in memory tools.

  • Lack of embodied, sensorimotor interfaces.

  • Accessibility limitations for visually impaired users.

Contribution Tactile Memory Recall provides proprioceptive encoding, natural gesture-based retrieval, and inclusive design beyond visual channels.


6. Future Work

  • Wireless ESP32 integration for untethered wearability.

  • Miniaturized ring-form factor sensors.

  • Richer haptic modalities (directional, thermal, ultrasonic).

  • Adaptive ML-based thresholds and personalization.

  • Integration with assistive navigation and spatial learning applications.


7. Ethics & Accessibility

  • Haptic intensity capped to safe tactile levels.

  • No personal data stored; only numeric pose vectors.

  • Supports private, silent recall — valuable for visually impaired users.


8. References

Varela, F. J., Thompson, E., & Rosch, E. (1991). The embodied mind. MIT Press.

  • Clark, A. (1997). Being There. MIT Press.

  • O’Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. OUP.

  • Lederman, S. J., & Klatzky, R. L. (1987, 2009). Haptic perception.

  • Dourish, P. (2001). Where the Action Is. MIT Press.

  • Brock, A. M. et al. (2015). Human–Computer Interaction, 30(2).

  • Yates, F. A. (1966). The Art of Memory.

  • Legge, E. L. et al. (2012). Acta Psychologica.

  • Pearce, J. M. (2012). Science.

  • Nosek, B. A. et al. (2015). Science.


9. Citation

Project page: https://www.c11visualarts.com/altered-perception---tactile-memory-recall
GitHub repository: https://github.com/CJD-11/Tacticle-Memory-Recal


@misc{dziadzio2025tactile,
title = {Altered Perception—Tactile Memory Recall},
author = {Dziadzio, Corey},
year = {2025},
howpublished = {Project page and GitHub repository},
url = {https://www.c11visualarts.com/altered-perception---tactile-memory-recall},
note = {GitHub: https://github.com/CJD-11/Tacticle-Memory-Recall}
}