Menu

Menu

Menu

C11 VISUAL ARTS

C11 VISUAL ARTS

C11 VISUAL ARTS

TACTICLE MEMORY RECALL

A Haptic Interface for Embodied Spatial Recall

A Haptic Interface for Embodied Spatial Recall

A Haptic Interface for Embodied Spatial Recall

TABLE OF CONTENTS

  1. Executive Summary

  2. Project Overview

  3. Research Context & Motivation

  4. Technical Innovation

  5. System Implementation

  6. Research Vision

TABLE OF CONTENTS

  1. Executive Summary

  2. Project Overview

  3. Research Context & Motivation

  4. Technical Innovation

  5. System Implementation

  6. Research Vision

TABLE OF CONTENTS

  1. Executive Summary

  2. Project Overview

  3. Research Context & Motivation

  4. Technical Innovation

  5. System Implementation

  6. Research Vision

EXECUTIVE SUMMARY

Project Vision

Tactile Memory Replay seeks to move beyond traditional visual and auditory aids to leverage the potential of proprioceptive and haptic channels. This project investigates how finger pose recognition combined with haptic feedback can create persistent spatial memories, enabling users to "feel" previously visited locations or moments in time through embodied interaction.

Key Innovation

Instead of abstract digital interfaces, the system here creates haptic memory residencies where spatial relationships are encoded through natural hand movements and retrieved through proprioceptive reproduction. This approach grounds memory formation in the body's natural sensorimotor coupling with the environment, fundamentally aligning with embodied cognition principles.

Technical Achievement

Designed and built a functional prototype system that successfully:

  • Records finger poses using three MPU6050 IMU sensors with ±10° accuracy

  • Stores spatial memories persistently in EEPROM with 5-location capacity

  • Triggers haptic feedback when finger positions approximate recorded poses

  • Operates in real-time with <50ms response latency

Costs only $137 to build, making it accessible for research replication


EXECUTIVE SUMMARY

Project Vision

Tactile Memory Replay seeks to move beyond traditional visual and auditory aids to leverage the potential of proprioceptive and haptic channels. This project investigates how finger pose recognition combined with haptic feedback can create persistent spatial memories, enabling users to "feel" previously visited locations or moments in time through embodied interaction.

Key Innovation

Instead of abstract digital interfaces, the system here creates haptic memory residencies where spatial relationships are encoded through natural hand movements and retrieved through proprioceptive reproduction. This approach grounds memory formation in the body's natural sensorimotor coupling with the environment, fundamentally aligning with embodied cognition principles.

Technical Achievement

Designed and built a functional prototype system that successfully:

  • Records finger poses using three MPU6050 IMU sensors with ±10° accuracy

  • Stores spatial memories persistently in EEPROM with 5-location capacity

  • Triggers haptic feedback when finger positions approximate recorded poses

  • Operates in real-time with <50ms response latency

Costs only $137 to build, making it accessible for research replication


EXECUTIVE SUMMARY

Project Vision

Tactile Memory Replay seeks to move beyond traditional visual and auditory aids to leverage the potential of proprioceptive and haptic channels. This project investigates how finger pose recognition combined with haptic feedback can create persistent spatial memories, enabling users to "feel" previously visited locations or moments in time through embodied interaction.

Key Innovation

Instead of abstract digital interfaces, the system here creates haptic memory residencies where spatial relationships are encoded through natural hand movements and retrieved through proprioceptive reproduction. This approach grounds memory formation in the body's natural sensorimotor coupling with the environment, fundamentally aligning with embodied cognition principles.

Technical Achievement

Designed and built a functional prototype system that successfully:

  • Records finger poses using three MPU6050 IMU sensors with ±10° accuracy

  • Stores spatial memories persistently in EEPROM with 5-location capacity

  • Triggers haptic feedback when finger positions approximate recorded poses

  • Operates in real-time with <50ms response latency

Costs only $137 to build, making it accessible for research replication


PROJECT OVERVIEW

The Challenge

Current memory augmentation technologies predominantly target visual and auditory modalities, despite mounting evidence that physical interactions fundamentally shape mental representations and recall performance. This oversight represents a significant gap in our approach to cognitive enhancement, particularly given that spatial memory formation naturally integrates proprioceptive information during environmental exploration.


Solution

Tactile Memory Replay creates a wearable interface that:

  1. Records spatial poses through multi-sensor finger orientation tracking

  2. Stores memory anchors as persistent proprioceptive patterns

  3. Triggers haptic feedback when users approximate recorded configurations

  4. Enables embodied recall without relying on visual or auditory cues

Core Functionality

MAP Mode - Spatial Memory Encoding

User positions fingers → Apply force → System records pose → Memory stored

        ↓                    ↓              ↓                ↓

   Natural exploration   Contact detection   3D tracking    EEPROM persistence



When a user applies pressure above the force threshold (150 units), the system begins recording finger orientations from three IMU sensors over a 3-second window, averaging the readings to create a stable spatial memory.

REPLAY Mode - Embodied Retrieval

Finger movement → Pose comparison → Distance calculation → Haptic activation

       ↓               ↓                  ↓                    ↓

   Real-time IMU    Pattern matching   Euclidean distance   Motor feedback



The system continuously compares current finger poses against stored memories using 3D Euclidean distance. When the distance falls below 300° tolerance, maximum intensity haptic feedback activates across all three motors.


PROJECT OVERVIEW

The Challenge

Current memory augmentation technologies predominantly target visual and auditory modalities, despite mounting evidence that physical interactions fundamentally shape mental representations and recall performance. This oversight represents a significant gap in our approach to cognitive enhancement, particularly given that spatial memory formation naturally integrates proprioceptive information during environmental exploration.


Solution

Tactile Memory Replay creates a wearable interface that:

  1. Records spatial poses through multi-sensor finger orientation tracking

  2. Stores memory anchors as persistent proprioceptive patterns

  3. Triggers haptic feedback when users approximate recorded configurations

  4. Enables embodied recall without relying on visual or auditory cues

Core Functionality

MAP Mode - Spatial Memory Encoding

User positions fingers → Apply force → System records pose → Memory stored

        ↓                    ↓              ↓                ↓

   Natural exploration   Contact detection   3D tracking    EEPROM persistence



When a user applies pressure above the force threshold (150 units), the system begins recording finger orientations from three IMU sensors over a 3-second window, averaging the readings to create a stable spatial memory.

REPLAY Mode - Embodied Retrieval

Finger movement → Pose comparison → Distance calculation → Haptic activation

       ↓               ↓                  ↓                    ↓

   Real-time IMU    Pattern matching   Euclidean distance   Motor feedback



The system continuously compares current finger poses against stored memories using 3D Euclidean distance. When the distance falls below 300° tolerance, maximum intensity haptic feedback activates across all three motors.


PROJECT OVERVIEW

The Challenge

Current memory augmentation technologies predominantly target visual and auditory modalities, despite mounting evidence that physical interactions fundamentally shape mental representations and recall performance. This oversight represents a significant gap in our approach to cognitive enhancement, particularly given that spatial memory formation naturally integrates proprioceptive information during environmental exploration.


Solution

Tactile Memory Replay creates a wearable interface that:

  1. Records spatial poses through multi-sensor finger orientation tracking

  2. Stores memory anchors as persistent proprioceptive patterns

  3. Triggers haptic feedback when users approximate recorded configurations

  4. Enables embodied recall without relying on visual or auditory cues

Core Functionality

MAP Mode - Spatial Memory Encoding

User positions fingers → Apply force → System records pose → Memory stored

        ↓                    ↓              ↓                ↓

   Natural exploration   Contact detection   3D tracking    EEPROM persistence



When a user applies pressure above the force threshold (150 units), the system begins recording finger orientations from three IMU sensors over a 3-second window, averaging the readings to create a stable spatial memory.

REPLAY Mode - Embodied Retrieval

Finger movement → Pose comparison → Distance calculation → Haptic activation

       ↓               ↓                  ↓                    ↓

   Real-time IMU    Pattern matching   Euclidean distance   Motor feedback



The system continuously compares current finger poses against stored memories using 3D Euclidean distance. When the distance falls below 300° tolerance, maximum intensity haptic feedback activates across all three motors.


PROJECT OVERVIEW

The Challenge

Current memory augmentation technologies predominantly target visual and auditory modalities, despite mounting evidence that physical interactions fundamentally shape mental representations and recall performance. This oversight represents a significant gap in our approach to cognitive enhancement, particularly given that spatial memory formation naturally integrates proprioceptive information during environmental exploration.


Solution

Tactile Memory Replay creates a wearable interface that:

  1. Records spatial poses through multi-sensor finger orientation tracking

  2. Stores memory anchors as persistent proprioceptive patterns

  3. Triggers haptic feedback when users approximate recorded configurations

  4. Enables embodied recall without relying on visual or auditory cues

Core Functionality

MAP Mode - Spatial Memory Encoding

User positions fingers → Apply force → System records pose → Memory stored

        ↓                    ↓              ↓                ↓

   Natural exploration   Contact detection   3D tracking    EEPROM persistence



When a user applies pressure above the force threshold (150 units), the system begins recording finger orientations from three IMU sensors over a 3-second window, averaging the readings to create a stable spatial memory.

REPLAY Mode - Embodied Retrieval

Finger movement → Pose comparison → Distance calculation → Haptic activation

       ↓               ↓                  ↓                    ↓

   Real-time IMU    Pattern matching   Euclidean distance   Motor feedback



The system continuously compares current finger poses against stored memories using 3D Euclidean distance. When the distance falls below 300° tolerance, maximum intensity haptic feedback activates across all three motors.


RESEARCH CONTEXT & MOTIVATION


Theoretical Framework

This research builds on the foundational premise that cognition is deeply rooted in the body's interactions with the world (Varela, Thompson & Rosch, 1991). Key principles include:

Enactive Cognition: Knowledge emerges through dynamic sensorimotor interaction, not passive information processing.

Motor-Sensory Integration: Physical movement patterns fundamentally shape conceptual understanding and memory formation.

Situated Learning: Cognition is inherently tied to the physical and social contexts in which it occurs.

Spatial Memory Research

Decades of neuroscience research demonstrate that spatial memory formation integrates multiple sensory modalities:

Cognitive Maps (O'Keefe & Nadel, 1978): Neural representations of spatial relationships that can be enhanced through multi-modal input.

Haptic Exploration (Lederman & Klatzky, 2009): Systematic hand movements create robust spatial memories through tactile interaction.



Research Gap Analysis

Current Limitations

  1. Modal Bias: Existing memory aids predominantly target visual/auditory channels

  2. Abstract Interfaces: Digital systems lack grounding in natural sensorimotor experience

  3. Accessibility Barriers: Visual-centric approaches exclude important user populations with visual impairment. 

Contribution

Tactile Memory Replay addresses these gaps by:

  • Leveraging proprioceptive channels for memory encoding and retrieval

  • Grounding digital memories in natural hand movements and spatial exploration

  • Providing non-visual feedback suitable for diverse accessibility needs


Potential Applications

Assistive Technology

Navigation aids for visually impaired individuals that provide silent, private spatial feedback without interfering with auditory environmental cues.

Educational Enhancement

Kinesthetic learning tools for spatial concepts in mathematics, geography, and science education through tactile "bookmarks" for concept navigation.

Memory Enhancement

Augmenting traditional memory palace techniques with tactile anchoring for improved recall performance in educational and therapeutic contexts.







RESEARCH CONTEXT & MOTIVATION


Theoretical Framework

This research builds on the foundational premise that cognition is deeply rooted in the body's interactions with the world (Varela, Thompson & Rosch, 1991). Key principles include:

Enactive Cognition: Knowledge emerges through dynamic sensorimotor interaction, not passive information processing.

Motor-Sensory Integration: Physical movement patterns fundamentally shape conceptual understanding and memory formation.

Situated Learning: Cognition is inherently tied to the physical and social contexts in which it occurs.

Spatial Memory Research

Decades of neuroscience research demonstrate that spatial memory formation integrates multiple sensory modalities:

Cognitive Maps (O'Keefe & Nadel, 1978): Neural representations of spatial relationships that can be enhanced through multi-modal input.

Haptic Exploration (Lederman & Klatzky, 2009): Systematic hand movements create robust spatial memories through tactile interaction.



Research Gap Analysis

Current Limitations

  1. Modal Bias: Existing memory aids predominantly target visual/auditory channels

  2. Abstract Interfaces: Digital systems lack grounding in natural sensorimotor experience

  3. Accessibility Barriers: Visual-centric approaches exclude important user populations with visual impairment. 

Contribution

Tactile Memory Replay addresses these gaps by:

  • Leveraging proprioceptive channels for memory encoding and retrieval

  • Grounding digital memories in natural hand movements and spatial exploration

  • Providing non-visual feedback suitable for diverse accessibility needs


Potential Applications

Assistive Technology

Navigation aids for visually impaired individuals that provide silent, private spatial feedback without interfering with auditory environmental cues.

Educational Enhancement

Kinesthetic learning tools for spatial concepts in mathematics, geography, and science education through tactile "bookmarks" for concept navigation.

Memory Enhancement

Augmenting traditional memory palace techniques with tactile anchoring for improved recall performance in educational and therapeutic contexts.







RESEARCH CONTEXT & MOTIVATION


Theoretical Framework

This research builds on the foundational premise that cognition is deeply rooted in the body's interactions with the world (Varela, Thompson & Rosch, 1991). Key principles include:

Enactive Cognition: Knowledge emerges through dynamic sensorimotor interaction, not passive information processing.

Motor-Sensory Integration: Physical movement patterns fundamentally shape conceptual understanding and memory formation.

Situated Learning: Cognition is inherently tied to the physical and social contexts in which it occurs.

Spatial Memory Research

Decades of neuroscience research demonstrate that spatial memory formation integrates multiple sensory modalities:

Cognitive Maps (O'Keefe & Nadel, 1978): Neural representations of spatial relationships that can be enhanced through multi-modal input.

Haptic Exploration (Lederman & Klatzky, 2009): Systematic hand movements create robust spatial memories through tactile interaction.



Research Gap Analysis

Current Limitations

  1. Modal Bias: Existing memory aids predominantly target visual/auditory channels

  2. Abstract Interfaces: Digital systems lack grounding in natural sensorimotor experience

  3. Accessibility Barriers: Visual-centric approaches exclude important user populations with visual impairment. 

Contribution

Tactile Memory Replay addresses these gaps by:

  • Leveraging proprioceptive channels for memory encoding and retrieval

  • Grounding digital memories in natural hand movements and spatial exploration

  • Providing non-visual feedback suitable for diverse accessibility needs


Potential Applications

Assistive Technology

Navigation aids for visually impaired individuals that provide silent, private spatial feedback without interfering with auditory environmental cues.

Educational Enhancement

Kinesthetic learning tools for spatial concepts in mathematics, geography, and science education through tactile "bookmarks" for concept navigation.

Memory Enhancement

Augmenting traditional memory palace techniques with tactile anchoring for improved recall performance in educational and therapeutic contexts.







TECHNICAL INNOVATION

System Architecture & Hardware Design

┌─────────────────────────────────────────────────────────────┐

│                    TACTILE MEMORY SYSTEM                   │

├─────────────────────────────────────────────────────────────┤

│  INPUT LAYER                                                │

│  ├── Multi-IMU Pose Detection (3× MPU6050)                 │

│  │   ├── Thumb Sensor (Channel 0)                          │

│  │   ├── Index Finger Sensor (Channel 1)                   │

│  │   └── Middle Finger Sensor (Channel 2)                  │

│  ├── Force Sensing Array (3× FSR402)                       │

│  │   ├── Contact Detection                                  │

│  │   └── Pressure Measurement                              │

│  └── I2C Multiplexer (TCA9548A)                           │

├─────────────────────────────────────────────────────────────┤

│  PROCESSING LAYER                                           │

│  ├── Arduino Mega 2560 (Main Controller)                   │

│  │   ├── 256KB Flash Memory                                │

│  │   ├── 8KB SRAM                                          │

│  │   └── 4KB EEPROM                                        │

│  ├── Sensor Fusion Algorithms                              │

│  │   ├── Noise Filtering                                   │

│  │   ├── Pose Estimation                                   │

│  │   └── Temporal Smoothing                                │

│  ├── Memory Management                                      │

│  │   ├── Pattern Storage                                   │

│  │   ├── Retrieval Algorithms                              │

│  │   └── EEPROM Persistence                                │

│  └── Pattern Matching Engine                               │

│      ├── Euclidean Distance Calculation                    │

│      ├── Threshold Detection                               │

│      └── Real-time Comparison                              │

├─────────────────────────────────────────────────────────────┤

│  OUTPUT LAYER                                               │

│  ├── Haptic Feedback Array (3× Motors)                     │

│  │   ├── PWM Intensity Control                             │

│  │   ├── Maximum Intensity Output                          │

│  │   └── Simultaneous Activation                           │

│  ├── Serial Communication Interface                        │

│  │   ├── Real-time Debugging                               │

│  │   ├── Configuration Commands                            │

│  │   └── Status Monitoring                                 │

│  └── User Command Interface                                │

└─────────────────────────────────────────────────────────────┘



Component Selection Rationale


MPU6050 IMU Sensors: 6-DOF sensors providing 3-axis gyroscope and accelerometer data with I2C communication, chosen for accuracy, availability, and Arduino library support.

TCA9548A I2C Multiplexer: Enables multiple MPU6050 sensors (which share the same I2C address) to operate simultaneously on a single bus.

Arduino Mega 2560: Provides sufficient I/O pins, processing power, and memory for real-time sensor processing and haptic control.

Force Sensitive Resistors: Simple analog pressure sensors that detect intentional object contact for triggering memory recording mode.

Vibration Motors: 10mm coin-style motors providing clear haptic feedback with PWM intensity control.

Software Architecture

Core System Design

The software implements a state machine with four primary modes:


    IDLE,        // Waiting for user commands

    MAP,         // Recording spatial memories  

    REPLAY,      // Triggering haptic feedback

    CALIBRATE    // Real-time sensor monitoring


For additional details on the algorithms and code used please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall


Design Decisions & Trade-offs


Generous Tolerance Threshold (300°)

The system uses a large tolerance zone to prioritize successful haptic triggering over precise pose matching. This design decision emerged from recognizing that human movement reproduction contains natural variability.

Maximum Intensity Feedback

Rather than graduated haptic intensity based on proximity, the system provides binary maximum-intensity feedback to ensure clear, unmistakable user perception when spatial memories are detected.

Force-Triggered Recording

Requiring physical contact during memory encoding ensures that spatial memories correspond to actual object interactions rather than arbitrary hand positions in space.

Three-Finger Tracking

Thumb, index, and middle finger tracking provides sufficient degrees of freedom to distinguish between distinct hand configurations while maintaining practical wearability.



SYSTEM IMPLEMENTATION


Bill of Materials


Component

Model


Quantity

Cost

Purpose

Microcontroller

Arduino Mega 2560

1


$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Finger orientation tracking

Force Sensors


FSR 402

3

$36

Contact detection

Haptic Motors


10mm Coin Motors

3

$12

Tactile feedback

MOSFETs

2N7000

3

$1.50

Motor control

Resistors

1kΩ, 10kΩ

6

$2

Pull-ups and biasing

Breadboard

Half-size

1

$5

Circuit assembly

Jumper Wires

Various

30

$12

Connections


Total System Cost$137


Circuit Connections

Arduino Mega Pin Assignments:

├── I2C Communication

│   ├── SDA (Pin 20) → TCA9548A SDA

│   └── SCL (Pin 21) → TCA9548A SCL

├── Force Sensors (Analog)

│   ├── A3 → FSR #1 (Thumb)

│   ├── A6 → FSR #2 (Index)  

│   └── A7 → FSR #3 (Middle)

├── Haptic Motors (PWM)

│   ├── Pin 51 → Motor #1 (Thumb)

│   ├── Pin 44 → Motor #2 (Index)

│   └── Pin 46 → Motor #3 (Middle)

└── Power Distribution

    ├── 5V → Sensor power

    └── GND → Common ground



Software Implementation

For additional details please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall



Command Interface

The system provides a serial command interface for operation:

  • MAP: Enter mapping mode to record new spatial memory

  • REPLAY: Enter replay mode to trigger haptic feedback

  • CALIBRATE: Real-time sensor monitoring and debugging

  • STATUS: Display system information and stored memories

  • RESET: Clear all stored spatial memories

  • TESTMOTOR: Verify haptic motor functionality

  • HELP: Display available commands

System Operation

Memory Recording Process

  1. User types "MAP" command

  2. System waits for force threshold to be exceeded

  3. Upon contact detection, 3-second averaging begins

  4. Gyroscope readings collected at 50Hz

  5. Final averaged pose stored to EEPROM

  6. Memory assigned unique identifier

Haptic Replay Process

  1. User types "REPLAY" command

  2. System continuously samples finger pose

  3. Current pose compared against all stored memories

  4. Distance calculated using 3D Euclidean metric

  5. If distance < 300°, maximum haptic feedback triggered

  6. Feedback continues while pose remains in tolerance zone

Calibration and Debugging

  1. User types "CALIBRATE" command

  2. Real-time display of all sensor values

  3. IMU status and connectivity monitoring

  4. Force sensor readings with visual bar graphs

  5. Automatic timeout after 30 seconds









TECHNICAL INNOVATION

System Architecture & Hardware Design

┌─────────────────────────────────────────────────────────────┐

│                    TACTILE MEMORY SYSTEM                   │

├─────────────────────────────────────────────────────────────┤

│  INPUT LAYER                                                │

│  ├── Multi-IMU Pose Detection (3× MPU6050)                 │

│  │   ├── Thumb Sensor (Channel 0)                          │

│  │   ├── Index Finger Sensor (Channel 1)                   │

│  │   └── Middle Finger Sensor (Channel 2)                  │

│  ├── Force Sensing Array (3× FSR402)                       │

│  │   ├── Contact Detection                                  │

│  │   └── Pressure Measurement                              │

│  └── I2C Multiplexer (TCA9548A)                           │

├─────────────────────────────────────────────────────────────┤

│  PROCESSING LAYER                                           │

│  ├── Arduino Mega 2560 (Main Controller)                   │

│  │   ├── 256KB Flash Memory                                │

│  │   ├── 8KB SRAM                                          │

│  │   └── 4KB EEPROM                                        │

│  ├── Sensor Fusion Algorithms                              │

│  │   ├── Noise Filtering                                   │

│  │   ├── Pose Estimation                                   │

│  │   └── Temporal Smoothing                                │

│  ├── Memory Management                                      │

│  │   ├── Pattern Storage                                   │

│  │   ├── Retrieval Algorithms                              │

│  │   └── EEPROM Persistence                                │

│  └── Pattern Matching Engine                               │

│      ├── Euclidean Distance Calculation                    │

│      ├── Threshold Detection                               │

│      └── Real-time Comparison                              │

├─────────────────────────────────────────────────────────────┤

│  OUTPUT LAYER                                               │

│  ├── Haptic Feedback Array (3× Motors)                     │

│  │   ├── PWM Intensity Control                             │

│  │   ├── Maximum Intensity Output                          │

│  │   └── Simultaneous Activation                           │

│  ├── Serial Communication Interface                        │

│  │   ├── Real-time Debugging                               │

│  │   ├── Configuration Commands                            │

│  │   └── Status Monitoring                                 │

│  └── User Command Interface                                │

└─────────────────────────────────────────────────────────────┘



Component Selection Rationale


MPU6050 IMU Sensors: 6-DOF sensors providing 3-axis gyroscope and accelerometer data with I2C communication, chosen for accuracy, availability, and Arduino library support.

TCA9548A I2C Multiplexer: Enables multiple MPU6050 sensors (which share the same I2C address) to operate simultaneously on a single bus.

Arduino Mega 2560: Provides sufficient I/O pins, processing power, and memory for real-time sensor processing and haptic control.

Force Sensitive Resistors: Simple analog pressure sensors that detect intentional object contact for triggering memory recording mode.

Vibration Motors: 10mm coin-style motors providing clear haptic feedback with PWM intensity control.

Software Architecture

Core System Design

The software implements a state machine with four primary modes:


    IDLE,        // Waiting for user commands

    MAP,         // Recording spatial memories  

    REPLAY,      // Triggering haptic feedback

    CALIBRATE    // Real-time sensor monitoring


For additional details on the algorithms and code used please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall


Design Decisions & Trade-offs


Generous Tolerance Threshold (300°)

The system uses a large tolerance zone to prioritize successful haptic triggering over precise pose matching. This design decision emerged from recognizing that human movement reproduction contains natural variability.

Maximum Intensity Feedback

Rather than graduated haptic intensity based on proximity, the system provides binary maximum-intensity feedback to ensure clear, unmistakable user perception when spatial memories are detected.

Force-Triggered Recording

Requiring physical contact during memory encoding ensures that spatial memories correspond to actual object interactions rather than arbitrary hand positions in space.

Three-Finger Tracking

Thumb, index, and middle finger tracking provides sufficient degrees of freedom to distinguish between distinct hand configurations while maintaining practical wearability.



SYSTEM IMPLEMENTATION


Bill of Materials


Component

Model


Quantity

Cost

Purpose

Microcontroller

Arduino Mega 2560

1


$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Finger orientation tracking

Force Sensors


FSR 402

3

$36

Contact detection

Haptic Motors


10mm Coin Motors

3

$12

Tactile feedback

MOSFETs

2N7000

3

$1.50

Motor control

Resistors

1kΩ, 10kΩ

6

$2

Pull-ups and biasing

Breadboard

Half-size

1

$5

Circuit assembly

Jumper Wires

Various

30

$12

Connections


Total System Cost$137


Circuit Connections

Arduino Mega Pin Assignments:

├── I2C Communication

│   ├── SDA (Pin 20) → TCA9548A SDA

│   └── SCL (Pin 21) → TCA9548A SCL

├── Force Sensors (Analog)

│   ├── A3 → FSR #1 (Thumb)

│   ├── A6 → FSR #2 (Index)  

│   └── A7 → FSR #3 (Middle)

├── Haptic Motors (PWM)

│   ├── Pin 51 → Motor #1 (Thumb)

│   ├── Pin 44 → Motor #2 (Index)

│   └── Pin 46 → Motor #3 (Middle)

└── Power Distribution

    ├── 5V → Sensor power

    └── GND → Common ground



Software Implementation

For additional details please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall



Command Interface

The system provides a serial command interface for operation:

  • MAP: Enter mapping mode to record new spatial memory

  • REPLAY: Enter replay mode to trigger haptic feedback

  • CALIBRATE: Real-time sensor monitoring and debugging

  • STATUS: Display system information and stored memories

  • RESET: Clear all stored spatial memories

  • TESTMOTOR: Verify haptic motor functionality

  • HELP: Display available commands

System Operation

Memory Recording Process

  1. User types "MAP" command

  2. System waits for force threshold to be exceeded

  3. Upon contact detection, 3-second averaging begins

  4. Gyroscope readings collected at 50Hz

  5. Final averaged pose stored to EEPROM

  6. Memory assigned unique identifier

Haptic Replay Process

  1. User types "REPLAY" command

  2. System continuously samples finger pose

  3. Current pose compared against all stored memories

  4. Distance calculated using 3D Euclidean metric

  5. If distance < 300°, maximum haptic feedback triggered

  6. Feedback continues while pose remains in tolerance zone

Calibration and Debugging

  1. User types "CALIBRATE" command

  2. Real-time display of all sensor values

  3. IMU status and connectivity monitoring

  4. Force sensor readings with visual bar graphs

  5. Automatic timeout after 30 seconds









TECHNICAL INNOVATION

System Architecture & Hardware Design

┌─────────────────────────────────────────────────────────────┐

│                    TACTILE MEMORY SYSTEM                   │

├─────────────────────────────────────────────────────────────┤

│  INPUT LAYER                                                │

│  ├── Multi-IMU Pose Detection (3× MPU6050)                 │

│  │   ├── Thumb Sensor (Channel 0)                          │

│  │   ├── Index Finger Sensor (Channel 1)                   │

│  │   └── Middle Finger Sensor (Channel 2)                  │

│  ├── Force Sensing Array (3× FSR402)                       │

│  │   ├── Contact Detection                                  │

│  │   └── Pressure Measurement                              │

│  └── I2C Multiplexer (TCA9548A)                           │

├─────────────────────────────────────────────────────────────┤

│  PROCESSING LAYER                                           │

│  ├── Arduino Mega 2560 (Main Controller)                   │

│  │   ├── 256KB Flash Memory                                │

│  │   ├── 8KB SRAM                                          │

│  │   └── 4KB EEPROM                                        │

│  ├── Sensor Fusion Algorithms                              │

│  │   ├── Noise Filtering                                   │

│  │   ├── Pose Estimation                                   │

│  │   └── Temporal Smoothing                                │

│  ├── Memory Management                                      │

│  │   ├── Pattern Storage                                   │

│  │   ├── Retrieval Algorithms                              │

│  │   └── EEPROM Persistence                                │

│  └── Pattern Matching Engine                               │

│      ├── Euclidean Distance Calculation                    │

│      ├── Threshold Detection                               │

│      └── Real-time Comparison                              │

├─────────────────────────────────────────────────────────────┤

│  OUTPUT LAYER                                               │

│  ├── Haptic Feedback Array (3× Motors)                     │

│  │   ├── PWM Intensity Control                             │

│  │   ├── Maximum Intensity Output                          │

│  │   └── Simultaneous Activation                           │

│  ├── Serial Communication Interface                        │

│  │   ├── Real-time Debugging                               │

│  │   ├── Configuration Commands                            │

│  │   └── Status Monitoring                                 │

│  └── User Command Interface                                │

└─────────────────────────────────────────────────────────────┘



Component Selection Rationale


MPU6050 IMU Sensors: 6-DOF sensors providing 3-axis gyroscope and accelerometer data with I2C communication, chosen for accuracy, availability, and Arduino library support.

TCA9548A I2C Multiplexer: Enables multiple MPU6050 sensors (which share the same I2C address) to operate simultaneously on a single bus.

Arduino Mega 2560: Provides sufficient I/O pins, processing power, and memory for real-time sensor processing and haptic control.

Force Sensitive Resistors: Simple analog pressure sensors that detect intentional object contact for triggering memory recording mode.

Vibration Motors: 10mm coin-style motors providing clear haptic feedback with PWM intensity control.

Software Architecture

Core System Design

The software implements a state machine with four primary modes:


    IDLE,        // Waiting for user commands

    MAP,         // Recording spatial memories  

    REPLAY,      // Triggering haptic feedback

    CALIBRATE    // Real-time sensor monitoring


For additional details on the algorithms and code used please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall


Design Decisions & Trade-offs


Generous Tolerance Threshold (300°)

The system uses a large tolerance zone to prioritize successful haptic triggering over precise pose matching. This design decision emerged from recognizing that human movement reproduction contains natural variability.

Maximum Intensity Feedback

Rather than graduated haptic intensity based on proximity, the system provides binary maximum-intensity feedback to ensure clear, unmistakable user perception when spatial memories are detected.

Force-Triggered Recording

Requiring physical contact during memory encoding ensures that spatial memories correspond to actual object interactions rather than arbitrary hand positions in space.

Three-Finger Tracking

Thumb, index, and middle finger tracking provides sufficient degrees of freedom to distinguish between distinct hand configurations while maintaining practical wearability.



SYSTEM IMPLEMENTATION


Bill of Materials


Component

Model


Quantity

Cost

Purpose

Microcontroller

Arduino Mega 2560

1


$45

Main processing unit

I2C Multiplexer

TCA9548A

1

$8

Multi-IMU communication

IMU Sensors

MPU6050

3

$15

Finger orientation tracking

Force Sensors


FSR 402

3

$36

Contact detection

Haptic Motors


10mm Coin Motors

3

$12

Tactile feedback

MOSFETs

2N7000

3

$1.50

Motor control

Resistors

1kΩ, 10kΩ

6

$2

Pull-ups and biasing

Breadboard

Half-size

1

$5

Circuit assembly

Jumper Wires

Various

30

$12

Connections


Total System Cost$137


Circuit Connections

Arduino Mega Pin Assignments:

├── I2C Communication

│   ├── SDA (Pin 20) → TCA9548A SDA

│   └── SCL (Pin 21) → TCA9548A SCL

├── Force Sensors (Analog)

│   ├── A3 → FSR #1 (Thumb)

│   ├── A6 → FSR #2 (Index)  

│   └── A7 → FSR #3 (Middle)

├── Haptic Motors (PWM)

│   ├── Pin 51 → Motor #1 (Thumb)

│   ├── Pin 44 → Motor #2 (Index)

│   └── Pin 46 → Motor #3 (Middle)

└── Power Distribution

    ├── 5V → Sensor power

    └── GND → Common ground



Software Implementation

For additional details please see this GitHub Repository:

https://github.com/CJD-11/Tacticle-Memory-Recall



Command Interface

The system provides a serial command interface for operation:

  • MAP: Enter mapping mode to record new spatial memory

  • REPLAY: Enter replay mode to trigger haptic feedback

  • CALIBRATE: Real-time sensor monitoring and debugging

  • STATUS: Display system information and stored memories

  • RESET: Clear all stored spatial memories

  • TESTMOTOR: Verify haptic motor functionality

  • HELP: Display available commands

System Operation

Memory Recording Process

  1. User types "MAP" command

  2. System waits for force threshold to be exceeded

  3. Upon contact detection, 3-second averaging begins

  4. Gyroscope readings collected at 50Hz

  5. Final averaged pose stored to EEPROM

  6. Memory assigned unique identifier

Haptic Replay Process

  1. User types "REPLAY" command

  2. System continuously samples finger pose

  3. Current pose compared against all stored memories

  4. Distance calculated using 3D Euclidean metric

  5. If distance < 300°, maximum haptic feedback triggered

  6. Feedback continues while pose remains in tolerance zone

Calibration and Debugging

  1. User types "CALIBRATE" command

  2. Real-time display of all sensor values

  3. IMU status and connectivity monitoring

  4. Force sensor readings with visual bar graphs

  5. Automatic timeout after 30 seconds









RESEARCH VISION

Immediate Research Opportunities

Empirical Validation Studies

The functional prototype enables systematic investigation of fundamental questions about embodied spatial memory:

Memory Formation Comparison: How does haptic encoding affect spatial memory formation compared to traditional visual methods? The system could support controlled studies comparing recall accuracy and retention across different encoding modalities.

Parameter Optimization: What haptic feedback parameters (intensity, duration, spatial patterns) maximize memory effectiveness? The adjustable system parameters enable systematic optimization studies.

Individual Differences: How do cognitive abilities affect performance with embodied memory systems? The platform could support correlation studies between spatial ability measures and system performance.

Application Development

Assistive Technology: Collaboration with organizations serving visually impaired populations to develop and validate navigation aids for real-world environments.

Educational Integration: Partnership with educational institutions to explore kinesthetic learning enhancement for spatial concepts in STEM curricula.

Therapeutic Applications: Investigation of the system's potential for cognitive rehabilitation following brain injury or age-related spatial memory decline.

Technical Evolution Pathway

Hardware Advancement

Wireless Integration: Developing ESP32-based sensor nodes for seamless wearable integration without tethered connections to base station.

Miniaturization: Exploring ring-form factor sensors and flexible electronics for improved wearability and social acceptability.

Enhanced Haptics: Investigating directional tactile feedback, temperature modulation, and ultrasonic haptic displays for richer spatial information encoding.

Software Innovation

Machine Learning Integration: Adaptive algorithms that personalize pose matching thresholds and optimize haptic feedback patterns for individual users.

Pattern Recognition Enhancement: Advanced gesture classification and pose prediction algorithms for more robust spatial memory detection.

Broader Research Integration

External Research Network

Accessibility Organizations: Community-based participatory research with target user populations.

Educational Institutions: Longitudinal studies of kinesthetic learning enhancement across diverse student populations.

Rehabilitation Centers: Clinical validation of therapeutic applications for cognitive rehabilitation.












RESEARCH VISION

Immediate Research Opportunities

Empirical Validation Studies

The functional prototype enables systematic investigation of fundamental questions about embodied spatial memory:

Memory Formation Comparison: How does haptic encoding affect spatial memory formation compared to traditional visual methods? The system could support controlled studies comparing recall accuracy and retention across different encoding modalities.

Parameter Optimization: What haptic feedback parameters (intensity, duration, spatial patterns) maximize memory effectiveness? The adjustable system parameters enable systematic optimization studies.

Individual Differences: How do cognitive abilities affect performance with embodied memory systems? The platform could support correlation studies between spatial ability measures and system performance.

Application Development

Assistive Technology: Collaboration with organizations serving visually impaired populations to develop and validate navigation aids for real-world environments.

Educational Integration: Partnership with educational institutions to explore kinesthetic learning enhancement for spatial concepts in STEM curricula.

Therapeutic Applications: Investigation of the system's potential for cognitive rehabilitation following brain injury or age-related spatial memory decline.

Technical Evolution Pathway

Hardware Advancement

Wireless Integration: Developing ESP32-based sensor nodes for seamless wearable integration without tethered connections to base station.

Miniaturization: Exploring ring-form factor sensors and flexible electronics for improved wearability and social acceptability.

Enhanced Haptics: Investigating directional tactile feedback, temperature modulation, and ultrasonic haptic displays for richer spatial information encoding.

Software Innovation

Machine Learning Integration: Adaptive algorithms that personalize pose matching thresholds and optimize haptic feedback patterns for individual users.

Pattern Recognition Enhancement: Advanced gesture classification and pose prediction algorithms for more robust spatial memory detection.

Broader Research Integration

External Research Network

Accessibility Organizations: Community-based participatory research with target user populations.

Educational Institutions: Longitudinal studies of kinesthetic learning enhancement across diverse student populations.

Rehabilitation Centers: Clinical validation of therapeutic applications for cognitive rehabilitation.












RESEARCH VISION

Immediate Research Opportunities

Empirical Validation Studies

The functional prototype enables systematic investigation of fundamental questions about embodied spatial memory:

Memory Formation Comparison: How does haptic encoding affect spatial memory formation compared to traditional visual methods? The system could support controlled studies comparing recall accuracy and retention across different encoding modalities.

Parameter Optimization: What haptic feedback parameters (intensity, duration, spatial patterns) maximize memory effectiveness? The adjustable system parameters enable systematic optimization studies.

Individual Differences: How do cognitive abilities affect performance with embodied memory systems? The platform could support correlation studies between spatial ability measures and system performance.

Application Development

Assistive Technology: Collaboration with organizations serving visually impaired populations to develop and validate navigation aids for real-world environments.

Educational Integration: Partnership with educational institutions to explore kinesthetic learning enhancement for spatial concepts in STEM curricula.

Therapeutic Applications: Investigation of the system's potential for cognitive rehabilitation following brain injury or age-related spatial memory decline.

Technical Evolution Pathway

Hardware Advancement

Wireless Integration: Developing ESP32-based sensor nodes for seamless wearable integration without tethered connections to base station.

Miniaturization: Exploring ring-form factor sensors and flexible electronics for improved wearability and social acceptability.

Enhanced Haptics: Investigating directional tactile feedback, temperature modulation, and ultrasonic haptic displays for richer spatial information encoding.

Software Innovation

Machine Learning Integration: Adaptive algorithms that personalize pose matching thresholds and optimize haptic feedback patterns for individual users.

Pattern Recognition Enhancement: Advanced gesture classification and pose prediction algorithms for more robust spatial memory detection.

Broader Research Integration

External Research Network

Accessibility Organizations: Community-based participatory research with target user populations.

Educational Institutions: Longitudinal studies of kinesthetic learning enhancement across diverse student populations.

Rehabilitation Centers: Clinical validation of therapeutic applications for cognitive rehabilitation.