ChatDEAF – Multimodal Sign Language Dataset (ISL/TİD) & Latent Trigger Start: Insights Needed

Hello Hugging Face Community,

My name is Yasin Şimşek, and I’m a Deaf individual who recently shared the first open-access ISL (International Sign Language) and TİD (Turkish Sign Language) dataset on this platform.

I’m writing to ask for your feedback and insights on a multimodal approach that combines:

– Handshape and gesture data
– Facial expressions
– Emotional rhythm
– Visual energy
– Cultural context

This dataset is part of a larger project called ChatDEAF, which aims to build an AI system for real-time sign language interpretation and accessibility — not only for hearing people to understand Deaf users, but also for Deaf parents to support their hearing children in education.

My Main Questions to the Community:

  1. Multimodal Support:
    Can Hugging Face datasets fully support gesture + video + emotion-based labeling systems?

  2. Latent Trigger Theory:
    ChatDEAF started with a signal — not code. It was initiated by a silent visual cue.
    Is it possible for AI systems to detect and learn from non-verbal, subconscious triggers?

  3. Dataset Evaluation:
    Would any of you be interested in reviewing or improving the ISL/TİD dataset structure with me?
    I am open to collaboration and knowledge exchange.


This is not just a dataset.

It is a signal.
It is a beginning.
And maybe, a new way to understand intelligence.

Thank you for your attention and support.
Let’s bring true accessibility into AI — together.

This dataset is part of a larger project called ChatDEAF, which aims to build an AI system for real-time sign language interpretation and accessibility — not only for hearing people to understand Deaf users, but also for Deaf parents to support their hearing children in education.

My Main Questions to the Community:

  1. Multimodal Support:
    Can Hugging Face datasets fully support gesture + video + emotion-based labeling systems?

  2. Latent Trigger Theory:
    ChatDEAF started with a signal — not code. It was initiated by a silent visual cue.
    Is it possible for AI systems to detect and learn from non-verbal, subconscious triggers?

  3. Dataset Evaluation:
    Would any of you be interested in reviewing or improving the ISL/TİD dataset structure with me?
    I am open to collaboration and knowledge exchange.

This is not just a dataset.
It is a signal.
It is a beginning.
And maybe, a new way to understand intelligence.

Thank you for your attention and support.
Let’s bring true accessibility into AI — together.

Best,
Yasin Şimşek (yasodeafs)
Founder of ChatDEAF
:envelope_with_arrow: yasodeafs@gmail.com

1 Like