Hello Hugging Face Community,
My name is Yasin Şimşek, and I’m a Deaf individual who recently shared the first open-access ISL (International Sign Language) and TİD (Turkish Sign Language) dataset on this platform.
I’m writing to ask for your feedback and insights on a multimodal approach that combines:
– Handshape and gesture data
– Facial expressions
– Emotional rhythm
– Visual energy
– Cultural context
This dataset is part of a larger project called ChatDEAF, which aims to build an AI system for real-time sign language interpretation and accessibility — not only for hearing people to understand Deaf users, but also for Deaf parents to support their hearing children in education.
My Main Questions to the Community:
-
Multimodal Support:
Can Hugging Face datasets fully support gesture + video + emotion-based labeling systems? -
Latent Trigger Theory:
ChatDEAF started with a signal — not code. It was initiated by a silent visual cue.
Is it possible for AI systems to detect and learn from non-verbal, subconscious triggers? -
Dataset Evaluation:
Would any of you be interested in reviewing or improving the ISL/TİD dataset structure with me?
I am open to collaboration and knowledge exchange.
This is not just a dataset.
It is a signal.
It is a beginning.
And maybe, a new way to understand intelligence.
Thank you for your attention and support.
Let’s bring true accessibility into AI — together.
This dataset is part of a larger project called ChatDEAF, which aims to build an AI system for real-time sign language interpretation and accessibility — not only for hearing people to understand Deaf users, but also for Deaf parents to support their hearing children in education.
My Main Questions to the Community:
-
Multimodal Support:
Can Hugging Face datasets fully support gesture + video + emotion-based labeling systems? -
Latent Trigger Theory:
ChatDEAF started with a signal — not code. It was initiated by a silent visual cue.
Is it possible for AI systems to detect and learn from non-verbal, subconscious triggers? -
Dataset Evaluation:
Would any of you be interested in reviewing or improving the ISL/TİD dataset structure with me?
I am open to collaboration and knowledge exchange.
This is not just a dataset.
It is a signal.
It is a beginning.
And maybe, a new way to understand intelligence.
Thank you for your attention and support.
Let’s bring true accessibility into AI — together.
Best,
Yasin Şimşek (yasodeafs)
Founder of ChatDEAF
yasodeafs@gmail.com