Forward-Forward algorithm by Geoffrey Hinton

I would like to initiate a discussion on the recent publication by Geoffrey Hinton proposing an alternative to the traditional backpropagation algorithm - The Forward-Forward Algorithm: Some Preliminary Investigations and the paper by Alexander Ororbia and Ankur Mali - The Predictive Forward-Forward Algorithm which suggests incorporating a generative circuit into the original FF network.

I am interested in hearing the thoughts and insights of the community on these papers. I am particularly interested in discussing the potential benefits of layer level weights update in the Forward-Forward algorithm as it could potentially allow for training a network layer by layer without the need for a huge amount of VRAM.

1 Like

Implementations found so far:
Tensorflow Implementation
PyTorch Implementation

More:
Another PyTorch Implementation
DRD2 activity prediction using the Forward-Forward Algorithm

Another one:
Tensorflow Implementation

I am attempting to build a mini-GPT version using the Forward-forward idea.

I cant find much of anything using it in generative language models, or any example of the NLP benchmark referenced in the Hinton paper.

if anyone has any thoughts or repos to provide that type of Implementing of the Forward-Forward Algorithm it would be very helpful.

best so far is a few not working repos:

nebuly-ai: nebullvm/apps/accelerate/forward_forward at 5fb48f6cda4d2ab756f20a91eea7b482f38ca50f · nebuly-ai/nebullvm · GitHub

and kyleliang919: GitHub - kyleliang919/forward_forward_gpt: Using the forward forward algorithm to train large language model