Although JAX is faster than the others via jit compiling, another strong point of JAX is intuitive autograd.
This project will try to exploit an advantage of JAX autograd while reproducing Neural ODE and Neural SDE which is suitable for this purpose.
Please feel free to suggest any opinion about this project
Some of you might think this project requires mathematical understanding and harder than the others.
However, there are plenty of materials we can help you understand its math background such as
- implicit-layers-tutorial at ICML20’
so do not hesitate
Current team members:
Awesome! the intuitive autograd of JAX would be very useful while implementing Neural ODE or SDE.
I would like to contribute the project
This project seems very interesting. Count me in if there’s still space~!
This sounds like a very interesting project! I’m happy to finalize this project even though we probably cannot help you out very much since Transformers doesn’t support ODE or SDE.
@sw32-seo do you think enough code is available online to finish this project within 10 days?
@patrickvonplaten Yes, of course
Neural ODE and SDE papers above have Flax/JAX code that we can refer to.
So I think the project can be done within 10 days.
This project looks perfect for experiencing the strength of the JAX framework. I would like to give a hand to this project.
That’s great - very exciting! Think the Flax authors will like this as well
I am very tempted to work on this too because I love Maths, though I had also suggested transGAN for art generation in the chat. Can I work on this too? I would anyway love to get resources to understand this better
@Grishma I’m glad to have you on the team!
Hi, I think this is a very interesting project. But I am afraid I do not know much about Neural ODEs. Hoping to learn about them and contribute to this project.
I am quite interested in working on this project. However, my imposter syndrome is kicking in and I am also not very familiar with Neural ODE and SDE. I would love to understand them better and contribute my best. So is it possible to join the team?
I’d like to join this team! Haven’t read the papers before, but math is .
ibraheemmoosa, weierstrass-walker, dom-miketa All of you are welcome to the project!
Announcement for all members!
As the number of team member is larger than I expected, I’m going to refine our goal and find the way to split the job to the last talk in 3rd of July.
Until then, please read two papers that I linked above, and ask question freely in the discord channel.
Sure, will read the papers, but I am not able to connect to the discord channel for some reason.
I had to independently join Flax-HuggingFace-Community-Week and then the link just points to our (misnamed) channel.
Thanks for adding me!
Thanks for your help
I copied wrong link.