Replies: 1 comment
-
|
Hi - thanks for the interest! I'm not entirely sure what to recommend: we used to keep a good list of issues with the "contributions welcome" tag, but as JAX has gotten more popular, I think all the low-hanging fruit there has been picked. For the kinds of topics you describe (e.g. neural network primitives) the best approach may be to develop APIs within a companion project. Note that JAX is not a neural network library, and aside from a few convenience functions in Regarding the geglu activation function: we've typically been reluctant to merge contributions like this. The reason is that the space of activation functions is huge, and unless we draw the line somewhere, |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone
I hope this message finds everyone well. I recently submitted a small contribution adding the geglu activation to jax.nn, and I’d love to continue getting involved in the JAX project. I’m interested in working on ongoing or experimental areas - especially around numerical computing, neural network primitives, or performance improvements.
I’d appreciate any guidance on where to start or any “first issues” that could help me understand the codebase better. I’m also open to helping with documentation or small fixes to learn the workflow. If some of the experienced people or the maintainers could guide me, that would be great!
My goal is to build toward contributing at the feature level as I gain more experience. Thanks so much for maintaining this incredible project - I’d be really grateful for any direction or mentorship from the community!
Beta Was this translation helpful? Give feedback.
All reactions