From my point, there is still too much change around automatic differentiation libraries.
There is Zygote.jl, which is used in Flux.jl. However, it's more in maintenance mode. At some point, Diffractor.jl was hyped but it didn't take off yet. And then there is Enzyme.jl which people hype now.
But for me as a user, it's not clear what I should really do to make my code well differentiable for those libraries.
If you stick with torch, jax or tensorflow, everything seems to work better regarding AD.
There is Zygote.jl, which is used in Flux.jl. However, it's more in maintenance mode. At some point, Diffractor.jl was hyped but it didn't take off yet. And then there is Enzyme.jl which people hype now.
But for me as a user, it's not clear what I should really do to make my code well differentiable for those libraries.
If you stick with torch, jax or tensorflow, everything seems to work better regarding AD.
[0]: https://github.com/FluxML/Flux.jl [1]: https://github.com/FluxML/Zygote.jl [2]: https://github.com/JuliaDiff/Diffractor.jl [3]: https://github.com/EnzymeAD/Enzyme.jl