Most of Erlang, Julia, and Go... As they tended to re-factor dependencies and trend toward a homogeneous distributed ecosystem.
Python is the core issue rather than specifically Numpy, as it was bodged on (SWIG) to try to fix real use-cases... much like how 30 years of bodged on GPU mailbox structures made people biased to implement ridiculous solutions that necessitated ridiculous software paradigms.
I don't think any one person has the will and resources to resolve the core problem. However, Julia will likely also initially bind the same nonsense for awhile due to compatibility needs, but users tend to re-factor nonsense out of the ecosystem with better native solutions over time.
"All software is terrible, but some of it is useful" =)
I am specifically looking for Numpy-like libraries and designs, not languages that have generally well-designed libraries. For example, Elixir has Nx.
It seems to me that, even given Python's constraints as a language, that a nicer wrapper could have been developed in Numpy that still called out to bindings to do the heavy work.
> functional compatibility layer for a fundamentally broken language paradigm
What did you mean by this? Is it mainly the fact that Python is a bit an undesigned language in general that calls into unmanaged languages like C/C++ and Fortran in its libraries?
"I am specifically looking for Numpy-like libraries and designs"
Why would one bottleneck a design with polyglot stacks even before a single line of code was implemented? This sounds like naive nonsense.
"What did you mean by this? ... Python"
Python was never designed to handle threads or parallelism properly, and has performance issues Numpy tries to address though its wrapped C/C++ libraries.
Python became 30 years of spiral development, and implodes into a new implementation every so often. Depending on the use-case it may prove appropriate, but never optimal. =)
There are many native packages and wrapper scaffold libraries like Numpy. It will depend on your problem domain, but Julia often transparently supports its broadcast operator on most core data-structure math ("using LinearAlgebra" and the like).
Yes, the paper in the original post has a few examples. Julia uses expressions that are based on operators we use in regular arithmetic, even in plain Python. So, just ordinary operators instead of np.matmul, np.ling.expm, etc.
Matrix multiplication works with the @ operator, e.g. A@B. (Or with the * operator if defined as np.matrix, but nobody uses that and probably shouldn't). For matrix exponentials you need scipy.linalg.expm, but there is nothing in Python preventing why it couldn't be done with e.g. e**M if wanted (or even e^M if wanted but probably shouldn't). You can even implement it yourself in a few lines.
You don't seem to know much about what you're criticizing.
For expm? The matmul as @ is numpy standard and used widely. If you have something that needs expm so much it needs to be an operator, good luck working in a project where people can't understand one-line class definition.
People often conflate popularity with good design. Have a great Monday =)