Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not completely sure that you aren't joking. Maybe for a mathematician the latter would be "very readable"?


Indeed it is (slight improvements not excluded). I would love to be able to write things like M<sup>T</sup>A (w/o the HTML rubbish seen here - flavour of the same problem) for two matrices directly as code (can Julia do that?). Instead, I've been forced to learn row and column precedence in OpenGL, DirectX, Numpy, and whatever, paying attention to it in endless constructs.

Even after 25 years in the business, it's still a pain.


What about everything that isn't yet represented by widely-adopted mathematical notation? I'd hate to have to learn which Greek letters correspond to which operations every time I dive into a new codebase. The advantage of English words is that everyone knows them.

Edit for details: `Σλ∀φ` is a chore to read for me. These letters only make sense if I've absorbed the author's notation beforehand, like in a math proof. I parse the snake case version instantly, because my brain is trained for that. And I understand it much more quickly, because it says exactly what the function does in the way I would represent it in memory.


We’re talking in a context of scientific software here.

> I'd hate to have to learn which Greek letters correspond to which operations every time I dive into a new codebase.

You have to do that anyway, because you’ll have to connect the code you’re working on with the scientific paper describing it. When the code uses variable names too far removed from the mathematical symbols, you have to do make two steps: figure out how the words connect to symbols and then figure out the symbols. This will be especially difficult for the mathematician/scientist without a strong coding background: they’ll have much less friction reading code that matches the symbolic notation and Greek letters that they’re used to.

> The advantage of English words is that everyone knows them.

Not when it comes to “math” words.

> `Σλ∀φ` is a chore to read for me.

Right, but that’s because you have a different background. For me, `Σλ∀φ` is much easier to read and understand. More importantly, a symbolic notation is much denser, which allows to parse long expression that would be very hard to understand if they were written out in words.

Again, this is for the very specific context of highly mathematical/scientific software that Julia excels at and is primarily used for. In a more general context (when the software isn’t a direct representation of a scientific paper), I’m 100% on board with good, descriptive variable names


How about single-letter Chinese labels? because that is my background and that is what I am comfortable using.


If that’s what’s most appropriate for the field you’re working in, sure, why not?

Of course it will limit who will be able to interact with your project, which can be a good thing or a bad thing. For the math-unicode in numerical software, you may not even want someone without at least a minimal math background (enough to understand Greek letters and math symbols) working on the code. Likewise, a project that’s inherently Chinese, where it doesn’t make sense for the users/developers not to know Chinese, should feel absolutely free to use Chinese symbols in their source code.

On the other hand, if you do it gratuitously, you just unnecessarily limit collaboration. I’m ok with that, too, personally: it’s your own foot you’re shooting.


I also don't see Chinese symbols as fundamentally different from non-english ASCII. It's pretty common for Physics bachelor students to show still naming their variables and write comments in e.g. German (my teaching background). I'll push them to kick that habit pretty quickly and switch to English, but this is no different from them still writing the bachelor thesis in German and then switching to English for the master and PhD: there's nothing really wrong with it, but you won't have any reach within the scientific community if your work is not in English.

There's really nothing wrong with beginners starting out in their own language. Why shouldn't a 14 year old Chinese kid write their first programs using Chinese characters as identifiers? I'd much rather have a language support full Unicode they way Julia does than to force everyone into ASCII.


As a mathematician by training - even with a now almost full career in professional software development - it's simply one of the handful of levels of abstraction I like to switch between. Much more concise then than usual code. I did all these things, massaging bits down to machine code, designing systems on the other end. Math has its place there, and if the toolbox supports it, I'll consider....

I remember that one of the first things I did when learning C++ was to implement a matrix library, operator overloading and all (oh, how naive when it comes to getting it right!).


This is a valid question. Others have provided a much better answer, the thing I want to add is that being able to express computation in software the same way as in research papers makes the code much easier to read and reason about. This language was invented a long time ago to express computations concisely, and today you will see a lot of math built on top of these abstractions (since they’ve been around for sooo long).

So the ability to express computation in standard mathematical notation rather than having to invent pseudo symbols to do it makes it much easier to read for people who already have that training. And for people who don’t… it does require pre-reading to understand how these symbols are used but you have the benefit for hundreds of years of math literature and community to look it up!


> `Σλ∀φ` is a chore to read for me. These letters only make sense if I've absorbed the author's notation beforehand, like in a math proof.

It's a hell of a lot better than reading `Sigma`, `lambda`, etc. and having to do both the verbose English -> character translation in your head while trying to understand the mathematical form. At the end of the day, you do need to understand the math in order to understand mathematical code, and making it have as few translation steps really helps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: