The glyphs are not a barrier to use or learning. Learning them is no more difficult than learning the keywords and syntax of any other language. The real difficulty lies in thinking in terms of arrays and array operations rather than conditionals and loops.
perhaps not to you but perhaps they are for a large number of people who are used to seeing words they recognize, surely being able to read a simple program before you have learned and guess what it means will be a great comfort to people and make them feel that yes, they can learn this.
A bunch of incomprehensible glyphs might interest someone like a puzzle solving challenge, but not everyone will want to solve a puzzle in order to first start to learn a language.
>
perhaps not to you but perhaps they are for a large number of people who are used to seeing words they recognize, surely being able to read a simple program before you have learned and guess what it means will be a great comfort to people and make them feel that yes, they can learn this.
This argument was already used in the past to justify the very verbose syntax of COBOL.
>
Looks like we found Goldilocks zone of programming languages in years after first version of COBOL.
Considering the current state the Goldilocks zone of programming languages is like calling the verbose style in which mathematical proofs were written down in the early modern period the Goldilocks zone of writing down mathematical proofs. :-(
Also, mathematicians are much more well- and long-trained than typical programmer.
And it is good in both ways: some times ago there was idea that everybody should learn a little of programming (see, for example, BBC program in UK which brings us BBC Micro and, later ARM processor ;-) ), because now almost everything is computer (including my wahsing mashine). Now, it is not popular idea, now all corporations try to build non-programmable walled gardens, but it was not so always.
COBOL (and SQL) were created with managers, not programmers, in mind. Goal was to allow managers automate their business-processes without calling for IT consultant.
Spreadsheets are VERY near to this goal. I've seen A LOT of business and finance automation made in excel by people, who will punch you in the face, if you say them that now they are programmers.
If we want to have very specialized, PhD-requiring languages, it is perfectly Ok. But then don't try to sell them to Spreadsheet users.
> But then don't try to sell them to Spreadsheet users.
The problem rather is that "fetching people where they are" regarding the programming language design means keeping these people stupid. Instead, you should market the programming language to spreadsheet users by explaining them how the new kind of thinking that the programming language enables lets their intellectual potential explode.
Problem is, many people (including many power Excel users) don't want to "lets their intellectual potential explode". Job is trading time for money. Not something for rising self-esteem or even something interesting. You could try to play "you will do more in less time" card, but it is often not true (unfortunately), and in corporate world person who do more in less time get more tasks and not the rise.
My experience shows, that programmers love their job and want to be better for sake of being better much more often, than other employees (and it is considering norm - like "show your github profile" on hiring, do you know cases when auto mechanic is required to show custom-build car to get job in the shop?). And even among programmers it is not universal rule, I know some colleagues (very competent ones, need to say) for whom it is simple job, not passion. They don't read bleeding-edge papers on CS, they don't have pet projects and, even, may not own computer at home - PlayStation/XBox and smartphone is enough for them, because computers are tool of trade, not home necessity.
Programs are not mathematical proofs, and when they are (in systems like Coq) they, maybe, will be better with mathematical notation.
Also, all these comparisons with mathematical notation left out that mathematical notation is not linear and one-font-size-fit-all, as our programs, even in array languages.
I don't see how I can easily input those glyphs. Selecting them by clicking on a symbol, as it seems is an option in one screenshot, is really out of the question. That's just clumsy. Special key combinations is also something I really dislike. It's too slow. I disable dead-keys, for example, I much prefer to be able to enter e.g. '~' directly, instead of dead-keying it (as is that "standard", apparently, for my national layout).
For me it's much faster and easier to enter a string, a keyword, instead of moving my hands, or, shudder, mouse, in strange ways to insert some symbol. I don't need symbols, I need names and words, those I can enter as fast as I can think.
> The glyphs are not a barrier to use or learning. Learning them is no more difficult than learning the keywords and syntax of any other language.
Of course they are! I can't type them in on my keyboard without doing a google search first.
It's a barrier according to all definitions of the word. It's unnecessary friction. It's slower to type in when learning. It's slower to read when learning.
Compare it to a language that isn't bragging about how smart you are if you know it - all those barriers are removed.
In most editors you can type ``fn_name to insert the symbol, and move on to `s single letter shortcuts when you start using the functions enough. If you need to read, just hover over the symbol and it will give you its name; even without that, the surface level is small enough that you can get acquainted with 90% of them in a week with regular use, compared to "more accessible" languages and all their hidden gotchas
You mean like how you've got to find out how to type accented characters - á or ö for example?
The glyphs are available as a keyboard layer which you can easily find and install, or simply enable on Linux.
Yes there is an initial hurdle when you're learning, but the same applies with +-×÷, and any languague with a different alphabet/symbol set!
Once learnt however, the ~50 odd symbols are all the primitive functions, and then have speed and ability like maths to write and play with expressions.
> Yes there is an initial hurdle when you're learning, but the same applies with +-×÷, and any languague with a different alphabet/symbol set!
Thee difference is, with any other new programming language I already know those! IOW, it's not a barrier.
> Once learnt however, the ~50 odd symbols are all the primitive functions, and then have speed and ability like maths to write and play with expressions.
50? You'll forget them if you don't use them. Still a barrier, even when not learning anymore.
Look, I get that there's a speed upgrade, but it's a balance, and 50 is way too much for something that you'll forget easily due to lack of usage.
For example, why bother with 26 elements that compose to all the words in the English language? One can simply memorise the most frequent 30000 words and assign a different rune to each one.
Obviously, we don't do that. The languages that do do that get outcompeted by the ones that don't.
The simple reality is that languages, both human and programming, have to match what the average human is most productive in. It doesn't matter that genius developers are more productive with esoteric runes; their experience is irrelevant because they aren't doing the grunt work that 99.9999% of programming entails.
People who need to type those characters often have localized keyboards where this is obvious. In addition, those are not special characters but compositions of characters, so adding them to a keyboard is relatively inexpensive in terms of key cost.
As it's much easier to start learning a language that uses a familiar alphabet, simply not being able to easily represent basic characters in your head is a huge barrier to overcome.
> simply not being able to easily represent basic characters in your head is a huge barrier to overcome.
In the modern technical world, you are basically surrounded by mathematics; thus a lot of people (in particular nearly all programmers) have already seen many such symbols.
Doesn't matter. If it is pitched as an alternative to spreadsheets being that much less straightforward is a big downside. And downsides will be used by actual people to weigh their decisions.
Why not make it straightforward first and then when people understand it, introduce the shortcuts? Not nerdy enough?
Sometimes I have the feeling that a certain type of peogrammer doesn't even want others to understand what's going on, what they want is bending them to their own will and conventions.
Also, I could say from my own experience, that studying Vietnamese is much-much-much simpler (in the beginning at least) than Chinese.
Chinese, theoretically, is more simple language than Vietnamese in terms of grammar structures and such. But Vietnamese uses Latin alphabet (augmented with diacritics) and for western person it is huge help when you start to learn language. Maybe, on B2+ level it is not relevant anymore, but when you try to study language from zero it is blessing (again, for western person).
That’s just true. Keywords like “for”, “while”, “if”, “then”, “else” are helpful mnemonics for English speakers. And they are well known by non-English speaking programmers already.
So the only audience where this is not true, is non-English speakers who want to learn an array language as their first programming language.
See recent discussion about (irrelevancy) of literate programming in modern world, where we have good readable identifiers in our code instead of single-letter variables of BASIC and control structures instead of GOTO hell.
IMHO, it is relevant to discussion about array languages.
What are you talking about? I don't even know how to _make_ most of those symbols this isn't true of any of the dozens of programming languages I've used in various capacities. The people disagreeing with the original comment are so deeply inside a bubble i'm worried about them