Hacker Newsnew | past | comments | ask | show | jobs | submit | analog31's commentslogin

I did something similar. I’m not a developer, but I use programming as a problem solving tool, and have written little apps for limited uses such as controlling a fixture in the factory — stuff that the devs won’t touch. My first language was BASIC on a mainframe, before I had access to a microcomputer.

I was getting sick of Visual Basic and Excel, and besides, my VB license was more than a decade old. So I went “language shopping” by trying out the same two tasks in a whole bunch of languages. And I also let myself be influenced by online discussions, blogs, etc. Between computers at work and at home, I tried out each language on both Windows and Linux. One of the tasks was computational and graphical, the other was controlling a widget connected to USB.

I ended up with Python, and have been loyal to it for 13+ years. Did I make the best choice? I can drum up a list of pro’s and con’s, but it would be based on hindsight.


> I ended up with Python, and have been loyal to it for 13+ years

As another long-time Pythonista: I feel like I would have ended up with it anyway, but I do kinda wish I'd done more of that kind of experimentation around that time.

Certainly I've made mental lists of things I'd change about the language. (Not a lot of overlap with the complaints I hear most often, actually.)


I'd love to see that list, I'm really intersted in your perspective given the time you put in.

It was kind of a hodgepodge, since I gave myself a couple of rules. First, it had to be free. Second, it had to run on both Windows and Linux. Those were harder constraints in the 2010's.

To give a flavor, I tried Python, GCC, Javascript, and some higher level tools like Maxima and Octave. So I was certainly not systematic in my search. And trying Python coincided with a really pleasant and comfortable vacation where I had some blocks of time to play with it in peace.

The devs at my workplace had just jumped onto C#, but it was exactly during the time when C# was a mountainous download, hard to install without a good network connection, and Windows-only. I didn't relish staying dependent on Microsoft. Building a "hello world" app also seemed laborious.

Some of those issues have become meaningless, but here we are. On the other hand the growth of the Python community and ecosystem are hard to dispute.

You can see that among Python, Maxima, and Octave, you've got a REPL and a notebook style interface. At a previous job, I was a heavy Mathematica user. But Python was definitely gaining momentum compared to those other tools.

If I were to issue a complaint about Python, it's that the language has sprawled to the point where it's hard to claim that it's easy to learn unless someone helps you get started with a subset of it.


Im curious to see the whole list of languages you tried, and result with each. I suspect it was related to finding suitable library for each problem (usb, graphics) more so than the language itsel. But maybe ecosystem is what we need from a language.

This is going to be the year of refunds from the government.

This raises an interesting point. I've speculated that if someone has a hard time expressing themselves to other humans verbally or in writing, they're also going to have a hard time writing human-readable code. The two things are rooted in the same basic abilities. Writing documentation or comments in the code at least gives someone two slim chances at understanding them, instead of just one.

I have the opposite problem. Granted, I'm not a software developer, but only use code as a problem solving tool. But once again, adding comments to my code gives me two slim chances of understanding it later, instead of one.


> I've speculated that if someone has a hard time expressing themselves to other humans verbally or in writing

I don't think they have actually problems with expressing themselves, code is also just a language with a very formal grammar and if you use that approach to structure your prose, it's also understandable. The struggle is more to mentally encode non-technical domain knowledge, like office politics or emotions.


That's true. But people have had formal language for millennia, so why don't we use it?

Here's my hunch. Formal specifiation is so inefficient that cynics suspect it of being a form of obstructionism, while pragmatic people realize that they can solve a problem themselves, quicker than they can specify their requirements.


> But people have had formal language for millennia, so why don't we use it?

In case you don't refer to the mathematical notion of formal, then we use formal language all the time. Every subject has its formal terms, contracts are all written in a formal way, specifications use formal language. Anything that really matters or is read by a large audience is written in formal language.


I think there’s some of that, but it’s also probably a thing where people who make good tutors/mentors tend to write clearer code as well, and the Venn diagram for that is a bit complicated.

Concise code is going to be difficult if you can’t distill a concept. And that’s more than just verbal intelligence. Though I’m not sure how you’d manage it with low verbal intelligence.


Aside from it being sponsored research, I’m not surprised by the claim.

I blame the GUI. Sure, the GUI did great things for us, but I don’t think it evolved with sufficient attention paid to physical ergonomics. I get massive eye strain headaches when I use software that requires close hand-eye coordination and fine mouse work, such as CAD.

I can spot the CAD and data entry operators in a workplace because they’re wearing carpal tunnel braces.

Anything that’s purely text based, such as programming is massively more ergonomic. I can type while only minimally focusing on the screen, and can often close my eyes or look away. Between typing, and keyboard shortcuts, I can minimize mouse use.


I see this frequently. People want data that are organized in some fashion, so they start with a spreadsheet.

The drawback is that spreadsheet cells are a terrible way to convey narrative information. I’ve seen detailed product requirements in spreadsheets with thousands of cells, that failed to capture what the team actually wanted to build, and were never read.


One simplistic way is to successively add a small constant to a large integer, and generate the waveform from the most significant bits. A "cent," which is 1/100 of a semitone, is a factor of about 580 parts per million, so you can work out the precision needed for the constant. On a microcontroller, you can control the timing with a PWM, which runs independently of the processor and its timing foibles.

Proof is left as an exercise to the student. ;-)


> On a microcontroller, you can control the timing with a PWM, which runs independently of the processor and its timing foibles.

That is not really true. You usually have a couple of clock sources on a MCU, but the clock gets propagated down the clock tree and the source, and most of the times, the PWM has the same source clock as the CPU. Indeed, I think if you're before the PLL the clock is more accurate as in you get less jitter but the overall drift is the same. You might have distinct clock sources but you need a specific hw and a specific configuration.


Is it enough to have an audible effect? We’re not talking cesium clock levels of stability here. Now my curiosity is piqued, I have to figure out a way to measure this.

Indeed, and another factor is that a fingered note has a different tone quality.

Disclosure: String player.


And the thicker strings sound a bit different as well.

And the fingering for a given melody may just lay across the strings better one way than another.


Or Europeans before potatoes.

Or peppers. Hungary without paprika!

Not to mention India without the spicy peppers...

I heard turnips used to be all the rage.

Although it should be noted that modern turnip varieties are significantly more flavorful and sweet than pre-Columbian exchange era turnips. The old varieties were usually very bland so it didn’t take much for another tuber to displace it.

This could be an age thing. I’m 62. I didn’t know there was such a thing as an em dash until I was nearly finishing grad school. My buddy had an Apple Mac and was up to date on typography, and told me about em dashes. I ignored him and have continued to use double hyphens — all the way up to this point where my iPad seems to convert them into em dashes.

At first glance, the thread title made me think this was going to be about the halting problem.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: