The only language that I have worked with that realizes that in the real world simple is _always_ a lie, is common lisp. It is the only language that has actually embraced the fact that its designers/committee were not geniuses and provided the tools for dealing with the complexity of the system. When unix tools fail, hope that you are on a system where it is possible to get the symbols and/or the source code, and even then good luck fixing things in situ. Most existing systems do not empower the consumer of code to do anything if it breaks. CL remains almost completely alone in this, because the debugger is part of the standard. Most of my code is written in python, and I can tell you for a fact that when python code really fails, e.g. in a context with threading, you might as well burn the whole thing to the ground, it will take days to resolve the issue, roll back the code and start over. The fact that people accept this as a matter of course is pure insanity, or a sign that most programmers are in an abusive relationship with their runtime environment.
I disagree about Python. First, you can code a "manhole" in your code which lets you evaluate arbitrary string as Python code at runtime, basically a shell available via a socket.
Second, you don't even need that. Gdb with some tooling (see pyrasite) lets you attach to an arbitrary Python process and evaluate Python code in its context.
Sort of. I have a bad habit of forgetting Smalltalk in these kinds of conversations, but that is possibly because Smalltalk images exist in their own happy little worlds. I love working in Smalltalk environments, it is always a mind blowing experience, the system is completely homogeneous, accessible, introspectable, modifiable, etc. -- until you hit the hard boundary between the image and the host system, be it hardware or software. That boundary leaves quite a gap that smalltalks tend to be unable to fill by themselves (mostly due to a relative lack of resources). I can imagine a smalltalk system that could go all the way down to modern assembly, but unfortunately no such system exists today (that I'm aware of). If you can live inside the image then yes, Smalltalk is possibly even better.
> when python code really fails, e.g. in a context with threading, you might as well burn the whole thing to the ground
This sounds weird. Why burn the whole thing to the ground? You've got frames from all threads available. Why would it take days to resolve the issue? Why do you think it's easier to resolve it in common lisp?
I think I need to unpack what I mean by 'really fails' to capture what I was trying to convey. I deal with Python programs running in a number of different environments, and there are some where literally all you have on hand are the libs you brought with you. Maybe that is an oversight on my part, but the reality is that in many cases this means that I am just going to restart the daemon and hope the problem goes away, I don't have the time to manually instrument the system to see what was going on. I shudder to imagine having to debug a failure from some insane pip freeze running on a Windows system with the runtime packaged along with it.
Worst case for CL means that at the very least I don't have to wonder if gdb is installed on the system. It provides a level of assurance and certainty that vastly simplifies the decision making around what to do when something goes wrong.
To be entirely fair, the introduction of breakpoint in 3.7 has simplified my life immensely -- unless I run into a system still on 3.6. Oops! I use pudb with that, and the number of uncovered, insane, and broken edge cases when using it on random systems running in different contexts is one of the reasons I am starting no new projects in Python. When I want to debug a problem that occurred in a subprocess (because the gil actually is a good thing) there is a certain perverse absurdity of watching your keyboard inputs go to a random stdin so that you cant even C-d out of your situation. Should I ever be in this situation? Well the analogy is trying to use a hammer to pound in nailgun nails and discovering that doing such a thing opens a portal to the realm of eternal screaming -- a + b = pick you favorite extremely nonlinear unexpected process that is most definitely not addition. You can do lots of amazing things in Python, but you do them at your own peril. (Disclosure: see some of my old posts for similar rants.)
> The only language that I have worked with that realizes that in the real world simple is _always_ a lie
This feels like an excuse to me. I’ve worked on a lot of rather simple web apps that all more or less do the same stuff. A few of them have managed to have delightfully simple codebases, most of them haven’t. There’s no reason that couldn’t be true for all of them. You usually end up having at least some complexity. But small complexity trade offs don’t necessarily require you to undermine the simplicity of the entire system.
One of the traditional criticisms of Lisp, though, is that it lets programmers re-introduce a whole lot of accidental complexity in their Lisp code, and, worse, everyone introduces a completely different set of accidental complexities into their code.
As a programming language, Common Lisp is large enough and multi-paradigm enough to allow for elegant solutions to problems. It does require some experience with the language and some wisdom and discipline to know what pieces to select and how to best use them.
However, like all large, multi-paradigm programming languages that have been around for a while (I'm looking at you C++), programmers tend to carve out their own subsets of the language which are not always as well understood by those who come after them, particularly as the language continues to evolve and grow.
There is also the problem where programmers try to be too clever and push the language to its limits or use too many language features when a simpler solution would do. All too often we are the creators of our own problems by over-thinking, over-designing, or misusing the tools at hand.
>> if the language is not powerful enough people will inevitably add preprocessors, code generators, etc... to do the things they want.
This is definitely true and it adds to the accidental complexity of the system, usually to save programmer time or implement layers of abstraction for convenience.
Common Lisp and C++ have both incorporated preprocessors and code generators through Common Lisp macros and C++ template metaprogramming and C++ preprocessor / macros. These features give the programmer metaprogramming powers, enable domain-specific language creation, implement sophisticated generics, etc.
They are powerful language facilities that need to be used wisely and judiciously or they can add exponential complexity and make the system much more difficult to understand, troubleshoot, and maintain.
No language can prevent someone from exercising bad taste, but a language can prevent someone from exercising good taste.
Some languages attempt to discourage bad taste by limiting the power given to users. Common Lisp embraces the expression of good taste by giving users considerable power.
This is a valid criticism, but it's not lisp specific.
We know somewhat how to control accidental complexity in systems that are highly constrained and don't let you deal well with essential complexity. And we know somewhat how to give you powerful tools to deal with essential complexity.
We haven't figured out in general how to make systems powerful enough to deal with the range of essential complexity, but constrained enough that this sort of accidental complexity isn't common.
Of course a lot of real world systems don't do a great job of either.
Clojure is simpler than Common Lisp, but I still feel that the features are too many and too complicated, and continue to simplify, do not use some complex features, and insist on writing systems with pure pipeline structure.
As a result, the simplest system is obtained, but the design process is a systematic project.It is difficult to design a complex system into a simple and smooth pipeline system.
I think "incidental" (non-essential, secondary, happenstance), and "accidental" (non-intentional, happenstance) are more or less synonyms here, not contrasts. I think there, uh, is basically no significant difference between "incidental complexity" and "accidental complexity".
Care to elaborate what corresponds to the incidental/accidental complexity in CL? I tried to understand these concepts, but it feels very subjective. Whether or not something is accidental looks in the eye's the beholder.
Most proponents of Common Lisp swear by emacs/slime as the perfect IDE for it, but those may be not to everyone’s taste. Are there alternatives that are just as good?
There was a better alternative: Lisp machines. But in today's world, there's nothing as good as Emacs/SLIME. You can get almost there by connecting another editor to Lisp with an extension that speaks SLIME's SWANK protocol (like SLIMV for vim).
All those complicated lists? Forth is even further down the path to ultimate minimalism. Really only one data structure, a stack with binary values on it...
The fact that I can program in the highest level language and still have realistic, working tools to control machine code that is being generated or choose level or type of optimizations or lets me change all of that case by case and during lifetime of the application kind of blows my mind (FYI, I am using SBCL).