Cool! Requests has got to be my favorite Python library. There are so many times I have replaced ~80 lines of urllib2 ugliness with 2-5 lines of requests. Simple, Pythonic, pretty...
I don't, this would require fundamental changes to the module (such as how it handles verifying SSL certificates) and it would cause API improvements to stagnate.
It's painless to install, so I prefer it as a module I can just add to all of my routines.
I feel like stdlib is where cool Python modules go to die...
I've been thinking about your last line. The rest of this comment is musings.
I think argparse is a cool module, and definitely better than optparse or getopt for the types of things I do. Both optparse and argparse were external packages before becoming part of Python. Optparse is dead, but I don't think that argparse is dead.
Some modules have a longer cycle time than Python. For examples, the decimal module (based on the 'General Decimal Arithmetic' standard) and sqlite module (based on SQLite).
I think these modules are cool. I don't think either of them are dead. (But on the other hand, the *bdb modules and the underlying implementations are dead.)
Some modules spun off, developed for a while on their own, and merged back into Python. Two examples were 'turtle2' and 'unittest2'. I enjoyed using unittest2, and am glad that it's part of Python 2.7 and newer. I never used turtle, but it was used in some computer classes. The turtle2 author pointed out that some computers are so locked down that the teachers can't change anything, which is why a 'turtle' in the base install can be useful.
Elementtree was and is a standalone package. I use it for nearly all of my XML parsing needs. It has a C extension as an accelerator, which isn't always painless to install, especially if I want a pure-Python distribution.
But on the other hand, minidom and the other XML-SIG package are dead. Then again, I didn't think they were cool. ;)
I looked at what modules were added in 3.2 and 3.3: lzma, faulthandler, ipaddress, argparse, concurrent.futures, unittest.mock, .. and that's about it.
(Interestingly, PEP 3156 would likely replace concurrent.futures.)
So I think others agree that only the most stable of APIs should go into Python core. But I don't think that all cool Python modules which get into the core die.
I guess I haven't considered restrictions on the machines where the modules are required. So there's definitely a reason for things making their way in to stdlib in those situations. (I wish the virtualenv stuff was packaged as part of core Python, to be honest...but that's a discussion for another time.)
My concern is the frustration (which may not be the right term) of having to continually bring things out of core, implement improvements, and then feed them back in to core. All the while being concerned with impact your changes may have on the entire ecosystem.
All-in-all, maybe it's just moot. This could probably just be chalked up to the ebb-and-flow of software development as a whole.
I might try it in the future, but I don't think useful to change my current system. I have internal type strings which look like this "RDKit minPath=3 maxPath=7" and command-line options like "--minPath=3 --maxPath=7".
I ended up making a registry of options which is used to validate the type strings and is used to populate argparse.
I also use custom decoders, like a decoder to enforce that a given value is >=32 and a power of two. A quick glimpse at the docs doesn't show a way to support that ability, so it would need to be rewritten to occur after arg parsing has finished rather than during.
I don't doubt that I could use docopt instead, but the amount of work to migrate doesn't seem worthwhile.
Even Guido van Rossum said "A package in the standard library has one foot in the grave".
Modules that get updated with things people need are cool. Stdlib modules can only be updated once per Python version, with great effort and extreme care not to break anything anywhere. So stdlib modules aren't cool.
This is probably the single thing I miss the most when not using Python. It's a really great example of writing a library that lets the user get right to the semantics they want while simplifying a bunch of mostly unimportant details.
This has prevented me from using the official Jira python module. I literally just launched something hacky using urllib2 yesterday because of this... :(
I find it better that they stay out of the stdlib. Having to upgrade all of python just to get a new feature like CONNECT support would be unfortunate.
Cool! I did the `Session.prepare_request` change [1] and found the project maintainers extremely helpful and patient in discussing the best way to go about it as I stumbled around. It's a good codebase and an awesome library.
The underlying issue is with Python 2.x (which is feature frozen and won't get SNI support) but apparently Requests can support SNI if you install PyOpenSSL, ndg-httpsclien and pyasn1
I first used requests to earlier this year to handle some obscure calls to the twitter API, and was absolutely astonished by how much work it saved me. Really amazing work.
Requests is one of the best Python modules ever written, and I use it (literally) every day. Good stuff, man!
<3