Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sadly a problem with any wrapper function is that it nullifies this kind of information. Use functools.wraps.


PyCharm usually figured this out if it's not too complex. I often wrap session.request() with some defaults/overrides and autocomplete usually shows me the base arguments as well.


My question is that can @warps warp more than 1 function?

Maybe in some use case people need to merge 2 functions into 1, I don't know if it can handle this situation.


I'm not sure what it means to "merge two functions into one", can you elaborate?

If you are referring to a type signature for a function that passes through it's arguments to one of two inner functions, each of which has different signatures, such that the outer signature accepts the union of the two inner signatures, well ... you could achieve that with ParamSpecs or similar, but it would be pretty hard to read and indirected. Better, I'd say, to manually express appropriate typing.Union (|) annotations on the outer function, even if that is a little less DRY.


> I'm not sure what it means to "merge two functions into one", can you elaborate?

I'm not OP, but I see this pattern often enough:

    def foo(**kwargs):
        pass

    der bar(**kwargs):
        pass

    def wrapper(**kwargs):
        foo(**kwargs)
        bar(**kwargs)


Yup, this exactly.


What would you have wraps() do for two functions? Concatenate the docstrings and union of the annotations? Perhaps you want only the latter, that seems like it could easily be its own decorator: `@functools.annotate_intersection(f1, f2, …)`


That makes sense. If you have two functions with identical type signatures, then you should be able to invoke @functools.wraps (or its underlying functools.update_wrapper) such that the annotations are propagated correctly. In this case, that might be as simple as "functools.wraps(assigned=('__annotations__'))".

The rest of what "wraps" does isn't really applicable to what you're trying to do, though: propagating __doc__, __name__ or __module__ etc. isn't really meaningful when combining two functions with different names, as toxik points out.

Coming at it from the other side, you can use typing.ParamSpecs to write a higher-order "wrapper()" that wraps 2 arbitrary functions with identical signatures, like this:

    S = typing.ParamSpec('S')
    R = typing.TypeVar('R')
    def wrapper(func1: typing.Callable[S, R], func2: typing.Callable[S, R]) -> typing.Callable[S, typing.Tuple[R, R]]:
        def inner(**kwargs):
            return func1(**kwargs), func2(**kwargs)
        return inner

You can additionally use typing.Concatenate to indicate additive modifications to those signatures if the two inner signatures aren't the same.

However, what I thought you meant originally is the common problem of deduplicating large/complex function type signatures so you don't have to write them out multiple times or risk drift. In the parent post, consider what would happen if the signatures of "wrapper", "foo", and "bar" were all a) large and b) the same/very similar. Python's answers to that problem are much less good:

1. Duplicate them and write a unit test that ensures that the __annotations__ property of the functions that should have the same signatures remain in sync (or sub/supersets of each other, or whatever you'd prefer). This addresses drift, but requires a testing system and doesn't save the duplicate code.

2. A hack: use object constructors for functions. Write the in-common parts of the signature in the constructor of a parent class, and then put the body of "foo" and "bar" in the __init__ methods of child classes whose instances are not used/are useless, taking advantage of the superclass relationship to indicate to typecheckers that the parameters are all shared. For example, many typecheckers will do the right thing when handed code like this:

    class _Super:
        def __init__(self, arg1: SomeType, arg2: SomeOtherType, ...):
            self.arg1, self.arg2 = arg1, arg2
            
            
    class foo(_Super):  # Bizarre casing is intentional, this is not meant to be used as an instance
        def __init__(self, *args, **kwargs):
             super(*args, **kwargs)
             ...  # Business logic of old "foo()" method goes here, using object fields self.arg1 etc. instead of named variables.
             
    class bar(_Super):  # Bizarre casing is intentional, this is not meant to be used as an instance
        def __init__(self, *args, **kwargs):
             super(*args, **kwargs)
             ...  # Business logic of old "bar()" method goes here, using object fields self.arg1 etc. instead of named variables.
             
    class wrapper(_Super):
        def __init__(self, *args, **kwargs):
            foo(*args, **kwargs)
            bar(*args, **kwargs)
This does solve the duplication problem, and you can use dataclasses with __post_init__ methods for your "foo"/"bar" business logic to smooth out some of the boilerplate and weirdness there, but it remains a very bizarre coding style which substantially trades away readability/familiarity (and performance, if this is on a very hot path) in return for type-checker-friendliness.

3. Use a databag object (ideally an immutable slotted class, dataclass, typing.NamedTuple, [c]attrs, or something of that sort) to encompass all the data that would previously go in your argument signature, so that "wrapper", "foo", and "bar" all end up taking a single such object as their sole argument and accessing fields of that argument to do their work. This is probably (maybe? Lots of Scotsmen in this area...) the most traditionally Pythonic of these options, but is still a far cry from the convenience of something like functools.wraps.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: