> for a trivial example there is an algorithm for "determine if the input number is odd" which is O(1).
I would argue that this particular example cannot be considered a case of arbitrarily-sized input: There is only a single meaningful bit of information, which is the single bit accessed.
However, if you must, we can expand to clarify that the input is meant to include arbitrary meaningful bits of information that require processing and cannot simply be ignored.
That's not how this works. What you're doing here is defining away the scope of the problem to make O(1) complexity analysis redundant. A bound of O(1) conventionally means that we can exploit some structural component of the input and the given problem to find a solution independent of the input size. If you redefine input size to the more narrow sense of inputs which don't have some sort of structural feature like that, then yes of course nothing is constant time. But then you're just shifting the difficulty of the problem around without much of a gain.
I see how you may think that, but I would argue that I am not redefining anything, as things turn nonsensical without this assumption.
It is not an input if it does not affect your output (I am sure a better formal definition exists—I usually hear "meaningful", hence my use of that). Without this restriction, any input could be arbitrarily padded, inflating input sizes and making an algorithms complexity appear lower.
I find the even/odd example to be an exceptionally clear case of an algorithm that takes a fixed 1-bit input: Its implementation on a little-endian machine does not know the bounds of its input, only the location of the least significant byte (due to byte-addressing).
Without a defined bound, such an implementation must be assumed to either take a single byte as input (here a byte read is an implementation detail, despite the algorithm only needing one bit), or the full system memory from the location specified as input regardless of contents (due to having used the word "arbitrary", the input is allowed to not fit in registers).
However, I admit that I am now entering deeper theoretical waters with regards to definition details that I am normally comfortable with. I do however not find any other definition to line up with practice.
> I see how you may think that, but I would argue that I am not redefining anything, as things turn nonsensical without this assumption.
The point I'm making is that this statement reduces to the claim that defining an algorithm's worst case time complexity as O(1) or constant time is nonsensical.
Do you disagree that adding two numbers is a constant time operation?
> Do you disagree that adding two numbers is a constant time operation?
It is constant time by crypto programming definition in both theory and practice, and O(1) only in theory (bigints are a pain).
I did not try to claim that O(1) is nonsensical. Rather, that certain O(1) variable-input algorithms are in fact simply fixed input algorithms due to not considering its input.
I also find these "non-sensical" O(1) algorithms to be outliers.
Do you disagree that adding two numbers is a constant time operation?
I do. A general algorithm for adding two numbers, at least one represented in the conventional way as a series of digits in some base, has a lower bound of O(log n).
I think it would be tough to find a suitable definition of "meaningful". When searching for an element in an ordered list, how many items are meaningful? One if it's present, zero otherwise? Always log(n)? Always n?
If "meaningful" is a non-trivial property, as difficult to determine as an algorithmic lower bound, then it's not terribly useful.
And it also leaves us with no way to talk about non-constant time algorithms for determining whether a number is odd. Can we say one runs in exponential time when we only talk about "meaningful" bits of input?
> I think it would be tough to find a suitable definition of "meaningful".
I do not believe it is tough to find a suitable practical definition of meaningful: If a piece of information has no effect on the output of the algorithm, regardless of its value, then it is not meaningful.
Examples of meaningful values would be any value in a list being sorted. Non-meaningful values would be any other bit than the lowest bit when determining if a number is even or odd.
I find it to be an incredibly trivial definition when considering algorithms.
> If "meaningful" is a non-trivial property, as difficult to determine as an algorithmic lower bound, then it's not terribly useful.
I find the case where the full input is not meaningful to be an outlier that does not at all affect how we normally deal with algorithms and bounds.
However, when the two do not match, one can start discussing what your input actually is. I would, for example, argue that an even/odd number checker only ever takes a single bit as input. It may consume more due to architectural limitations around bit accesses on modern architectures, but that is not part of the core algorithm.
With this definition, different parts of the input might be "meaningful" for two algorithms that solve the same problem. This makes the size of the "meaningful" part of the input ill suited for comparing efficiency.
(It also makes the statement "no program can run faster than linear time in the size of the meaningful input" true directly by definition, which makes it a pretty boring statement...)
For an equality operation, all bits are meaningful, as there are input configurations in which any bit can lead to a change in output.
However, I do see your point in that whether a bit flip in input leads to a change in output depends on the particular input. My counter to that is that it is not the specific input, but the potential inputs that matter.
I would argue that this particular example cannot be considered a case of arbitrarily-sized input: There is only a single meaningful bit of information, which is the single bit accessed.
However, if you must, we can expand to clarify that the input is meant to include arbitrary meaningful bits of information that require processing and cannot simply be ignored.