Yes. I know it's possible because this is how I work.
It is unfortunate that avoiding "common code" sometimes involves more effort than just accepting it, but in my experience, once you have invested the time to free yourself from the "common code", it is gone forever. I find this to be a very liberating feeling.
The idea of "common code" that the user or developer "must" accept is, I think, seen on many levels.
From CPU's where I get dozens of instructions no program will ever use,
to computers of all sizes that are pre-loaded with crapware I will never use,
to operating systems that come with numerous drivers, libraries and programs I will never use,
where the libraries themselves include dozens of functions never used,
to IDE's where that would give me heaps of code that I will never use,
to programs themselves, often loaded with features I will never use. I'm sure there is more but you get the idea.
Are there costs to "common code"?
For example, I have seen gratuitous features increase attack surface and make programs less secure. At the least these gratuitous features make the programs more complex.
One cost I would argue is time, which I would also argue the world's most valuable asset. But then I also have to invest time to avoid "common code".
I also see the "common code" problem in documentation. Manuals running in the hundreds or thousands of pages where the needed information could be conveyed succinctly and concisely in a few paragraphs.
What is behind this decision to always deliver "the kitchen sink"? Or maybe there is nothing behind it? As crypt1d says, it is amusing.
I recall many years ago various attempts at obtaining small command line utilities from Microsoft for working with Windows. In each case the utility was only a few KB. Yet obtaining the program always involved downloading a "kit" of several hundred MB or, in more recent years, several GB. Is this intentional? Mere oversight? What do you think?
Interestingly, the "common code" problem does not appear to have gained a foothold in the context of information retrieval. Even though users today have ample storage space and processing power to handle bulk data, in fact as much data as they will ever access in their lifetime, they are not usually presented with an option to download "the kitchen sink".
For example, I can download the entire Wikipedia, load it into a database using my database software of choice, wherein all my queries become local.
Or I can make each query across the open internet into a third party's database software of choice.
Obviously, the later approach might be more desirable to the third party as they can record all the queries. But the former "kitchen sink" approach is more appealing to me since the querying is faster and more reliable.
Yes. I know it's possible because this is how I work.
It is unfortunate that avoiding "common code" sometimes involves more effort than just accepting it, but in my experience, once you have invested the time to free yourself from the "common code", it is gone forever. I find this to be a very liberating feeling.
The idea of "common code" that the user or developer "must" accept is, I think, seen on many levels.
From CPU's where I get dozens of instructions no program will ever use, to computers of all sizes that are pre-loaded with crapware I will never use, to operating systems that come with numerous drivers, libraries and programs I will never use, where the libraries themselves include dozens of functions never used, to IDE's where that would give me heaps of code that I will never use, to programs themselves, often loaded with features I will never use. I'm sure there is more but you get the idea.
Are there costs to "common code"?
For example, I have seen gratuitous features increase attack surface and make programs less secure. At the least these gratuitous features make the programs more complex.
One cost I would argue is time, which I would also argue the world's most valuable asset. But then I also have to invest time to avoid "common code".
I also see the "common code" problem in documentation. Manuals running in the hundreds or thousands of pages where the needed information could be conveyed succinctly and concisely in a few paragraphs.
What is behind this decision to always deliver "the kitchen sink"? Or maybe there is nothing behind it? As crypt1d says, it is amusing.
I recall many years ago various attempts at obtaining small command line utilities from Microsoft for working with Windows. In each case the utility was only a few KB. Yet obtaining the program always involved downloading a "kit" of several hundred MB or, in more recent years, several GB. Is this intentional? Mere oversight? What do you think?
Interestingly, the "common code" problem does not appear to have gained a foothold in the context of information retrieval. Even though users today have ample storage space and processing power to handle bulk data, in fact as much data as they will ever access in their lifetime, they are not usually presented with an option to download "the kitchen sink".
For example, I can download the entire Wikipedia, load it into a database using my database software of choice, wherein all my queries become local.
Or I can make each query across the open internet into a third party's database software of choice.
Obviously, the later approach might be more desirable to the third party as they can record all the queries. But the former "kitchen sink" approach is more appealing to me since the querying is faster and more reliable.