Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish I had the expertise or the time to give a better rebuttal to this, instead I'll just point out a few things:

First, the 2.4ghz spectrum is not the best for wireless comms, there were many reasons for opening up that spectrum and very few of those reasons were because it was an optimal signal frequency.

Second, bandwidth. The 2.4ghz spectrum is confined within an 82mhz band. The available whitespace spectrum has a significantly larger bandwidth.

Third, technology. 802.11b and g are based on technology that is now stale by more than a decade. Today's wireless broadband tech makes those protocols look like a 56k modem.

The combination of these factors allows for stunning possibilities. More efficient broadband technologies, better broadcast frequencies, more bandwidth. These allow for more users with higher data rates at farther distances from base stations. It also makes things like bridging access points a lot easier. There's little reason to believe that in 10 years whitespace broadband access points won't be utterly ubiquitous.

As for WiMax, that's not open spectrum, consumers can't buy a WiMax AP on newegg. That makes it an entirely different beast.



In large cities there is less than 82 MHz of white space available AFAIK. Also, white space channels are 6 MHz wide while 802.11 channels are 20 MHz.


What does an 82MHz band really mean? I understand that devices are confined to communicate within the frequencies of 2.4GHz +/- 41MHz (come to think of it, that's ridiculously narrow... a margin of .00000034%), but is band related to actual bps bandwidth? If so, how? And why is the available band so narrow?


Throughput is directly related to bandwidth. Modern radios have a spectral efficiency of 5-15 bps/Hz (depending on range, noise, and multipath), meaning that a 6 MHz white space channel is worth 30-90 Mbps, a 20 MHz 802.11 channel can give 100-300 Mbps, and the entire 2.4 GHz band theoretically has 410-1230 Mbps of capacity.


Very interesting. What is currently limiting the amount of bandwidth per Hz? IANAP, but why can't you just lock in a very specific frequency range, and blip morse code at a very high clock speed (say, 100MHz), and get a huge bps/Hz?


The only way to transmit information over a wave of fixed frequency is to mess up with that wave (if the wave is really fixed, it doesn't carry any information). When you mess up with the wave, you necessarily fluctuate it's frequency. So you have to send you signal within range of frequencies.

Intuitively, the more information you want to transmit, the more you have to mess up with the frequencies. Hence the bandwidth per Hz thing.

(Disclaimer: I don't know much about the subject. Corrections, clarifications, and precisions are welcome.)


That's not the only way to transmit information with radio waves. What's wrong with this: To send binary data, you could either send the pure signal (1), or you could not send any signal at all (0). As long as the clock cycles of the sender and receiver are the same, it should work (eg: 100MHz).

Given that this isn't the way it works, there must be some major flaw with it.


When you do a Fourrier transform of a signal of finite time, you notice that there's more than the fundamental frequency. Basically, when you cut off a signal, then resume it, over and over, then that signal ceases to be of only one frequency.

That's why your method doesn't work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: