functional bitrate sim

My wmediumd rewrite is a bit further along thanks to getting a few hours to hack on it this weekend. It can now accurately simulate throughput between a pair of radios using legacy rates. For example, if we set the SNR between two devices to 20 dB, then they can communicate at a nominal 54 mbps rate, yielding about 26 Mbps achieved in iperf:

[  3]  0.0-10.0 sec  31.2 MBytes  26.1 Mbits/sec

At 15 dB, we can send between 24 and 36 Mbps nominal rates, which yields:

[  3]  0.0-10.1 sec  21.0 MBytes  17.5 Mbits/sec

Note that achieved throughput is quite a bit lower than nominal, as in real life — if aggregation were implemented then they would be closer.

The basic architecture is pretty simple: frames are queued on a per-sender management or data queue depending on type, and delivery time is computed based on whether or not there is loss and the contention window parameters of the queues. A timerfd is used to schedule reporting of frame delivery back to the kernel at appropriate times. The delivery time does not take into account actual contention, although this could be done in principle by looking at all the queued frames for all stations.

I haven’t really decided what to do about configuration. I stripped out the jamming and probability matrix configurations, as I feel like doing things on a signal level basis are simpler. But at this point there’s no real way to specify signal levels either (other than hardcoding), and some scenarios probably want something dynamic (e.g. mobile stations).

Changes are in my wmediumd master branch. Unfortunately, I won’t have much time to work on this for the next two months, but patches for the many TODOs are welcome.