Re: Newbie Help - and the Holy Grail of CuSeeMe

Brian Godette (bgodette@idcomm.com)
Fri, 03 Apr 1998 11:11:25 -0700


At 09:37 AM 4/3/98 -0400, Scott Lacroix wrote:
>At 12:46 AM 4/3/98 -0600, Jason Williams wrote:
>>On Thu, 2 Apr 1998, Brian K. Dowtin wrote:
>>> Settings are kind of important in getting video
>>> Seems that for a while these were unpublished or at least
>>> any reasoning behind the settings was unpublished.
>>
>>Historically (when I got into CU in 1994), settings weren't published
>>because they really weren't that important. A default max send of 80kbps
>>and max receive of 500kbps worked great when I was on a T1 :)
>
> <nasty, biting sarcasm on>
> Uhm, things worked great for you on a T1 and thus, no-one else published
>the meanings of the setting? Did they all check with you or what?
> <nasty, biting sarcasm off>
> :)
>
>>> Min Send: 1 Max Send: 15
>>> Men Recv: 1 Max Recv: 28
>>
>>The Mins don't really do much for bandwidth at all...The only real reason
>>to set your mins that low is so no reflectors can complain and force you
>>out of the reflector for incorrect mins. (I try to convince reflector
>>operators that forcing people's mins to 1 doesn't save any bandwidth for
>>the reflector and only ticks the participants off).
>
> Not entirely true. If you want a high quality connection and you push your
>Min Send up then it will definately affect bandwidth.
>
> Min Send: 15 Max Send: 28
>
>Will help guarantee that you send decent vid, but it's going to cause your
>outgoing bandwidth to be over 15kbs constantly. Min Recv, on the other
>hand, doesn't do alot for ya, or your bandwidth...

I really have to object to this statement. Maybe you've actually made it so
that the 3.1 *client* actually pays attention to it's min xmit value, but
as far as I've seen thru controlled testing on a LAN and over modems, every
client version (Mac or PC) ignores this value for all purposes except one.
That one purpose only happens with the Cornell Mac client and *ONLY* when
the vid was paused. The min xmit value was used as the max rate at which to
send tiles older than the refresh interval. And as far as I've seen, with
all the clients I've tested this with, the clients don't even use the min
xmit rate as a floor with rate adaption, they all happily drop to 10kbps
(seems to be the common floor) under outbound packet loss (real or faked :)
in case you're wondering how I tested it).

In addition the above settings you gave will have two direct effects.
1) that client will have fragmented video, if using B&W, from the
perspective of a modem user on the receive side, or "slow" (frame rates are
below what the client is actually sending) video using MJPEG or smeared
video using SFM or H.263.
2) that client, if they're using a modem, is likely to have problems
staying connected to reflectors, or connecting to them in the first place
(max-min/max-max settings on the ref). And is likely to cause problems for
point-to-point connections for the first couple of minutes while rate
adaption between the clients pulls a sanity check on those insane (for
modems) rates.

Mind you, I'm only stating what happens out in the *real world* outside a
LAN environment or where all clients happen to be on ISDN/ASDL/Cable
Modem/non-POTS.

>
>>> Again, I'm sure the guys here who've written the code
>>> can give you an explanation and specifics.
>>
>>I haven't written the code really..but there's my $.02 :)
>>
>
> I have... *G*

So have I.