I see a lot of You Tube videos about antennas and coax and I laugh and laugh at some of the stupid ass things these idiots put out. Now, with that said, I saw one the other day that made me scratch my head and say, "Hmmmm... this guy might be right". He was answering the age old question of why radios want to see a 50 Ohm load. He said that the transmitted signal loses less energy when it does out at 30 Ohms impedance and a received signal loses less energy when it comes down a 75 Ohm Impedance. So, somebody decided to find a sweet spot in the middle and they settled on 50 Ohms. Actually it is 52 Ohms but who gives a crap about the 2 Ohms difference. That made sense and the guy was quite convincing. The only thing that bothers me about it is why would the exact same frequency going out or coming into a coax have a different loss factor? For the moment, I'm going to go with the notion that he is right and accept his findings. I've never been one to give a crap about someone using 52 Ohm Coax or 75 Ohm Coax. Now, back in the old days, I remember a lot of Hams used a separate Receiver and Transmitter. Each one had its own coax. Ok some guys had an RF sniffer circuit relay box that would automatically switch a single coax cable back and forth but that was for the rich guys or the one's with enough brain cells to build their own. Ideally, you could use 75 Ohm coax on the receiver and 52 Ohm coax on the transmitter.