Hollywood is indeed Boys' Town, a leftist one at that, as we see unfolding with the Harvey Weinstein scandal.
Why are the feminists not up in arms with this misogynistic practice being performed, primarily, by women for men on children?
Many of us on the right remain in the underground for fear of retribution. With a Republican president and Congress, why does this continue?
From someone who grew up in the seventies and saw the residual harm from the sixties' "free love."