Not All Americans Are American

Not All Americans Are American

By Dave Daubenmire | July 1st, 2021 Are All Americans Truly American? What does it mean to be American?  Is it something that is determined by birth?  Partially, I assume.  But simply being born in America does not make one a real American. My friend Dr. John Diamond...