Interesting deconstruction of U.S. government's role in preserving the American way of life.
According to a specific Enlightenment-era doctrine, yes, rights are inherent, either originating from God or as the logical consequence of the "state of nature." In reality, governments bestow rights. If you believe otherwise, ask yourself who defines which rights you inherited. Is it God? Which god? The Judeo-Christian god? The Greek gods? Or is it the writers? If so, is it Locke or Rousseau? What about Hobbes? The problem with all of this talk about rights is that it's utterly meaningless without a political and legal framework to support it. You can go on and on about the rights of Central Africans to life, liberty, and the pursuit of happiness, but they still live in a place where children are forcibly enslaved and used as soldiers. Now there's something to be said for standing up to government action in our time and saying, "Hey, your power derives from us, and you better remember that." But that's because our system of government is (theoretically) founded on that principle. In societies where that's not the case, the discussion should entail whether they ought to change the foundation of their government rather than about a nebulous concept like fundamental human rights.
"Now there's something to be said for standing up to government action in our time and saying, "Hey, your power derives from us, and you better remember that."" All good points, but you hit the nail on the head right there. Frankly, Americans should only be worrying about American government. The recently adopted (last 60 years) "world police" and "freedom whether you want it or not" policies isn't doing the public any good. Just securing oil and feeding the military industrial complex for the government and corporations.