Is it me?
Or, has America really changed?
Was it really the hippies doing all that dope that changed America?
Or, was it our schools teaching us propaganda?
Or, not having honest politicians?
Or, did we just get tired of God? And turn our back on God? And then he quit favoring us?
I have been abroad for some time. And I can immediately see a difference in culture. EVERYONE thinks going to America will make their lives better.
OK, almost everyone.
What I found IRONIC, is that the ones who KNEW America was not a good choice, have ALWAYS been men. Now the sample size has been WAY too small to be certain, only three so far really thought they did not want to go to America when other people had told them to. Canada is similar.
Why do they think America is great? They are looking for OPPORTUNITY.
America has THAT. We have more opportunity for politically connected business than any country in the world. OK, that was exaggeration again.
Communist China, Mexico, and the other countries practicing bribes all have the same amount of opportunity if you grease the wheels.
And America has some opportunity for the small business man.
But, it comes at a high price, we have the highest taxes in the world. I think we do, and I have looked at the actual numbers several times.
But, I talked with some Mormon Missionaries yesterday. And I made the comment that I wanted this country to not follow America’s moral decay.
They replied, “That is exactly why we are here.”
Baptists and Mormons are fundamentally opposed to each other …. they do to Jesus what the Muslims do, they completely redefine Jesus.
But, ironically, they see the same problem in America. And they are YOUNG.
We are in trouble. America is in trouble.
And without God’s leadership, I am not sure that we will ever recover.
Can we recover without God?