All About Christian Faith
The Christian faith has existed since the time that Jesus Christ was on earth. Followers of Jesus have had different understandings about many things, but all true Christians agree that Jesus Christ is a revelation of God in the flesh. We all believe that God came to earth to bring us back into a relationship with Himself.