What does California Mean?

California is the furthermost western state in the United States. The word california’s origins have been disputed. The commonly held origin is from the Spanish conquistadors. It is said to mean ‘an earthly paradise’.