What does the Word California Mean?

The word California means an imaginary island or paradise on earth, and the name was based on a Spanish romance novel that was written around 1510. The state California was given by the conquistadors from Spain.You can find more information here: