When was Earth Formed?

The Bible teaches that the God formed the earth way back at the beginning of time. It teaches that there was nothing on the earth until God began creating things.