What does the Word Nazi Mean?

The word Nazi was derived from the German translation NAtionalsoZIalismus, meaning National Socialism party. It came to be the term used to refer to the dictatorship regime under the leadership of Adolf Hitler.