Data compression is the compacting of info by lowering the number of bits that are stored or transmitted. As a result, the compressed information needs considerably less disk space than the initial one, so a lot more content could be stored on identical amount of space. You will find many different compression algorithms which work in different ways and with many of them just the redundant bits are removed, so once the info is uncompressed, there's no loss of quality. Others erase unneeded bits, but uncompressing the data later will lead to reduced quality compared to the original. Compressing and uncompressing content requires a huge amount of system resources, in particular CPU processing time, therefore any Internet hosting platform that uses compression in real time needs to have enough power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the entire code.
Data Compression in Hosting
The ZFS file system which runs on our cloud hosting platform employs a compression algorithm named LZ4. The latter is considerably faster and better than any other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data really well and it does that quickly, we're able to generate several backups of all the content stored in the hosting accounts on our servers on a daily basis. Both your content and its backups will need less space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web servers where your content will be stored.