The term data compression refers to reducing the number of bits of data which needs to be saved or transmitted. This can be achieved with or without the loss of info, which means that what will be deleted throughout the compression can be either redundant data or unneeded one. When the data is uncompressed later on, in the first case the data and its quality will be identical, whereas in the second case the quality shall be worse. There are various compression algorithms which are more effective for various sort of info. Compressing and uncompressing data in most cases takes a lot of processing time, so the server carrying out the action should have sufficient resources in order to be able to process the info quick enough. An example how information can be compressed is to store just how many sequential positions should have 1 and just how many should have 0 inside the binary code rather than storing the actual 1s and 0s.
Data Compression in Cloud Web Hosting
The compression algorithm that we employ on the cloud hosting platform where your new cloud web hosting account will be created is named LZ4 and it's applied by the state-of-the-art ZFS file system which powers the platform. The algorithm is more advanced than the ones other file systems employ since its compression ratio is higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens faster than information can be read from a hard disk. Because of this, LZ4 improves the performance of any Internet site hosted on a server that uses the algorithm. We take advantage of LZ4 in an additional way - its speed and compression ratio let us generate multiple daily backup copies of the entire content of all accounts and keep them for one month. Not only do these backup copies take less space, but their generation doesn't slow the servers down like it often happens with other file systems.