The term data compression identifies reducing the number of bits of info that should be stored or transmitted. This can be achieved with or without losing data, so what will be removed at the time of the compression will be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the data and the quality shall be identical, while in the second case the quality shall be worse. There are different compression algorithms which are more effective for different kind of information. Compressing and uncompressing data frequently takes a lot of processing time, which means that the server executing the action must have adequate resources in order to be able to process your data fast enough. One simple example how information can be compressed is to store just how many sequential positions should have 1 and just how many should have 0 inside the binary code rather than storing the particular 1s and 0s.
Data Compression in Shared Hosting
The ZFS file system which is run on our cloud web hosting platform uses a compression algorithm named LZ4. The aforementioned is considerably faster and better than any other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that quickly, we're able to generate several backup copies of all the content kept in the shared hosting
accounts on our servers every day. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web hosting servers where your content will be kept.
Data Compression in Semi-dedicated Servers
The semi-dedicated server
plans that we supply are created on a powerful cloud platform that runs on the ZFS file system. ZFS works with a compression algorithm known as LZ4 that is superior to any other algorithm available on the market in terms of speed and data compression ratio when it comes to processing website content. This is valid particularly when data is uncompressed as LZ4 does that much faster than it would be to read uncompressed data from a hard disk and owing to this, sites running on a platform where LZ4 is present will work at a higher speed. We can take advantage of the feature despite of the fact that it needs quite a great deal of CPU processing time because our platform uses many powerful servers working together and we don't make accounts on a single machine like many companies do. There's one more advantage of using LZ4 - given that it compresses data very well and does that speedily, we can also make multiple daily backup copies of all accounts without influencing the performance of the servers and keep them for 30 days. That way, you'll always be able to bring back any content that you erase by mistake.