Data compression is the compacting of information by decreasing the number of bits that are stored or transmitted. This way, the compressed info will take much less disk space than the original one, so more content can be stored on the same amount of space. You'll find different compression algorithms that work in different ways and with a lot of them just the redundant bits are removed, so once the info is uncompressed, there's no decrease in quality. Others erase unnecessary bits, but uncompressing the data later will lead to reduced quality in comparison with the original. Compressing and uncompressing content requires a large amount of system resources, and in particular CPU processing time, so every hosting platform which uses compression in real time must have adequate power to support this feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of storing the actual code.

Data Compression in Shared Web Hosting

The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm named LZ4. The latter is a lot faster and better than every other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that very fast, we can generate several backups of all the content stored in the shared web hosting accounts on our servers every day. Both your content and its backups will require reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the web hosting servers where your content will be stored.

Data Compression in Semi-dedicated Hosting

The ZFS file system that runs on the cloud platform where your semi-dedicated hosting account will be created uses a powerful compression algorithm called LZ4. It's one of the best algorithms out there and positively the most efficient one when it comes to compressing and uncompressing website content, as its ratio is very high and it will uncompress data much faster than the same data can be read from a hard drive if it were uncompressed. In this way, using LZ4 will quicken every site that runs on a platform where the algorithm is present. The high performance requires lots of CPU processing time, that's provided by the multitude of clusters working together as a part of our platform. In addition to that, LZ4 makes it possible for us to generate several backups of your content every day and save them for a month as they'll take much less space than standard backups and will be created much faster without loading the servers.