Data compression is the compacting of data by reducing the number of bits that are stored or transmitted. This way, the compressed info will take substantially less disk space than the initial one, so a lot more content can be stored on identical amount of space. You can find various compression algorithms that function in different ways and with a number of them only the redundant bits are deleted, which means that once the data is uncompressed, there is no decrease in quality. Others delete excessive bits, but uncompressing the data at a later time will lead to lower quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, in particular CPU processing time, therefore each and every hosting platform which uses compression in real time needs to have sufficient power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of keeping the whole code.