# Is there any way to compress 10GB of anything to 1MB with WinRAR?



## youself553 (Jun 1, 2019)

How do I compress 10GB of anything to 40MB with WinRAR?
My specs:
CPU: AMD Athlon II X2 250
iGPU: AMD Radeon HD 3200 IGP
RAM: 2 GB RAM
Disk Drive Type: HDD


----------



## Solaris17 (Jun 1, 2019)

generally not, you can play with the compression algorithm, but compression ratio % is based off of the type of data being compressed.

They do have ways to do it, but only with data that can be compressed in very high levels. We actually use it in networking to stop bots.









						How to defend your website with ZIP bombs
					

the good old methods still work today




					blog.haschek.at


----------



## Deleted member 24505 (Jun 1, 2019)

Solaris17 said:


> generally not, you can play with the compression algorithm, but compression ratio % is based off of the type of data being compressed.
> 
> They do have ways to do it, but only with data that can be compressed in very high levels. We actually use it in networking to stop bots.
> 
> ...



Thanks, that was a interesting little read.


----------



## blobster21 (Jun 1, 2019)

How would you use this script to protect a nextcloud instance ?

What would be the name of the php file, in order to catch a bot ?


----------



## Bill_Bright (Jun 2, 2019)

youself553 said:


> How do I compress 10GB of anything to 40MB with WinRAR?





Solaris17 said:


> They do have ways to do it


I don't believe that is true in this scenario. 40,000,000/10,000,000,000 = 0.004 or .4%. It would be nearly impossible to compress anything that much. You are talking a compression ratio of over 99%! The example given in Solaris17's link is not realistic at all (a file with nothing but 0s).

Perhaps, maybe, if in some very rare and extreme circumstance your original file did contain 10GB of just 0s (as indicated in Solaris17's link), or it was an image of a polar bear eating marshmellows in a snow storm, or a black bear eating blackberries in a deep cave on a moonless night, you might be able to compress it down that much. But even then, I doubt it. And if the file contained real data, I say no way. The fact is, most compression algorithms are unable to reduce a real data file even 50%, 33% if lucky. 

What type of file are we talking about? If nothing but plain text, you can compress it quite a bit - especially if much of the text is repeated many times. But if there are any images - especially high resolution images - your luck begins to run out especially if you are looking to achieve lossless compression. But it should be noted that many files types are already highly compressed - like JPEG or MP4 media files.  

Your best bet may be to break your original file down into smaller parts, then compress the parts. But understand you cannot then merge them into another compressed file that is even smaller.


----------



## Solaris17 (Jun 2, 2019)

Solaris17 said:


> generally not, you can play with the compression algorithm, but compression ratio % is based off of the type of data being compressed.





Bill_Bright said:


> I don't believe that is true in this scenario. 40,000,000/10,000,000,000 = 0.004 or .4%. It would be nearly impossible to compress anything that much. You are talking a compression ratio of over 99%! The example given in Solaris17's link is not realistic at all (a file with nothing but 0s).
> 
> Perhaps, maybe, if in some very rare and extreme circumstance your original file did contain 10GB of just 0s (as indicated in Solaris17's link), or it was an image of a polar bear eating marshmellows in a snow storm, or a black bear eating blackberries in a deep cave on a moonless night, you might be able to compress it down that much. But even then, I doubt it. And if the file contained real data, I say no way. The fact is, most compression algorithms are unable to reduce a real data file even 50%, 33% if lucky.
> 
> ...



Please learn to read context bill, and maybe then not even context. My post was pretty clear already.


----------



## Bill_Bright (Jun 2, 2019)

Solaris17 said:


> Please learn to read context bill, and maybe then not even context. My post was pretty clear already.


Gee whiz. I already acknowledged correctly what your posts said. Please learn to read context yourself - including the OPs as well as mine. You are criticizing me for agreeing with you!


----------



## micropage7 (Jun 2, 2019)

What data some data could packaged in small size like text or office documents, but if apps or video, good luck with that
Highly compressed file could bring you higher risk of data corruption
Winrar? Although winrar well known for compressing file i guess that's not capable to do that, please ask NSA for the right app

and why not you create multiple RAR, split big document into several RAR


----------



## Nuke Dukem (Jun 2, 2019)

Solaris17 said:


> compression ratio % is based off of the type of data being compressed.



I can confirm. Story time: a few years ago our office PC was running laggy, so I poked around and found out the cause - turns out the access control software was constantly writing new data to some obscure log file that had grown to 1,56GB, if memory serves me right. Deleting the log fixed the issue, but I made a 7zip archive of the file before that, just in case it was needed some day. The archive was mere megabytes in size, almost certainly under 50MB.


----------



## NdMk2o1o (Jun 2, 2019)

There are more advanced archiving tools than winrar/winzip that will use much higher compression standards and ratios though as pointed out above it depends on what the data is and generally speaking most common data types cannot be compressed by this much, some hardly any at all. It depends on the file type how much compression you can acheive though there is no silver bullet for this type of thing otherwise it would be common practice and we would all be doing it to store terabytes worth of data on a single 80GB drive. We're not.

Edit: I just had fun though  I opened notepad and prceeded to continually write 0's into a blank doc, after a while I copied and continued pasting all the 0's I had done before to increase the number and speed and again and again did this until I got a "ran out of memory" error......   then I saved the file, it was only 2GB but I opened up winrar and added to an archive aznd chose the best compression ratio and now I have 2GB worth of crap in a .txt file that is only 129KB 

Edit 2: now the original file is too big to be opened by notepad   wordpad said hold my beer and is currently not responding whilst trying to open it we shall see


----------

