STORY   LOOP   FURRY   PORN   GAMES
• C •   SERVICES [?] [R] RND   POPULAR
Archived flashes:
229103
/disc/ · /res/     /show/ · /fap/ · /gg/ · /swf/P0001 · P2571 · P5141

<div style="position:absolute;top:-99px;left:-99px;"><img src="http://swfchan.com:57475/94639471?noj=FRM94639471-8DO" width="1" height="1"></div>

Nick
Mail
Title
Required text body length: 2 characters. Maximum: 15000 characters.
A file is optional.

Age: 4817.15d   Health: 83%   Posters: 2   Posts: 5   Replies: 4   Files: 0

>>MadHax!DCHAGqU/Do  1jul2011(fr)21:22  No.1453  OP  P1
MackanZoor -- compression

The site detects whether files are compressed or not, so why not compress them if they aren't? You'd save disk space and bandwidth, and the files wouldn't experience any loss. It's not really necessary to preserve the files bit-for-bit as long as the content remains undamaged.

It's not hard, and in fact you already have portable C++ code to do it -- it's RAR'd in MadHax_destink100.swf. Link zlib, include src/swf.h, then do something like this:

swf target;
target.loadFromDisk(infile);
target.saveToDiskAsCWS(outfile);

A caveat I have to point out is that it doesn't seem to work that great on the batch of huge porn files that were recently uploaded. A 22 frame, 15.4MiB file reduces to 15.2MiB. I'm guessing that it has to do with an inefficiency with how the images are stored (jpeg rather than 8-bit lossless). It'd be more involved to optimize this type of swf, but if you want, I'm willing to work with you to do that.

>>!///SWFAnts  #ADMIN#  4jul2011(mo)07:03  No.1457  SWF  P2R1
Thanks for the suggestion, always glad to hear thoughts about the site. The system I wrote that validates the uploaded flash files already has an option to auto-compress uncompressed files, but in the end I decided to turn that off.

I found two reasons for doing so, the first is simply because I don't want to "help the internet get two versions of a flash file". For example if person A created an uncompressed flash file and uploaded it to 4chan's /f/ section, then person B could download it and upload it to swfchan.org. If the system then compress the flash file and person C downloads it there would now be both a compressed and uncompressed version in existance. Not a super big deal sure, but these two versions could be used to bypass the "duplicate file" detection system that 4chan and many other sites has since it only works on checksums (not to forget local indexing services on ones computer). Granted that anyone can make these alternative versions of the same flash but most people just don't know about it. I was a bit foolish when I wrote the system on swfchan.com a few years ago, it also seperates uncompressed and compressed flash files even if it is the same file. So if swfchan.org would always compress the uploaded files there would be a lot more dupes showing up. Of course swfchan.org is immune to this uncompressed/compressed issue so it means that I could improve the algorithm on swfchan.com, but it will take a lot of time (merging the two flash file's "wiki" pages will probably be the biggest hassle) and I just don't have the motivation at the moment. But I will fix the system on swfchan.com some day and when that happens there will be no more duplicate files (and the issue that it sometimes, rarely, mistakes two different files to be the same file will also be gone). TL;DR changing compression on uploaded files would create unnecessary duplicate files. :)

The second reason is a bit simpler, some flash files do not benefit from being compressed. That's probably the reason why not compression is always forced when making new flash files these days using the official tools. Simple embedded movies is only reduced by a few measly bytes (under the worst conditions the file size may actually grow a couple of bytes by enabling compression) and it's a lot harder (technically) to do stuff like seeking a frame in a compressed flash file and it gets more bothersome to access the flash's meta data for search engines. The CPU have to work a bit harder and so forth. In other words sometimes the compression is not worth it.

Still I would have liked it if compression was always required in flash. Just to keep things consistent, and most of the time compression is only a good thing. Kind of a pity that Macromedia only introduced it in version 6. In a perfect Internet Flash compression would always be on... AND its header would be a lot easier to read. They actually do not store the flash dimensions in easy-to-read pixels but instead they use a measurement called "twips", where 20 twips equal 1 pixel (without any scaling). But we're not done yet, these "twips" are hidden in RECT structures that are encoded in, hold on to your hats, bit fields! I kind of understand that they wanted to save some disk space of every single rectangle in flash (to reduce the total file size by maybe a KiB or two?) but they could at least have spent an extra four bytes on the header to make the flash dimensions easy to read. Now you have to be almost a wizard class hacker just to determine how big a flash file is. Boy, makes my brain bubble every time I think about it, haha. And why the meta-data is inside the compressed part of the flash is also a little mind boggling.

>>MadHax!DCHAGqU/Do  4jul2011(mo)11:54  No.1460  OP  P3R2
Not to mention that said bitfields are not byte aligned so you end up with variables spanning half the bits of one byte and half of another. Yeah, not fun.

Kind of disappointed that you're not compressing them to keep duplicates from arising. IMO it would be beneficial to the Internet if people started spreading smaller versions and the larger versions dropped into obscurity. Were there a problem with duplicates being uploaded to /f/, it might give them incentive to fix their system.

Still, how about optimization? There was a recent trend of very large files uploaded that were simple cell-shaded line drawings, the aforementioned worst case eating 15.4MiB for 22 frames. It doesn't compress with the standard compression because the images are stored as DEFINEBITSJPEG2. However, if you look at the files, they have long spans of the same color and probably less than 256 -- perfect conditions for PNG format. By grabbing a capture and converting it to PNG, a frame goes down to 144KiB. 144KiB * 22 frames = 3MiB, considerably better than 15.4 and not losing an iota of signal.

It would create an alternate version of the file, granted, but how about making it an optional step for when people directly upload files here -- as was the case for the files that inspired this thread.

>>!///SWFAnts  #ADMIN#  4jul2011(mo)19:42  No.1462  SWF  P4R3
I do understand your point and appreciate your suggestion but that would be a lot of work for a result that I don't really want (different file versions). Just the amount of testing that would be needed to make sure it works in 100% of all cases would be huge. Poorly optimized flash files are a bitch but to put so much effort into saving a couple of MiB in the year 2011 would be a little silly, especially since Internet will keep getting faster with each year. Here in Sweden people where I live now do have the option of getting a 1 Gbps connection at home and I expect in two years it will be cheap enough for most to actually have it even though they don't really need it. Besides it's not like it's even 10% of all files that desperately need to become smaller anyway.

I also doubt that anything would be changed in the different site's dupe checks since most admins don't do much coding themselves and use imageboards that someone else has put together. Most admins will just think "why bother" and then go out and have a beer instead.

There was a third reason I didn't mention before... I just get the feeling that people would think it to be odd if the server modified the files they upload. They would look at the file-size and go something like "hey that's not the file I uploaded" and then try to re-upload it and get annoyed over getting a dupe error. Well, I didn't mention it since it's more like user-expectance-thing rather than a real reason. It can be overcome through education, but then we have the problem with the many people that don't bother to read everything on every site they visit. Hehe.

>>MadHax!DCHAGqU/Do  4jul2011(mo)20:47  No.1463  OP  P5R4
I'd be willing to do the actual work of coding the filter, but alright, it ultimately comes down to the issue of duplicates.

It's worth noting that Internet speeds aren't that great everywhere. Here in McBurgerObeseFattyLand there are still large regions where the only available connection is dialup (!), others where the only telecom doesn't give a damn and the connection spontaneously craps out for days at a time (I'm one of those lucky people), lackluster wifi hotspots, slow connections on portable devices, and usage caps (my parents are stuck with a meager 8GB/month, and even my relatively decent 10Mbps connection is capped at 100GB/month).

Anyway, I'd like to thank you for running this site. I've found it to be very useful, particularly the logged /f/ threads. The info pages are also nice -- it's cool being able to study the ActionScript of neat effects without having to reverse engineer the entire file.




http://boards.swfchan.net/630/index.shtml
Created: 1/7 -2011 21:22:37 Last modified: 8/9 -2024 00:54:13 Server time: 08/09 -2024 01:06:02