The Panopticon/100k size restriction

From Tardis Wiki, the free Doctor Who reference
< User:SOTO‎ | Forum Archive
Revision as of 22:27, 27 April 2023 by SV7 (talk | contribs) (Bot: Automated text replacement (-'''User:(SOTO/Forum Archive)/(.*?)/\@comment-([\d\.]+)-(\d+)'''\n([\s\S]*)\[\[Category:SOTO archive posts\]\] +\5\2/\4-\3, -'''User:(SOTO/Forum Archive)/(.*?)/\@comment-(.*?)'''([\s\S]*) ?\{\{retitle\|///(.*?)\}\} +{{retitle|\2/\5}}\n'''User:\1/\2/@comment-\3'''\4))
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

I propose to abolish a policy that is being regularly violated and creates additional workload for everyone.

The rule "you should try to keep your images to no more than [...] 100kb in size" is simply not followed. A quick count among the first 200 images in the Category:Big Finish Productions CD covers shows 73 images not complying with this rule. That's significantly more than 25%. At least one image is over 2M and multiple images are over 1M, i.e., more than 10 times the allowable size.

Of course, one of the admins can always go through these 73 images and scale them down. But that's a lot of work, and for what? Another solution would be to delete all those images and hope that they will eventually be re-uploaded by someone. This seems to be the current modus operandi for newly uploaded images (though not even all of these get deleted). But for old images this may not work that well. The downside would be that many pages would remain bare, possibly for significant amounts of time as some of the images are from rather obscure ranges. At any rate, it would be a lot of work for many people and with unclear gain.

But since this policy is not followed anyway, the question one might ask: is the justification for this policy still relevant? The current explanation is that 1) it's a waste of bandwidth; 2) it might adversely affect people with slow Internet. But in the age of streaming HD video and incessant video-ads, these are simply not valid concerns anymore. We're not using dial-up. Even with 10Mb/s download speed (which I would not be able to survive on), a page with one 1M image will download in 1 second, which is fine. The waste of bandwidth might be relevant for mobile data, but would still be nothing compared to checking Facebook or posting on Instagram.

So question is: is it worth changing all those old images and keeping admins occupied with policing and deleting new images just because the Internet used to be slow? It's kind of like TSA agens in American airports were found at some point spending 80% of their time searching for matchboxes. It distracts from important things rather than helps achieve anything.

And another aspect is usability. In a Wiki setting, complex rules are less likely to be followed. Currently, there are two parameters that have to be satisfied: width and size. But thing is: decreasing width will decrease size but may not decrease below 100k. So having both restrictions requires two operations from uploaders on images instead of one (assuming the initial image is a high-resolution cover from Big Finish). Moreover, some (simpler) image editors are incapable of performing the second operation. If the size restriction is removed, on the other hand, only one operation would be required and the resulting size will still not be outrageously bad, far from 1M or 2M.

Thus, it appears that removing the size restriction but making the width restriction policy clearer and more prominent is a win-win in terms of workloads all around.Amorkuz 00:18, November 12, 2015 (UTC)