You probably got sent here from a very large wiki page.
Before you add to that page, please consider that a fair number of us cannot edit wiki pages beyond a certain size (seemingly 32k characters). Perhaps you could consider either a) refactoring the page so that it is smaller, and/or b) pointing discussion onto a new page.
Note for Wiki Gnomes: there's a script that shows the largest pages on Wiki: c2.com . You can use it to easily pick out targets for big refactoring work.
One could perhaps say that what Wiki really needs is for enormous pages to be refactored into more easily malleable chunks.... So do it, but be careful, very careful.
Unless, of course, there are better solutions. Anyone have one? -- Michael Hill
Sure. Try taking out the least useful 16k characters before you save. -- Ward Cunningham
Ya know, I just knew he was gonna say that. Ward, I know you've said this in a dozen places a dozen different ways, and I know you really mean it. But it's just so scary!!
32K of text can't all be talking about the same thing. Refactor. See Wiki Refactoring.
A method should be small enough to fit in one page - nay, in your head. A Wiki Page should be small enough to fit in 32K - nay, 16K. ;)
Yay, 16K should be enough for everyone!
I have a proposal for a new Wiki misbug. When a page grows larger than some limit (say 16K), there is a chance that it gets "accidentally" lost. How unfortunate. The chance increases the larger the page gets. -- Stephan Houben
Another page is too big for me to edit. It is called Yagni And Logging. Yagni And Logging Has Got Too Big -- John Fletcher
May I suggest that when a page becomes too big, this fact should not immediately be turned into its very own Wiki Page? Yagni And Logging is being refactored, as the page itself plainly states - hopefully this will succeed and the page will become small enough. But the new Wiki Page will remain as a permanent legacy of that transient condition.
Yes true. That was my second go at getting attention to the problem. The first was to edit this page in the hope that its presence in the Recent Changes would be enough. It is one of the features of Wiki that such pages persist, but I suspect that they don't get much attention once they drop out of current interest. I am much more interested in not letting pages get so big in the first place. -- John Fletcher
I propose that there be a special code to indicate the end of a wiki page (automatically generated by the system). It could be something like this, for example:
\\END OF WIKI PAGE\\ All text being edited must go above this line It would only show up in the "Edit" mode and when you try to Save the perl script should look for it to verify that nothing was cut off, if it is not found it can notify the user that their save did not happen because they cut off part of the wiki page, and they can press Back and refactor the page (leaving at least six backslashes on the last line) and try again if they want to save their changes.
This would make accidental loss impossible. The exact codes used are not important, that was just an example. -- Jeff Day
Would be nice if Wiki told you how large the page was on the save confirmation screen. If the text is near or over 32K, it could display a bold red heading saying "This page is over 32K; some people will not be able to edit it."
-- Jeff Grigg
I think a good size for a Wiki Page is 10K. I can read a page this size in one breath. Larger page makes my mind dwindle away.
An easy way to refractor a huge page is just cutting it up in pieces: PageNamePartOne; PageNamePartTwo; etcetera. and link the pages at the bottom of each page. PageName should be the refactored page, that somehow will emerge while you are doing all this cutting and re-editing. -- Gerard Buisman
Splitting is often a good thing, but as long as it's not indiscriminate. Dont Make Part Two, please.
The problem with long pages is not only technical, but mostly human - nobody really can grasp it all. Something that would help Wiki Refactoring is a Wysiwyg Wiki.
Some topics, such Value Existence Proof are indeed too big to technically edit. The server gives a "memory full" error message if you try to add about a paragraph more. I suspect it's using a Perl object with a max size, or perhaps Apache has been configured to limit stuff to prevent DOS-attacks. I'm not complaining because large pages are not good from a usability standpoint anyhow. It's just a technical curiosity.
Discovery: The Wiki itself has some kind of size limitation: Somewhere between 257K and 267K, the Wiki itself is unable to save changes to a page. Changes In November is 257K (with MS-DOS cr-lf termination), and I cannot add November 30th (Changes In November Thirtieth) to it.
The Wiki gives this error message:
The WikiWiki Server Can not Process Your Request Out of memory. This information has been logged. We are sorry for any inconvenience.
Most Interesting.
-- Jeff Grigg
The limit was introduced around August 2004, but is evidently too low; don't count on Ward increasing it, though.
No doubt, because of spam. Can you imagine the problems if spammers could add any length of page that they pleased? No, it wasn't to prevent spam (which rarely exceeds 60kchars), but related to a deletion/restoration war at that time.
Notes:
References to editing problems with Really Old Browsers removed, 6 Jul 11.
Thank You Moores Law (and related laws) for allowing us to ever expand our bloat ;-)
Contrast: Wiki Quicky
See original on c2.com