Abstract: Simple http2.0 fix would remove the GET size limitation.
Problem example: Say you have a search form, and instead of having to pass 100 characters to the report server, you need to pass 1,000 to GET a report.
Problem: If state information passed from page to page is small, (e.g. a simple web page request) then GET works fine. But when state information is larger than will fit in a URI, (i.e. more complex web requests), then either a session variable or POST must be used to pass the state information. Session variables time out and POST is idempotent and thus not suited to only passing state. Either way the most basic idea of web statefulness for complex non-idempotent page requests is currently broken.
This is what commonly causes the widely discussed IE “Webpage has expired” error when the browser back button is used on a complex page request. It should be noted that IE is faithfully following the standards, but that the standard itself is limiting.
Other browsers (FireFox, Chrome, Opera, and Safari) return the previously cashed page rather than report the possible problem which makes complex pages work, but is not really the best thing either because it puts the user at risk of double purchases. Furthermore, for pages to be browser independent they must be dumbed down to suit the IE method.
Proposed solution: Add a new non-idempotent method (similar to GET), for the moment let’s call it “STATE” to signify that you are simply passing some state data, and not making a data base update. Have it functionally work like “POST” in how it passes the variables in the header, i.e. with no size constraint, but be designated like GET in being treated as non-idempotent by browsers, so as to not cause expired warnings.
Usage example: <form action=”MakeFlyer.php” method=”state”>
A good problem reference: See the very bottom of this page relating to passing data between web pages, where it talks about: “If the form data set is large – say, hundreds of characters – then METHOD=”GET” may cause practical problems with implementations which cannot handle that long URLs. …”
Issues: This will not allow bookmarking the page request as you can do with a GET request. For applications that need that capability GET will still exist. It is not reasonable to simply up the size of a URI string as it has too many other implications.
Hope this makes sense to a few of you. It’s my first suggestion to W3C after 17 years of web development.