i couldn't get thru the wall of boring PDF text, but is this a scheme to checksum an url, then send the code with the request, so the webserver can 404 if the user altered the url?
God the way this was formatted is painful for the eyes, but TLDR the paper gives a pretty detailed overview of a method for mitigating the manipulation of URLs to artificially influence search rankings, which plagues many online service providers, especially in the retail and e-commerce sectors. Basically there’s a problem with editable URLs being exploited by automated systems or "click farms" to inflate the prominence of certain search results, which ultimately skews the integrity of web interactions and analytics. The paper does provide a solution to preserve the authenticity of user interactions and ensure that request data is verifiable, so if that’s your thing good luck trying to read this
i couldn't get thru the wall of boring PDF text, but is this a scheme to checksum an url, then send the code with the request, so the webserver can 404 if the user altered the url?
so ... a bearer token for a specific URL
Yes, along with incorporating into the checksum any other session data the server desires. Basically, DRM for links.
God the way this was formatted is painful for the eyes, but TLDR the paper gives a pretty detailed overview of a method for mitigating the manipulation of URLs to artificially influence search rankings, which plagues many online service providers, especially in the retail and e-commerce sectors. Basically there’s a problem with editable URLs being exploited by automated systems or "click farms" to inflate the prominence of certain search results, which ultimately skews the integrity of web interactions and analytics. The paper does provide a solution to preserve the authenticity of user interactions and ensure that request data is verifiable, so if that’s your thing good luck trying to read this