You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Cleaning is really applied to only a part to the link, which is URL.path, URL.search (and maybe URL).hash). Since some rules (e.g. whitelisting) apply per domain, it is useful to start from a properly parsed URL object.
Finding out the link is only (partly) tricky in the injected script, as when checking the header we have the full uri. The injected script is useful for visual feedback of cleaned links, but failing there will be caught later on if request cleaning is enabled.
Once that is done, it will be easy to allow per-domain rules, such as parameters to clean (see discussion #20) or some rules which are currently hardcoded (e.g. on google.com/search, don't clean if we find the URL in the parameter q=)
The text was updated successfully, but these errors were encountered:
URL objects make URLs canonical, thus comparing them for "equality" sometimes wrongly fails and we add non-cleaned links to the history (e.g. a comma "," becomes %2C). Happened with standalone parameter cleaning in v3.1.2
We should also clean paths/parameters in the URL fragment, detected with fragments starting with #!/... or "?key=value, e.g. remove refid in https://m.facebook.com/home.php#!/photo.php?fbid=1234567890&refid=1234567890
This will prevent erroneous cleaning of sections of a legitimate URL, as
in the new test case (where aff0b550d3fe338b645a4deebdcb1b got removed).
This is (likely) a temporary fix while waiting for #25, when we'll match
parameters in a more robust way.
Cleaning is really applied to only a part to the link, which is
URL.path
,URL.search
(and maybeURL).hash
). Since some rules (e.g. whitelisting) apply per domain, it is useful to start from a properly parsed URL object.Finding out the link is only (partly) tricky in the injected script, as when checking the header we have the full uri. The injected script is useful for visual feedback of cleaned links, but failing there will be caught later on if request cleaning is enabled.
Once that is done, it will be easy to allow per-domain rules, such as parameters to clean (see discussion #20) or some rules which are currently hardcoded (e.g. on
google.com/search
, don't clean if we find the URL in the parameterq=
)The text was updated successfully, but these errors were encountered: