

My point isn’t really about the implementation per se (I’m aware of the limitation since at least 2011 by reading then Link Prediction by De-anonymization: How We Won the Kaggle Social Network Challenge so more than a decade ago) but rather that the “solution” Murena offers is not a mandatory service. If people want to use it, they can. I do not want to, I do NOT have to. I’m not arguing that their solution is good, or bad, only that it’s optional.
I understand the concern. I also imagine (I want to be optimistic here, maybe naively so) that most websites wants some form of analytics, probably does not code it themselves and instead of relying on aggregate data like a traffic counter of hits (maybe due to crawlers and other bad agents not respecting
robots.txt
) then went with somethings fancier. Maybe that fancier tool is trying to mitigate automated traffic with fingerprint detectors.Well, one can understand and still disagree with it. I suggest contacting the administrator of such website with their concern BUT in the meantime, until they actually do act (which might be never) I suggest to start with self-defense and use dedicated tools e.g.
Firefox Enhanced Tracking Protection
(you can use a non-Mozilla flavor of Firefox if you prefer) or even more specificallyJShelter
with its Fingerprint Detector.