• 0 Posts
  • 8 Comments
Joined 11 months ago
cake
Cake day: March 22nd, 2025

help-circle

  • yes, that’s why it’s called fingerprinting:

    it’s a kind of mathematical function that takes the entire code as input and outputs a unique result.

    the result is just some string of symbols (which really just represent a unique string of 1’s and 0’s).

    this unique string of characters is, as mentioned, unique for any given input.

    this string can then be compared to any arbitrary other string, and if they match, then you know it’s the same code.

    so in the case of signal anybody can download the source, compile it, and verify that it matches the fingerprint of the compiled code on their own device.

    that’s why it can’t be faked: you compare the already compiled code.

    if even a single digit of the code is out of place, it’s not going to result in the same string, and thus immediately get flagged as a mismatch.

    it’s mathematically impossible to fake.


  • yeah, alright then:

    you are arguing from ignorance, ask for evidence, then reject said evidence in the first paragraph instead of reading the entire thing because of a boilerplate disclaimer (which you of course do not understand to be boilerplate).

    you read the executive summary, even though you asked for the methodology, which is explained in the studies linked under the sources of the article.

    you need to click through to the actual study to see the methodology.

    the link i provided is just a summary of multiple studies.

    the studies lack this disclaimer, which was added by factually, probably for legal reasons, not because the data is faulty.

    since you’re apparently too lazy to even click the links already pointing to the exact information you asked for, here’s the abstract of the NBER/Stanford paper (most relevant part at the end highlighted):

    This paper examines the impact of the UK’s decision to leave the European Union (Brexit) in 2016. Using almost a decade of data since the referendum, we combine simulations based on macro data with estimates derived from micro data collected through our Decision Maker Panel survey. These estimates suggest that by 2025, Brexit had reduced UK GDP by 6% to 8%, with the impact accumulating gradually over time. We estimate that investment was reduced by between 12% and 18%, employment by 3% to 4% and productivity by 3% to 4%. These large negative impacts reflect a combination of elevated uncertainty, reduced demand, diverted management time, and increased misallocation of resources from a protracted Brexit process. Comparing these with contemporary forecasts – providing a rare macro example to complement the burgeoning micro-literature of social science predictions – shows that these forecasts were accurate over a 5-year horizon, but they underestimated the impact over a decade

    from the CEPR/VoxEU article (already in plain language and easy to read):

    So, taking all this together, what’s the bottom line? First, the public is right. Brexit has damaged the UK economy. But, inevitably, the mechanisms and hence the impacts have been considerably more complex than economists could incorporate in macroeconomic or trade models, with their inevitably simplifying assumptions. To simplify hugely, however, it would be reasonable to say that the impact on trade overall has been broadly consistent with predictions so far, that on immigration much less negative (and perhaps even positive) and on investment somewhat worse.

    so, yes, brexit has been bad for the UK economy. definitely, without question.

    what IS still in question is how bad exactly it was.

    THAT’S were the uncertainty is.

    whether or not it was detrimental has been answered with abundant certainty: it was bad.



  • if we only ever look at past data, and never compare that data to alternative scenarios, then it gets really difficult to make better decisions in the future.

    in circumstances where the sample size is naturally limited to just 1, it is necessary to perform simulations in order to gain insight into the outcome of any given event. there’s not really any other way to do this.

    what you call “wank estimates” (very scientific, thank you) is a collection of well established research methodologies that have been used with great success in both predicting future outcomes and analyzing past outcomes.

    this is evidence. it provides mathematical certainty, in this case about brexit.

    this is factual evidence, not simply “wank estimates”.

    and the evidence suggests that the UK economy would be significantly better off without brexit.

    this is simply fact.

    that the UK economy did mostly fine on its own is not relevant, because that’s not the point.

    the point is, that it would have been better for the economy, if the UK had remained.


  • yeah, nah, brexit did have a major negative Impact on the UK economy:

    Taken together, independent assessments paint a consistent picture: Brexit has reduced UK GDP (estimates commonly span roughly 2–8% to date, with central academic estimates clustering around 6–8% by 2025), slashed business investment (commonly estimated down 12–18% by mid‑2020s and in some scenarios far larger over decades), and trimmed productivity (roughly 3–4% in many studies and up to 4% in OBR scenarios), and they identify trade frictions, uncertainty and misallocation as core drivers—facts that point to policy levers on trade facilitation, investment incentives and productivity reforms if the UK seeks to narrow the gap with its peers [2] [4] [1] [3].

    damn near all economy experts agree on a negative, long-term impact fo the UK economy. they just can’t be entirely sure how bad it was, just that it was quite bad.

    single digit percentage points don’t sound too bad, but when it’s entire percentage points pf an entire economy that actually works out to billions in lost economic activity.

    the stats you provided do not show any comparison to a no-brexit/remain scenario, which is what should be compared.



  • afaik the client does collect a bunch if data, most (all, i think? but not a 100% on that) of which is opt-in.

    they do need stuff like IPs for internet related features.

    telemetry wise there’s the steam hardware survey, which is opt-in, and it asks every single time it attempts to collect your systems hardware and OS information. this could technically be identifying information, but since it’s opt-in it’s not a privacy violation and it’s entirely optional. (plus it’s super useful for all involved: users, devs, and steam. it’s kind of a win-win and straight up necessary info for devs to know which hardware they should optimize for)

    they might be putting it at the top because steam has native support for DRM?

    but that’s also weird, because DRM isn’t a privacy violation. it’s a shitty practice, barely does anything, barely works, and keeps breaking or hobbling otherwise perfectly good games, all of which is shitty, but it’s little to do with privacy. and the dev has to specifically opt-in and integrate it as a feature…unless they’re thinking of 3rd party DRM that can be waaay more intrusive, like Vanguard… THAT’S a privacy and security nightmare just waiting to blow up in people’s faces.

    otherwise…i haven’t really heard anything bad about steam privacy wise?

    doesn’t mean that there’s nothing to be concerned about, but i feel like there’d been some news about it if there was…