Maybe, but Maybe Not: Unexpected Value, Unexpected Information, and Information-Weighted Probabilities
This paper introduces a simple probabilistic framework that blends expected and unexpected values with Shannon entropy and its "unexpected" complement to reveal deeper insights into decision-making under uncertainty. By considering Bernoulli random variables, I show how an interplay of probabilities, values, and information leads to a natural emergence of the optimal expectation.
Quantifying the Plausible: Information Theory, COVID Origins, and Contested Historical Narratives
The early debate over COVID-19's origins exemplifies how institutions can coalesce prematurely around particular views, diminishing healthy uncertainty and risking significant policy missteps. In this paper, I develop a quantitative framework that merges legal standards of plausibility with modern information theory, showing how tools such as Shannon entropy and "unexpected information" can help maintain appropriate space for plausible alternatives.
Information Processing, Full-Information States, and the Tension Between Certainty and Surprise
This paper explores a novel perspective on how information processing shapes belief dynamics in binary-outcome settings. We examine two complementary measures of uncertainty, Shannon's classical entropy (expected information) and an alternative metric of unexpected information, and demonstrate how they jointly govern belief updates through time. The analysis provides analogies-from plumbing to CPU architectures-that illustrate how capacity constraints lead to bottlenecks, backlogs, and potential breakdowns in belief-updating systems.