Withstand the Fall of Time: Temporality, Market Selection, and Human Learning under Generative Models

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Much AI risk discourse centers on loss of control under AGI. We ask a nearer question: before those thresholds, does modern machine learning already create a structural risk for how societies produce knowledge and culture? We use temporality in an operational sense: the way understanding changes over time and the signals left by that process. Representation learning and large-scale autoregressive generation approximate the distribution of task outputs like text, designs and decisions while omitting the slow, path-dependent human learning that produced them; at scale, these pipelines function as general-purpose production technologies. We formalize the link from technical indistinguishability to market selection with a GAN-inspired economic model and a screening threshold: when divergence between model signals and temporal signals is small and verification remains costly, decision makers cease screening, prices track pooled quality, and temporality-intensive work exits. We call this value collapse. As training data increasingly mirror such environments, models absorb their own outputs and the risk of model collapse rises through loss of diversity and feedback. Alignment is, in our model, largely orthogonal: by improving usability and narrowing observable gaps, it can reduce incentives to screen and thereby intensify these selection pressures, especially where provenance remains costly to verify. This integration of machine learning and economic reasoning clarifies conditions under which signals and rewards are reshaped.

Article activity feed