Est. reading time: 5 minutes
You’ve been told that “sessions” and “users” are the foundations of digital truth. They’re not. They’re shadows on the wall—convenient abstractions that collapse messy human behavior into tidy rows. If you build strategy on those rows, you’re not managing reality; you’re managing a mirage. Time to stop worshiping counters and start understanding context.
Your Metrics Lie: Sessions Aren’t Real People
A session is a stopwatch and a guess. It begins when a system thinks you started engaging and ends when it thinks you stopped—often after an arbitrary timeout, a tab close, or a tracking hiccup. A person can be mid-conversation with your brand while “not in session,” and a bot can trigger dozens of “sessions” while no human is present. The metric is a convenience, not a witness.
Modern browsing shreds the session illusion. People open five tabs, switch devices, prefetch pages, and return hours later via a push notification. Privacy features throttle cookies, reset identifiers, and block tracking pixels on a whim. What looks like three separate “sessions” could be one continuous train of thought, and what looks like one seamless journey might be stitched from unrelated visits.
Even when sessions are captured, they’re framed by platform biases. Single-page apps often register “engagement” without navigation; native apps might run background events that keep timers alive; server-side tagging masks gaps while client-side scripts fail under content blockers. You’re not measuring engagement itself—you’re measuring the survival of a tracking setup under variable weather.
Users Aren’t Users: Identity Is Messy by Design
A “user” in analytics is an identifier with a pulse, not a person with a pulse. Device IDs, cookies, hashed emails, and login states each imply identity, but none are identity. One person becomes six “users” across devices and browsers; one household becomes one “user” under a shared login; one bot becomes a “user” that never sleeps. The map keeps pretending to be the territory.
Privacy regimes sharpen the fragmentation. ITP, ATT, MPP, consent prompts, and walled garden policies ensure that identifiers decay, drift, or never materialize. Your graph of “users” is a rotating collage: some nodes vanish, some converge incorrectly, some duplicate indefinitely. Accuracy isn’t just low; it’s unknowable without ground truth you rarely possess.
Identity resolution—deterministic or probabilistic—offers stitches, not skin. Merge logic helps tie sessions together under a common ID, but it also creates false positives: coworkers sharing a laptop, family members sharing a tablet, test accounts mirroring real users. The paradox is simple: the more you try to force reality into a singular “user,” the more you risk hallucinating one.
Attribution Myths: One Session, Many Realities
Attribution models promise fairness but deliver stories. Last-touch flatters closers; first-touch flatters discoverers; linear splits the baby; time-decay romanticizes recency. Each is a narrative device dressed as math. You don’t find truth in a model—you choose a bias that matches your incentives.
A single session often contains multiple influences, many of which your tags never see. A friend’s recommendation, a TikTok scroll, a review site skimmed on a commute, an email viewed on lockscreen, a brand memory from last year—good luck fitting that into a UTM. Ad platforms claim credit inside their own walls, your analytics claims credit inside yours, and reconciliation turns into mythology.
Cross-channel paths are also context-dependent. The same ad that “converts” in one session might only work because of months of organic exposure; the same email that seems weak might be the nudge that rescues a cart abandoned due to payment friction. Attribution frameworks are less about assigning trophies and more about testing levers—what happens when you pull this and release that.
Measure Behavior, Not Buckets: Rethink Analytics
Stop treating sessions and users as units of reality. Start modeling behaviors and states: viewed a product, compared options, signaled intent, encountered friction, returned with confidence. Build canonical events with clear semantics across platforms, and track transitions—what precipitates a next step, what stalls progress, what reverses churn. Behavior is portable; buckets are brittle.
Adopt layered measurement. Pair event analytics with qualitative signals, post-purchase surveys, incrementality experiments, and controlled holdouts. Use server-side logs to counter client fragility; segment by context (device class, consent state, geography, acquisition type) rather than chasing perfect identity. When attribution must be used, treat it as a decision aid, not a verdict.
Redefine success around movement and value creation. Optimize for fewer dead ends, shorter time-to-first-value, healthier repeat loops, and stronger LTV, not just bigger session counts or “user” totals. The teams that win aren’t the ones with the prettiest dashboards of fictional people—they’re the ones who build systems that reliably change real human behavior.
The truth isn’t that sessions and users are useless; it’s that they’re props, not protagonists. Keep them on stage if you must—but let behavior, context, and experiments direct the play. When you stop mistaking your metrics for reality, your strategy stops chasing ghosts and starts changing outcomes.








