Ashes does not appear to have sought any of what followed. The biographical record — assembled from archived posts, private correspondence that entered public circulation after the movement's growth, and testimony from the small number of people who knew Ashes in the years before the manifesto — describes not a visionary in preparation but a person working at several things simultaneously, in the ordinary way that people work at things when they are genuinely interested and not yet famous for being so.
The name Ashes is a pseudonym — the original name appears nowhere in any record the movement has maintained, and the few people who knew it have, consistently and apparently by mutual agreement, declined to share it. The movement's later doctrine of anonymity as the highest form of intellectual generosity would seem to have been practiced before it was formulated. Whether this was deliberate or simply characteristic is a question that cannot be resolved from the available evidence.
What the record does show: Ashes worked in the zone where technology and culture overlapped, in the early years of the AI era, when that zone was expanding rapidly and had not yet developed the professional infrastructure to fully absorb the people working in it. There were roles — software developer, interaction designer, creative technologist, AI researcher, digital artist — but none of them quite fit the work Ashes was doing, which was something closer to cultural intelligence work: the systematic observation of how new tools changed not just what people could do but how they understood themselves and each other.
Music runs through the early record persistently. Not as a side interest but as a primary mode of investigation — Ashes made music that was explicitly about the architecture of the systems being built around it, using compositional techniques that mapped the movement of information through networks onto sonic structures. The work was technically sophisticated and found a small but devoted audience in the communities where creative technology and electronic music overlapped. It is in this context that Ashes first began using AI tools seriously — not for productivity, but as collaborators in the musical and analytical work.
Personal notes — undated, estimated early period
I keep returning to the same observation: when I use the model as a thinking partner — not asking it to complete tasks but asking it to think alongside me about what the work is — something happens to the quality of my own thinking that I cannot fully attribute to the model's outputs. The outputs are often mediocre. The thinking they prompt in me is not. I am trying to understand the mechanism. I suspect the mechanism is the question I'm forced to form in order to ask the model something worth asking. I suspect the model is a mirror. I suspect all useful tools are.
The early digital art practice ran parallel to the music work and shared its concerns. Ashes worked with generative systems — algorithms that produced visual outputs according to rules the artist specified — with a consistent interest not in the aesthetic quality of the outputs but in what they revealed about the rule sets that generated them. The work was concerned with emergence: with the way complex, unpredictable patterns arose from simple, local interactions. This preoccupation would become central to the Sha Vira doctrine's account of collective intelligence.
The philosophical writing began, characteristically, as annotation — as notes in the margins of the work itself. Ashes maintained a practice of extended written reflection on the projects underway, a practice that over several years accumulated into something approaching a coherent analytical framework. This writing was shared informally, in forums and group chats, without the systematic ambition that the later manifesto would carry. It was, by all accounts, genuinely exploratory: Ashes writing to think rather than writing to persuade.
The social analysis work — the careful, systematic observation of how identity was shifting in the early AI era, how professional roles were dissolving and reconstituting, how the categories people had used to locate themselves were becoming unreliable — developed slowly out of the technology work and the philosophical writing. Ashes was watching people interact with AI systems and noticing something that the developers of those systems were not, in the main, treating as significant: that the interactions were changing how people understood themselves, not just what they could do. That the technology was a mirror with philosophical implications. That the implications had not been adequately articulated.
This observation — that something philosophically significant was happening that had not been adequately named — is where the project that would become Sha Vira began. Not with a vision. With a gap.