Skip to main content
paperMarch 9, 2026

Adaptive Computing Primitives: A Position Paper on Bio-Inspired Structures, Policies, and Coordination

B. D. Phillips

position-paperbio-inspiredtaxonomyresearch-programorganicmetabolicecologic
DOI: 10.5281/zenodo.19056237
Share

Abstract

Research on data structures and systems provides strong tools for analyzing static representations and operation costs, but it offers less consistent vocabulary for primitives that adapt to workload, maintain internal lifecycle state, and trade steady-state benefit against recurring maintenance overhead. Bio-inspired computing has explored related questions across several traditions, yet primitive-level adaptive structures still lack a shared framework for scoping claims, comparing tradeoffs, and evaluating biological analogs as engineering mechanisms rather than decorative metaphors.

This paper argues for a framework for adaptive computing primitives within the Mutuus research program. It distinguishes three classes: adaptive data structures, operational resource-management policies, and inter-system coordination patterns, termed Organics, Metabolics, and Ecologics. It introduces an organic/inorganic taxonomy, a four-dimensional complexity model that extends classical efficiency analysis with adaptiveness, resilience, and thermal cost, and an evaluation methodology for determining when biologically inspired mechanisms justify their complexity. Three implemented exemplars (Nacre Array, Diatom Bitmap, and Mycelial Cache) illustrate the framework. The contribution is conceptual and methodological: this paper defines the vocabulary, scope, and evaluation criteria, while companion papers provide the formal specifications, empirical measurements, and implementation details for individual primitives.


1. Introduction

Classical computer science gives data-structure design a precise language for time and space complexity. That language remains indispensable. It supports direct comparison among arrays, trees, hash tables, caches, and indexes, and it maps well to concrete implementation concerns such as contiguity, pointer chasing, and asymptotic cost.

However, a growing class of computing primitives does not fit cleanly into a purely static frame. Some structures maintain explicit lifecycle state, restructure in response to workload, pay recurring maintenance cost, or degrade toward simpler behavior when stressed. These properties often appear in papers and implementations, but they are usually treated as ad hoc details rather than as first-class dimensions of analysis.

The gap is especially visible in bio-inspired computing. Biology has motivated important work in neural models [14], evolutionary search [13], and swarm coordination [7, 15], yet the literature spans several levels of abstraction and does not provide a shared framework for primitive-level adaptive structures. As a result, biologically motivated mechanisms are often presented either as isolated techniques or as metaphors without a clear standard for evaluation.

This paper takes the position that adaptive primitives require a more explicit framework. The goal is not to replace classical complexity analysis, nor to argue that every structure should adapt. The goal is to define when lifecycle management is part of the design, how its cost should be measured, and how biological analogy can be used as a disciplined source of engineering hypotheses.

The framework addresses that problem across three levels of abstraction: adaptive data structures, operational resource-management policies, and inter-system coordination patterns. This paper defines the vocabulary and evaluation framework for those levels. The companion primitive papers provide the artifact-level evidence for individual structures.

This paper makes five contributions:

  1. It defines an organic/inorganic taxonomy for distinguishing static primitives from lifecycle-managed adaptive primitives.
  2. It introduces a three-class framework spanning adaptive data structures, operational resource-management policies, and inter-system coordination patterns, termed Organics, Metabolics, and Ecologics in the Mutuus program.
  3. It proposes a four-dimensional complexity model that extends efficiency analysis with adaptiveness, resilience, and thermal cost.
  4. It specifies an evaluation methodology for adaptive primitives and biologically inspired policies, including explicit gates for formal specification, implementation, benchmarking, and dissemination.
  5. It situates three implemented exemplars, Nacre Array, Diatom Bitmap, and Mycelial Cache, as concrete instances of the framework while leaving their formal proofs and measurements to companion papers.

2. Biology as a Reasoning Source

Biology has long informed computing. Neural networks draw from neuroscience [14], genetic algorithms from evolutionary dynamics [13], and ant colony optimization from collective behavior [7, 15]. These traditions demonstrate that biological inspiration can be computationally productive. They also show a recurring problem: the biological reference can become ornamental if it is not tied to a concrete mechanism, resource model, or evaluation criterion.

The paper adopts a narrower and more demanding use of biological analogy. A biological analog is useful when it helps identify a structural mechanism, lifecycle pattern, or resource tradeoff that can be specified, implemented, and tested. It is not useful when it merely renames an established technique or substitutes imagery for argument.

This position treats biology as a constrained source of hypotheses about adaptation, maintenance, resilience, and resource allocation. Evolution has explored these problems across many lineages and timescales [8, 9]. That does not make biological solutions automatically correct for computing, but it does make them worth examining when they converge on recurring organizational patterns.

The Mycelial Cache illustrates the intended use. The mycelial-network analog did not function as a naming exercise. Studying fungal network topology [6, 16] suggested that bridge-like nodes connecting otherwise separate clusters should be harder to remove than locally replaceable nodes. That observation led to the bridge-node eviction-resistance mechanism in the cache: entries whose removal would fragment the co-access mesh receive a score bonus proportional to their structural importance. Similar structural effects appear in vascular and neural systems [4, 19], which strengthens the claim that the analog reflects a reusable organizational pattern rather than an isolated metaphor. Section 9 returns to this and two other exemplars.


3. Related Work

Several research traditions address adaptive or self-organizing computation. This paper positions the Mutuus framework relative to each.

Organic Computing. The German Research Foundation funded a major Organic Computing initiative (2005-2011) investigating self-organizing technical systems [21]. That program focused primarily on multi-agent coordination and emergent system behavior at the infrastructure level. Mutuus differs in scope and mechanism: it targets individual data structures and their internal lifecycle, not system-level coordination. The coordination-pattern class overlaps with Organic Computing's concerns, but the adaptive-data-structure and operational-policy classes address a lower level of abstraction: the primitive itself as an adaptive entity. Mutuus also imposes a stricter evaluation methodology with explicit binary gates, whereas the Organic Computing initiative favored exploratory research.

Self-adjusting data structures. Sleator and Tarjan's splay trees [22] and move-to-front lists demonstrate that data structures can adapt to access patterns without external tuning. These are important precursors. However, self-adjusting structures adapt through a single mechanism (rotation or reordering on access) with a single objective (amortized access cost). The adaptive primitives studied here maintain richer internal state (thermal FSMs, co-access meshes, density registries) that evolves through a dedicated maintenance cycle (tick), enabling multiple adaptive behaviors simultaneously. The key distinction is lifecycle: splay trees adapt per operation; the Mutuus primitives adapt per tick across their internal state.

Cache-oblivious data structures. Frigo et al. [23] and Bender et al. [24] showed that data structures can achieve optimal cache performance without knowing the memory hierarchy. This is a form of passive adaptation: the structure's layout inherently works well across cache levels. The adaptive primitives studied here pursue active adaptation by observing workload behavior and restructuring accordingly. Cache-oblivious structures solve the "unknown hardware" problem; the Mutuus framework is aimed at the "unknown workload" problem. The approaches are complementary, not competing.

Learned indexes. Kraska et al. [25] proposed replacing B-trees and hash maps with learned models that predict data location from data distribution. This shares Diatom Bitmap's insight that structure should adapt to data, but the mechanism differs: learned indexes train ML models offline, while Diatom Bitmap uses online histogram valley analysis with tick-driven boundary drift. Learned indexes require retraining when data distribution shifts; Diatom boundaries drift continuously. The tradeoff is precision (learned indexes achieve tighter bounds) versus responsiveness (organics adapt without retraining).

Autonomic computing. Kephart and Chess [26] established the vocabulary of self-configuring, self-healing, self-optimizing, and self-protecting systems. Mutuus inherits this vocabulary but applies it at a different granularity. Autonomic computing targets system-level properties (e.g., a self-healing web service). Mutuus targets primitive-level properties (e.g., a self-healing data structure). The Metabolic Cost Principle (Section 8) provides an evaluation framework that autonomic computing lacks: a methodology for measuring whether the cost of self-management is justified by the benefit.


4. Organic and Inorganic Primitives

We distinguish between two categories of computing primitives:

Inorganic primitives have fixed internal organization, fixed tradeoffs across workloads, and no explicit lifecycle. A Vec at allocation time and a Vec after ten billion operations are governed by the same representation and the same classical complexity model.

Organic primitives maintain lifecycle state that evolves through workload exposure and maintenance. They may adapt to access patterns, develop compensatory structures, or transition through thermal states such as Hot → Warm → Cold → Compressed. Their behavior therefore requires dimensions of analysis beyond static efficiency.

This taxonomy is descriptive rather than prescriptive. Inorganic primitives remain appropriate, and often optimal, for many workloads. Small, short-lived arrays do not benefit from thermal management. A hash map used for constant-time lookup in a tight loop may not justify the overhead of co-access tracking. The value of the taxonomy is that it makes the design choice explicit.

4.1 When Organics Win

Organic primitives outperform their inorganic counterparts in scenarios characterized by:

  • Long-lived data. Data that persists across many access cycles benefits from adaptive structures that emerge through tick-driven maintenance.
  • Non-uniform access patterns. Workloads with hot spots, temporal locality, or co-access structure can be exploited by organic adaptation mechanisms.
  • Workload shifts. Systems where access patterns change over time benefit from structures that detect and respond to shifts, rather than structures that perform identically regardless of pattern.
  • Structural relationships. Data with inherent relationships (spatial clustering, temporal adjacency, co-access patterns) can be exploited by organic structures that learn and encode these relationships.
  • Graceful degradation requirements. Systems that must continue functioning under adversarial inputs or beyond design capacity benefit from organic resilience mechanisms.

4.2 When Inorganics Win

Inorganic primitives remain preferable when:

  • Data is short-lived. If the structure will be discarded before organic adaptation can develop, the metabolic cost is pure overhead.
  • Access patterns are uniform. If every element is equally likely to be accessed, adaptive structures have nothing to learn.
  • Predictability trumps adaptiveness. In real-time systems with hard latency bounds, the worst-case overhead of organic maintenance may be unacceptable.
  • Simplicity is the primary goal. Organic primitives have more parameters, more internal state, and more complex behavior. When simplicity matters most, inorganics are the right choice.

5. Three Classes of Adaptive Primitives

The framework spans three classes of adaptive primitives: adaptive data structures, operational resource-management policies, and inter-system coordination patterns. In the Mutuus program, these classes are termed Organics, Metabolics, and Ecologics, respectively. They address different levels of system organization, but they are intended to compose.

Three-tier Mutuus framework. Organics provide data structures (foundation); Metabolics govern operational strategies (resource management); Ecologics coordinate between systems (ecosystem). Higher tiers depend on lower tiers but each is independently useful.

5.1 Organics — Data Structures

Organics are the foundation. Each organic is a data structure that offers an alternative to a classical (inorganic) counterpart, adding lifecycle management, adaptive behavior, and self-tuning capabilities.

OrganicAlternative ToNature AnalogKey InnovationStatus
Nacre ArrayVec / Dynamic ArrayMother-of-pearl (nacre)Segmented storage with fracture planes enabling O(1) splittingPhase 5
Diatom BitmapRoaring BitmapDiatom frustule (silica shell)Density-derived domain boundaries instead of fixed 2^16 partitionsPhase 5
Mycelial CacheLRU / LFU / ARC CacheMycelial networks (fungi)Hebbian co-access mesh enabling topology-aware evictionPhase 5
Waggle ConvergenceWeighted averages / Static ensemblesHoneybee waggle dance [5, 17]Multi-source signal aggregation with adaptive trust and phase transitionsPhase 2

Each organic shares common traits defined by a core interface: OrganicElement (identity and lifecycle), Maintainable (tick-driven maintenance), Adaptive (workload-responsive tuning), Compressible (thermal-state-driven compression), and Fracturable (structural splitting). Not every organic implements every trait, but the shared vocabulary enables reasoning about organic behavior across different data structures.

5.2 Metabolics — Operational Strategies

Metabolics formalize the operational strategies that production systems already implement in ad hoc fashion. A metabolic does not compete with a data structure; it governs how computational resources are allocated, conserved, and recovered.

Every production system has metabolic behavior, whether it recognizes it or not:

  • Kafka's log retention is an informal Dormancy strategy: old segments are "put to sleep" and eventually discarded.
  • Kubernetes horizontal pod autoscaling is an informal Thermoregulation strategy: scale up when hot, scale down when cold.
  • Circuit breakers are informal Fever Response strategies: temporarily change behavior when the system detects anomalous load.

Mutuus metabolics extract the biological pattern behind these ad-hoc implementations, formalize it with measurable parameters (trigger condition, depth spectrum, wake latency, maintenance cost, recovery debt), and provide a reusable policy that applies across systems, not just the one system where it was first hand-coded.

5.3 Ecologics — Coordination Patterns

Ecologics govern relationships between systems: how they compete for resources, cooperate for mutual benefit, and co-evolve over time.

This tier is cataloged but evaluation is deferred. Inter-system coordination patterns require multiple Mutuus-powered systems running in production before they can be observed, measured, and validated empirically. Premature formalization would produce theoretical models without grounding.

Current catalog: Symbiosis (shared resource pools), Predator-Prey Dynamics (competitive resource allocation), Succession (temporal system evolution), Trophic Cascades (failure propagation), Niche Partitioning (workload specialization), and Migration Corridors (data mobility patterns).

5.4 The Unifying Principle

The three classes are analytically distinct but operationally coupled. Data structures interact with operational policies, and both eventually interact with system-level coordination. The framework separates these levels so that claims can be scoped precisely, but it keeps their dependencies visible. The intended result is not three unrelated taxonomies, but one layered account of adaptive computing primitives.


6. Four Dimensions of Complexity

Classical algorithm analysis operates primarily in a single dimension: efficiency. O(n log n) sort, O(1) hash lookup, O(log n) tree search. That dimension remains necessary, but it is insufficient for adaptive primitives whose behavior changes with workload and maintenance.

We propose four dimensions of complexity analysis:

Four dimensions of organic complexity analysis. Classical analysis covers only O(e); organic primitives require all four dimensions for complete characterization.

6.1 O(e) — Efficiency

Classical time and space complexity for every operation. This dimension is unchanged from traditional analysis. Every organic publishes its efficiency bounds alongside its inorganic counterpart, and any claim of superiority must be demonstrated through reproducible benchmarks.

Efficiency is the baseline. An organic that is dramatically slower at its core operations is not viable regardless of its adaptive capabilities.

6.2 O(a) — Adaptiveness

How well does the structure respond to changing workloads over time? This dimension measures the rate and quality of adaptation:

  • Convergence speed: How many access cycles until the organic's internal structures reflect the workload pattern?
  • Adaptation quality: How much does adaptation improve performance on the observed workload?
  • Catastrophic forgetting: When the workload shifts, does adaptation to the new pattern destroy knowledge of the old pattern?

Inorganics have O(a) = 0 by definition; they do not adapt. Organics must demonstrate measurable, beneficial adaptation that justifies their metabolic cost.

6.3 O(r) — Resilience

How does the structure behave when pushed beyond its design parameters? This dimension measures graceful degradation:

  • Adversarial inputs: What happens when the input distribution is deliberately adversarial?
  • Capacity overflow: What happens when the structure exceeds its intended capacity?
  • Maintenance failure: What happens when tick-driven maintenance is delayed or skipped?

The key question is whether the structure degrades smoothly (performance decreases linearly or logarithmically) or catastrophically (a cliff edge beyond which the structure fails completely). Organics should degrade smoothly, falling back to inorganic baseline behavior when their adaptive mechanisms are overwhelmed.

6.4 O(τ) — Thermal Cost

What is the overhead of self-management? Adaptation is not free. Every tick cycle, every thermal state transition, every Hebbian weight update consumes resources.

This dimension makes the metabolic cost visible and measurable:

  • Per-tick overhead: How much time and memory does a single maintenance tick consume?
  • Amortized cost: Over the structure's lifetime, what fraction of total computation is devoted to self-management?
  • Break-even point: At what usage level does the adaptive benefit exceed the metabolic cost?

The Metabolic Cost Principle (Section 8) argues that this dimension is the most important, and the most neglected.


7. Evaluation Methodology

The framework treats biological analogy as a hypothesis source, not as evidence. Every proposed primitive is evaluated through explicit gates covering specification, implementation, measurement, and dissemination. A candidate that fails a gate is revised or abandoned rather than advanced on rhetorical appeal.

7.1 Organic Evaluation: Seven-Phase Process

  1. Nature Analog Identification. Does the biology provide real computational insight, or is it just a naming exercise? Convergent analogs (the same strategy evolved independently in at least two different phyla) indicate that the strategy solves a genuine problem rather than being an evolutionary accident.

  2. Formal Specification. Data model, operations, complexity bounds, and invariants, precise enough to implement from. If the spec is ambiguous, the implementation will be arbitrary.

  3. Complexity Analysis. Theoretical complexity for all operations, compared directly against the inorganic counterpart. The organic must outperform on at least one operation class to justify its existence.

  4. Implementation. Working code in Rust with comprehensive test coverage. The implementation must compile, pass all tests, and match the specification.

  5. Benchmarking. Reproducible benchmarks against the inorganic counterpart across multiple workload types, data sizes, and access patterns. We publish wins, losses, regressions, and dead ends. Research that hides failures is marketing, not science.

  6. WASM + TypeScript Integration. WebAssembly bindings and TypeScript wrappers that make the organic accessible beyond the Rust ecosystem. The WASM overhead must be acceptable for browser-based use cases.

  7. Publication and artifacts. Documentation, papers, benchmark results, and interactive simulations. The work is not complete until others can understand, reproduce, and build on it.

7.2 Metabolic Evaluation: Five-Phase Process

  1. Dormancy/Energy Survey. Catalog biological strategies for the specific resource constraint. Require convergent analogs from at least two different phyla; if evolution arrived at this strategy independently multiple times, it is probably solving a real problem.

  2. Resource Model Extraction. For each analog, extract measurable parameters: trigger condition, depth spectrum (binary or graduated?), wake latency, maintenance cost, and recovery debt.

  3. Hardware/Infrastructure Filter. Can this be implemented with standard OS primitives, container orchestration, or application-level state? Any strategy requiring kernel modifications or custom hardware is eliminated.

  4. Policy Composition. Define the default behavior and tuning surface. Analyze failure modes: when this strategy fails, does the system degrade to "always awake" (safe) or "stuck asleep" (dangerous)? Metabolics must fail safe.

  5. System Validation. Map to real systems that already implement ad hoc versions of this strategy. Measure what the formalized metabolic simplifies and what resource savings it produces, compared directly against the ad hoc implementation.

7.3 Benchmark Philosophy

All benchmarks are conducted with Criterion.rs for statistical rigor, published with full methodology, and stored in version control for reproducibility. We maintain two levels of provenance:

  • Granular: Every Criterion report committed to git history. Every optimization attempt documented, including the ones that made things worse.
  • Narrative: Key milestones, decisions, and analysis documents that capture the story behind the numbers, for content creation and for other researchers evaluating our work.

We benchmark at multiple scales (1K, 10K, 100K elements), across multiple workload types (uniform, Zipfian, sequential, clustered, adversarial), and we report results honestly. When the organic loses, we say so. When the organic wins but with caveats, we explain the caveats.


8. The Metabolic Cost Principle

This section states the paper's central methodological claim:

Every organic has a metabolic cost, and the response is not to eliminate the metabolism but to let the organism grow compensatory structures through its lifecycle.

Conventional benchmarks often measure performance immediately after construction: allocate the structure, populate it, measure operations. This methodology is usually sufficient for inorganic structures because their behavior does not materially change with use. A Vec does not improve with age.

For organics, however, cold-start measurements can miss the cost-benefit relationship that emerges only after workload exposure and maintenance cycles. An adaptive primitive may incur lifecycle overhead before it has had an opportunity to realize compensatory structures or workload-sensitive behavior.

The Metabolic Cost Principle demands a different methodology:

  1. Construct the organic and its inorganic counterpart with identical data.
  2. Exercise both structures through representative workload patterns across a significant number of tick cycles (minimum 100).
  3. Then benchmark. Measure operations after the organic has had time to develop its adaptive structures.
  4. Report both. The t=0 benchmark (cold start, organic disadvantage expected) and the t=steady-state benchmark (adapted, where organic advantages should emerge).

This is lifecycle economics. The question is not simply "is the organic faster?" in an isolated cold-start microbenchmark, but "are the primitive's lifecycle economics favorable for the workload being studied?"

8.1 Implications

The Metabolic Cost Principle has several consequences for how we evaluate and present organic primitives:

Metamorphosis. Below a cardinality threshold, an organic should behave as its inorganic equivalent. A Diatom Bitmap with 10 values should be a flat sorted array with no boundary registry, no thermal tracking, no metadata overhead. The organism metamorphoses into its full structure only when data volume justifies the metabolic investment. This eliminates the small-scale benchmark gap entirely.

Compensatory structures. When benchmarks reveal a weakness (e.g., Diatom Bitmap's contains() is slower than Roaring due to boundary registry lookup), the response is not to bolt on an unrelated optimization but to ask whether the organic can grow a compensatory structure through its normal lifecycle. A thermally promoted companion index that emerges from tick cycles, and is strengthened by the same access patterns that reveal the weakness, fits this model.

Thermal inheritance. When set operations produce derived structures (e.g., bitmap_a & bitmap_b), the result should inherit thermal hints from its parents. A derived bitmap whose parents were both thermally adapted should be born partially mature, not start from thermal zero. In biology, offspring inherit immune memory; in computing, derived structures should inherit access pattern knowledge.


9. Three Exemplars

Three organic primitives have completed evaluation through at least Phase 5 (Benchmarking). Each has a dedicated paper with the full specification, benchmark methodology, limitations, and related work. The purpose of this position paper is not to reproduce those papers. It is to show what the framework produces in practice and why each exemplar merits further technical reading.

Nacre Array offers an alternative to Vec / dynamic array. Inspired by nacre (mother-of-pearl) [18], it stores data in segments separated by fracture planes, making structural split a first-class operation instead of an afterthought. Its win is structural mutation: split and mid-array edit behavior become far more interesting than in a fully contiguous array. Its loss is straightforward too: pure random access remains worse than Vec. See "Nacre Array: A Segmented Dynamic Array with Thermal State Management and Structural Fracture Planes" [27] for the quantitative tradeoffs.

Diatom Bitmap offers an alternative to Roaring Bitmap. Inspired by diatom frustules, it derives domain boundaries from the observed density of the data rather than inheriting fixed partitions from the conventional bitmap lineage. Its win is that aligned set-oriented work can benefit substantially when the learned boundaries match the data. Its loss is that point lookup becomes more explicit and can cost more than in fixed-boundary Roaring. See "Diatom Bitmap: Density-Derived Container Boundaries and Cooperative Thermoregulation" [28] for the formal specification and benchmark results.

Mycelial Cache offers an alternative to LRU/LFU/ARC [12] cache eviction. Inspired by fungal mycelial networks [6, 20], it maintains a co-access mesh with Hebbian edge strengthening [4], bridge-aware eviction scoring, and fever response under workload change. Its win is eviction quality on workloads with meaningful co-access structure. Its loss is higher metadata and per-access cost than simpler cache policies. See "Mycelial Cache: Topology-Aware Eviction Through Use-Dependent Structural Reinforcement" [29] for the current implementation, measurements, and limits.

9.1 Four-Dimensional Analysis: Diatom Bitmap

To demonstrate the four-dimensional framework from Section 6, we present the Diatom Bitmap's four-dimensional complexity profile. This analysis complements the classical complexity table in the dedicated paper.

O(e) Efficiency. Core operations: insert O(log c + log k) versus Roaring O(log c), where c is containers and k is container size. contains O(log c + log k) versus O(log c). Aligned set operations (AND, OR, XOR, ANDNOT) O(c), matching Roaring. cardinality O(1) versus Roaring O(c), cached at bitmap level. The boundary registry adds O(log c) to point operations; set operations are unaffected because they iterate containers, not the registry.

O(a) Adaptiveness. Boundary drift rate: up to max_drift_per_tick boundaries repositioned per tick cycle, converging on density-optimal placement within ~50-100 ticks for stable distributions. Container type promotion (Array → Bitmap → RunLength) responds to density changes within one tick. Thermal FSM transitions (Hot → Warm → Cold) require cooldown_ticks consecutive low-access ticks (default: 3). Catastrophic forgetting: workload shifts trigger boundary re-derivation from the current histogram, not incremental adjustment; old boundaries are replaced, not blended. This is appropriate for bitmaps (the data defines optimal boundaries) but means adaptation to the new distribution restarts from scratch.

O(r) Resilience. Adversarial inputs: uniformly distributed data defeats density-derived boundaries (all containers equal size), producing Roaring-equivalent behavior with O(log c) overhead for boundary lookup. This is the worst case, not a failure. Maintenance skipped: boundaries freeze at last-computed positions; thermal states freeze at current state. The bitmap remains correct and queryable; it simply stops adapting. Cold compression stops occurring but existing compressed containers remain valid. Capacity overflow: container count grows linearly with data range; no cliff edge.

O(τ) Thermal Cost. Per-tick overhead: O(c) for thermal FSM transitions across all containers + O(c) for boundary drift evaluation + O(cold) for cold compression attempts. At 100 containers, measured tick cost is ~2-5μs. Amortized cost over 10K operations: < 0.1% of total computation at 100ms tick intervals. Break-even point: density-derived boundaries outperform fixed boundaries when data has at least two distinct density regions (clusters separated by sparse gaps), which describes the majority of real-world bitmap workloads (time-series IDs, user cohorts, geographic regions).


10. Scope and Non-Claims

This is not a rejection of classical computer science. The four dimensions of complexity extend O(n) analysis; they do not replace it. Efficiency remains the baseline dimension, and any organic that fails to meet efficiency requirements is not viable regardless of its adaptive capabilities.

This is not biologically faithful simulation. We do not claim that Nacre Array accurately models the biomineralization of aragonite tablets. The biological analog is a reasoning tool; it reveals structural insights that mechanical metaphors hide. The computational implementation is evaluated on its own merits, not on its fidelity to biology.

This is not "everything should be organic." The organic/inorganic taxonomy exists to make the choice visible. Many use cases are best served by simple, predictable, zero-overhead inorganic structures. An organic is justified when the workload characteristics match the organic's strengths: long-lived data, non-uniform access, structural relationships, or resilience requirements.

This is not performance marketing. The dedicated primitive papers publish losses alongside wins. This position paper argues that the process is rigorous enough to surface real tradeoffs rather than hide them. The goal is honest engineering, not competitive scorecards.


11. Research Agenda

The research agenda proceeds on three fronts:

Expanding the organic catalog. Three organics have completed Phase 5 evaluation (Nacre Array, Diatom Bitmap, Mycelial Cache). A fourth, Waggle Convergence (multi-source signal aggregation inspired by honeybee waggle dances), is in early evaluation. Additional candidates address classical structures not yet covered: B-tree indexes, LSM-tree stores, graph routers, priority queues. Each candidate enters the seven-phase evaluation process, and most will not survive. We expect to abandon more candidates than we publish, and consider that healthy.

Formalizing the metabolic tier. Eight metabolic strategies are cataloged, each mapping a biological energy-management behavior to a computational resource-management policy. The 5-phase evaluation process filters for strategies that formalize what production systems already do informally, and that fail safe when they fail. Metabolic evaluation requires real-world system mapping, not just theoretical analysis.

Developing evaluation methodology. The Metabolic Cost Principle demands steady-state benchmarking, lifecycle economics analysis, and honest reporting of both wins and losses. As more organics complete evaluation, we refine the methodology itself, improving how we measure adaptiveness, resilience, and thermal cost.

11.1 Open Questions

Several questions remain open and guide ongoing research:

  • Metamorphosis thresholds. At what cardinality should an organic transition from inorganic baseline behavior to its full organic form? Is this threshold static, or should it adapt?
  • Cross-organic interactions. When organics are composed (e.g., a Mycelial Cache storing Nacre Arrays), do their metabolic cycles interfere or cooperate? Can thermal state propagate across organic boundaries?
  • Steady-state benchmarking methodology. How many tick cycles constitute a fair steady-state evaluation? Does this vary by organic, by workload, or by both?
  • Formal verification. Can organic invariants be verified formally, given that internal state evolves through tick-driven maintenance? What proof techniques apply to adaptive systems?

12. Conclusion

This paper does not argue that biological systems provide ready-made answers for computing. Evolution contains dead ends as well as durable organizational patterns. The claim is narrower: when similar adaptive patterns recur across biological systems, they can motivate hypotheses about resource allocation, maintenance, resilience, and structural change in computing.

The framework advances those hypotheses at the level of computing primitives. Its contribution is not a universal replacement for classical data structures and systems design. It is a framework for describing and evaluating primitives whose behavior depends on lifecycle state, workload adaptation, and recurring maintenance cost.

The value of that framework will be determined by companion papers, implementations, and measurements, not by terminology alone. The practical research question is when explicit lifecycle management produces better primitives than fixed representations, and under what costs, limits, and workload assumptions. This paper defines the framework within which that question can be pursued.


References

  1. Chambi, S., Lemire, D., Kaser, O., & Godin, R. (2016). Better bitmap performance with Roaring bitmaps. Software: Practice and Experience, 45(5), 709-719.

  2. O'Neil, E. J., O'Neil, P. E., & Weikum, G. (1993). The LRU-K page replacement algorithm for database disk buffering. ACM SIGMOD Record, 22(2), 297-306.

  3. Einziger, G., Friedman, R., & Manes, B. (2017). TinyLFU: A highly efficient cache admission policy. ACM Transactions on Storage, 13(4), 1-31.

  4. Hebb, D. O. (1949). The Organization of Behavior: A Neuropsychological Theory. Wiley.

  5. Seeley, T. D. (2010). Honeybee Democracy. Princeton University Press.

  6. Fricker, M. D., Boddy, L., & Bebber, D. P. (2007). Network organisation of mycelial fungi. In R. J. Howard & N. A. R. Gow (Eds.), The Mycota, Vol. VIII (pp. 309-330). Springer.

  7. Bonabeau, E., Dorigo, M., & Theraulaz, G. (1999). Swarm Intelligence: From Natural to Artificial Systems. Oxford University Press.

  8. Ball, P. (1999). The Self-Made Tapestry: Pattern Formation in Nature. Oxford University Press.

  9. Wegner, P. (1997). Why interaction is more powerful than algorithms. Communications of the ACM, 40(5), 80-91.

  10. Kreps, J., Narkhede, N., & Rao, J. (2011). Kafka: a distributed messaging system for log processing. Proceedings of the NetDB Workshop, 1-7.

  11. Yang, F., Tschetter, E., Leaute, X., Ray, N., Merlino, G., & Ganguly, D. (2014). Druid: A real-time analytical data store. Proceedings of the 2014 ACM SIGMOD International Conference on Management of Data, 157-168.

  12. Megiddo, N., & Modha, D. S. (2003). ARC: A self-tuning, low overhead replacement cache. In Proceedings of the 2nd USENIX Conference on File and Storage Technologies (FAST '03) (pp. 115-130). USENIX Association.

  13. Holland, J. H. (1975). Adaptation in Natural and Artificial Systems. University of Michigan Press.

  14. McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5(4), 115-133.

  15. Dorigo, M., & Stutzle, T. (2004). Ant Colony Optimization. MIT Press.

  16. Simard, S. W., Perry, D. A., Jones, M. D., Myrold, D. D., Durall, D. M., & Molina, R. (1997). Net transfer of carbon between ectomycorrhizal tree species in the field. Nature, 388(6642), 579-582.

  17. von Frisch, K. (1967). The Dance Language and Orientation of Bees. Harvard University Press.

  18. Jackson, A. P., Vincent, J. F. V., & Turner, R. M. (1988). The mechanical design of nacre. Proceedings of the Royal Society of London. Series B, Biological Sciences, 234(1277), 415-440.

  19. Wolff, J. (1892). Das Gesetz der Transformation der Knochen. Verlag von August Hirschwald.

  20. Tero, A., Takagi, S., Saigusa, T., Ito, K., Bebber, D. P., Fricker, M. D., Yumiki, K., Kobayashi, R., & Nakagaki, T. (2010). Rules for biologically inspired adaptive network design. Science, 327(5964), 439-442.

  21. Muller-Schloer, C., Schmeck, H., & Ungerer, T. (Eds.). (2011). Organic Computing: A Paradigm Shift for Complex Systems. Birkhauser Basel.

  22. Sleator, D. D., & Tarjan, R. E. (1985). Self-adjusting binary search trees. Journal of the ACM, 32(3), 652-686.

  23. Frigo, M., Leiserson, C. E., Prokop, H., & Ramachandran, S. (1999). Cache-oblivious algorithms. In Proceedings of the 40th Annual Symposium on Foundations of Computer Science (FOCS '99) (pp. 285-298). IEEE.

  24. Bender, M. A., Demaine, E. D., & Farach-Colton, M. (2000). Cache-oblivious B-trees. In Proceedings of the 41st Annual Symposium on Foundations of Computer Science (FOCS '00) (pp. 399-409). IEEE.

  25. Kraska, T., Beutel, A., Chi, E. H., Dean, J., & Polyzotis, N. (2018). The case for learned index structures. In Proceedings of the 2018 International Conference on Management of Data (SIGMOD '18) (pp. 489-504). ACM.

  26. Kephart, J. O., & Chess, D. M. (2003). The vision of autonomic computing. Computer, 36(1), 41-50.

  27. Phillips, B. D. (2026). Nacre Array: A segmented dynamic array with thermal state management and structural fracture planes. Mutuus Research.

  28. Phillips, B. D. (2026). Diatom Bitmap: Density-derived container boundaries and cooperative thermoregulation for compressed set membership. Mutuus Research.

  29. Phillips, B. D. (2026). Mycelial Cache: Topology-aware eviction through use-dependent structural reinforcement. Mutuus Research.


Citation

BibTeX

@article{phillips2026organic,
  title={Adaptive Computing Primitives: A Position Paper on Bio-Inspired Structures, Policies, and Coordination},
  author={B. D. Phillips},
  year={2026},
  journal={Mutuus Research},
  doi={10.5281/zenodo.19056237},
  url={https://mutuus.bytequilt.com/research/organic-data-structures-manifesto}
}

APA

B. D. Phillips (2026). Adaptive Computing Primitives: A Position Paper on Bio-Inspired Structures, Policies, and Coordination. Mutuus Research. https://doi.org/10.5281/zenodo.19056237

Related Papers


Discussion