A field is not a collection. It is a system. The node names the structural mechanism through which a corpus integrates its heterogeneous components into a coherent whole: not by reducing them to a common denominator, but by establishing protocols for their interaction. In the Socioplastics architecture, this is the final operation of Core III. The seven disciplinary fields — linguistics, conceptual art, epistemology, systems theory, architecture, urbanism, media theory, morphogenesis, dynamics — do not merge. They interact. The integration layer is what makes this interaction possible. It specifies the protocols: how a concept from architecture speaks to a concept from systems theory, how a method from conceptual art validates a claim from epistemology, how a scale from urbanism transforms a model from dynamics. These protocols are not universal. They are field-specific. They emerge from the Socioplastics corpus itself, from the accumulated operations of seventeen years. It is the meta-protocol that governs all other protocols. Node 1510 places this concept at the closure of Core III because integration is the final operation of the disciplinary field layer. Without this concept, Core III is a list of seven fields. With it, Core III is a demonstration that ten fields can be integrated without being dissolved. The layer is the field's operating system. It is what allows the corpus to run.
Burrell, J. (2016) ‘How the machine “thinks”: Understanding opacity in machine learning algorithms’, Big Data & Society, 3(1), pp. 1–12. doi: 10.1177/2053951715622512.
Burrell’s analysis of machine-learning opacity provides a crucial theoretical bridge between technical design and social consequence by demonstrating that algorithmic inscrutability is not a single problem but a stratified epistemic condition. She distinguishes three forms of opacity: deliberate corporate or state secrecy, public and professional technical illiteracy, and a deeper opacity generated by the scale, dimensionality and mathematical optimisation of machine-learning systems themselves. This third form is the most philosophically consequential, because even transparent code may fail to yield humanly intelligible reasons once a model has learned statistical relations across vast data structures. Her examples are instructive: the neural-network diagram on page 6 visualises handwritten-digit recognition as layered connections between input, hidden and output nodes, while page 7 shows that the machine’s internal weighting patterns do not correspond to familiar human categories such as curves, bars or diagonals. The spam-filtering case further reveals the gap between semantic interpretation and statistical classification: a Nigerian scam email is not recognised through narrative genre, intention or deception, but through weighted lexical fragments such as “please”, “money” or “contact”. The case synthesis therefore unsettles simplistic calls for transparency: disclosure, auditing and computational literacy remain necessary, yet insufficient where machine reasoning resists translation into human explanation. Ultimately, Burrell reframes algorithmic accountability as an interdisciplinary obligation: legal scholars, social scientists, computer scientists, domain experts and affected publics must jointly evaluate not merely code, but the classificatory systems through which life chances are increasingly governed.
Kahl, P. (2025) The Epistemic Architecture of Power: How Knowledge Control Sustains Authority in Social Structures. 2nd edn. London: Lex et Ratio Ltd. Available under CC BY-NC-ND 4.0.
Kahl’s Epistemic Architecture of Power advances a sophisticated account of authority by arguing that power is sustained not only through coercion, law, wealth or institutional hierarchy, but through the systematic organisation of what actors are permitted to know, interpret and contest. Its central proposition is that political and institutional domination operates through the capture of epistemic agency, whereby individuals or groups become dependent on authorised interpreters for the very frameworks through which reality is understood. The thesis identifies three modalities: representation, where delegates evolve into epistemic principals; alliance, where partners consolidate a mutually reinforcing interpretive order; and appeasement, where weaker actors internalise dominant frames to avoid exclusion or sanction. Across these modalities, Kahl isolates four mechanisms—delegated interpretation, narrative consolidation, information gatekeeping and epistemic socialisation—which progressively narrow contestability and normalise dependency. A case synthesis emerges in the comparison between democratic representation and corporate governance: in both, constituents or shareholders may formally retain rights of oversight, yet their evaluative capacity is weakened when representatives monopolise data, language and interpretive standards. The argument’s normative force lies in treating epistemic agency as a public good, thereby imposing fiduciary-epistemic duties on those who control interpretive infrastructures. Ultimately, Kahl reframes justice itself as inseparable from knowledge governance: institutions are legitimate only where they protect plural interpretation, preserve dissent, and prevent authority from becoming an oligarchy over reality.
BioticCoupling
A field is not only a cognitive system. It is also a living system. The **BioticCoupling** names the structural connection between epistemic infrastructure and biological process: the way a corpus breathes, metabolizes, grows, and decays. In the Socioplastics architecture, this concept operates at the deepest level of Core VI — Executive Mode — because it addresses the field's most fundamental condition: its existence as a form of life. The Protein Stratum (Books 17–19: MetadataSkin, DatasetFormation, MetabolicCondensation) already gestures toward this. The corpus is described in metabolic terms: it digests, absorbs, transforms, and excretes. The BioticCoupling extends this logic to the field's relationship with its environment. A living field does not merely store information; it exchanges matter and energy with its surroundings. It takes in new concepts (nutrition), processes them through its internal grammar (metabolism), and produces new theoretical forms (growth). It also sheds obsolete formulations (decay) and generates waste that must be managed (the archive layer). The BioticCoupling is not a metaphor borrowed from biology. It is a structural homology. The same organizational principles that govern living systems — autopoiesis, homeostasis, allostasis — govern the Socioplastics field. This is why the concept sits at Node 2998, deep in the Executive Mode layer. It is the final structural recognition that the field is not a machine to be operated but an organism to be maintained. The BioticCoupling ensures that Socioplastics will not be killed by its own success — by the weight of its 3,000 nodes, 30 Books, and 60 DOIs. It provides the conceptual framework for managing the field's own life cycle: growth, maturity, adaptation, and eventual transformation. Without this concept, the corpus risks becoming a monument. With it, the field remains alive.
Boullée, É.-L. (1953) Architecture, Essay on Art. Edited and annotated by H. Rosenau. Translated by S. de Vallée. London: Alec Tiranti.
Étienne-Louis Boullée’s Architecture, Essay on Art advances one of the most forceful theoretical claims of Enlightenment architecture: building is not merely the technical art of construction, but the poetic art of producing ideas through form. Boullée begins by challenging the reduction of architecture to Vitruvian utility and structural competence, insisting that the architect must study nature, sensation and the expressive power of volumes if architecture is to move the human mind. His argument depends on architectural character, the capacity of a building to declare its purpose, moral status and emotional charge through proportion, mass, light, shadow and disposition; a theatre should communicate pleasure, a palace dignity, a basilica majesty, a prison terror, and a monument civic grandeur. The essay’s most decisive conceptual case lies in Boullée’s theory of simple geometric bodies, especially the sphere, cube and pyramid, which he regards as uniquely capable of producing unity, clarity and sublimity because they are immediately intelligible to the eye and capable of overwhelming the imagination . His architectural method therefore transforms geometry into affect: vast surfaces, severe symmetry, controlled daylight, darkness and scale become instruments for awakening reverence, melancholy, joy or awe. The case of public monuments is especially revealing, because Boullée imagines architecture as a civic pedagogy, able to honour sovereigns, justice, nation and collective memory not through ornament alone but through an intensified correspondence between function and emotion. This does not make his architecture coldly abstract; on the contrary, his abstraction seeks maximum sensuous and moral force. Boullée’s definitive contribution is thus sublime composition: architecture becomes an art of intellectual drama, in which elementary forms are enlarged beyond ordinary habit so that buildings speak directly to the passions, transforming space into a theatre of reason, nature and public imagination.
Klein, G. (2020) Pina Bausch’s Dance Theater: Company, Artistic Practices and Reception. Translated by E. Polzer. Bielefeld: transcript Verlag. doi: 10.14361/9783839450550.
Gabriele Klein’s Pina Bausch’s Dance Theater: Company, Artistic Practices and Reception offers a major reconceptualisation of Bausch’s œuvre by shifting attention away from the solitary mythology of “Pina” and towards the entire ecology of production through which Tanztheater Wuppertal generated, transmitted and renewed its art. Klein begins from the historical shock of Bausch’s 1970s stage language: dancers coughed, smoked, shouted, flirted, collapsed, crossed water, soil, carnations and stones, and transformed banal gestures, social habits, objects, animals, plants and emotions into dance, thereby dismantling the established borders between choreography, theatre, everyday life and performance. Yet the book refuses to repeat inherited critical myths; instead, it proposes praxeology of translation as both theory and method, understanding each production as a continuous process of translation between speech and movement, body and writing, rehearsal and performance, performer and audience, cultural research and staged form, memory and renewal . The decisive case study is the Tanztheater Wuppertal itself, treated not merely as a company executing Bausch’s vision, but as a social and artistic formation whose dancers, designers, musicians, rehearsal practices, research trips, restagings and acts of passing on collectively shaped the works. Klein’s analysis of international coproductions further demonstrates that Bausch’s method anticipated later debates on artistic research, since the company investigated everyday rituals, gestures, music, customs and atmospheres across cities and cultures before transforming them into choreographic material. The result is not ethnographic illustration, but a dense theatrical practice in which the human condition is searched for through difference, repetition, affect and encounter. Klein’s conclusion is that Bausch’s art survives through living transmission: in restagings, audience memories, critical texts, bodily inheritance and the ongoing translation of dance into discourse, where the work remains contemporary precisely because it is never simply preserved, but continually reactivated.
Caswell, M. (2021) Urgent Archives: Enacting Liberatory Memory Work. Abingdon and New York: Routledge.
Michelle Caswell’s Urgent Archives argues that community archives must move beyond representation and become instruments of liberatory memory work, actively disrupting white supremacy, hetero-patriarchy, colonialism, and other oppressive systems in the present. Her central claim is that archives should not merely recover minoritised histories for inclusion within dominant institutions; they should be activated for resistance, solidarity, and transformation. Grounded in critical archival studies and more than a decade of work with the South Asian American Digital Archive, Caswell contrasts mainstream archives, which often reproduce exclusion through claims of neutrality, with community archives, which openly embrace affect, activism, participation, and care. The Dhillonn home movies and Zain Alam’s remix Lavaan provide a powerful case study: footage of a South Asian American interracial family in 1950s Oklahoma becomes, through artistic activation, a meditation on racism, assimilation, post-9/11 anti-Sikh violence, and the cyclical temporality of oppression. This example shows that records do not simply represent the past; when activated, they can generate political feeling, collective recognition, and a call to action. Caswell therefore shifts archival value from possession to use, from preservation to mobilisation, from symbolic inclusion to structural change. In conclusion, Urgent Archives insists that archives are urgent because oppression is ongoing: memory work must not wait for a distant future, but must intervene now, enabling communities to imagine and enact more just worlds.
Schwartz, J.M. and Cook, T. (2002) ‘Archives, Records, and Power: The Making of Modern Memory’, Archival Science, 2, pp. 1–19.
Schwartz and Cook’s “Archives, Records, and Power” dismantles the professional myth that archives are neutral repositories of historical fact, arguing instead that archives are active sites of power where memory, identity, evidence, and social legitimacy are produced, organised, and contested. Their central proposition is that archives do not simply preserve the past; they help determine which pasts become visible, authoritative, and usable. Against the older positivist image of the archivist as impartial guardian of truth, the authors insist that every stage of archival work—record creation, appraisal, selection, description, preservation, access, and interpretation—involves consequential acts of mediation. The archive is therefore not a passive storehouse but a social construct, shaped by governments, institutions, corporations, families, and individuals whose interests determine what is recorded, retained, privileged, or erased. This argument is especially powerful in relation to marginalised groups, since women, racialised communities, queer people, the poor, the non-literate, and other subaltern subjects have often been excluded from official memory through archival silence. Yet Schwartz and Cook also recognise that archives may become tools of resistance when read against the grain or when alternative communities create their own documentary spaces. Their case synthesis shows that archival power lies precisely in this double capacity: archives can stabilise dominant narratives, but they can also expose their fractures. In conclusion, the authors demand a postmodern archival ethics grounded in transparency, accountability, plurality, and critical self-awareness; to deny archival power is not neutrality, but complicity with the status quo.
Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J.W., Wallach, H., Daumé III, H. and Crawford, K. (2018) ‘Datasheets for Datasets’, Proceedings of the 5th Workshop on Fairness, Accountability, and Transparency in Machine Learning.
LeCun, Y., Bengio, Y. and Hinton, G. (2015) ‘Deep learning’, Nature, 521, pp. 436–444.
LeCun, Bengio and Hinton’s “Deep Learning” presents deep learning as a transformative form of representation learning, in which computational systems discover hierarchical features directly from raw data rather than relying on hand-engineered descriptors. Their central argument is that deep neural networks achieve their power by composing multiple layers of non-linear transformations, each layer converting an input into increasingly abstract representations: pixels become edges, edges become motifs, motifs become object parts, and parts become recognisable objects. This architecture enables systems to solve the long-standing selectivity–invariance problem, remaining sensitive to meaningful differences while ignoring irrelevant variation such as lighting, position, accent, or background. The article’s technical core is backpropagation, the procedure that uses gradients to adjust millions of internal weights so that errors decrease across training examples. Its case studies show the method’s breadth: convolutional neural networks revolutionised image recognition after the 2012 ImageNet breakthrough, recurrent neural networks advanced speech and language processing, and distributed word representations allowed machines to map semantic similarities into vector space. The visual examples are especially revealing: the convolutional network trained on a Samoyed image illustrates the layered extraction of visual structure, while the image-captioning system shows how deep vision models and recurrent language models can be joined to translate visual scenes into sentences. In conclusion, the article frames deep learning not as a narrow algorithmic technique, but as a general computational paradigm whose success derives from learning complex representations at scale, thereby reshaping artificial intelligence across vision, speech, language, science, and industry.
Cowen, D. (2014) The Deadly Life of Logistics: Mapping Violence in Global Trade. Minneapolis: University of Minnesota Press.
Deborah Cowen’s The Deadly Life of Logistics argues that logistics is not a neutral technique for moving goods efficiently, but a political technology of circulation whose modern form binds global trade to military strategy, imperial power, labour discipline, and security governance. In the chapter “The Revolution in Logistics”, Cowen traces how logistics moved from the military art of supplying armies to the corporate science of managing production, distribution, storage, transport, and consumption as one integrated system. This transformation was not merely technical: it reconfigured economic space itself by replacing isolated cost reduction with total cost analysis, a systems-based method that calculated value across entire supply chains. The apparently simple diagram of “integrated distribution management” becomes, for Cowen, a historical symptom of a deeper revolution: production no longer ends at the factory gate, but at the point where the consumer uses the commodity. Her case study of containerisation is especially decisive. Developed through military supply needs and later standardised through war and trade, the shipping container enabled just-in-time production, reduced port labour, intensified intermodal transport, and helped globalise production while weakening organised workers. Deregulation further extended this logistical order by reorganising rail, trucking, shipping, and telecommunications around transnational flows rather than national infrastructures. In conclusion, Cowen shows that logistics produces the world it claims merely to manage: beneath the language of efficiency lies a violent spatial rationality that transforms territory, labour, sovereignty, and security into instruments for protecting the continuous movement of capital.
Hayles, N.K. (2017) Unthought: The Power of the Cognitive Nonconscious. Chicago: University of Chicago Press.
N. Katherine Hayles’s Unthought argues that cognition must be radically detached from the privileged domain of human consciousness and redefined as a broader process distributed across human, biological, technical, and material systems. Her central proposition is that much of what enables perception, decision, adaptation, and meaning-making occurs through the cognitive nonconscious: a layer of processing inaccessible to introspection yet indispensable to conscious thought. Rather than treating consciousness as the sovereign centre of cognition, Hayles presents it as only one level within a wider ecology that includes unconscious processes, bodily perception, technical devices, artificial agents, plants, animals, and networked media. This shift challenges anthropocentric assumptions by showing that cognition is not limited to rational reflection or linguistic abstraction; it also appears in pattern recognition, environmental responsiveness, feedback loops, and adaptive behaviour. Her case studies range from human neural processing to plant signalling and computational systems, demonstrating that cognition emerges whenever information is interpreted in context and used to guide action. Particularly important is her concept of cognitive assemblages, where humans and technical systems operate together, as when smartphones, sensors, algorithms, networks, and users form temporary but consequential units of distributed cognition. In such assemblages, agency no longer belongs exclusively to the human subject; it circulates through relations among bodies, machines, codes, and environments. In conclusion, Hayles compels the humanities to rethink thought itself: beneath deliberate awareness lies an immense field of unthought cognition that structures contemporary life, from biological survival to digital infrastructures and planetary technosystems.
Starosielski, N. (2015) The Undersea Network. Durham and London: Duke University Press.
Cifor, M. and Gilliland, A.J. (2015) ‘Affect and the archive, archives and their affects: an introduction to the special issue’, Archival Science, 16(1), pp. 1–6.
Cifor and Gilliland argue that archives cannot be understood as neutral repositories of evidence, since records, archival spaces, absences, and acts of retrieval generate powerful affective responses that shape memory, identity, justice, and scholarly interpretation. Their introduction situates archival studies within the broader affective turn, where emotions, feelings, and bodily responses are treated not as secondary disturbances to knowledge, but as legitimate objects of critical inquiry. This perspective challenges the profession’s inherited attachment to objectivity by asking how archives provoke sadness, trust, anger, trauma, hope, longing, or recognition in those who encounter them. The archive, in this sense, is not only a place where the past is stored; it is a charged field where personal and collective lives are reactivated. The special issue they introduce develops this claim through cases involving LGBTQ, feminist, human rights, post-genocide, diaspora, and institutional-care archives, demonstrating that records may restore continuity for displaced communities, intensify pain for survivors, or enable marginalised subjects to recognise themselves within history. Particularly significant is the attention given to absence: missing, destroyed, or unattainable records can produce their own emotional force, generating what the authors describe through concepts such as imagined records and impossible archival imaginaries. The case of genocide survivors and displaced Bosnian communities is especially revealing, since records become instruments of mourning, identity reconstruction, truth-finding, and social healing. Ultimately, Cifor and Gilliland insist that affect is not peripheral to archival practice but constitutive of it: to archive is to mediate between evidence and emotion, bureaucracy and embodiment, memory and justice.
Larkin, B. (2013) ‘The Politics and Poetics of Infrastructure’, Annual Review of Anthropology, 42, pp. 327–343.
For Brian Larkin, infrastructure is not merely a technical substrate for moving water, electricity, vehicles, data, or people; it is a material form of governance, imagination, and sensory experience. His central argument moves infrastructure away from the apparently neutral domain of engineering and into an anthropology of technopolitics and material poetics, showing that roads, pipes, satellites, metros, and electrical systems condense state rationalities, collective desires, and embodied ways of inhabiting modernity. A pipe, therefore, does not simply distribute water; it may also distribute citizenship, dependency, moral calculation, or exclusion, as in Mumbai and Soweto, where access to water becomes entangled with political patronage, urban belonging, and neoliberal discipline. Likewise, a road may promise progress even when it remains empty, while a housing project may function more effectively as an administrative document than as actual shelter. Larkin’s strength lies in showing that infrastructure operates doubly: as a technical system enabling circulation, and as an aesthetic sign addressing its publics through visibility, monumentality, failure, or desire. Against the claim that infrastructures become visible only when they break down, he demonstrates that many are deliberately hypervisible: emblems of state power, progress, sovereignty, or collective aspiration. In conclusion, to study infrastructure is to examine not only cables, bridges, pipes, and roads, but also budgets, affects, materials, imaginaries, and bodies; where there appears to be mere circulation, Larkin reveals a deeper grammar of modern power.
100 Filmed Bodies
Rheinberger, H.-J. (2010) An Epistemology of the Concrete: Twentieth-Century Histories of Life. Durham and London: Duke University Press.
Hans-Jörg Rheinberger’s An Epistemology of the Concrete defines scientific knowledge as a historically situated experimental practice, produced through the reciprocal action of instruments, organisms, concepts, inscriptions and research communities. The foreword by Tim Lenoir situates Rheinberger within historical epistemology, a Franco-German tradition concerned with the concrete conditions through which scientific objects become thinkable, manipulable and conceptually productive . The central proposition is that science advances through experimental systems: material arrangements capable of generating unforeseen epistemic things at the frontier between knowledge and ignorance. This argument gains force through the prologue’s account of twentieth-century life sciences, where genetics and molecular biology emerge from dense assemblages of model organisms, apparatuses, laboratory protocols and interdisciplinary techniques. The decisive case study is the model organism. Rheinberger shows that organisms such as Drosophila, Ephestia, bacteria, viruses and tobacco mosaic virus become technical supports for general biological questions, selected for manipulability, accumulated knowledge and access to specific phenomena. Molecular biology further illustrates the thesis through ultracentrifugation, electron microscopy, chromatography, electrophoresis and liquid scintillation counting, whose instrumental configurations helped reshape the concepts of gene, information and biological specificity. Rheinberger’s epistemology therefore treats scientific objects as material-discursive hybrids, formed through recursion, trace, preparation and inscription. In conclusion, the concrete history of life science appears as an art of productive uncertainty, where experimental systems sustain controlled openness and allow concepts to acquire form through practice.
Alexander, C. (1979) The Timeless Way of Building. New York: Oxford University Press.
Benjamin, W. (1999) The Arcades Project. Translated by H. Eiland and K. McLaughlin. Cambridge, MA and London: Belknap Press of Harvard University Press.
Walter Benjamin’s The Arcades Project constructs a material archaeology of modernity, taking the Parisian arcade as the architectural, commercial and dreamlike emblem of the nineteenth century. The translators’ foreword presents the work as Benjamin’s vast inquiry into the “primal history” of that century, assembled through fragments, citations, images and convolutes rather than continuous exposition . Its central proposition is that capitalist modernity becomes legible through its residues: shopfronts, iron girders, glass roofs, panoramas, fashion, commodities, interiors, barricades and the wandering figure of the flâneur. The case study of the Paris arcades is decisive: in the 1935 exposé, Benjamin describes them as glass-roofed, marble-panelled corridors devoted to luxury commerce, where art enters the service of the merchant and the passage becomes a miniature city. The frontispiece of the Passage Jouffroy and the early arcade illustrations visually condense this thesis, showing urban space as both shelter and spectacle, street and interior, commodity theatre and collective dream. Through the concept of phantasmagoria, Benjamin shows how the new forms of iron construction, gas lighting, department stores and world exhibitions cloak capitalist relations in enchantment. Yet the method is critical as much as poetic: the dialectical image arrests historical fragments at the moment of recognisability, allowing the present to awaken from the dream of progress. In conclusion, Benjamin transforms Paris into an epistemic labyrinth, where modernity reveals itself through glittering surfaces, forgotten debris and the political charge of historical awakening.
Beer, S. (1989) The Viable System Model: Its Provenance, Development, Methodology and Pathology. Cwarel Isaf Institute.
Stafford Beer’s “The Viable System Model” presents viability as the capacity of any organism, organisation or polity to maintain independent identity within a changing environment. The paper reconstructs the VSM’s provenance across military psychology, neurocybernetics, operational research, industry and government, showing that Beer’s model emerged from a sustained search for invariances in adaptive systems rather than from loose biological analogy . Its central proposition is cybernetic: every viable system contains five necessary and sufficient subsystems, each contributing to production, coordination, control, intelligence and identity. The decisive case study is recursion. Beer argues that every viable system contains, and is contained within, another viable system; hence citizens compose communities, communities compose cities, cities compose states, and each level requires its own autonomy and metasystemic cohesion. The model’s methodological strength lies in topological mapping, where homomorphic and isomorphic relations disclose structural invariants across apparently different domains. Beer’s use of Ashby’s Law of Requisite Variety gives the argument operational precision: environmental complexity must be matched by regulatory complexity through attenuation, amplification and transduction. The pathological dimension sharpens the theory further, since organisational failure becomes diagnosable as malfunction within one or more subsystems, such as weak coordination, collapsed intelligence, confused identity or excessive centralisation. In conclusion, the VSM offers a rigorous architecture of adaptive governance, enabling managers and institutions to design autonomy without fragmentation, cohesion without domination, and systemic learning without surrendering identity.
Lefebvre, H. (1991) The Production of Space. Translated by D. Nicholson-Smith. Oxford and Cambridge, MA: Blackwell.
Henri Lefebvre’s The Production of Space establishes space as an active social product, generated through relations of power, knowledge, labour and everyday practice. His opening argument dismantles the inherited view of space as a neutral geometrical container, showing instead that every society produces its own spatial order through institutions, representations, techniques and lived routines . The central proposition is therefore profoundly architectural and political: space organises social life while being organised by it. Lefebvre’s case study is modern capitalist space, which he describes as abstract space: a homogenising field produced through planning, property, state power, exchange value, technical expertise and the world market. This space operates through fragmentation and centralisation at once, separating dwelling, labour, circulation and leisure while subordinating them to measurable, governable and commodifiable order. Against this reduction, Lefebvre proposes a unitary theory capable of holding together physical, mental and social space: material environments, conceptual representations and lived experience must be analysed as one dynamic ensemble. His later horizon is differential space, a spatial possibility arising from use, embodiment, conflict, festival, memory and appropriation, where social life exceeds the abstract logic imposed by capital and the state. In conclusion, Lefebvre transforms spatial thought into a critique of modern power: to understand space is to understand how society is produced, disciplined and contested, and to imagine space differently is to open the political possibility of another collective life.
Goethe, J.W. von (2009) The Metamorphosis of Plants. Introduction and photography by G.L. Miller. Cambridge, MA and London: MIT Press.
Johann Wolfgang von Goethe’s The Metamorphosis of Plants establishes a morphological science of living form, grounded in exact observation, poetic intuition and the search for unity within botanical diversity. Gordon L. Miller’s introduction presents the work as Goethe’s attempt to integrate scientific and symbolic perception, allowing nature to be understood through both sensory accuracy and imaginative insight . The central proposition is the doctrine of metamorphosis: the plant’s visible organs—cotyledon, stem leaf, calyx, corolla, stamen, pistil, fruit and seed—are successive transformations of one formative principle, the archetypal leaf or Urpflanze. Goethe’s method proceeds through disciplined attention to transitional forms, especially those moments where one organ begins to assume the structure of another. The illustrated case study of the annual plant is decisive: Figure 1 separates pistil, stamens, corolla, calyx, stem leaves, cotyledons and roots, making visible the sequential order through which the plant ascends from seed to fruit, while the chrysanthemum images distinguish regular and irregular metamorphosis as two modes of revealing the same formative law. The palm leaves from Padua further clarify Goethe’s insight, showing successive differentiation within a single foliar series. His science therefore treats morphology as movement: form appears through polarity, expansion, contraction and intensification. In conclusion, Goethe offers a delicate empiricism in which seeing becomes participation, botanical knowledge becomes a disciplined art of perception, and the living plant becomes an intelligible drama of unity unfolding through difference.
Rheinberger, H.-J. (2010) An Epistemology of the Concrete: Twentieth-Century Histories of Life. Durham and London: Duke University Press.
Hans-Jörg Rheinberger’s An Epistemology of the Concrete presents scientific knowledge as a material-discursive practice generated through experimental systems, model organisms, instruments and historically situated concepts. The foreword by Tim Lenoir emphasises Rheinberger’s decisive contribution to historical epistemology: science advances through recursive configurations in which objects emerge from technical arrangements, instead of appearing as ready-made entities awaiting discovery . The prologue develops this proposition through the life sciences, especially genetics and molecular biology, where organisms, apparatuses and laboratory inscriptions become active participants in knowledge production. The visual cover’s moth imagery and the contents’ emphasis on Pisum, Eudorina, Ephestia and tobacco mosaic virus already stage the book’s central case study: the model organism as a living technical object, selected, cultivated and transformed so that general biological questions may become experimentally tractable. Rheinberger’s account of molecular biology is especially instructive: its emergence depended upon assemblages of ultracentrifugation, electron microscopy, chromatography, electrophoresis, liquid scintillation counting, viruses, bacteria and interdisciplinary cooperation, which together produced new concepts of gene, information and biological specificity. His notion of phenomenotechnique, inherited from Bachelard, gives the argument its philosophical force: phenomena are technically constituted through instruments that embody prior knowledge while opening unforeseen futures. In conclusion, Rheinberger offers an epistemology grounded in concrete practices, where science becomes a historical art of configuring uncertainty, sustaining productive vagueness and allowing epistemic things to acquire form through experimental life.
Maton, K. (2014) ‘Seeing knowledge and knowers: Social realism and Legitimation Code Theory’, in Knowledge and Knowers: Towards a Realist Sociology of Education. Abingdon: Routledge.
Karl Maton’s opening chapter in Knowledge and Knowers establishes Legitimation Code Theory as a conceptual architecture for making knowledge practices visible, analysable and sociologically consequential. His argument begins with the knowledge paradox: contemporary societies proclaim knowledge as the defining force of economic, political and cultural transformation, while social science frequently treats knowledge as homogeneous information, transferable tokens or subjective experience. Against this reduction, Maton advances social realism, a position that understands knowledge as socially produced and historically situated, while also possessing structures, powers and effects that shape learning, research and institutional life. The chapter’s decisive case study is educational research itself: Maton shows how the field has often privileged learning processes, identities and power relations while leaving the internal organisation of knowledge insufficiently theorised. LCT responds by offering an explanatory framework organised through dimensions such as Specialization, Semantics, Autonomy, Density and Temporality, each enabling researchers to identify the principles by which practices claim legitimacy. The chapter’s figure distinguishing social ontologies, explanatory frameworks and substantive research studies is especially important, since it positions LCT as a practical theory: a toolkit developed through empirical engagement, capable of refinement as data “speak back” to concepts. In conclusion, Maton reframes education as a field where knowledge and knowers must be analysed together, allowing curriculum, pedagogy and research to build cumulative, powerful and socially just forms of understanding.
Prigogine, I. (1980) From Being to Becoming: Time and Complexity in the Physical Sciences. San Francisco: W.H. Freeman.
Ilya Prigogine’s From Being to Becoming challenges the classical scientific imagination that privileges stability, determinism, equilibrium, and timeless laws. Against a worldview centred on being, Prigogine advances a theory of becoming, in which time, irreversibility, instability, and complexity are not secondary disturbances but fundamental features of physical reality. His work on non-equilibrium thermodynamics demonstrates that systems far from equilibrium may generate unexpected forms of organisation rather than simply collapsing into disorder. This is the significance of dissipative structures: they show that order can emerge through flux, exchange, turbulence, and energy dissipation. A whirlpool, a chemical reaction, a biological organism, or an ecological system may maintain structure precisely because it remains open to its environment. Prigogine therefore contests the idea that nature is best understood as a closed, predictable machine. Instead, he presents reality as a temporal process marked by bifurcations, probabilities, thresholds, and emergent possibilities. A specific case study is the chemical clock, where reactions produce rhythmic patterns under certain non-equilibrium conditions, revealing that matter can organise itself temporally. This has profound philosophical consequences: the future is not merely the mechanical unfolding of a pre-given past, but a field of potential transformations. Prigogine’s thought is especially useful for cultural theory, media studies, and systems thinking because it provides a scientific vocabulary for analysing change, contingency, and emergence. His conclusion is clear: to understand complexity, one must abandon the metaphysics of permanence and think reality as irreversible becoming.
Goffman, E. (1974) Frame Analysis: An Essay on the Organization of Experience. New York: Harper & Row.
Erving Goffman’s Frame Analysis offers a sophisticated account of how human beings organise experience through frames, that is, interpretive structures that allow individuals to answer the implicit question: “What is going on here?” For Goffman, reality is not simply encountered in a raw or self-evident form; it is socially arranged through conventions, cues, roles, settings, and expectations. A frame determines whether an action is understood as play, aggression, ritual, rehearsal, accident, irony, performance, or institutional procedure. The same gesture, for example, may signify violence in one context, sport in another, theatrical acting in another, and comic exaggeration in another. This makes framing central to social order, because interaction depends upon shared assumptions about the nature of the situation. Goffman also shows that frames are fragile: they can be transformed, misunderstood, manipulated, or deliberately broken. A joke may become an insult; a performance may be mistaken for sincerity; a political image may be reframed by media circulation. A specific case study might be a courtroom, where speech, clothing, spatial arrangement, and ritualised address frame participants as judge, defendant, witness, lawyer, or observer. Without that frame, the same utterances would not carry the same authority. Goffman’s theory is therefore invaluable for analysing media, art, politics, and everyday conduct, because it reveals the invisible grammar by which situations acquire meaning. His conclusion is not that reality is unreal, but that social reality is always mediated by organised interpretive procedures.
Derrida, J. (1995) ‘Archive Fever: A Freudian Impression’, Diacritics, 25(2), pp. 9–63.
Jacques Derrida’s “Archive Fever: A Freudian Impression” proposes that the archive is never a passive container of historical evidence, but a political and psychic apparatus through which memory is authorised, ordered, and also partially destroyed. The archive begins with the arkhe: both commencement and commandment, the place where things begin and the authority that determines how they may be interpreted. Consequently, every archive is governed by institutional power, since what is preserved, classified, omitted, or legitimised depends upon structures of law, ownership, access, and interpretation. Derrida’s key insight is that the archive is animated by a paradoxical desire: it seeks to conserve traces of the past, yet this very impulse is haunted by repetition, repression, and the death drive. The wish to archive everything emerges from anxiety before loss, but the archive can never overcome loss entirely, because selection and exclusion are conditions of its existence. A museum, state archive, university collection, or activist repository therefore does not simply recover history; it produces a specific version of history through its protocols of preservation. A useful case study is the counter-archive: political groups, feminist collectives, and marginalised communities often construct alternative archives because official institutions have failed to preserve their histories. Derrida’s argument thus transforms the archive into a dynamic, unstable field of struggle. The conclusion is decisive: the archive is not where memory rests, but where memory is continuously contested, institutionalised, and exposed to disappearance.
Kirschenbaum, M.G. (2008) Mechanisms: New Media and the Forensic Imagination. Cambridge, MA: MIT Press.
Matthew G. Kirschenbaum’s Mechanisms: New Media and the Forensic Imagination advances a decisive challenge to the seductive myth that digital texts are weightless, unstable, and immaterial. Rather than treating electronic writing as a purely screen-based phenomenon, Kirschenbaum insists that digital objects possess a complex material ontology, distributed across physical storage media, logical structures, and conceptual interfaces. His distinction between forensic materiality and formal materiality is especially important: the former concerns the singular traces left by inscription on media such as hard drives and disks, while the latter names the procedural organisation of data through software environments and computational systems. This argument transforms new media studies by relocating attention from the visible screen to the hidden mechanisms of storage, recovery, erasure, and transmission. For example, the Department of Defense’s concern with data remanence demonstrates that digital information can remain stubbornly persistent even after deletion, contradicting academic accounts that emphasise ephemerality. Kirschenbaum’s case studies, including Mystery House, Michael Joyce’s Afternoon, and William Gibson’s Agrippa, show that electronic texts are not abstract events but historically situated artefacts shaped by hardware, software, protocols, and social practices. The book therefore proposes a forensic imagination: a critical method attentive to traces, versions, inscriptions, and preservation. Its conclusion is clear: digital culture can only be understood when its apparent immateriality is re-read through the durable, fragile, and historically specific mechanisms that sustain it.
Merton, R.K. (1973) The Sociology of Science: Theoretical and Empirical Investigations. Edited and introduced by N.W. Storer. Chicago: University of Chicago Press.
Robert K. Merton’s The Sociology of Science advances a foundational proposition: science is not merely a body of verified knowledge or a technical method, but a social institution sustained by normative commitments that make reliable inquiry possible. The uploaded extract centres on “The Normative Structure of Science”, where Merton identifies the ethos of modern science as a complex of institutional imperatives: universalism, communism, disinterestedness, and organised scepticism. These norms do not describe scientists as morally pure individuals; rather, they specify the social expectations through which scientific claims are tested, circulated, rewarded, and disciplined. Universalism requires that truth-claims be assessed independently of the race, nationality, class, religion, or personal status of their authors. Communism, in Merton’s specialised sense, means that scientific knowledge is a common inheritance rather than private property, even though recognition remains attached to discovery. Disinterestedness does not imply the absence of ambition, but the subordination of personal gain to public standards of verification. Organised scepticism obliges science to suspend deference before sacred, political, or economic authority, submitting claims to impersonal scrutiny. A specific case appears in Merton’s discussion of anti-intellectual hostility toward science under conditions of authoritarianism, religious orthodoxy, economic pressure, or racial nationalism: when science is subordinated to external power, its autonomy and credibility deteriorate. His conclusion is decisive: scientific knowledge depends upon a fragile moral architecture, and the defence of science requires protecting the institutions that permit criticism, openness, and collective verification.
Beer, S. (1989) ‘The viable system model: Its provenance, development, methodology and pathology’. Cwarel Isaf Institute.
Stafford Beer’s “The Viable System Model” advances a rigorous cybernetic proposition: any system capable of independent existence must possess a recursive structure of viability, allowing it to regulate complexity, preserve identity, and adapt within a changing environment. Beer’s central claim is not analogical but formal: brains, firms, states, cells, and social organisations may be compared because they instantiate invariant patterns of regulation, not because one merely resembles another metaphorically. The model develops from operational research, neurocybernetics, industry, government, and the large-scale Chilean application of 1971–73, culminating in the principle that every viable system contains, and is contained within, another viable system. Its decisive theoretical engine is Ashby’s Law of Requisite Variety: only variety can absorb variety; therefore, management cannot control complexity through simplification alone, but must attenuate excessive environmental variety, amplify regulatory capacity, and maintain channels and transducers adequate to the information they must carry. A specific case is System Five, the locus of identity and closure. Beer recalls Salvador Allende’s insistence that, in Chile, System Five was not the president but the people, thereby revealing the political difficulty of defining the self-awareness of a viable system. The model’s pathology is equally important: organisations fail when subsystems collapse, when adaptation loses identity, when coordination is absent, or when future intelligence is sacrificed to operational command. Beer’s conclusion is uncompromising: management is not hierarchy but cybernetic design for survival, autonomy, cohesion, and recursive intelligence.