MachineMachine /stream - tagged with entropy https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[the "undifferentiated mass of organic sensation" origin]]> http://ask.metafilter.com/mefi/378843

In this text from 1966, Robert Smithson quotes Roland Barthes as saying the “undifferentiated mass of organic sensation.” But I can’t find the origin of the quote. A skewed translation? or possibly just made up by Smithson? Any ideas where it might come from appreciated.

]]>
Thu, 14 Mar 2024 13:38:54 -0700 http://ask.metafilter.com/mefi/378843
<![CDATA[The Thermodynamic Theory of Ecology | Quanta Magazine]]> https://www.quantamagazine.org/the-thermodynamic-theory-of-ecology-20140903/

The Western Ghats in India rise like a wall between the Arabian Sea and the heart of the subcontinent to the east.

]]>
Fri, 12 Jan 2018 03:11:16 -0800 https://www.quantamagazine.org/the-thermodynamic-theory-of-ecology-20140903/
<![CDATA[Controversial New Theory Suggests Life Wasn't a Fluke of Biology—It Was Physics | WIRED]]> https://www.wired.com/story/controversial-new-theory-suggests-life-wasnt-a-fluke-of-biologyit-was-physics/

The biophysicist Jeremy England made waves in 2013 with a new theory that cast the origin of life as an inevitable outcome of thermodynamics.

]]>
Sun, 06 Aug 2017 11:35:32 -0700 https://www.wired.com/story/controversial-new-theory-suggests-life-wasnt-a-fluke-of-biologyit-was-physics/
<![CDATA[A Grand New Theory of Life's Evolution on Earth - The Atlantic]]> https://www.theatlantic.com/science/archive/2017/05/a-grand-unified-theory-for-life-on-earth/525648/

A series of energy revolutions—some natural, some technological—built upon one another to give us our rich, diverse biosphere.

]]>
Sat, 20 May 2017 06:35:11 -0700 https://www.theatlantic.com/science/archive/2017/05/a-grand-unified-theory-for-life-on-earth/525648/
<![CDATA[Towards a statistical mechanics of consciousness: maximization of number of connections is associated with conscious awareness]]> https://arxiv.org/abs/1606.00821

Authors: R. Guevara Erra, D. M. Mateos, R. Wennberg, J.L. Perez Velazquez Abstract: It has been said that complexity lies between order and disorder. In the case of brain activity, and physiology in general, complexity issues are being considered with increased emphasis.

]]>
Sun, 23 Oct 2016 04:56:19 -0700 https://arxiv.org/abs/1606.00821
<![CDATA[Paul Beatriz Preciado: Feminism beyond humanism, ecology beyond the environment | Autonomies]]> http://autonomies.org/it/2015/07/paul-beatriz-preciado-feminism-beyond-humanism-ecology-beyond-the-environment/

During one of his “infinite interviews”, Hans-Ulrich Obrist asked me to pose an urgent question to which artists and political movements must answer together.

]]>
Fri, 19 Feb 2016 14:57:22 -0800 http://autonomies.org/it/2015/07/paul-beatriz-preciado-feminism-beyond-humanism-ecology-beyond-the-environment/
<![CDATA[The Deeper I Stare Into the Internet, the More I See the World Going to Waste | Motherboard]]> http://motherboard.vice.com/read/the-deeper-i-stare-into-the-internet-the-more-shit-i-see-going-to-waste

We have a strange, sad way of leaving things behind. Not just regular old things—spent notebooks, obsolete consumer electronics, whatever—but big things that happen to be old.

]]>
Wed, 08 Oct 2014 01:54:38 -0700 http://motherboard.vice.com/read/the-deeper-i-stare-into-the-internet-the-more-shit-i-see-going-to-waste
<![CDATA[Evolution, Entropy, and Information]]> http://blogs.discovermagazine.com/cosmicvariance/2012/06/07/evolution-entropy-and-information/

There are really two points. The first is a bit of technical background you can ignore if you like, and skip to the next paragraph. It’s the idea of “relative entropy” and its equivalent “information” formulation. Information can be thought of as “minus the entropy,” or even better “the maximum entropy possible minus the actual entropy.” If you know that a system is in a low-entropy state, it’s in one of just a few possible microstates, so you know a lot about it. If it’s high-entropy, there are many states that look that way, so you don’t have much information about it. (Aside to experts: I’m kind of shamelessly mixing Boltzmann entropy and Gibbs entropy, but in this case it’s okay, and if you’re an expert you understand this anyway.) John explains that the information (and therefore also the entropy) of some probability distribution is always relative to some other probability distribution, even if we often hide that fact by taking the fiducial probability to be uniform (… in some va

]]>
Fri, 15 Jun 2012 05:19:00 -0700 http://blogs.discovermagazine.com/cosmicvariance/2012/06/07/evolution-entropy-and-information/
<![CDATA[Rigid Implementation vs Flexible Materiality]]> http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality

Wow. It’s been a while since I updated my blog. I intend to get active again here soon, with regular updates on my research. For now, I thought it might be worth posting a text I’ve been mulling over for a while (!) Yesterday I came across this old TED presentation by Daniel Hillis, and it set off a bunch of bells tolling in my head. His book The Pattern on the Stone was one I leafed through a few months back whilst hunting for some analogies about (digital) materiality. The resulting brainstorm is what follows. (This blog post, from even longer ago, acts as a natural introduction: On (Text and) Exaptation) In the 1960s and 70s Roland Barthes named “The Text” as a network of production and exchange. Whereas “the work” was concrete, final – analogous to a material – “the text” was more like a flow, a field or event – open ended. Perhaps even infinite. In, From Work to Text, Barthes wrote: The metaphor of the Text is that of the network… (Barthes 1979) This semiotic approach to discourse, by initiating the move from print culture to “text” culture, also helped lay the ground for a contemporary politics of content-driven media. Skipping backwards through From Work to Text, we find this statement: The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts. I am struck here by Barthes” use of the phrase “computable object”, as well as his attention to the “material”. Katherine Hayles in her essay, Text is Flat, Code is Deep, (Hayles 2004) teases out the statement for us: ‘computable’ here mean[s] to be limited, finite, bound, able to be reckoned. Written twenty years before the advent of the microcomputer, his essay stands in the ironic position of anticipating what it cannot anticipate. It calls for a movement away from works to texts, a movement so successful that the ubiquitous ‘text’ has all but driven out the media-specific term book. Hayles notes that the “ubiquity” of Barthes” term “Text” allowed – in its wake – an erasure of media-specific terms, such as “book”. In moving from, The Work to The Text, we move not just between different politics of exchange and dissemination, we also move between different forms and materialities of mediation. (Manovich 2002)For Barthes the material work was computable, whereas the network of the text – its content – was not.

In 1936, the year that Alan Turing wrote his iconic paper ‘On Computable Numbers’, a German engineer by the name of Konrad Zuse built the first working digital computer. Like its industrial predecessors, Zuse’s computer was designed to function via a series of holes encoding its program. Born as much out of convenience as financial necessity, Zuse punched his programs directly into discarded reels of 35mm film-stock. Fused together by the technologies of weaving and cinema, Zuse’s computer announced the birth of an entirely new mode of textuality. The Z3, the world’s first working programmable, fully automatic computer, arrived in 1941. (Manovich 2002) A year earlier a young graduate by the name of Claude Shannon had published one of the most important Masters theses in history. In it he demonstrated that any logical expression of Boolean algebra could be programmed into a series of binary switches. Today computers still function with a logic impossible to distinguish from their mid-20th century ancestors. What has changed is the material environment within which Boolean expressions are implemented. Shannon’s work first found itself manifest in the fragile rows of vacuum tubes that drove much of the technical innovation of the 40s and 50s. In time, the very same Boolean expressions were firing, domino-like, through millions of transistors etched onto the surface of silicon chips. If we were to query the young Shannon today, he might well gawp in amazement at the material advances computer technology has gone through. But, if Shannon was to examine either your digital wrist watch or the world’s most advanced supercomputer in detail, he would once again feel at home in the simple binary – on/off – switches lining those silicon highways. Here the difference between how computers are implemented and what computers are made of digs the first of many potholes along our journey. We live in an era not only practically driven by the computer, but an era increasingly determined by the metaphors computers have injected into our language. Let us not make the mistake of presupposing that brains (or perhaps minds) are “like” computers. Tempting though it is to reduce the baffling complexities of the human being to the functions of the silicon chip, the parallel processor or Wide Area Network this reduction occurs most usefully at the level of metaphor and metonym. Again the mantra must be repeated that computers function through the application of Boolean logic and binary switches, something that can not be said about the human brain with any confidence a posteriori. Later I will explore the consequences on our own understanding of ourselves enabled by the processing paradigm, but for now, or at least the next few paragraphs, computers are to be considered in terms of their rigid implementation and flexible materiality alone. At the beginning of his popular science book, The Pattern on the Stone, (Hillis 1999) W.  Daniel Hillis narrates one of his many tales on the design and construction of a computer. Built from tinker-toys the computer in question was/is functionally complex enough to “play” tic-tac-toe (noughts and crosses). The tinker-toy was chosen to indicate the apparent simplicity of computer design, but as Hillis argues himself, he may very well have used pipes and valves to create a hydraulic computer, driven by water pressure, or stripped the design back completely, using flowing sand, twigs and twine or any other recipe of switches and connectors. The important point is that the tinker-toy tic-tac-toe computer functions perfectly well for the task it is designed for, perfectly well, that is, until the tinker-toy material begins to fail. This failure is what Chapter 1 of this thesis is about: why it happens, why its happening is a material phenomenon and how the very idea of “failure” is suspect. Tinker-toys fail because the mechanical operation of the tic-tac-toe computer puts strain on the strings of the mechanism, eventually stretching them beyond practical use. In a perfect world, devoid of entropic behaviour, the tinker-toy computer may very well function forever, its users setting O or X conditions, and the computer responding according to its program in perfect, logical order. The design of the machine, at the level of the program, is completely closed; finished; perfect. Only materially does the computer fail (or flail), noise leaking into the system until inevitable chaos ensues and the tinker-toys crumble back into jumbles of featureless matter. This apparent closure is important to note at this stage because in a computer as simple as the tic-tac-toe machine, every variable can be accounted for and thus programmed for. Were we to build a chess playing computer from tinker-toys (pretending we could get our hands on the, no doubt, millions of tinker-toy sets we”d need) the closed condition of the computer may be less simple to qualify. Tinker-toys, hydraulic valves or whatever material you choose, could be manipulated into any computer system you can imagine, even the most brain numbingly complicated IBM supercomputer is technically possible to build from these fundamental materials. The reason we don”t do this, why we instead choose etched silicon as our material of choice for our supercomputers, exposes another aspect of computers we need to understand before their failure becomes a useful paradigm. A chess playing computer is probably impossible to build from tinker-toys, not because its program would be too complicated, but because tinker-toys are too prone to entropy to create a valid material environment. The program of any chess playing application could, theoretically, be translated into a tinker-toy equivalent, but after the 1,000th string had stretched, with millions more to go, no energy would be left in the system to trigger the next switch along the chain. Computer inputs and outputs are always at the mercy of this kind of entropy: whether in tinker-toys or miniature silicon highways. Noise and dissipation are inevitable at any material scale one cares to examine. The second law of thermo dynamics ensures this. Claude Shannon and his ilk knew this, even back when the most advanced computers they had at their command couldn”t yet play tic-tac-toe. They knew that they couldn”t rely on materiality to delimit noise, interference or distortion; that no matter how well constructed a computer is, no matter how incredible it was at materially stemming entropy (perhaps with stronger string connectors, or a built in de-stretching mechanism), entropy nonetheless was inevitable. But what Shannon and other computer innovators such as Alan Turing also knew, is that their saviour lay in how computers were implemented. Again, the split here is incredibly important to note:

Flexible materiality: How and of what a computer is constructed e.g. tinker-toys, silicon Rigid implementation: Boolean logic enacted through binary on/off switches (usually with some kind of input à storage à feedback/program function à output). Effectively, how a computer works

Boolean logic was not enough on its own. Computers, if they were to avoid entropy ruining their logical operations, needed to have built within them an error management protocol. This protocol is still in existence in EVERY computer in the world. Effectively it takes the form of a collection of parity bits delivered alongside each packet of data that computers, networks and software deal with. The bulk of data contains the binary bits encoding the intended quarry, but the receiving element in the system also checks the main bits alongside the parity bits to determine whether any noise has crept into the system. What is crucial to note here is the error-checking of computers happens at the level of their rigid implementation. It is also worth noting that for every eight 0s and 1s delivered by a computer system, at least one of those bits is an error checking function. W. Daniel Hillis puts the stretched strings of his tinker-toy mechanism into clear distinction and in doing so, re-introduces an umbrella term set to dominate this chapter: I constructed a later version of the Tinker Toy computer which fixed the problem, but I never forgot the lesson of the first machine: the implementation technology must produce perfect outputs from imperfect inputs, nipping small errors in the bud. This is the essence of digital technology, which restores signals to near perfection at every stage. It is the only way we know – at least, so far – for keeping a complicated system under control. (Hillis 1999, 18)   Bibliography  Barthes, Roland. 1979. ‘From Work to Text.’ In Textual Strategies: Perspectives in Poststructuralist Criticism, ed. Josue V. Harari, 73–81. Ithaca, NY: Cornell University Press. Hayles, N. Katherine. 2004. ‘Print Is Flat, Code Is Deep: The Importance of Media-Specific Analysis.’ Poetics Today 25 (1) (March): 67–90. doi:10.1215/03335372-25-1-67. Hillis, W. 1999. The Pattern on the Stone : the Simple Ideas That Make Computers Work. 1st paperback ed. New York: Basic Books. Manovich, Lev. 2002. The Language of New Media. 1st MIT Press pbk. ed. Cambridge  Mass.: MIT Press.      

]]>
Thu, 07 Jun 2012 06:08:07 -0700 http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality
<![CDATA[Sloppy MicroChips: Can a fair comparison be made between biological and silicon entropy?]]> http://ask.metafilter.com/mefi/217051

Was reading about microchips that are designed to allow a few mistakes (known as 'Sloppy Chips'), and pondering equivalent kinds of 'coding' errors and entropy in biological systems. Can a fair comparison be made between the two? OK, to setup my question I probably need to run through my (basic) understanding of biological vs silicon entropy...

In the transistor, error is a bad thing (in getting the required job done as efficiently and cheaply as possible), metered by parity bits that come as standard in every packet of data transmitted. But, in biological systems error is not necessarily bad. Most copying errors are filtered out, but some propogate and some of those might become beneficial to the organism (in thermodynamics sometimes known as "autonomy producing equivocations").

Relating to the article about 'sloppy chips', how does entropy and energy efficiency factor into this? For the silicon chip efficiency leads to heat (a problem), for the string of DNA efficiency leads to fewer mutations, and thus less change within populations, and thus, inevitably, less capacity for organisms to diversify and react to their environments - leading to no evolution, no change, no good. Slightly less efficiency is good for biology, and, it seems, good for some kinds of calculations and computer processes.

What work has been done on these connections I draw between the biological and the silicon?

I'm worried that my analogy is limited, based as it is on a paradigm for living systems that too closely mirrors the digital systems we have built. Can DNA and binary parity bit transistors be understood on their own terms, without resorting to using the other as a metaphor to understanding?

Where do the boundaries lie in comparing the two?

]]>
Tue, 05 Jun 2012 10:05:10 -0700 http://ask.metafilter.com/mefi/217051
<![CDATA[Sloppy MicroChips: Oh, that’s near enough]]> http://www.economist.com/node/21556087

Letting microchips make a few mistakes here and there could make them much faster and more energy-efficient.

Managing the probability of errors and limiting where they occur can ensure that the errors do not cause any problems. The result of a mathematical calculation, for example, need not always be calculated precisely—an accuracy of two or three decimal places is often enough. Dr Palem offers the analogy of a person about to cross a big room. Rather than wasting time and energy calculating the shortest path, it’s better just to start walking in roughly the right direction.

]]>
Tue, 05 Jun 2012 09:18:58 -0700 http://www.economist.com/node/21556087
<![CDATA[The Arrow of Time (Debategraph)]]> http://debategraph.org/Stream.aspx?nid=100641&iv=09&mac=100641-

The debate about the nature of time and its passage is a long and venerable one. The issues addressed by pre-Socratic philosophers such as Heraclitus and Parmenides about whether time 'flows' or not prefigure present day philosophical arguments. In his talk to the Blackheath Philosophy Forum Huw Price chose as his starting point the views of cosmologist Sir Arthur Eddington - a prominent figure in the first half of the 20th century, but little known today. What made Eddington's view of time interesting is that he was prepared to part company with most physicists - who conceive time as it is revealed in the laws of physics - and give credence to our subjective perceptions about time, particularly our perception that time passes (or 'goes on' in his terms).

]]>
Fri, 11 May 2012 08:12:52 -0700 http://debategraph.org/Stream.aspx?nid=100641&iv=09&mac=100641-
<![CDATA[“The Shannon and Weaver Model”]]> http://www.thelateageofprint.org/2012/02/20/the-shannon-and-weaver-model/

The genius of Shanon’s original paper from 1948 and its subsequent popularization by Weaver lies in many things, among them, their having formulated a model of communication located on the threshold of these two understandings of theory. As a scientist Shannon surely felt accountable to the empirical world, and his work reflects that. Yet, it also seems clear that Shannon and Weaver’s work has, over the last 60 years or so, taken on a life of its own, feeding back into the reality they first set about describing. Shannon and Weaver didn’t merely model the world; they ended up enlarging it, changing it, and making it over in the image of their research.

]]>
Mon, 20 Feb 2012 06:51:16 -0800 http://www.thelateageofprint.org/2012/02/20/the-shannon-and-weaver-model/
<![CDATA[Digital Decay (2001): by Bruce Sterling]]> http://variablemedia.net/pdf/Sterling.pdf

"Entropy requires no maintenance. Entropy has its own poetry."

]]>
Wed, 10 Aug 2011 09:59:32 -0700 http://variablemedia.net/pdf/Sterling.pdf
<![CDATA[Kipple and Things: How to Hoard and Why Not To Mean]]> http://machinemachine.net/portfolio/kipple-and-things

This is paper (more of an essay, really) was originally delivered at the Birkbeck/London Consortium ‘Rubbish Symposium‘, 30th July 2011 Living at the very limit of his means, Philip K. Dick, a two-bit, pulp sci-fi author, was having a hard time maintaining his livelihood. It was the 1950s and Dick was living with his second wife, Kleo, in a run-down apartment in Berkley, California, surrounded by library books Dick later claimed, “They could not afford to pay the fines on.” In 1956, Dick had a short story published in a brand new pulp magazine: Satellite Science Fiction. Entitled, Pay for the Printer, the story contained a whole host of themes that would come to dominate his work On an Earth gripped by nuclear winter, humankind has all but forgotten the skills of invention and craft. An alien, blob-like, species known as the Biltong co-habit Earth with the humans. They have an innate ability to ‘print’ things, popping out copies of any object they are shown from their formless bellies. The humans are enslaved not simply because everything is replicated for them, but, in a twist Dick was to use again and again in his later works, as the Biltong grow old and tired, each copied object resembles the original less and less. Eventually everything emerges as an indistinct, black mush. The short story ends with the Biltong themselves decaying, leaving humankind on a planet full of collapsed houses, cars with no doors, and bottles of whiskey that taste like anti-freeze. In his 1968 novel Do Androids Dream of Electric Sheep? Dick gave a name to this crumbling, ceaseless, disorder of objects: Kipple. A vision of a pudding-like universe, in which obsolescent objects merge, featureless and identical, flooding every apartment complex from here to the pock-marked surface of Mars. “No one can win against kipple,” Dick wrote: “It’s a universal principle operating throughout the universe; the entire universe is moving toward a final state of total, absolute kippleization.” In kipple, Dick captured the process of entropy, and put it to work to describe the contradictions of mass-production and utility. Saved from the wreckage of the nuclear apocalypse, a host of original items – lawn mowers, woollen sweaters, cups of coffee – are in short supply. Nothing ‘new’ has been made for centuries. The Biltong must produce copies from copies made of copies – each replica seeded with errors will eventually resemble kipple. Objects; things, are mortal; transient. The wrist-watch functions to mark the passing of time, until it finally runs down and becomes a memory of a wrist-watch: a skeleton, an icon, a piece of kipple. The butterfly emerges from its pupae in order to pass on its genes to another generation of caterpillar. Its demise – its kipple-isation – is programmed into its genetic code. An inevitable consequence of the cosmic lottery of biological inheritance. Both the wrist-watch and the butterfly have fulfilled their functions: I utilised the wrist-watch to mark time: the ‘genetic lottery’ utilised the butterfly to extend its lineage. Entropy is absolutely certain, and pure utility will always produce it. In his book Genesis, Michel Serres, argues that objects are specific to the human lineage. Specific, not because of their utility, but because they indicate our drive to classify, categorise and order: “The object, for us, makes history slow.” Before things become kipple, they stand distinct from one another. Nature seems to us defined in a similar way, between a tiger and a zebra there appears a broad gap, indicated in the creatures’ inability to mate with one another; indicated by the claws of the tiger and the hooves of the zebra. But this gap is an illusion, as Michel Foucault neatly points out in The Order of Things: “…all nature forms one great fabric in which beings resemble one another from one to the next…” The dividing lines indicating categories of difference are always unreal, removed as they are from the ‘great fabric’ of nature, and understood through human categories isolated in language. Humans themselves are constituted by this great fabric: our culture and language lie on the same fabric. Our apparent mastery over creation comes from one simple quirk of our being: the tendency we exhibit to categorise, to cleave through the fabric of creation. For Philip K. Dick, this act is what separates us from the alien Biltong. They can merely copy, a repeated play of resemblance that will always degrade to kipple. Humans, on the other hand, can do more than copy. They can take kipple and distinguish it from itself, endlessly, through categorisation and classification. Far from using things until they run down, humans build new relations, new meanings, carefully and slowly from the mush. New categories produce new things, produce newness. At least, that’s what Dick – a Platonic idealist – believed. At the end of Pay for the Printer, a disparate group camp in the kipple-ised, sagging pudding of a formless city. One of the settlers has with him a crude wooden cup he has apparently cleaved himself with an even cruder, hand-made knife: “You made this knife?” Fergesson asked, dazed. “I can’t believe it. Where do you start? You have to have tools to make this. It’s a paradox!” In his essay, The System of Collecting, Jean Baudrillard makes a case for the profound subjectivity produced in this apparent production of newness. Once things are divested of their function and placed into a collection, they: “…constitute themselves as a system, on the basis of which the subject seeks to piece together [their] world, [their] personal microcosm.” The use-value of objects gives way to the passion of systematization, of order, sequence and the projected perfection of the complete set. In the collection, function is replaced by exemplification. The limits of the collection dictate a paradigm of finality; of perfection. Each object – whether wrist-watch or butterfly – exists to define new orders. Once the blue butterfly is added to the collection it stands, alone, as an example of the class of blue butterflies to which the collection dictates it belongs. Placed alongside the yellow and green butterflies, the blue butterfly exists to constitute all three as a series. The entire series itself then becomes the example of all butterflies. A complete collection: a perfect catalogue. Perhaps, like Borges’ Library of Babel, or Plato’s ideal realm of forms, there exists a room somewhere with a catalogue of everything. An ocean of examples. Cosmic disorder re-constituted and classified as a finite catalogue, arranged for the grand cosmic collector’s singular pleasure. The problem with catalogues is that absolutely anything can be collected and arranged. The zebra and the tiger may sit side-by-side if the collector is particularly interested in collecting mammals, striped quadrupeds or – a particularly broad collection – things that smell funny. Too much classification, too many cleaves in the fabric of creation, and order once again dissolves into kipple. Disorder arises when too many conditions of order have been imposed. William H. Gass reminds us of the linguistic conjunction ‘AND’ an absolute necessity in the cleaving of kipple into things: “[W]e must think of chaos not as a helter-skelter of worn-out and broken or halfheartedly realised things, like a junkyard or potter’s midden, but as a fluid mishmash of thinglessness in every lack of direction as if a blender had run amok. ‘AND’ is that sunderer. It stands between. It divides light from darkness.” Collectors gather things about them in order to excerpt a mastery over the apparent disorder of creation. The collector attains true mastery over their microcosm. The narcissism of the individual extends to the precise limits of the catalogue he or she has arranged about them. Without AND language would function as nothing but pudding, each clause, condition or acting verb leaking into its partner, in an endless series. But the problem with AND, with classes, categories and order is that they can be cleaved anywhere. Jorge Luis Borges exemplified this perfectly in a series of fictional lists he produced throughout his career. The most infamous list, Michel Foucault claimed influenced him to write The Order of Things, refers to a “certain Chinese encyclopaedia” in which: Animals are divided into

belonging to the Emporer, embalmed, tame, sucking pigs, sirens, fabulous, stray dogs, included in the present classification, frenzied, innumerable, drawn with a very fine camelhair brush, et cetera, having just broken the water pitcher, that from a long way off look like flies…

In writing about his short story The Aleph, Borges also remarked: “My chief problem in writing the story lay in… setting down of a limited catalog of endless things. The task, as is evident, is impossible, for such a chaotic enumeration can only be simulated, and every apparently haphazard element has to be linked to its neighbour either by secret association or by contrast.” No class of things, no collection, no cleaving of kipple into nonkipple can escape the functions of either “association OR contrast…” The lists Borges compiled are worthy of note because they remind us of the binary contradiction classification always comes back to:

Firstly, that all collections are arbitrary and Secondly, that a perfect collection of things is impossible, because, in the final instance there is only pudding “…in every lack of direction…”

Human narcissism – our apparent mastery over kipple – is an illusion. Collect too many things together, and you re-produce the conditions of chaos you tried so hard to avoid. When the act of collecting comes to take precedence over the microcosm of the collection, when the differentiation of things begins to break down: collectors cease being collectors and become hoarders. The hoard exemplifies chaos: the very thing the collector builds their catalogues in opposition to. To tease apart what distinguishes the hoarder, from the collector, I’d like to introduce two new characters into this arbitrary list I have arranged about myself. Some of you may have heard of them, indeed, they are the brothers whom the syndrome of compulsive hoarding is named after.

Brothers, Homer and Langley Collyer lived in a mansion at 2078, Fifth Avenue, Manhattan. Sons of wealthy parents – their father was a respected gynaecologist, their mother a renowned opera singer – the brothers both attended Columbia University, where Homer studied law and Langley engineering. In 1933 Homer suffered a stroke which left him blind and unable to work at his law firm. As Langley began to devote his time entirely to looking after his helpless brother, both men became locked inside the mansion their family’s wealth and prestige had delivered. Over the following decade or so Langley would leave the house only at night. Wandering the streets of Manhattan, collecting water and provisions to sustain his needy brother, Langley’s routines became obsessive, giving his life a meaning above and beyond the streets of Harlem that were fast becoming run-down and decrepid. But the clutter only went one way: into the house, and, as the interest from the New York newspaper media shows, the Collyer brothers and their crumbling mansion became something of a legend in a fast changing city. On March 21st 1947 the New York Police Department received an anonymous tip-off that there was a dead body in the Collyer mansion. Attempting to gain entry, police smashed down the front-door, only to be confronted with a solid wall of newspapers (which, Langley had claimed to reporter’s years earlier his brother “would read once his eyesight was restored”.) Finally, after climbing in through an upstairs window, a patrolman found the body of Homer – now 65 years old – slumped dead in his kippleised armchair. In the weeks that followed, police removed one hundred and thirty tons of rubbish from the house. Langley’s body was eventually discovered crushed and decomposing under an enormous mound of junk, lying only a few feet from where Homer had starved to death. Crawling through the detritus to reach his ailing brother, Langley had triggered one of his own booby traps, set in place to catch any robbers who attempted to steal the brother’s clutter. The list of objects pulled from the brother’s house reads like a Borges original. From Wikipedia: Items removed from the house included baby carriages, a doll carriage, rusted bicycles, old food, potato peelers, a collection of guns, glass chandeliers, bowling balls, camera equipment, the folding top of a horse-drawn carriage, a sawhorse, three dressmaking dummies, painted portraits, pinup girl photos, plaster busts, Mrs. Collyer’s hope chests, rusty bed springs, a kerosene stove, a child’s chair, more than 25,000 books (including thousands about medicine and engineering and more than 2,500 on law), human organs pickled in jars, eight live cats, the chassis of an old Model T Ford, tapestries, hundreds of yards of unused silks and fabric, clocks, 14 pianos (both grand and upright), a clavichord, two organs, banjos, violins, bugles, accordions, a gramophone and records, and countless bundles of newspapers and magazines. Finally: There was also a great deal of rubbish. A Time Magazine obituary from April 1947 said of the Collyer brothers: “They were shy men, and showed little inclination to brave the noisy world.” In a final ironic twist of kippleisation, the brothers themselves became mere examples within the system of clutter they had amassed. Langley especially had hoarded himself to death. His body, gnawed by rats, was hardly distinguishable from the kipple that fell on top of it. The noisy world had been replaced by the noise of the hoard: a collection so impossible to conceive, to cleave, to order, that it had dissolved once more to pure, featureless kipple. Many hoarders achieve a similar fate to the Collyer brothers: their clutter eventually wiping them out in one final collapse of systemic disorder. To finish, I want to return briefly to Philip K. Dick. In the 1960s, fuelled by amphetamines and a debilitating paranoia, Dick wrote 24 novels, and hundreds of short stories, the duds and the classics mashed together into an indistinguishable hoard. UBIK, published in 1966, tells of a world which is itself degrading. Objects regress to previous forms, 3D televisions turn into black and white tube-sets, then stuttering reel-to-reel projections; credit cards slowly change into handfuls of rusted coins, impressed with the faces of Presidents long since deceased. Turning his back for a few minutes a character’s hover vehicle has degraded to become a bi-propeller airplane. The Three Stigmata of Palmer Eldritch, a stand-out novel from 1965, begins with this memo, “dictated by Leo Bulero immediately on his return from Mars”: “I mean, after all; you have to consider we’re only made out of dust. That’s admittedly not much to go on and we shouldn’t forget that. But even considering, I mean it’s a sort of bad beginning, we’re not doing too bad. So I personally have faith that even in this lousy situation we’re faced with we can make it. You get me?”

]]>
Sun, 31 Jul 2011 10:28:32 -0700 http://machinemachine.net/portfolio/kipple-and-things
<![CDATA[Genetic Future: How much data is a human genome? It depends how you store it.]]> http://www.genetic-future.com/2008/06/how-much-data-is-human-genome-it.html

The question is pretty simple: in the not-too-distant future you and I will have had our entire genomes sequenced (except perhaps those of you in California) - so how much hard drive space will our genomes take up?

Andrew calculates that a genome will take up about two CDs worth of data, but that's only if it's stored in one possible format (a text file storing one copy of each and every DNA letter in your sequence). There are other ways you might want to keep your genome depending on what your purpose is.

The executive summary For those who don't want to read through the tedious details that follow, here's the take-home message: if you want to store the data in a raw format for later re-analysis, you're looking at between 2 and 30 terabytes (one terabyte = 1,000 gigabytes). A much more user-friendly format, though, would be as a file containing each and every DNA letter in your genome, which would take up around 1.5 gigabytes (small enough for three genomes to fit on a standard data

]]>
Tue, 29 Jun 2010 08:33:00 -0700 http://www.genetic-future.com/2008/06/how-much-data-is-human-genome-it.html
<![CDATA[Uniformity and Variability: An Essay in the Philosophy of Matter]]> http://museum.doorsofperception.com/doors3/transcripts/Delanda.html

If the planet needs us to speed up information, and slow down matter, what does this mean for the complex relationship between information and nature? There is a growing awareness of the importance of studying the behaviour of matter in its full complexity. According to Manuel DeLanda, author of A Short History of Matter, this is partly the result of experimentation with non-homogeneous materials. DeLanda explores some of the philosophical issues raised by new developments in materials science, including the significance of the idea that many different material and energetic systems may have a common source of spontaneous order. The historical emergence of uniform, homogenous, predictable materials like steel entailed great gains -- DeLanda focuses on some of what may have been lost in this process, for human beings, technology and the philosophy of matter.

]]>
Mon, 03 May 2010 09:35:00 -0700 http://museum.doorsofperception.com/doors3/transcripts/Delanda.html