1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
|
[[!meta title="You're not a Gadget"]]
* Author: Jaron Lanier
## Concepts
* Technological lock-ins.
* Cybernetic totalists versus humanistic technologies.
* Circle of empaty.
* Computationalism.
* Value of personhood contrasted to "the hive".
* Neoteny and it's contradictory qualities in culture.
* Cephalopods + Childhood = Humans + Virtual Reality.
* There's an underlying discussion between individual versus collective. Does creativity is just individual? He seems to view the polarization as a obligation to choose sides.
## Information Doesn’t Deserve to Be Free
“Information wants to be free.” So goes the saying. Stewart Brand, the founder
of the Whole Earth Catalog, seems to have said it first.
I say that information doesn’t deserve to be free.
Cybernetic totalists love to think of the stuff as if it were alive and had its
own ideas and ambitions. But what if information is inanimate? What if it’s
even less than inanimate, a mere artifact of human thought? What if only humans
are real, and information is not?
Of course, there is a technical use of the term “information” that refers to
something entirely real. This is the kind of information that’s related to
entropy. But that fundamental kind of information, which exists independently
of the culture of an observer, is not the same as the kind we can put in
computers, the kind that supposedly wants to be free.
Information is alienated experience.
You can think of culturally decodable information as a potential form of
experience, very much as you can think of a brick resting on a ledge as storing
potential energy. When the brick is prodded to fall, the energy is revealed.
That is only possible because it was lifted into place at some point in the
past.
In the same way, stored information might cause experience to be revealed if it
is prodded in the right way. A file on a hard disk does indeed contain
information of the kind that objectively exists. The fact that the bits are
discernible instead of being scrambled into mush—the way heat scrambles
things—is what makes them bits.
But if the bits can potentially mean something to someone, they can only do so
if they are experienced. When that happens, a commonality of culture is enacted
between the storer and the retriever of the bits. Experience is the only
process that can de-alienate information.
Information of the kind that purportedly wants to be free is nothing but a
shadow of our own minds, and wants nothing on its own. It will not suffer if it
doesn’t get what it wants.
But if you want to make the transition from the old religion, where you hope
God will give you an afterlife, to the new religion, where you hope to become
immortal by getting uploaded into a computer, then you have to believe
information is real and alive. So for you, it will be important to redesign
human institutions like art, the economy, and the law to reinforce the
perception that information is alive. You demand that the rest of us live in
your new conception of a state religion. You need us to deify information to
reinforce your faith.
## The Apple Falls Again
It’s a mistake with a remarkable origin. Alan Turing articulated it, just
before his suicide.
Turing’s suicide is a touchy subject in computer science circles. There’s an
aversion to talking about it much, because we don’t want our founding father to
seem like a tabloid celebrity, and we don’t want his memory trivialized by the
sensational aspects of his death.
The legacy of Turing the mathematician rises above any possible sensationalism.
His contributions were supremely elegant and foundational. He gifted us with
wild leaps of invention, including much of the mathematical underpinnings of
digital computation. The highest award in computer science, our Nobel Prize, is
named in his honor.
Turing the cultural figure must be acknowledged, however. The first thing to
understand is that he was one of the great heroes of World War II. He was the
first “cracker,” a person who uses computers to defeat an enemy’s security
measures. He applied one of the first computers to break a Nazi secret code,
called Enigma, which Nazi mathematicians had believed was unbreakable. Enigma
was decoded by the Nazis in the field using a mechanical device about the size
of a cigar box. Turing reconceived it as a pattern of bits that could be
analyzed in a computer, and cracked it wide open. Who knows what world we would
be living in today if Turing had not succeeded?
The second thing to know about Turing is that he was gay at a time when it was
illegal to be gay. British authorities, thinking they were doing the most
compassionate thing, coerced him into a quack medical treatment that was
supposed to correct his homosexuality. It consisted, bizarrely, of massive
infusions of female hormones.
In order to understand how someone could have come up with that plan, you have
to remember that before computers came along, the steam engine was a preferred
metaphor for understanding human nature. All that sexual pressure was building
up and causing the machine to malfunction, so the opposite essence, the female
kind, ought to balance it out and reduce the pressure. This story should serve
as a cautionary tale. The common use of computers, as we understand them today,
as sources for models and metaphors of ourselves is probably about as reliable
as the use of the steam engine was back then.
Turing developed breasts and other female characteristics and became terribly
depressed. He committed suicide by lacing an apple with cyanide in his lab and
eating it. Shortly before his death, he presented the world with a spiritual
idea, which must be evaluated separately from his technical achievements. This
is the famous Turing test. It is extremely rare for a genuinely new spiritual
idea to appear, and it is yet another example of Turing’s genius that he came
up with one.
Turing presented his new offering in the form of a thought experiment, based on
a popular Victorian parlor game. A man and a woman hide, and a judge is asked
to determine which is which by relying only on the texts of notes passed back
and forth.
Turing replaced the woman with a computer. Can the judge tell which is the man?
If not, is the computer conscious? Intelligent? Does it deserve equal rights?
It’s impossible for us to know what role the torture Turing was enduring at the
time played in his formulation of the test. But it is undeniable that one of
the key figures in the defeat of fascism was destroyed, by our side, after the
war, because he was gay. No wonder his imagination pondered the rights of
strange creatures.
When Turing died, software was still in such an early state that no one knew
what a mess it would inevitably become as it grew. Turing imagined a pristine,
crystalline form of existence in the digital realm, and I can imagine it might
have been a comfort to imagine a form of life apart from the torments of the
body and the politics of sexuality. It’s notable that it is the woman who is
replaced by the computer, and that Turing’s suicide echoes Eve’s fall.
[...]
But the Turing test cuts both ways. You can’t tell if a machine has gotten
smarter or if you’ve just lowered your own standards of intelligence to such a
degree that the machine seems smart. If you can have a conversation with a
simulated person presented by an AI program, can you tell how far you’ve let
your sense of personhood degrade in order to make the illusion work for you?
People degrade themselves in order to make machines seem smart all the time.
Before the crash, bankers believed in supposedly intelligent algorithms that
could calculate credit risks before making bad loans. We ask teachers to teach
to standardized tests so a student will look good to an algorithm. We have
repeatedly demonstrated our species’ bottomless ability to lower our standards
to make information technology look good. Every instance of intelligence in a
machine is ambiguous.
[...]
Wikipedia, for instance, works on what I call the Oracle illusion, in which
knowledge of the human authorship of a text is suppressed in order to give the
text superhuman validity. Traditional holy books work in precisely the same way
and present many of the same problems.
[...]
Or it might turn out that a distinction will forever be based on principles we
cannot manipulate. This might involve types of computation that are unique to
the physical brain, maybe relying on forms of causation that depend on
remarkable and nonreplicable physical conditions. Or it might involve software
that could only be created by the long-term work of evolution, which cannot be
reverse-engineered or mucked with in any accessible way. Or it might even
involve the prospect, dreaded by some, of dualism, a reality for consciousness
as apart from mechanism.
## Wikified Biology
Dyson equates the beginnings of life on Earth with the Eden of Linux. Back when
life first took hold, genes flowed around freely; genetic sequences skipped
from organism to organism in much the way they may soon be able to on the
internet. In his article, Freeman derides the first organism that hoarded its
genes behind a protective membrane as “evil,” just like the nemesis of the
open-software movement, Bill Gates.
Once organisms became encapsulated, they isolated themselves into distinct
species, trading genes only with others of their kind. Freeman suggests that
the coming era of synthetic biology will be a return to Eden.
I suppose amateurs, robots, and an aggregation of amateurs and robots might
someday hack genes in the global garage and tweet DNA sequences around the
globe at light speed. Or there might be a slightly more sober process that
takes place between institutions like high schools and start-up companies.
However it happens, species boundaries will become defunct, and genes will fly
about, resulting in an orgy of creativity. Untraceable multitudes of new
biological organisms will appear as frequently as new videos do on YouTube
today.
One common response to suggestions that this might happen is fear. After all,
it might take only one doomsday virus produced in one garage to bring the
entire human story to a close. I will not focus directly on that concern, but,
instead, on whether the proposed style of openness would even bring about the
creation of innovative creatures.
The alternative to wide-open development is not necessarily evil. My guess is
that a poorly encapsulated communal gloop of organisms lost out to closely
guarded species on the primordial Earth for the same reason that the Linux
community didn’t come up with the iPhone: encapsulation serves a purpose.
[...]
Wikipedia has already been elevated into what might be a permanent niche. It
might become stuck as a fixture, like MIDI or the Google ad exchange services.
That makes it important to be aware of what you might be missing. Even in a
case in which there is an objective truth that is already known, such as a
mathematical proof, Wikipedia distracts the potential for learning how to bring
it into the conversation in new ways. Individual voice—the opposite of
wikiness—might not matter to mathematical truth, but it is the core of
mathematical communication.
## The Culture of Computationalism
For lack of a better word, I call it computationalism. This term is usually
used more narrowly to describe a philosophy of mind, but I’ll extend it to
include something like a culture. A first pass at a summary of the underlying
philosophy is that the world can be understood as a computational process, with
people as subprocesses.
[...]
In a scientific role, I don’t recoil from the idea that the brain is a kind of
computer, but there is more than one way to use computation as a source of
models for human beings. I’ll discuss three common flavors of computationalism
and then describe a fourth flavor, the one that I prefer. Each flavor can be
distinguished by a different idea about what would be needed to make software
as we generally know it become more like a person.
One flavor is based on the idea that a sufficiently voluminous computation will
take on the qualities we associate with people—such as, perhaps, consciousness.
One might claim Moore’s law is inexorably leading to superbrains, superbeings,
and, perhaps, ultimately, some kind of global or even cosmic consciousness. If
this language sounds extreme, be aware that this is the sort of rhetoric you
can find in the world of Singularity enthusiasts and extropians.
[...]
A second flavor of computationalism holds that a computer program with specific
design features—usually related to self-representation and circular
references—is similar to a person. Some of the figures associated with this
approach are Daniel Dennett and Douglas Hofstadter, though each has his own
ideas about what the special features should be.
Hofstadter suggests that software that includes a “strange loop” bears a
resemblance to consciousness. In a strange loop, things are nested within
things in such a way that an inner thing is the same as an outer thing.
[...]
A third flavor of computationalism is found in web 2.0 circles. In this case,
any information structure that can be perceived by some real human to also be a
person is a person. This idea is essentially a revival of the Turing test. If
you can perceive the hive mind to be recommending music to you, for instance,
then the hive is effectively a person.
[...]
The approach to thinking about people computationally that I prefer, on those
occasions when such thinking seems appropriate to me, is what I’ll call
“realism.” The idea is that humans, considered as information systems, weren’t
designed yesterday, and are not the abstract playthings of some higher being,
such as a web 2.0 programmer in the sky or a cosmic Spore player. Instead, I
believe humans are the result of billions of years of implicit, evolutionary
study in the school of hard knocks. The cybernetic structure of a person has
been refined by a very large, very long, and very deep encounter with physical
reality.
### From Images to Odors
For twenty years or so I gave a lecture introducing the fundamentals of virtual
reality. I’d review the basics of vision and hearing as well as of touch and
taste. At the end, the questions would begin, and one of the first ones was
usually about smell: Will we have smells in virtual reality machines anytime
soon?
Maybe, but probably just a few. Odors are fundamentally different from images
or sounds. The latter can be broken down into primary components that are
relatively straightforward for computers—and the brain—to process. The visible
colors are merely words for different wavelengths of light. Every sound wave is
actually composed of numerous sine waves, each of which can be easily described
mathematically.
[...]
Odors are completely different, as is the brain’s method of sensing them. Deep
in the nasal passage, shrouded by a mucous membrane, sits a patch of tissue—the
olfactory epithelium—studded with neurons that detect chemicals. Each of these
neurons has cup-shaped proteins called olfactory receptors. When a particular
molecule happens to fall into a matching receptor, a neural signal is triggered
that is transmitted to the brain as an odor. A molecule too large to fit into
one of the receptors has no odor. The number of distinct odors is limited only
by the number of olfactory receptors capable of interacting with them. Linda
Buck of the Fred Hutchinson Cancer Research Center and Richard Axel of Columbia
University, winners of the 2004 Nobel Prize in Physiology or Medicine, have
found that the human nose contains about one thousand different types of
olfactory neurons, each type able to detect a particular set of chemicals.
This adds up to a profound difference in the underlying structure of the
senses—a difference that gives rise to compelling questions about the way we
think, and perhaps even about the origins of language. There is no way to
interpolate between two smell molecules. True, odors can be mixed together to
form millions of scents. But the world’s smells can’t be broken down into just
a few numbers on a gradient; there is no “smell pixel.” Think of it this way:
colors and sounds can be measured with rulers, but odors must be looked up in a
dictionary.
[...]
To solve the problem of olfaction—that is, to make the complex world of smells
quickly identifiable—brains had to have evolved a specific type of neural
circuitry, Jim believes. That circuitry, he hypothesizes, formed the basis for
the cerebral cortex—the largest part of our brain, and perhaps the most
critical in shaping the way we think. For this reason, Jim has proposed that
the way we think is fundamentally based in the olfactory.
[...]
He often refers to the olfactory parts of the brain as the “Old Factory,” as
they are remarkably similar across species, which suggests that the structure
has ancient origins.
## Editing Is Sexy; Creativity Is Natural
These experiments in linguistic variety could also inspire a better
understanding of how language came about in the first place. One of Charles
Darwin’s most compelling evolutionary speculations was that music might have
preceded language. He was intrigued by the fact that many species use song for
sexual display and wondered if human vocalizations might have started out that
way too. It might follow, then, that vocalizations could have become varied and
complex only later, perhaps when song came to represent actions beyond mating
and such basics of survival.
[...]
Terry offered an unconventional solution to the mystery of Bengalese finch
musicality. What if there are certain traits, including song style, that
naturally tend to become less constrained from generation to generation but are
normally held in check by selection pressures? If the pressures go away,
variation should increase rapidly. Terry suggested that the finches developed a
wider song variety not because it provided an advantage but merely because in
captivity it became possible.
In the wild, songs probably had to be rigid in order for mates to find each
other. Birds born with a genetic predilection for musical innovation most
likely would have had trouble mating. Once finches experienced the luxury of
assured mating (provided they were visually attractive), their song variety
exploded.
Brian Ritchie and Simon Kirby of the University of Edinburgh worked with Terry
to simulate bird evolution in a computer model, and the idea worked well, at
least in a virtual world. Here is yet another example of how science becomes
more like storytelling as engineering becomes able to represent some of the
machinery of formerly subjective human activities.
## Metaphors
One reason the metaphor of the sun fascinates me is that it bears on a conflict
that has been at the heart of information science since its inception: Can
meaning be described compactly and precisely, or is it something that can
emerge only in approximate form based on statistical associations between large
numbers of components?
Mathematical expressions are compact and precise, and most early computer
scientists assumed that at least part of language ought to display those
qualities too.
## Future Humors
Unfortunately, we don’t have access at this time to a single philosophy that
makes sense for all purposes, and we might never find one. Treating people as
nothing other than parts of nature is an uninspired basis for designing
technologies that embody human aspirations. The inverse error is just as
misguided: it’s a mistake to treat nature as a person. That is the error that
yields confusions like intelligent design.
[...]
Those who enter into the theater of computationalism are given all the mental
solace that is usually associated with traditional religions. These include
consolations for metaphysical yearnings, in the form of the race to climb to
ever more “meta” or higher-level states of digital representation, and even a
colorful eschatology, in the form of the Singularity. And, indeed, through the
Singularity a hope of an afterlife is available to the most fervent believers.
## My Brush with Bachelardian Neoteny in the Most Interesting Room in the World
But actually, because of homuncular flexibility, any part of reality might just
as well be a part of your body if you happen to hook up the software elements
so that your brain can control it easily. Maybe if you wiggle your toes, the
clouds in the sky will wiggle too. Then the clouds would start to feel like
part of your body. All the items of experience become more fungible than in the
physical world. And this leads to the revelatory experience.
## Final Words
For me, the prospect of an entirely different notion of communication is more
thrilling than a construction like the Singularity. Any gadget, even a big one
like the Singularity, gets boring after a while. But a deepening of meaning is
the most intense potential kind of adventure available to us.
|