1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
|
[[!meta title="Who owns the future?"]]
* Author: Jaron Lanier
* Year: 2013
* Publisher: Simon & Schuster
## Index
* Star system versus the bell curve as network designs.
* Siren Servers: narcissism, hyperamplified risk aversion, and extreme information asymmetry.
* Siren Servers and Maxwell’s Demon.
* Disruptive innovation as the tedious scheme to shrink markets.
* Science isn't automatic.
* Nine dismal humors of futurism, and a hopeful one.
* Marx as one of the first technology writers (when discussing Luddites).
* Human obsolescence is avoidable.
* Keynes Considered as a Big Data Pioneer.
* Amazon's Mechanical Turk.
* Humanistic information economics.
* What is experience? If personal experience were missing from the universe, how would things be different?
* Gurus and New Age at the Sillicon Valley: Gurdjieff, Steve Jobs.
## Prelude
Instagram isn’t worth a billion dollars just because those thirteen employees
are extraordinary. Instead, its value comes from the millions of users who
contribute to the network without being paid for it. Networks need a great
number of people to participate in them to generate significant value. But when
they have them, only a small number of people get paid. That has the net effect
of centralizing wealth and limiting overall economic growth.
[...]
By “digital networking” I mean not only the Internet and the Web, but also
other networks operated by outfits like financial institutions and intelligence
agencies. In all these cases, we see the phenomenon of power and money becoming
concentrated around the people who operate the most central computers in a
network, undervaluing everyone else. That is the pattern we have come to
expect, but it is not the only way things can go.
## The Price of Heaven
Utopians presume the advent of abundance not because it will be affordable, but
because it will be free, provided we accept surveillance.
Starting back in the early 1980s, an initially tiny stratum of gifted
technologists conceived new interpretations of concepts like privacy, liberty,
and power. I was an early participant in the process and helped to formulate
many of the ideas I am criticizing in this book. What was once a tiny
subculture has blossomed into the dominant interpretation of computation and
software-mediated society.
One strain of what might be called “hacker culture” held that liberty means
absolute privacy through the use of cryptography. I remember the thrill of
using military-grade stealth just to argue about who should pay for a pizza at
MIT in 1983 or so.
On the other hand, some of my friends from that era, who consumed that pizza,
eventually became very rich building giant cross-referenced dossiers on masses
of people, which were put to use by financiers, advertisers, insurers, or other
concerns nurturing fantasies of operating the world by remote control.
It is typical of human nature to ignore hypocrisy. The greater a hypocrisy, the
more invisible it typically becomes, but we technical folk are inclined to seek
an airtight whole of ideas. Here is one such synthesis—of cryptography for
techies and massive spying on others—which I continue to hear fairly often:
Privacy for ordinary people can be forfeited in the near term because it will
become moot anyway.
Surveillance by the technical few on the less technical many can be tolerated
for now because of hopes for an endgame in which everything will become
transparent to everyone. Network entrepreneurs and cyber-activists alike seem
to imagine that today’s elite network servers in positions of information
supremacy will eventually become eternally benign, or just dissolve.
Bizarrely, the endgame utopias of even the most ardent high-tech libertarians
always seem to take socialist turns. The joys of life will be too cheap to
meter, we imagine. So abundance will go ambient.
This is what diverse cyber-enlightened business concerns and political groups
all share in common, from Facebook to WikiLeaks. Eventually, they imagine,
there will be no more secrets, no more barriers to access; all the world will
be opened up as if the planet were transformed into a crystal ball. In the
meantime, those true believers encrypt their servers even as they seek to
gather the rest of the world’s information and find the best way to leverage
it.
It is all too easy to forget that “free” inevitably means that someone else
will be deciding how you live.
## Just Blurt the Idea Out
So we begin with the simple question of how to design digital networks to
deliver more help than harm in aligning human intention to meet great
challenges. A starting point for an answer can be summarized: “Digital
information is really just people in disguise.”
### Aristotle frets
Aristotle directly addressed the role of people in a hypothetical high-tech
world: If every instrument could accomplish its own work, obeying or
anticipating the will of others, like the statues of Daedalus, or the tripods
of Hephaestus, which, says the poet, of their own accord entered the assembly
of the Gods; if, in like manner, the shuttle would weave and the plectrum touch
the lyre without a hand to guide them, chief workmen would not want servants,
nor masters slaves.1
At this ancient date, a number of possibilities were at least slightly visible
to Aristotle’s imagination. One was that the human condition was in part a
function of what machines could not do. Another was that it was possible to
imagine, at least hypothetically, that machines could do more. The synthesis
was also conceived: Better machines could free and elevate people, even slaves.
If we could show Aristotle the technology of our times, I wonder what he would
make of the problem of unemployment. Would he take Marx’s position that better
machines create an obligation (to be carried out by political bodies) to
provide care and dignity to people who no longer need to work? Or would
Aristotle say, “Kick the unneeded ones out of town. The polis is only for the
people who own the machines, or do what machines still cannot do.” Would he
stand by idly as Athens was eventually depopulated?
I’d like to think the best of Aristotle, and assume he would realize that both
choices are bogus; machine autonomy is nothing but theater. Information needn’t
be thought of as a freestanding thing, but rather as a human product. It is
entirely legitimate to understand that people are still needed and valuable
even when the loom can run without human muscle power. It is still running on
human thought.
[...]
Note: How prescient that Aristotle chose musical instruments and looms as his
examples for machines that might one day operate automatically! These two types
of machines did indeed turn out to be central to the prehistory of computation.
The Jacquard programmable loom helped inspire calculating engines, while music
theory and notation helped further the concept of abstract computation, as when
Mozart wrote algorithmic, nondeterministic music incorporating dice throws.
Both developments occurred around the turn of the 19th century.
[...]
Aristotle seems to want to escape the burden of accommodating lesser people.
His quote about self-operating lutes and looms could be interpreted as a
daydream that better technology will free us to some degree from having to deal
with one another.
It’s not as if everyone wanted to be closer to all of humanity when cities
first formed. Athens was a necessity first, and a luxury second. No one wants
to accommodate the diversity of strangers. People deal with each other
politically because the material advantages are compelling. We find relative
safety and sustenance in numbers. Agriculture and armies happened to work
better as those enterprises got bigger, and cities built walls.
But in Aristotle’s words you get a taste of what a nuisance it can be to
accommodate others. Something was lost with the advent of the polis, and we
still dream of getting it back.
[...]
The reward for a Roman general, upon retiring after years of combat, was a plot
of land he could farm for himself. To be left alone, to be able to live off the
land with the illusion of no polis to bug you, that was the dream. The American
West offered that dream again, and still loathes giving it up. Justice Louis
Brandeis famously defined privacy as the “right to be left alone.”
In every case, however, abundance without politics was an illusion that could
only be sustained in temporary bubbles, supported by armies. The ghosts of the
losers haunt every acre of easy abundance. The greatest beneficiaries of
civilization use all their power to create a temporary illusion of freedom from
politics. The rich live behind gates, not just to protect themselves, but to
pretend to not need anyone else, if only for a moment. In Aristotle’s quote, we
find the earliest glimmer of the hope that technological advancement could
replace territorial conquest as a way of implementing an insulating bubble
around a person.
[...]
People naturally seek the benefits of society, meaning the accommodation of
strangers, while avoiding direct vulnerabilities to specific others as much as
possible. This is a clichéd criticism of the online culture of the moment.
People have thousands of “friends” and yet stare at a little screen when in the
proximity of other people. As it was in Athens, so it is online.
## Money
Money might have begun as a mnemonic counter for assets you couldn’t keep under
direct observation, like wandering sheep. A stone per sheep, so the shepherd
would be confident all had been reunited after a day at pasture. In other
words, artifacts took on information storage duties.
[...]
Ancient money was information storage that represented events in the past. To
the ears of many a financier, at this early stage “money” had not been born
yet, only accounting. That kind of money can be called “past-oriented money.”
## Noise and luck
Consider the problem of noise, or what is known as luck in human affairs.
[...]
And yet the rewards of winning and losing are vastly different. While some
critics might have aesthetic or ethical objections to winner-take-all outcomes,
a mathematical problem with them is that noise is amplified. Therefore, if a
societal system depends too much on winner-take-all contests, then the acuity
of that system will suffer. It will become less reality-based.
When a bell curve distribution is appreciated as a bell curve instead of as a
winner-take-all distribution, then noise, luck, and conceptual ambiguity aren’t
amplified. It makes statistical sense to talk about average intelligence or
high intelligence, but not to identify the single most intelligent person.
## Letting Bell Curves Be Bell Curves
In a star system, the top players are rewarded tremendously, while almost
everyone else—facing in our era an ever-larger, more global body of competitive
peers—is driven toward poverty (because of competition or perhaps automation).
## Absolutism
Being an absolutist is a certain way to become a failed technologist.
Markets are an information technology. A technology is useless if it can’t be
tweaked. If market technology can’t be fully automatic and needs some
“buttons,” then there’s no use in trying to pretend otherwise. You don’t stay
attached to poorly performing quests for perfection. You fix bugs.
## The Taste of Politics
Despite my favorable regard for organized labor, for the purposes of this book
I have to focus somewhat on certain failings. The problems of interest to me
are not really with the labor movement, but with the nature of levees. What
might be called “upper-class levees,” like exclusive investment funds, have
been known to blur into Ponzi schemes or other criminal enterprises, and the
same pattern exists for levees at all levels.
Levees are more human than algorithmic, and that is not an entirely good thing.
Whether for the rich or the middle class, levees are inevitably a little
conspiratorial, and conspiracy naturally attracts corruption. Criminals easily
exploited certain classic middle-class levees; the mob famously infiltrated
unions and repurposed music royalties as a money-laundering scheme.
Levees are a rejection of unbridled algorithm and an insertion of human will
into the flow of capital. Inevitably, human oversight brings with it all the
flaws of humans. And yet despite their rough and troubled nature, antenimbosian
levees worked well enough to preserve middle classes despite the floods,
storms, twisters, and droughts of a world contoured by finance. Without our
system of levees, rising like a glimmering bell-curved mountain of rice
paddies, capitalism would probably have decayed into Marx’s “attractor
nightmare” in which markets decay into plutocracy.
## A First Pass at a Definition
A Siren Server, as I will refer to such a thing, is an elite computer,
or coordinated collection of computers, on a network. It is
characterized by narcissism, hyperamplified risk aversion, and extreme
information asymmetry. It is the winner of an all-or-nothing contest,
and it inflicts smaller all-or-nothing contests on those who interact
with it.
Siren Servers gather data from the network, often without having to pay
for it. The data is analyzed using the most powerful available
computers, run by the very best available technical people. The results
of the analysis are kept secret, but are used to manipulate the rest of
the world to advantage.
That plan will always eventually backfire, because the rest of the world
cannot indefinitely absorb the increased risk, cost, and waste dispersed
by a Siren Server. Homer sternly warned sailors to not succumb to the
call of the sirens, and yet was entirely complacent about Hephaestus’s
golden female robots. But Sirens might be even more dangerous in
inorganic form, because it is then that we are really most looking at
ourselves in disguise. It is not the siren who harms the sailor, but the
sailor’s inability to think straight. So it is with us and our machines.
Siren Servers are fated by their nature to sow illusions. They are
cousins to another seductive literary creature, star of the famous
thought experiment known as Maxwell’s Demon, after the great 19th
century physicist James Clerk Maxwell. The demon is an imaginary
creature that, if it could only exist, would be able to implement a
perpetual motion machine and perform other supernatural tricks.
Maxwell’s Demon might be stationed at a tiny door separating two
chambers filled with water or air. It would only allow hot molecules to
pass one way, and cold molecules to pass in the opposite direction.
After a while, one side would be hot and the other cold, and you could
let them mix again, rushing together so quickly that the stream could
run a generator. In that way, the tiny act of discriminating between hot
and cold would produce infinite energy, because you could repeat the
process forever.
The reason Maxwell’s Demon cannot exist is that it does take resources
to perform an act of discrimination. We imagine computation is free, but
it never is. The very act of choosing which particle is cold or hot
itself becomes an energy drain and a source of waste heat. The principle
is also known as “no free lunch.”
We do our best to implement Maxwell’s Demon whenever we manipulate
reality with our technologies, but we can never do so perfectly; we
certainly can’t get ahead of the game, which is known as entropy. All
the air conditioners in a city emit heat that makes the city hotter
overall. While you can implement what seems to be a Maxwell’s Demon if
you don’t look too far or too closely, in the big picture you always
lose more than you gain.
Every bit in a computer is a wannabe Maxwell’s Demon, separating the
state of “one” from the state of “zero” for a while, at a cost. A
computer on a network can also act like a wannabe demon if it tries to
sort data from networked people into one or the other side of some
imaginary door, while pretending there is no cost or risk involved. For
instance, a Siren Server might allow only those who would be cheap to
insure through a doorway (to become insured) in order to make a
supernaturally ideal, low-risk insurance company. Such a scheme would
let high-risk people pass one way, and low-risk ones pass the other way,
in order to implement a phony perpetual motion machine out of a human
society. However, the uninsured would not cease to exist; rather, they
would instead add to the cost of the whole system, which includes the
people who run the Siren Server. A short-term illusion of risk reduction
would actually lead to increased risk in the longer term.
## Candy
The primary business of digital networking has come to be the creation of
ultrasecret mega-dossiers about what others are doing, and using this
information to concentrate money and power. It doesn’t matter whether the
concentration is called a social network, an insurance company, a derivatives
fund, a search engine, or an online store. It’s all fundamentally the same.
Whatever the intent might have been, the result is a wielding of digital
technology against the future of the middle class.
[...]
We loved the crazy cheap easy mortgages, motivated by crazed overleveraging. We
love the free music, enabled by crazed copying. We love cheap online prices,
offered by what would have once seemed like national intelligence agencies.
These newer spy services do not struggle on behalf of our security, but instead
figure out just how little payment everyone in the chain can be made to accept.
We are not benefiting from the benevolence of some artificial intelligence
superbeing. We are exploiting each other off the books while those
concentrating our information remain on the books. We love our treats but will
eventually discover we are depleting our own value.
That’s how we can have economic troubles despite there being so much wealth in
the system, and during a period of increasing efficiencies. Great fortunes are
being made on shrinking the economy instead of growing it. It’s not a result of
some evil scheme, but a side effect of an idiotic elevation of the fantasy that
technology is getting smart and standing on its own, without people.
## From Autocollate to Autocollude
It seems as though online services are bringing bargains to everyone, and yet
wealth disparity is increasing while social mobility is decreasing. If everyone
were getting better options, wouldn’t everyone be doing better as well?
## From the Customer’s Point of View
Wal-Mart confronted the ordinary shopper with two interesting pieces of news.
One was that stuff they wanted to buy got cheaper, which of course was great.
This news was delivered first, and caused cheering.
But there was another piece of news that emerged more gradually. It has often
been claimed that Wal-Mart plays a role in the reduction of employment
prospects for the very people who tend to be its customers.1 Wal-Mart has
certainly made the world more efficient in a certain sense. It moved
manufacturing to any spot in the world that could accomplish it at the very
lowest cost; it rewarded vendors willing to cut corners to the maximum degree.
[...]
All Siren Servers deliver dual messages similar to the pair pioneered by
Wal-Mart. On the one hand, “Good news! Treats await! Information systems have
made the world more efficient for you.”
On the other hand, a little later: “It turns out you, your needs, and your
expectations are not maximally efficient from the lofty point of view of our
server. Therefore, we are reshaping the world so that in the long term, your
prospects are being reduced.”
The initial benefits don’t remotely balance the long-term degradations.
Initially you made some money day trading or getting an insanely easy loan, or
saved some money couch-surfing or by using coupons from an Internet site, but
then came the pink slip, the eviction notice, and the halving of your savings
when the market drooped. Or you loved getting music for free, but then realized
that you couldn’t pursue a music career yourself because there were hardly any
middle-class, secure jobs left in what was once the music industry. Maybe you
loved the supercheap prices at your favorite store, but then noticed that the
factory you might have worked for closed up for good.
## Financial Siren Servers
The schemes were remarkably similar to Silicon Valley designs. A few of them
took as input everything they possibly could scrape from the Internet as well
as other, proprietary networks. As in Google’s data centers, stupendous
correlative algorithms would crunch on the whole ’net’s data overnight, looking
for correlations. Maybe a sudden increase in comments about mosquito bites
would cause an automatic, instant investment in a company that sold lotions.
Actually, that’s an artificially sensible example. The real examples made no
sense to humans. But money was made, and fairly reliably.
Note: It should be pointed out that if only one Siren Server is milking a
particular fluctuation in this way, a reasonable argument could be made that a
service is being performed, in that the fluctuation reveals inefficiency, and
the Siren is canceling it out. However, when many Sirens milk the same
fluctuation, they lock into a feedback system with each other and inadvertently
conspire to milk the rest of the world to no purpose.
[...]
What is absolutely essential to a financial Siren Server, however, is a
superior information position. If everyone else knew what you were doing, they
could securitize you. If anyone could buy stock in a mathematical “sure thing”
scheme, then the benefits of it would be copied like a shared music file, and
spread out until it was nullified. So, in today’s world your mortgage can be
securitized in someone else’s secretive bunker, but you can’t know about the
bunker and securitize it. If it weren’t for that differential, the new kind of
sure thing wouldn’t exist.
## If Life Gives You EULAs, Make Lemonade
The information economy that we are currently building doesn’t really embrace
capitalism, but rather a new form of feudalism.
## Your Lack of Privacy Is Someone Else’s Wealth
Occasionally the rich embrace a new token and drive up its value. The fine art
market is a great example. Expensive art is essentially a private form of
currency traded among the very rich. The better an artist is at making art that
can function this way, the more valuable the art will become. Andy Warhol is
often associated with this trick, though Pablo Picasso and others were
certainly playing the same game earlier. The art has to be stylistically
distinct and available in suitable small runs. It becomes a private form of
money, as instantly recognizable as a hundred-dollar bill.
A related trend of our times is that troves of dossiers on the private lives
and inner beings of ordinary people, collected over digital networks, are
packaged into a new private form of elite money. The actual data in these
troves need not be valid. In fact, it might be better that it is not valid, for
actual knowledge brings liabilities.
## The Nature of Our Confusion
Our core illusion is that we imagine big data as a substance, like a natural
resource waiting to be mined. We use terms like data-mining routinely to
reinforce that illusion. Indeed some data is like that. Scientific big data,
like data about galaxy formation, weather, or flu outbreaks, can be gathered
and mined, just like gold, provided you put in the hard work.
But big data about people is different. It doesn’t sit there; it plays against
you. It isn’t like a view through a microscope, but more like a view of a
chessboard.
## The Most Elite Naïveté
As technology advances, Siren Servers will be ever more the objects of the
struggle for wealth and power, because they are the only links in the chain
that will not be commoditized. If present trends continue, you’ll always be
able to seek information supremacy, just as old-fashioned barons could struggle
for supremacy over land or natural resources. A new energy cycle will someday
make oil much less central to geopolitics, but the information system that
manages that new kind of energy could easily become an impregnable castle. The
illusory golden vase becomes more and more valuable.
### Mapping out where the conversation can go
An endgame for civilization has been foreseen since Aristotle. As technology
reaches heights of efficiency, civilization will have to find a way to resolve
a peculiar puzzle: What should the role of “extra” humans be if not everyone is
still strictly needed? Do the extra people—the ones whose roles have
withered—starve? Or get easy lives? Who decides? How?
The same core questions, stated in a multitude of ways, have elicited only a
small number of answers, because only a few are possible.
What will people be when technology becomes much more advanced? With each
passing year our abilities to act on our ideas are increased by technological
progress. Ideas matter more and more. The ancient conversations about where
human purpose is headed continue today, with rising implications.
Suppose that machines eventually gain sufficient functionality that one will be
able to say that a lot of people have become extraneous. This might take place
in nursing, pharmaceuticals, transportation, manufacturing, or in any other
imaginable field of employment.
The right question to then ask isn’t really about what should be done with the
people who used to perform the tasks now colonized by machines. By the time one
gets to that question, a conceptual mistake has already been made.
Instead, it has to be pointed out that outside of the spell of bad philosophy
human obsolescence wouldn’t in fact happen. The data that drives “automation”
has to ultimately come from people, in the form of “big data.” Automation can
always be understood as elaborate puppetry.
The most crucial quality of our response to very high-functioning machines,
artificial intelligences and the like, is how we conceive of the things that
the machines can’t do, and whether those tasks are considered real jobs for
people or not. We used to imagine that elite engineers would be automation’s
only puppeteers. It turns out instead that big data coming from vast numbers of
people is needed to make machines appear to be “automated.” Do the puppeteers
still get paid once the whole audience has joined their ranks?
## The Technology of Ambient Cheating
Siren Servers do what comes naturally due to the very idea of computation.
Computation is the demarcation of a little part of the universe, called a
computer, which is engineered to be very well understood and controllable, so
that it closely approximates a deterministic, non-entropic process. But in
order for a computer to run, the surrounding parts of the universe must take on
the waste heat, the randomness. You can create a local shield against entropy,
but your neighbors will always pay for it.
Note: A rare experimental machine called a “reversible” computer never forgets,
so that any computation can be run backward as well as forward. Such devices
run cool! This is an example of how thermodynamics and computation interact.
Reversible computers don’t radiate as much heat; forgetting radiates
randomness, which is the same thing as heating up the neighborhood.
## The Insanity of the Local/Global Flip
A Siren Server can become so successful—sometimes in the blink of an eye—that
it optimizes its environment—changes it—instead of changing in order to adapt
to the environment. A successful Siren Server no longer acts only as a player
within a larger system. Instead it becomes a central planner. This makes it
stupid, like a central planner in a communist regime.
## The Conservation of Free Will
A story must have actors, not automatons. Different people become more or less
like automatons in our Sirenic era.
Sirenic entrepreneurs intuitively cast free will—so long as it is their own—as
an ever more magical, elite, and “meta” quality of personhood. The entrepreneur
hopes to “dent the universe”* or achieve some other heroic, Nietzschean
validation. Ordinary people, however, who will be attached to the nodes of the
network created by the hero, will become more effectively mechanical.
[...]
We’re setting up barriers between cases where we choose to give over some
judgment to cloud software, as if we were predictable machines, and those where
we elevate our judgments to pious, absolute standards.
Making choices of where to place the barrier between ego and algorithm is
unavoidable in the age of cloud software. Drawing the line between what we
forfeit to calculation and what we reserve for the heroics of free will is the
story of our time.
## Rewarding and Punishing Network Effects
To understand how Siren Servers work, it’s useful to divide network effects
into those that are “rewarding” and those that are “punishing.” Siren Servers
gain dominance through rewarding network effects, but keep dominance through
punishing network effects.
## The Closing Act
Competition becomes mostly about who can out-meta who, and only secondarily
about specialization.
[...]
Individual Siren Servers can die and yet the Siren Server pattern perseveres,
and it is that pattern that is the real problem. The systematic decoupling of
risk from reward in the rising information economy is the problem, not any
particular server.
## The limits of emergence as an explanation
But the problem with freestanding concentrations of power is that you never
know who will inherit them. If social networking has the power to synchronize
great crowds to dethrone a pharaoh, why might it not also coordinate lynchings
or pogroms?
[...]
The core ideal of the Internet is that one trusts people, and that given an
opportunity, people will find their way to be reasonably decent. I happily
restate my loyalty to that ideal. It’s all we have.
But the demonstrated capability of Facebook to effortlessly engage in mass
social engineering proves that the Internet as it exists today is not a
purists’ emergent system, as is so often claimed, but largely a top-down,
directed one.
[...]
We pretend that an emergent meta-human being is appearing in the computing
clouds—an artificial intelligence—but actually it is humans, the operators of
Siren Servers, pulling the levers.
[...]
The nuts and bolts of artificial-intelligence research can often be more
usefully interpreted without the concept of AI at all. For example, in 2011,
IBM scientists unveiled a “question answering” machine that is designed to play
the TV quiz show Jeopardy. Suppose IBM had dispensed with the theatrics, and
declared it had done Google one better and come up with a new phrase-based
search engine. This framing of exactly the same technology would have gained
IBM’s team as much (deserved) recognition as the claim of an artificial
intelligence, but it would also have educated the public about how such a
technology might actually be used most effectively.
AI technologies typically operate on a variation of the process described
earlier that accomplishes translations between languages. While innovation in
algorithms is vital, it is just as vital to feed algorithms with “big data”
gathered from ordinary people. The supposedly artificially intelligent result
can be understood as a mash-up of what real people did before. People have
answered a lot of questions before, and a multitude of these answers are
gathered up by the algorithms and regurgitated by the program. This in no way
denigrates it or proposes it isn’t useful. It is not, however, supernatural.
The real people from whom the initial answers were gathered deserve to be paid
for each new answer given by the machine.
[...]
What all this comes down to is that the very idea of artificial intelligence
gives us the cover to avoid accountability by pretending that machines can take
on more and more human responsibility. This holds for things that we don’t even
think of as artificial intelligence, like the recommendations made by Netflix
and Pandora. Seeing movies and listening to music suggested to us by algorithms
is relatively harmless, I suppose. But I hope that once in a while the users of
those services resist the recommendations; our exposure to art shouldn’t be
hemmed in by an algorithm that we merely want to believe predicts our tastes
accurately. These algorithms do not represent emotion or meaning, only
statistics and correlations.
What makes this doubly confounding is that while Silicon Valley might sell
artificial intelligence to consumers, our industry certainly wouldn’t apply the
same automated techniques to some of its own work. Choosing design features in
a new smartphone, say, is considered too consequential a game. Engineers don’t
seem quite ready to believe in their smart algorithms enough to put them up
against Apple’s late chief executive, Steve Jobs, or some other person with a
real design sensibility.
But the rest of us, lulled by the concept of ever-more intelligent AIs, are
expected to trust algorithms to assess our aesthetic choices, the progress of a
student, the credit risk of a homeowner or an institution. In doing so, we only
end up misreading the capability of our machines and distorting our own
capabilities as human beings. We must instead take responsibility for every
task undertaken by a machine and double-check every conclusion offered by an
algorithm, just as we always look both ways when crossing an intersection, even
though the signal has been given to walk.
When we think of computers as inert, passive tools instead of people, we are
rewarded with a clearer, less ideological view of what is going on—with the
machines and with ourselves. So, why, aside from the theatrical appeal to
consumers and reporters, must engineering results so often be presented in
Frankensteinian light?
The answer is simply that computer scientists are human, and are as terrified
by the human condition as anyone else. We, the technical elite, seek some way
of thinking that gives us an answer to death, for instance. This helps explain
the allure of a place like the Singularity University. The influential Silicon
Valley institution preaches a story that goes like this: One day in the
not-so-distant future, the Internet will suddenly coalesce into a
superintelligent AI, infinitely smarter than any of us individually and all of
us combined; it will become alive in the blink of an eye, and take over the
world before humans even realize what’s happening.
Some think the newly sentient Internet would then choose to kill us; others
think it would be generous and digitize us the way Google is digitizing old
books, so that we can live forever as algorithms inside the global brain. Yes,
this sounds like many different science fiction movies. Yes, it sounds nutty
when stated so bluntly. But these are ideas with tremendous currency in Silicon
Valley; these are guiding principles, not just amusements, for many of the most
influential technologists.
It should go without saying that we can’t count on the appearance of a
soul-detecting sensor that will verify that a person’s consciousness has been
virtualized and immortalized. There is certainly no such sensor with us today
to confirm metaphysical ideas about people. All thoughts about consciousness,
souls, and the like are bound up equally in faith, which suggests something
remarkable: What we are seeing is a new religion, expressed through an
engineering culture.
|