[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference noted::sf

Title:Arcana Caelestia
Notice:Directory listings are in topic 2
Moderator:NETRIX::thomas
Created:Thu Dec 08 1983
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:1300
Total number of notes:18728

802.0. "Random question #4: Downloading yourself" by RICKS::REDFORD (Let's rathole this off-line) Wed Jun 14 1989 23:35

    As you're sitting in your cubicle at Digital in the year 2020, 
    telepathically scanning NOTES files,  your personnel manager drops
    by.  "Ken," he says, "You're a real credit to  the department. 
    You have skills that we have been unable to find  elsewhere.  We
    have a hiring freeze on anyway.  I'd like to make  you a
    proposition.  We'd like to make a tape of your brain and  run it
    on our new neural net simulation engine.  That way we can  make
    use of your valuable abilities 24 hours a day.  The taping  is
    painless, and won't disturb the present state of your neurons.
    We'll pay you five years salary as a bonus.  What do you say?"
    
    Would you do it?
    
    /jlr
T.RTitleUserPersonal
Name
DateLines
802.1I'd hold out for an annual "license" feeDINSCO::FUSCIDEC has it (on backorder) NOW!Thu Jun 15 1989 02:404
After you agreed to this, what would keep the company from terminating you? 
What value does the "meat" version of you add at this point?

Ray
802.2Pay for the privilage ???CURRNT::ALFORDNo problems, just opportunities...Thu Jun 15 1989 07:404
	Only if I could retire on the commission I would *have* to be 
	getting :-)
	
	CJA
802.3You Load 16 Meg and What Do You Get?ATSE::WAJENBERGand the CthulhuettesThu Jun 15 1989 13:215
    Depending on my background information on this procedure, I might worry
    about the civil rights and working conditions of the on-line copy or 
    copies.
    
    Earl Wajenberg
802.4Need more compensation.STAR::KOHLSNo comment.Thu Jun 15 1989 14:358
    
    Sure, but the pay would have to be different.  Since they are using
    'me' for 24 hours day, they should pay me say, half salary for those 24
    hours. Then they can terminate me, but I'm still making money as long
    as they use my neural network.
    
    							-SK
    
802.5WELSWS::FINNISFri Jun 16 1989 15:318
    
    
    Immagine the fun if you had a brain disorder..
    
    Could add a whole new concept to Computer Viruses..
    
    		Pete..
    
802.6Just call me LATPOLAR::LACAILLEThere's a madness to my methodFri Jun 16 1989 16:547
    
    Overheard:
    
    "Poor George, I heard he accidently downloaded himself with terminal
    server software [wetware...as gibson would say] last night."
    
    Charlie :-}
802.7Don't Lose Your Backup Tapes...DRUMS::FEHSKENSFri Jun 16 1989 18:496
    This discussion reminds me of "Overdrawn at the Memory Bank", by
    John Varley (I think).  For me, one of *the* great SF stories of all
    time.
                       
    len.
    
802.8ELRIC::MARSHALLhunting the snarkFri Jun 16 1989 21:2012
    re .7:
    
    You are correct that Varley wrote "Overdrawn at the Memory Bank".
    
    One of my favorites also.
    
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
802.9STRATA::RUDMANReality,n. The nucleus of a vacuum.Fri Jun 16 1989 21:2913
    Random thoughts: Restricted access, unrestricted access, invasion 
    of privacy, one-sided learning curve, the copy's will to live, 
    career prospects, duplication, breach of contract/conflict of interest...
                                  
    I'm also reminded of Wallace West's THE MEMORY BANK.  Not heavy
    and on the juvenile side, but one I re-read from time to time.
                      
    Also Blish & Loundes' THE DUPLICATED MAN.  If you can "dump" a brain's
    contents into a computer, you could also maybe upload it to another
    human.  I have the same aversion to that as human duplication in
    Knight's A FOR ANYTHING.
                            
    							Don
802.10Bad Memories extracted for free.JETSAM::WILBURSat Jun 17 1989 15:414
    
    
    Not to beat a dead lobe, but. Brings new meaning to a head crash.
    
802.11EVETPU::CANTORThe answer is -- a daily double.Sun Jun 18 1989 16:087
Re .10

One of my favorite "cookie" lines is

   "System read error, you lose your mind."

Dave C.
802.12Insanity or genius . a thin lineWELSWS::FINNISMon Jun 19 1989 10:0718
    
    Re .9
    
    	Did you see the film where they uploaded the memory of someone,
    but forgot about the subconcious, the recipient of the memory had
    been murdered.  But it was thought to be an accident, I'm sure you
    can guess the rest..
    
    	About the original comment I put in.. In one of Asimov Robot
    stories an engineer is sent to meet a woman ,whom he believes to
    be an excellent light artist.  On his way he is greeted by a droid
    that has a defect ,being an engineer he decides to fix the problem.
    	To cut a short story even shorter the artist in question was
    the droid.!! ..Yup you guessed fixing it .. meant no more fantastic
    light art..If only he'd studied the fautlt better....
    
    			Pete..
    
802.13.11 reminded me...ATSE::BLOCKRemember what the doormouse said...Mon Jun 19 1989 14:329
	...of one of my favorite Nancy Liebovitz buttons:

		I have not lost my mind!
		It's backed up on tape, somewhere!

	Sorry,
	Beverly

802.14Digital Has It Now!TROA02::SKEOCHWash, rinse, repeat.Mon Jun 19 1989 14:5324
    How does the original proposition differ from creating an 'expert'
    system, using your expertise?  (Assuming, of course, that the expert
    system was sufficiently useful to meet Digital's requirements).
    Some observations:
    
    	a) Digital could ask you to cooperate with the expert system
    builder as part of your job -- and so gain your expertise for free
    
    	b) in the original scenario, when they 'copy' your brain, it's
    only your expertise thay want anyway
    
    
    
    
    For a fascinating examination of some of the philosophical aspects of 
    artifical intelligence and the human mind and soul, read 
    'The Mind's I', co-edited by the same guy who wrote "Goedel, Escher,
    Bach'.  Unfortunately, I can't think of his name; too bad, he deserves
    to be remembered.
    
    
    Cheers,
    
    Ian S.
802.15BEING::POSTPISCHILAlways mount a scratch monkey.Mon Jun 19 1989 15:0615
    Re .14:
    
    _Godel, Escher, Bach_ is by Douglas Hofstadter.  _Mind's I_ was edited
    by Dennett and Hofstadter.
    
    Q:	Why did Douglas Hofstadter cross the road?
    
    A:	To make this joke possible.
    
    Q:	Have you heard the latest Douglas Hofstadter joke?
    
    A:	You have now.
    
    
    				-- edp
802.16Neurons & Anti-neuronsLEVERS::BATTERSBYMon Jun 19 1989 17:117
    RE: .10  It would also bring a new meaning to "RBMS" (Remote
    Bridge Management Software), would become "Remote Brain Management
    Software"? Imagine,you could be in the middle of a meeting when
    all of a sudden you go into "standby mode" and cease sending "hello"
    messages....
    
    -Bob
802.17You keepa you hands offa my brain!WECARE::BAILEYCorporate SleuthMon Jun 19 1989 17:4718
    Back to the original question, I assume, since at about that time I'd
    be nearing retirement and would have developed/acquired telepathic
    skills that elude me today, that much else would have changed.
    
    However, I hear nothing to say that only the part of my neural net
    related to work is to be explored/exploited -- so I'd have to decline.
    No computer system needs to be burdened with the garbage left in
    MY brain from my early years -- and there are some aspect of my
    current years that I don't care to have duplicated for world
    consumption/access/or whatever, either!
    
    I guess it's just my policy to keep my neurons to themselves.
    
    But thanks for asking!
    
    And about that raise...
    
    Sherry
802.18Downloading into Hell, or NirvanaRICKS::REDFORDDisbelief is the best revengeMon Jun 19 1989 23:0232
    re: .14
    
    Good point!  How does this differ from 'teaching' an expert system
    what you have so painfully learned?  For that matter, 
    how does it differ from teaching another person?
    
    I would say that it's different because it could well be you 
    that's running on the simulator.  If the simulator works like 
    your brain and has all the memories of your brain, then it might 
    think it's you.  This copying of my consciousness is what would make
    me think twice (ahem) about the process. The copy of myself that 
    remained in my body would reap all the benefits of the sale, 
    while the copy in the simulator would be held in bondage dire to 
    Digital.  The simulator owner could do anything to the copy - pare
    away useless memories, force it to work continuously, destroy it 
    at will and reload to start afresh.
    
    How about if we turn the question around?  Suppose the copy in the 
    simulator lives a rich fantasy life, full of romance and 
    adventure.  Suppose it's allowed to keep the experience it gains, 
    and so to grow and mature.  It's not limited by these five dull 
    senses; it can see smell with the range of a spectrograph, or see 
    with the resolution of a hawk, or hear the radio songs of the stars 
    directly.  Pain is nonexistent, and physical disease
    impossible.  Mental disease is more of a problem than ever, of 
    course.   Death can be overcome by backups even if the hardware is 
    destroyed.   However, only the simulator copy gets to live this 
    way; the flesh and blood version of you goes on as before.
    
    Would you pay Digital for the privilege?
    
    /jlr
802.19RUBY::BOYAJIANProtect! Serve! Run Away!Mon Jun 19 1989 23:518
    re:.12
    
    Are you perhaps referring to the PBS film OVERDRAWN AT THE MEMORY
    BANK? it was based on a short story by John Varley. I thought the
    film was so-so (most folks I know who saw it thought it was trash),
    but the short story is excellent.
    
    --- jerry
802.20SUBURB::TUDORKSKEADUGENGAThu Aug 03 1989 19:513
    Retain the copyright.  Until you go obsolete the royalties could
    be fantastic!  And you could bring out new, enhanced versions every
    few years.
802.21Copyprotect yourself !AMIGA2::MCGHIEThank Heaven for small Murphys !Sun Aug 13 1989 02:5419
    After reading through all of the previous replies, it occurred to me
    that the on-line version of 'you' could be subjected to all sorts of
    conditions which might well be very unpleasant. If the simulated 'you'
    died in the tests, you could be reset/reloaded and the tests run again.
    
    Brings to mind a short story I re-read recently (I think Scott
    Card's "Capitol") where the repeatedly killed the hero who was being
    brain-taped at the time, and then the tap feed back into a clone.
    
    Also in the movie Brainstorm the hero discovered the government was
    planning to do some pretty nasty psychological conditioning using the
    brain tapes.
    
    As stated, the simulated you could be subjected to almost any conditions,
     GOOD or BAD !
    
    Would you let someone copy you !
    
    Mike
802.22TritleARTMIS::GOREIBar Sinister with Pedant RampantWed Sep 06 1989 12:516
    
    	There seem to be a lot of parallels here with another of John Varley's
    stories; "The Ophiuchi Hotline". WRT royalties, suppose the copy
    decided *it* was the original and wanted royalties on *you*!
    
    		Ian G.
802.23OASS::MDILLSONGeneric Personal NameThu Sep 07 1989 13:513
    Cheack the Jun (I think) issue of F&SF for a story by Charles Sheffield
    called ?_The Copyright Problem_? or something like that.  Think
    it will raise a whole new set of problems here.
802.24Budrys NovelAUSTIN::MACNEALBig MacFri Sep 22 1989 20:4911
    Budrys wrote a story which somewhat incorporates this theme.  A strange
    labyrinth is discovered on the moon, in which one doesn't just get lost
    upon committing a wrong turn, one is killed in a most gruesome manner. 
    An earth scientist devises a way to completely duplicate a person
    physically and mentally.  The duplicate is sent to the moon to explore
    the labyrinth.  There is one slight drawback, however.  There are
    strong emotional ties between the copy and the original such that the
    orginal can experience everything the copy experiences.  When the copy
    dies, the original goes insane.  They are able to find one man who can
    survive the ordeal of having his duplicate killed over and over again
    in the journey through the labyrinth.
802.25Excellent story BTW43339::BAILEYNOTES-W-NO_MORE_NOTESSun Sep 24 1989 16:4817
Re .-1   -< Budrys Novel >-

And of course the extra twist at the end of Rouge Moon (sp?)
you have the _extra_ problem of "what do you do after you
get someone through the Maze complete?"... now you've got
an identical person on the Moon & on earth.. same memory's
..same ..whole thing

and the extra-extra problem that (as far as I can remember the
story) 

You go into chamber A... the machine 'breaks you down' into
something it can understand.. and creates a copy on the Moon
and in chamber B.. so _you_ are both copies.. but are you
exact copies... did something get changed?  was that a blue
car you had years ago (as you remember it was) or was it red?
...in short.. are you _really_ still the same man you were before?
802.26Does anyone remember this story?WOOK::LEEWook... Like 'Book' with a 'W'Mon Sep 25 1989 15:1212
    I read a story in OMNI a few years back that was about a man who would
    get enormously fat from overeating and would go to a clinic to get a
    new body so he could overeat somemore.  What really happened was that
    his mind would be transfered to a clone body and the original would be
    sent to work at a slave-labor farm along with all of the other previous
    incarnations of himself.  The new worker would be brutally overworked
    by himself in a previous version, now gaunt from the work on the farm. 
    The clone would eventually do the same thing and end up at the farm
    himself.  The story gave the worst case of the creeps that I'd had in a
    long time.  Truly appalling.
    
    Wook
802.27Made me lose weight...WHELIN::TASCHEREAUCaught with my windows down.Mon Sep 25 1989 16:235
    
    Oh, yes. I remember it. It really was a "different" kind of story.
    Quite a disgusting means of self-preservation.
    
    				-Steve
802.28demand certain god-given rights!GUESS::STOLOSSun Mar 04 1990 21:389
    yes to all the copy questions as long as my software versions had
    certain inalieable rights! yes we're talking a amendment to the 
    constitution for rights of software entities! now this would be 
    interesting...the right to be backed up over a set period of time,
    the right to existance on your own private node, the right to new
    faster hardware, the right to travel over different networks,
    the right to spawn subprocesses....
    yes you opened up quite a can of worms here...
    pete
802.29You forgot a fewSNDBOX::SMITHPowdered endoskeletonTue Mar 06 1990 18:196
    The right to pay for all this hardware and software support.  The
    requirement to split your worldly posessions with any copies.  The
    inalienable right to be erased when you have "no visible means of
    support".
    
    Willie
802.30The wolf in the foldBIGUN::HOLLOWAYSavage Tree Frogs on SpeedWed Jul 15 1992 06:3110
    re: the last few
    
    A lot of this has been posed in the past in both the actual T.V. series
    and associated books, magazines and articles...
    
    Does the Transporter really transport you, or just nuke you and create
    a copy?
    
    Also downline loading personalities is discussed in "Immortality Inc."
    by Sheckley - made into the file "Freejack".
802.31Cross-ReferencesCUPMK::WAJENBERGPatience, and shuffle the cards.Wed Jul 15 1992 13:3210
    Re .30:
    
    "Does the Transporter really transport you, or just nuke you and create
     a copy?"
    
    You might want to check out the Star Trek conference (NOTED::STAR_TREK) 
    and topic 305 of the Philosophy conference (ATSE::PHILOSOPHY), which
    also contain discussions of this question.
    
    Earl Wajenberg
802.32Harrison and Minsky's THE TURING OPTIONVERGA::KLAESLife, the Universe, and EverythingFri May 28 1993 21:3738
Article: 726
From: marshall@jester.usask.ca (Marshall Gilliland)
Newsgroups: alt.books.reviews,usask.general
Subject: Two notes about novels
Date: 27 May 1993 20:40:16 GMT
Organization: University of Saskatchewan
 
For users of computers who enjoy novels about the machines, I can
recommend two recent books, one a mystery/thriller and the other a
novel mostly about some issues of artificial intelligence.
 
1.  David Pogue, HARD DRIVE.  New York: Diamond Books, 1993 (ISBN
1-55773-884-X, $5.99 CDN, $4.99 US) Paperback, 288 pages.  You'll catch
on to this novel quickly if you use a Macintosh, but if you don't the
terms will be familiar enough (a mouse is a mouse is a mouse, after
all) so you'll appreciate the actions of the main character, a
programmer who strives to save the world from a nasty computer virus.
It's quite a gripping page-turner even though you know He Will
Succeed.  The reviewer of the novel for the NEW YORK TIMES BOOK REVIEW
ended his brief report by saying that "Mr. Pogue, who is obviously an
expert, has the ability to make things clear to the nonprofessional
and still keep the story line bubbling.  A swell job."  And it is, I
agree.
 
2.  Harrison and Marvin Minsky, THE TURING OPTION. New York: Warner
Books, 1992 (ISBN 0-446-51565-5, price unknown--I read a library copy)
Hardcover, 422 pages.  This novel is set in 2023-4, and is about a man
suffering brain damage after being shot; he has programmable computer
chips implanted in his brain in an effort to recreate his memory.
This is worth reading, especially if you are interested in AI.  It's
not breathless prose (perhaps Minsky's contributions got in the way of
Harrison's usually entertaining prose style), but the uncertainty of
how it will end will keep you reading.  You don't wonder too long "who
did it?" because you get caught up in trying to answer the question of
"how human is the injured hero," the 24-year-old computer whiz.
 
Marshall Gilliland	English Department	U of Saskatchewan

802.33The ExtropiansVERGA::KLAESQuo vadimus?Thu Jul 29 1993 21:23486
Article: 68256
Newsgroups: sci.space,sci.cryonics
From: whitaker@eternity.demon.co.uk (Russell Earl Whitaker)
Subject: GQ article on Extropians
Organization: Extropy Institute
Date: Tue, 27 Jul 1993 21:46:42 +0000
Sender: usenet@demon.co.uk
 
-----BEGIN PGP SIGNED MESSAGE-----
 
The following article is reprinted with the express permission
of the author and copyright holder David Gale, in London.
David can now be reached by email at 100117.1660@compuserve.com,
and would appreciate feedback on his work, which appeared in the
Conde Nast publication *GQ* (UK version) in June 1993.
 
The article is uploaded by interviewee Russell Whitaker, with
much amusement.  I can be reached most readily at
whitaker@eternity.demon.co.uk.
 
Note that for authenticity's sake, the piece (and this
preamble), has been wrapped and signed in PGP.  PGP ("Pretty
Good Privacy") is a public-key encryption and authentication
program available from bulletin boards and ftp archive sites all
over the world.  If you'd like a copy, ftp gate.demon.co.uk, and
look in /pub/pgp for the version matching the operating system
of your choice.
 
Permission is granted to reproduce this posting ad lib, with the
only proviso that you maintain the integrity of the data,
including the PGP wrapper.
 
Ad astra,

Russell

[text follows]
 
*GQ* (UK edition), pp 105-107, 160
June, 1993
Issue 48
 
"Meet the Extropians"
 
Death?  No fear.  David Gale logs on with the computer cult who
are downloading their souls for immortality
 
Russell E. Whitaker is outwardly unremarkable: a shortish
26-year-old American, with clean-shaven, symmetrical good looks.
He has big, bright eyes, clean black hair and exudes health and
efficiency.
 
The thing is, I know he wants to live forever.  And in order to
do so he's prepared to take one of the most extreme steps
imaginable: Whitaker intends to copy the entire contents of his
mind onto something like a computer's hard disk, creating his
electronic replica on a machine which, he feels, will deliver an
infinitely more stimulating life.
 
The Swiss Centre in Leicester Square is his chosen rendezvous
for our meeting.  Its spotless orderliness seems to echo
Whitaker's unusual ambition to live in machine-like sterility.
He is organised and tidy.  A pocket computer lies beside his
baby chicken lunch and throughout our conversation he regularly
flips the PC open to tap in memos and summon up addresses for me.
 
What is it that makes a man want to eliminate his body?  What's
so terribly wrong with the equipment that nature has given him?
Doesn't he enjoy eating baby chicken?  Clues may be found in the
fact that Whitaker is the communications editor of Extropy, the
magazine of the Los Angeles-based Extropian movement.  A group
of futurist techno-freaks scattered across the US with pockets
in Britain, the Extropians are grooming themselves for a science
fictional future that many would consider a form of suicide.
They would protest, as Whitaker does, that their goals are
precisely the opposite; they are dedicated to the extension of
life - beyond the bounds of the body and the gravitational coils
of planet Earth.  They see no reason why their fleshly vehicle
should frustrate their goals by dying on them.  The Extropians,
it soon becomes clear, are not impressed by human biology.
 
California, true to its stereotype, is home to a variety of
groups with an interest in life extension.  Those who merely
wish to live longer tend to be preoccupied with smart drugs,
biochemical nutrients and exercise.  But those bidding for
immortality are obliged to think carefully about the wear and
tear problem.  In this respect the cryonicists, believers in the
resurrection of the deep-frozen body, are the only group with a
radical view comparable to that of the Extropians.  The two
persuasions have membership overlap and Whitaker, who now lives
in London, has bought into a body-freeze facility managed by a
hotelier in Bournemouth.  Should he die unexpectedly in the UK,
Whitaker will be prepped, cooled and flown out to the West Coast
pronto for the big chill.
 
The Second Law of Thermodynamics states that all differentials
in energy level between bodies will eventually be levelled out.
Hot things will grow colder and cold things will get hotter,
until the universe becomes a homogeneous mix of molecules with
no concentrations of energy.  This is entropy: the inexorable
tendency of everything to move towards disorder and decay, the
heat death of the universe and a source of irritation for
serious immortalists.  To register their distaste for this
impertinence of theoretical phyics, some Los Angeles scientists
and academics, mostly in their late twenties and early thirties,
coined the term "Extropy".  It signals their desire to reverse
the inevitable and is also the name of the biannual journal of
their non-profit corporation, the Extropy Institute.
 
/Extropy - The Journal of Transhumanist Thought/ has a cover
price of $4.50 - but a lifetime subscription at $200 could be a
bargain if things go well in the war against thermodynamics.
Whitaker estimates membership at around 100; the journal itself
has a print run of six or seven hundred and growing.  Every
issue iterates the basic Extropian Principles: 1) boundless
expansion 2) self-transformation 3) dynamic optimism and 4)
cooperative diversity.
 
The principles seem harmless enough, even a little dull, until
the Extropy reader grasps the full implications of
Transhumanism.  In keeping with a publication designed to
disseminate what can only be called new and challenging ideas,
the magazine is full of footnotes, glossaries and boxes defining
the Extropian aims and terminology.  We learn that transhumanism
is a philosophy of life that "seeks the continuation and
acceleration of the evolution of intelligent life beyond its
currently human form and human limitations by means of science
and technology, guided by progressive principles and values,
while rejecting dogma and religion".
 
The full Extropian package aspires to /post/humanism, defined as
"migration out of biology (deanimalisation) or into a completely
new biology".  There may still be a few technical details to
work out, but posthumanism presumes the possibility of total
mind transfer from man to machine.  This, at least, is the
confident view of Hans Moravec, director of the mobile robot
laboratory at Carnagie Mellon University in Pittsburgh.  In
1988, Moravec published /Mind Children/, a book in which he
advanced the notion of downloading the contents of the human
brain into a computer.  (Since the destination is assumed to be
superior to the source, some devotees call it uploading.)  /Mind
Children/ became a key Extropian text.
 
Moravec is a 45-year old Austrian-born scientist with a
doctorate in artificial intelligence.  A highly respected
figure working at the sharp end of robotics, his most recent
project involved the design and construction of a robot that
crawls under factory workbenches and tidies up waste.  In
conversation Moravec, who has an owlish face and boyish
enthusiasm for his subject, spins exotic hypotheses that blend
science with science fiction.  For reasons of his own, he
chuckles like a gleeful chipmunk while doing so.
 
Given that all the activities and functions of the brain are
ultimately electrical, Moravec contends that its thoughts,
memories and abilities could be copied onto a data storage
medium: a hard disk, for the sake of argument.  Given that we
are the sum of the functions of the brain, then that disk will,
for all intents and purposes, become us.  My clone will exist in
silicon and it will think it is me.  This postbiological being
could be used to animate a robot which, in turn, could be used
in the exploration and colonisation of deep space.
 
Movavec has constructed graphs that relate the amount of
computational power that can be purchased for a dollar to the
passage of time.  The graph displays a steady gradient
indicating power has increased a thousand-fold every two decades
this century.  At this rate, what Moravec calls a "humanlike
computer" would be viable before 2010.  He believes that
mind-transfer technology itself will be in place in 50 years'
time.  For some, however, this is just not soon enough.
 
On the line from the philosophy department of the University of
Southern California, Max More, the softly spoken editor of
Extropy, initially expresses caution.  "Before I make the jump I
want to make sure that everything that makes me what I am can be
duplicated."  But in a digital storage medium, changes should be
as easy to effect as erasing a file from a floppy.  Isn't it
going to be hard to resist the temptation to edit the quality of
life in the postbiological vehicle?  "I do think we're going to
be a little selective.  For instance, we're going to get rid of
pain.  It seems a fairly crude way of warning you about
problems, something that evolved because it was easy.  You
couldn't ignore it.  It's one design problem we can improve on."
 
Aspects of personality can be painful too - so maybe a nip here,
a tuck there?  "It's definitely going to happen: some people
will remove some of their traumatic memories.  Certainly,
editing personality seems like a major improvement.  Right now
we're born with these bodies and brains which are not really
under our control.  For instance, if you don't produce enough
dopamine then you go around in a state of depression all the
time; others have big mood swings or suffer anxiety.  All these
things have biochemical causes and if you upload you can
understand the processes and affect them.  There's the potential
of freeing ourselves from conditions we don't like and being
able to be in a state of energy all the time."
 
More is equally enthusiastic about one of the fundamental
Extropian convictions: that it is the destiny of man to colonise
outer space.  When Earth Man eventually takes up
extraterrestrial habitation, he will be leading a demanding,
action-packed life and he'll need something a little smarter
than the meaty old bipedal brain carrier to get around in.
He'll need a whole new body, with a central processor capable of
infinite extension.
 
But More has the answer to Space Man's problems.  He points out
that the downloaded mind, freed from pain and mood, will also be
free to choose its own mode of transport.  "I want to be able to
transfer my personality to different vehicles for different
purposes.  For walking around on this planet, basically the
human body is just fine.  But for a different planet or in outer
space you might want to download into a different vehicle."
 
He continues to extol the posthuman life that he hopes to access
before his system crashes.  But Max, won't you miss your body at
all?  "For the most part, we'll transfer sensations intact, then
we'll start to fiddle with things.  We're going to expand our
abilities - to be able to see in the infrared and the
ultraviolet, and pick up radio waves, useful things like that.
Or increase our ability to smell; it would be nice to smell as
well as a dog.  Increase our sense of touch - you could have a
very fine sense of feel if you wanted; right down to the
molecular level."
 
Now then: without an eye, a nose or a hand, how on Earth (so to
speak) will all this be done?  In the Extropy article entitled
"Persons, Programs and Uploading Consciousness", Extropian David
Ross attempts a step-by-step fictionalisation of the actual
mechanics of mind copying.  At the beginning of the procedure,
Ross' hero, Jason Macklin, is lying on a bed with a tube
connected to his neck through his carotid artery.  "For years he
has resisted the urgings of family and friends to get rid of his
natural body and upload his mind onto the Web, to become a
creature in Cyberspace like them."  As Jason frets about whether
he will really be Jason at the end of it all, microscopic
nanomachines are replacing the nerve cells of his brain and
sensory organs with emulators.  These transmit their input to
the artificial world inside the computer that stand's beside
Jason's body.  "Gradually, each synapse in his brain is absorbed
into the program structure of the emulation program, its
functionality retained but its physical structure gone."  All
the sensory input that gave Jason a feeling of continuity is now
duplicated in Cyberspace.  The structures in the computer are
interacting among themselves, in direct synchronisation with
the way they perform in his body.
 
Jason is now a virtual being.  He does not need his organs.
Virtual experience takes up less space; thanks to virtual
tactility and virtual versions of all the sensual input to his
body, his experiences will be indistinguishable from those he
had back in Old Pinky.
 
The new "electronic body" ceases to be an integrated array of
impulses and sensations and becomes instead a resource centre.
Since posthuman virtual experience depends for its existence on
the language of programs, there arises the opportunity for a
novel form of hands-off lobotomy: turn off the program and the
experience is nulled.  Personality is endlessly permutated, its
aspects brought on line like options on a Magimix.
 
The Extropian credo, we recall, advocates the "evolution of
intelligent life beyond its currently human form".  What exactly
do they have in store for us?  The champions of uploading are
happy to point out the extraordinary perks that will be
available to the virtual being.  Want to fly like a bird?
Right: upload a bird brain and copy to own disk.  You get to
soar off cliffs and eat worms - virtually, of course.  Want to
trudge across the South Pole without losing any toes?  Copy Sir
Ranulph Fiennes' Antarctic memories (assuming he's in the
catalogue), edit toe pain and upload.  Once the technology has
been perfected, the uploadee will have access to the finest
thoughts, memories and experiences of the human race.  Multiple
copies of oneself can be readily produced and despatched to the
corners of the galaxy.  There they will absorb the plasticities
of space-time, witness the death of stars and commune with alien
intelligences.  They then return to Earth, or wherever you wish
to keep your master copy, and the robots slot the well-travelled
clones into your drive.  Presto!  You've travelled the galaxy
without ever leaving your box.
 
Far from being a flight from sexuality, uploading will enable
forms of congress beyond the dreams of hormonal abandon.  Freely
borrowing the sexual memories, techniques and proclivities of
the most fascinating entities in the catalogue, the lustful
disk-jock will construct new sexual identities, select genders
and initiate algorithmic flirting.  Liberated from the old rod
and tube-based genital structures, he/she may even devise whole
new virtual organs of delight, guaranteed to melt the most
resistant chip and drive its molecules into paroxysms of quantum
disorder.  Given the general indifference of women to the
Extropian project, however, the uploadees may simply be reduced
to rerunning the memories of the resident Don Juan and editing
out the bytes that do not arouse.
 
But just how fantastical are the yearnings of the Extropians?
In order to evaluate their dream properly, we should look at
current developments in science and technology that seem to
prefigure a form of transhumanism.  The white hot centre of
future research is, as always, the military.  Last year the
Pentagon published /Star 21: Strategic Technologies for the Army
of the Twenty-First Century/.  Among the areas commended for
research are some of particular interest to Extropians.  One is
the robotic exoskeleton, a form of which we saw in Aliens when
Sigourney climbed into a two-legged freight shifter and beat off
the slimehead with her massively empowered hydraulic gripper arms.
 
In the Aliens scenario, the operation of the limbs was carried
out manually.  What the Pentagon's engineers envision in the
longer term is the direct linking of bionic devices to the human
nervous system.  This biotechnology might emerge in two stages.
The ultimate goal would be brain-centred control, wherein the
soldier thinks of the action he wishes to make and the thought
itself animates the bionic extension.  Comparable experiments
being carried out by the US Air Force involve the development of
fighter pilots' helmets rigged to register deliberately induced
changes in the pilots' brainwaves.  The changes act as triggers
for activating gunfire, flight controls and so forth.
 
In Japan, the computer mega-corporation Fujitsu is painstakingly
measuring the brainwaves of subjects who are told to say the
word "ah" in their minds when they see a light of a certain
colour.  After ten hours and dozens of readings, researches were
able to identify the brain waves peculiar to the silent "ah".
Their goal is computer control by thought-induced brain pattern.
At the Wadsworth Center in New York State, Department of Health
scientists have trained subjects to move a cursor up and down a
computer screen by altering the amplitude of their brain waves.
By thinking about weightlifting, subjects found they could move
the cursor upwards.  Thoughts about relaxing brought the cursor
down.  Eventually the cursor could be moved without the imaging
process.  Stopping the cursor at a particular point is currently
beyond the capacity of all subjects, but the way forward could
not be clearer - direct, hands-off brains-on techniques will
lead to the redundancy of the body/machine interface.  Bodies
wil not be required; minds can do it on their own.
 
It's not only the military which is preoccupied with these
possibilities.  Thanks to the work of Stelarc, a Greek
Australian performer with distinctly Extropian proclivities,
neural robotics can currently be seen in art galleries.
 
Stelarc achieved notoriety in the Eighties with his
body-suspension performances, gruesome tableaux in which the
artist inserted over a dozen chrome steel butcher's hooks into
his flesh and hung himself on ropes from art gallery ceilings
and, on one occasion, from a crane over the streets of Tokyo.
At a conference in Brighton last year, Stelarc declared that
what he would really like is a photoelectric skin that would
take its energy from the sun, as plants do.  Were this the case,
he enthused, then the lungs and pulmonary system would be
redundant and could be removed.  The cavity thus formed could be
"packed with technology".
 
These days Stelarc works with a robotic third hand.  Clad in a
modest jockey brief, the stocky, balding performer bears the
device clamped onto his forearm.  Electrodes on his leg and
abdominal muscles channel signals that make the fully jointed
hand grip, grasp, release and rotate at the wrist.
 
The imagery of robots, electrode-covered brains and bionic
extensions has adorned popular culture for decades.  But it is
only with the explosive rise of the computer, and the sciences
that benefit from it, that the man-machine dream has started to
lift itself clear of the mythic haze.  Any number of respectable
futurologists believe that the achievements of researchers in,
for example, artificial intelligence, medical nanotechnology,
life extension and virtual reality have an inevitable point of
convergence: the creation of the cybernetic organism.
 
While Extropians would insist that the desire to replace the
body is "natural", it can be argued that the whole business has
a distinctly pathological side.  It's all too easy to construe
these machine dreams as simply another late-twentieth-century
displacement of the fear of death.  In the case of the
Extropians it seems, paradoxically, that this fear can be
morbid.  While the yearning for immortality can be seen as a
hubristic game of the-man-who-would-be-God, the flight from
flesh into mechanism looks more like an undignified retreat from
feelings.  These unpopular phenonena earn their name from the
fact that they register in the body just as much as in the mind.
If the body is gradually replaced with electronics and bionics,
then what's left to feel?  Thus disconnected, the mind can bathe
in the unsullied beauty of its own electrochemical efficiency, a
digital porpoise forever freed from the fear of predators.
 
In cyberspace no drive is safe.  "We can think of affecting some
of our basic biological drives," More states, "like our sex
drive, which for me, I think it's great.  I mean, I love sex,
but it's sometimes very inconvenient.  It's distracting.  It
would be nice if you could just switch it off occasionally."
 
Both More and my companion at the Swiss Centre, Russell
Whitaker, claim to have been proto-Extropians from an early age.
At the age of ten More started taking vitamin C tablets to
extend his life, while Whitaker in his youth was profoundly
influenced by the SF novels of Robert /Stranger in a Strange
Land/ Heinlein.  The latter is renowned for his espousal of the
values of the rugged individual and is widely regarded as one of
the most right-wing SF authors in print.
 
A closer reading of Extropy uncovers the political dimensions of
Extropian belief.  In article after article, More inveighs
against the interventionism of the welfare state, the
doom-mongering ecologists who would impede technological
development and the die-hard Marxist demagogues who yearn to
fetter the free market.
 
Whitaker is perfectly explicit about the Extropians' rightish
thing.  "Most Extropians start out with an interest in computers
and science fiction, but politically we are anarcho-capitalist.
 
"We tend to be libertarians, what some people would consider to
be of an extreme persuasion, but we consider ourselves to be
fairly reasonable."  He laughs, fairly reasonably, but is
interrupted by a beep from his pager.
 
The solution to all this lefty decadence is obvious: start an
Extropian colony.  Plans for Free Oceana are premised on getting
away from what Extropian Tom W Bell calls "the grasp of meddling
statists".  He suggests that the ideal would be an escape to
outer space but settles, more realistically, for the high seas.
Bell proposes a "Sociosphere" in which "we can test the limits
of real consent".  The tests would be carried out on "several
oil tankers [joined] together to make a huge floating island".
The benefits are plain to see: on Free Oceana "we could migrate
towards opportunities and away from threats as if we were
seafaring Gypsies".  Thus unfettered "we can take advantage of
our isolation to prepare ourselves for expansion into space".
 
A picture emerges.  Stripped of body, immortalised, plugged into
the Encyclopaedia Galactica, you're free to compete for virtual
space.  The nannyism of the state shall not enfeeble this
cyber-cowboy.  No one can hold him down because he is not
actually there.
 
We rejoin Jason Macklin, fictional posthuman seminarian, as he
is about to slip into that discorporated bliss.  Macklin is by
now downloaded onto a computer; everything he senses hereinafter
is virtual reality.  But then the two are no longer
distinguishable.  "After a while, a doctor comes into his room
and removes the tap into his neck.  She holds out her hand and
tells Jason to stand... [then] leads him over to a curtain at
the side of the room and draws it back... On a bed in the middle
of the room lies his body, still connected to its cable.  For a
moment he watches it breathe... The doctor hands him a switch
which he knows will turn off his old body.  He represses the
feeling that he is committing suicide and throws the switch.  In
the next room the body - he no longer thinks of it as himself -
releases its last breath and seems to relax... He feels less
emotion than he thought he would."
 
In the Swiss Centre the currently human bodies of the waiters
and waitresses, uploaded with plates, bump into each other in
the lunchtime bustle.  Whitaker is talking about the body that
is talking about his body.  His crisp delivery has an energetic
precision that seems fueled by the need to eliminate any form of
clutter.  "I want to be stronger.  I'd like an alternative store
of energy."
 
He warms to the theme.  "I'd like, say, just conservatively, a
titanium skull-case so if I fall down I don't crack my skull.
This is just a little first start."  He places his knife and
fork down neatly at the side of his plate.  He hasn't even
finished his baby chicken.
 
[End of article]
 
-----BEGIN PGP SIGNATURE-----
Version: 2.2
 
iQCVAgUBLEtCDoTj7/vxxWtPAQHy1gQAo2VnpwgVjjViQGJiMgyPc/w1d3EZnrH4
/lYd3QZgHtRvlC7JRXIWiHrcMpPpAtks112PEF9FVSN54rQVQipFQqSLfxTYdjEA
OJ5saygpnQkc8xSUiJhMm8TJSBia8s/EdTbbJIhty+j2mdxM8q3T7js6cUhxgdO0
NiZ2XLG7i0Y=
=YUVy
-----END PGP SIGNATURE-----
 
Russell Earl Whitaker                   whitaker@eternity.demon.co.uk
Communications Editor                                 AMiX: RWhitaker
EXTROPY: The Journal of Transhumanist Thought
Board member, Extropy Institute (ExI)
================ PGP 2.2 public key available =======================
--- Fight the Wiretap Chip!  Details?  Follow alt.privacy.clipper ---
 
802.34Using a mind/computer link to move objectsVERGA::KLAESQuo vadimus?Fri Sep 03 1993 13:3774
<><><><><><><><>  T h e   V O G O N   N e w s   S e r v i c e  <><><><><><><><>

 Edition : 2906               Friday  3-Sep-1993            Circulation :  6623 

        VNS COMPUTER NEWS .................................  184 Lines
        VNS TECHNOLOGY WATCH ..............................   86   "

  For information on how to subscribe to VNS, ordering backissues, contacting
  VNS staff members, etc, send a mail to EXPAT::EXPAT with a subject of HELP.

VNS COMPUTER NEWS:                            [Tracy Talcott, VNS Computer Desk]
==================                            [Littleton, MA, USA              ]

 Computers - Controlling computers via brain waves

	{The Boston Globe, 16-Aug-93, p. 25}

   The small group of scientists doing this research - in Japan and Austria as
 well as in the U.S. - are trying a variety of methods by which humans can make
 their wishes known to a computer.  In some systems, the human must learn to
 alter the voltage of a particular set of brain waves to give a command.  In
 others, the computer is programmed to recognize a particular pattern of brain
 activity as a thought command.  All of the "interfaces" rely on the monitoring
 of the brain's electrical signals by electroencephalography, or EEG, a
 technique that involves placing electrodes on the scalp.  The human brain
 produces a noisy cascade of electrical signals, and the scientists try to
 isolate one particular signal from that ruckus.  They can then use this
 isolated signal, in one manner or another, to act as a brain-to-computer
 messenger.  Perhaps the most promising - and yet the most difficult - approach
 is one being pursued by scientists in Austria and Japan.  They are seeking to
 identify signature brain patterns, or neural networks, the instant before the
 subject performs an action, such as moving a hand, or silently voicing a word.
 Such patterns could then be harnessed to serve as though commands.  A Japanese
 researcher reportedly has identified a pattern that occurs the instant before
 a subject says the sound "ah" in his or her mind, but in the experiment the
 computer needed hours to make that recognition.  The big unknown in this field
 of research is Andrew Junker's work. In late spring, Junker, who has a PhD in
 engineering, sat on the deck of his 35-foot yacht, carving turns around buoys
 and maneuvering past other boats in a crowded Virgin Islands harbor.  Three
 electrodes attached to his forehead ferried a series of "brain-body" signals
 to an onboard computer, and the vessel heeded his mental commands.  No hand
 controls were necessary. Junker was steering with his thoughts.  He says he
 has made a breakthrough since he left the Air Force, and, as evidence, he can
 cite his boat-steering feat.  He is, in fact, doing something radically
 different from the others, but he won't be explicit on the details.  Junker
 does not try to isolate an individual signal from the complex web produced by
 the brain.  Instead, he places electrodes on his forehead and simply analyzes
 the whole messy "brain-body" signal present.  The electrical activity comes
 both from brain waves and, more likely, nearby facial muscle.  His
 breakthrough, he says, resulted from his invention of an algorithm for quickly
 analyzing this complex signal.  With this feedback, a user can learn to alter
 the complex signal in a way that makes it possible to send multiple commands
 to the computer, he said.  "We are taking the whole mess, or at least a lot
 (of the complex signal), and then letting the person make some sense out of
 it.  But they are not going to do it at the conscious level, but at the
 feeling level, and that is the important level," Junker said.  With his
 system, he added, people have been able to use thought commands (he calls them
 "brain fingers") to play video games like Pong and Mario Brothers, perform
 computer music, and steer a wheelchair - or a yacht.  He believes that people
 will be amazed when they discover the power of their own minds as they use
 this technology.  "It gives people another view of themselves," he said.
 "That is the transformative part."  [The 3/4 page article describes a variety
 of work by several researchers - TT].

<><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
  For information on how to subscribe to VNS, ordering backissues, contacting
  VNS staff members, etc, send a mail to EXPAT::EXPAT with a subject of HELP.

    Permission to copy material from this VNS is granted (per DIGITAL PP&P)
    provided that the message header for the issue and credit lines for the
    VNS correspondent and original source are retained in the copy.

<><><><><><><><>   VNS Edition : 2906      Friday  3-Sep-1993   <><><><><><><><>

802.35Brain dump = blank screen!BAHTAT::EATON_NI w'daft t'build castle in't swampFri Sep 03 1993 16:2512
    
    Mind/computer links - what really happened:
    
    "Hmmm, lucky we got the files back, couldn't have done if we'd had to
    FORMAT C:"
    
    "Hey, why's the disk light flashing?".
    
    Think I'll stick to keyboards, if that's OK by you!
    
    Nigel.
    
802.36Review of The Turing Option (SPOILERS)VERGA::KLAESQuo vadimus?Tue Sep 21 1993 15:02224
Article: 362
From: chess@watson.ibm.com (David M. Chess)
Newsgroups: rec.arts.sf.reviews
Subject:  Review of Harry Harrison and Marvin Minsky's "The Turing Option"
Date: 17 Sep 93 01:50:44 GMT
 
Note : Some mild spoilers in the following, although the book isn't
   particularly a suspense novel, and IMHO knowing the outcome
   will not materially reduce the reader's enjoyment of the book.
   (I can't stick in c-L due to my environment, but the moderators
   certainly can if they want to.)
 
Executive summary : Marvin Minsky's "Society of Mind" is must reading
   for anyone with an interest in AI.  Harry Harrison, while not my
   favorite sf author, has done some good stuff, and is certainly
   respected in the field.  From the combination I expected "The
   Turing Option" to be a really well-written novel with interesting
   plotting, good science, and neat new ideas.  I was disappointed.
 
Setting : 2023 to 2026, North America.  Thirty years in the future, but
   it feels a lot like 1993.  There have been some significant advances
   in science, people carry gigabytes in their pockets and there's a
   little nanotech around, but basically people, nations, etc, are still
   the same.  Vinge's singularity is nowhere in sight.
 
Premise : As it is about to be demo'd for the first time, a new advanced
   AI system is stolen, and its inventor shot and left for dead.  The
   investigation of the crime makes no progress.  The inventor has had
   a bullet through the brain, severing critical connections between the
   various parts of his thinking gear.  Using state-of-the-art
   nanotech and brain science, and some technology developed by the
   patient himself, many of the connections are restored.  He ends up
   with his memories intact up to about the age of 14, and sets out
   to re-invent the AI that was stolen, and catch the bad guys.  He is
   hampered by the need for intense security to keep the bad guys from
   coming back and finishing him off.
 
Story : The story itself is reasonably well-done.  The pacing is fast
   enough, the plotline simple enough, and the underlying concepts
   interesting enough to get me from the start to the end.
 
Characterization : Weak to non-existent.  The premise has the potential
   for at least two major character-developments: Brian (the inventor)
   needs to go from almost-dead to 14-year-old-in-24-year-old-body to
   grownup, and the machine intelligences that he creates need to go
   from non-working prototype to human-level (or beyond) minds.  But
   the authors don't show us either of these things.  Brian goes from
   almost-dead, through a couple of dream-memories of his childhood,
   and then ZONK he's a supposedly-14-year-old who is in fact completely
   rational, has no apparent internal conflicts or confusions, is able
   to function completely as an adult, and doesn't change noticably
   throughout the rest of the book.   The AIs go from not working,
   through one amusing almost-working demo, and then ZONK they're
   there, as intelligent flawless super-human-type machine intelligences
   that can learn a new language or a new skill in minutes, are
   politer than Brian, and call up phone-sex lines to practice their
   language skills and study human sexual culture.  Oh, well.
 
   The minor characters are also flat.  The Bad General is a cardboard
   cutout Bad General, the main bad guy who arranged for the original
   theft and almost-killing of Brian is barely seen at all, and has
   no plausible motivation when he is, and so on.  Good sf novels can
   of course get away with little or no characterization if the ideas
   or storytelling are neat enough.  Read on...
 
Storytelling : "The Society of Mind" is a marvelously-told book, made
   up of one-page nuggets of clearly-expressed stuff that link together
   and point to each other in compelling ways.  Harry Harrison's books
   generally have a certain touch of wry humor that gives them a
   distinctive flavor.  This book is neither of those things; I kept
   looking for an "as told to Biff Jones" somewhere on the copyright
   page.  It's done in the uninspired high-school-English-class prose
   of your average written-for-paperback hack novel.  Many important
   actions are completely undermotivated: Brian at one point decides
   that he doesn't *want* to get back all his disconnected memories
   and become his previous 24-year-old self, because of some notes he
   finds that his previous self wrote about "Zenome Therapy".  This
   seems like it could be a major plot element: Brian's attempt to
   re-invent his AI without at the same time awakening too much of the
   former self that's still in his brain somewhere, and falling into
   whatever "Zenome therapy" is again.  But that doesn't happen;
   "Zenome therapy" itself is mentioned exactly once more in the book,
   on the same page, and no conflict between the current and former
   Brians is ever brought in again; the issue of his missing ten
   years of memories vanishes about 150 pages in and never reappears
   in any significant way.  (With the exception of the bizarre last
   page of the book, in which Brian suddenly declares that the
   Bad Guys really won, and killed his humanity, and he's really
   just a Machine Intelligence himself, cry, whine, moan.  This is
   also completely unmotivated.)
 
   In another key scene, Brian, following up a clue that his AI
   found hidden within the programming of an AI recovered from the
   bad guys, walks into what from the reader's point of view has
   at least a 50% chance of being a deathtrap.  But, as he apparently
   knew all along (perhaps the authors told him), the message was
   planted by a good guy who was just working for the bad guys for
   awhile, and really has Brian's best interests at heart.
 
Editing : There are a few nitty oddnesses in the book that suggest
   hasty or scanty editing.  The (non)word "orientated" occurs at least
   a couple of times, as does reference to "a circuitry" in a context
   that clearly means "a subroutine".  There is also evidence of some
   uncareful shortening; we are shown a demo of an AI that doesn't
   work because of too much inhibition, but the following dialogue
   clearly suggests that there was also a demo of one that had not
   enough inhibition, but we missed seeing that somehow.  (It's
   possible that some of the undermotivated actions I moan about
   above are also due to overhasty editing-out of motivating or
   explanatory scenes.)
 
Science : The science in the book should have been a compelling current
   theory coupled to an experienced sf writer's ability to extrapolate.
   It wasn't.  The basic idea of mind as a quasi-hierarchy of agents
   that each do a simple job and are overseen by other agents, and so on,
   played a key role in the plot, as Brian's agents are re-connected in
   order to restore his mind.  But the concept struck me as *just* a
   relatively isolated plot element.  Except for one incident in his
   youth, the idea is never used to show Brian, or the AIs he creates,
   in any interesting lights.  The idea itself is not developed in any
   speculative ways; you'll get more fiction-like speculation in
   The Society of Mind than in this novel.
 
   There are also a painful number of science problems outside the
   main scientific thrust of the book.  At one point Brian discovers
   that he can access the memory banks of the computers that were
   implanted in his brain as part of his operation.  The surgeon tests
   this by uploading the contents of a scientific article into the
   CPUs in his head, and he can then "read" the article word-for-word.
   No mechanism is suggested by which this might work; it's the usual
   bad-sf assumption that all information-processors speak the same
   language.  I cannot myself imagine *any* mechanism by which the
   neurons in Brian's brain could have learned ASCII, and I would have
   appreciated at least some hand-waving towards the question.  At
   another point, an Expert Systems guru that has been hired to
   assist Brian decides that she can help solve the original crime by
   writing an Expert System to consider all the information, and
   suggest answers.  She does, and it provides great help in solving
   the case.  Gee, funny no one thought of doing that before!  Seems
   clear that if ES technology were at that level, it would be a
   routine part of criminal investigation (the book does not suggest
   that she has made some great breakthrough in ES in order to do it).
 
   The last part of the book suffers from the Transporter Problem.
   Gene Roddenberry (I think it was) once commented that the writers
   on Star Trek had problems coming up with situations that the
   Transporter couldn't solve.  The AIs that are developed towards the
   end of this book have a similar effect: in any physical or
   intellectual activity, they are better and faster than humans.
   They can teach themselves languages and skills almost instantly,
   do many things at once, have micromanipulators that let them juggle
   individual molecules, can listen in on radio and telephone traffic
   apparently by magic (another bad-sf premise: all machines speak
   the same language), and so on.  The main Bad Guy is found at the
   end of the book because someone happens to see him walking down the
   street.  Why didn't the magic AIs just scan through all the world's
   photographic databases looking for his face, or whatever?  Every time
   the humans have some problem towards the end of the book, the
   obvious right thing to do would just be to ask an AI.  But they
   only do that when it fits the plot.
 
   This leads to my main tech-related frustration with the book.  Mankind
   has now developed intelligent systems that are faster, smarter, tougher,
   and more reliable than he is.  What will this lead to?  In the real
   world, I think it would obviously lead to an unimaginable shakeup of
   every facet of world culture.  There would be riots, religious
   denunciations, acts of sabotage and rebellion, the potential for
   massive (human) unemployment, the end of nations, breakdown of many
   cultural institutions, etc, etc.  Humanity would face a huge
   challenge in trying to come to an accommodation with the machine
   intelligences, without either being wiped out, pushed aside as an
   irrelevant inferior species, or ending up in a disastrous series of
   wars to eliminate the new competitors.  I'd love to see a well-written
   novel addressing these things.  But in "The Turing Option", the only
   people who can think of any uses for the AIs are Brian, the AIs
   themselves (sorry, "MIs"; they prefer to be called "Machine
   Intelligences"), and the bad guys who stole the original AI.  And
   what are the uses they come up with?  The bad guys produce a
   product called Bug-Off, which is a robot with a dumbed-down AI
   that picks bugs off of plants.  Brian goes beyond this, pointing
   out that MIs will also be really good at planting and harvesting
   crops, and hey maybe even transporting them to market.  And
   he thinks they'll make really good household servants!  What
   intellectual daring.
 
   The bizarre final scene of the book suffers from the Transporter
   Problem acutely.  Without giving it away entirely, it's your
   typical "brave good guys walk in to arrest the bad guy, but
   it turns out he unfairly has a GUN, and a tense confrontation
   ensues" scene.  The problem is that one of the MIs is there.
   To be consistent with the MI abilities in the rest of the book,
   he should have simple picked up a stone with the manipulators
   in his left pseudo-foot and flung it at supersonic speed at the
   bad guy, knocking the gun from his hand and engraving "I am a
   Bad Person" on his forehead on the rebound.  Instead, the MI
   *grapples* with the bad guy to save Brian's life, and the gun
   goes off and you get to guess who got shot etc.  Shortly after
   that Brian begins whining about how the bad guys really won
   after all, for no apparent reason (see above).
 
Recommendation : I see I've waxed pretty negative here.  I don't think
   it's a great book, nor that it'll be remembered long (ironically, the
   back cover says that it "ranks with Michael Crichton's Jurassic Park";
   I tend to agree: I think they're both ephemeral).  I wouldn't recommend
   it to the general reader, or the very picky sf reader.  On the other hand,
   if you enjoy 400-page quick reads, and are interested in having a
   reasonably complete collection of current AI-related sf, it's probably
   worth the six bucks.
 
%A Harrison, Harry
%A Minsky, Marvin
%T The Turing Option
%I Warner Books; Questar Science Fiction
%C New York
%D October 1993 (copyright 1992)
%G ISBN 0-446-36496-7
%P 409 pp.
%O paperback, US$5.99
 
- -- -
David M. Chess                /  "In the long run, life depends less on
High Integrity Computing Lab  /    an abundant supply of energy than on
IBM Watson Research           /    a good signal-to-noise ratio." - Dyson

802.37Spinrad's Deus XJVERNE::KLAESBe Here NowTue Apr 05 1994 15:1768
Article: 547
From: aaron@amisk.cs.ualberta.ca (Humphrey Aaron V)
Newsgroups: rec.arts.sf.reviews
Subject: Prograde Reviews--Norman Spinrad: Deus X
Date: Sun, 27 Mar 1994 15:20:05 GMT
Organization: not specified
 
Norman Spinrad: Deus X
 
A Prograde Review by Aaron V. Humphrey  [some spoilers]
 
I haven't read a lot of Norman Spinrad.  That last thing I read by him, I
believe, was a novella called "World War Last" in Asimov's a few years
back, which was an incredibly heavy-handed satire on the Cold War and the
U.S.A.'s role in it.  I wasn't that impressed with it, but then I've
discovered a low tolerance in myself for heavy-handed satire.  In the
Author's Notes at the back, it includes as accolades:
 
   Spinrad's novel about Adolf Hitler, _The Iron Dream_, was banned in
   Germany for seven years, and _Bug Jack Barron_, his controversial
   novel about presidential politics and the power of televisin, was
   denounced on the floor of the British Parliament.
 
So, after that, _Deus X_ was a pleasant surprise.  It's not heavy-handed at
all.  It deals sensitively with the spiritual issue: does an electronic
replica of a personality(called a "successor entity")have a soul?
 
The Catholic Church of the time is very concerned with this question,
because its current anti-successor-entity position is losing it worshippers
in droves; the ecology is going to hell, and some people would rather live
in computer simulation than in the real world.  They are desirable of
finding out for sure.
 
Father Pierre De Leone is dying.  He is of the near-unshakeable opinion
that successor entities do _not_ have souls, and that they are therefore a
temptation of Satan.  So Pope Mary I considers him the best candidate to
discover the truth--by making a successor entity of him, and attempting to
convince it that it has a soul.
 
Marley Philippe is a net jockey called in when Father De Leone's simulacrum
disappears mysteriously.  It has somehow been spirited away to The Other
Side, where the more mysterious entities dwelling on the Net live...  And
he eventually ends up having to convince Father De Leone that he _does_
have a soul.
 
Spinrad isn't out to bash religion here.  He treats it with sensitivity,
not just pointing and laughing.  And he deals seriously with the issues he
raises.  Expecting to have to slog through it, instead I only needed an
evening and a morning to get through it.  Short, fast-paced, and yet deeply
thoughtful.  A difficult combination to achieve, but Spinrad pulls it off.
 
It may yet show up on my Hugo ballot.
 
%A Spinrad, Norman
%T Deus X
%I Bantam Spectra
%C New York
%D January 1993
%G ISBN 0-553-29677-9
%P 177 pp.
%O Paperback, US$4.99, Can$3.99
 
--
--Alfvaen(Editor of Communique)
Current Album--Neil Diamond:Stones
Current Read--Robert Reed:The Remarkables
"curious george swung down the gorge/the ants took him apart"  --billbill