2010-07 - Korea IT Times
2010-07 - Korea IT Times
2010-07 - Korea IT Times
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
How to..<br />
How to Design<br />
a Safe Nano Robot or Virus<br />
A Moore's Law or similar ratio for ethics? Asimov's law and risks of nanotechnology as ascribed<br />
by the center for responsible nanotechnology<br />
Ethics issues of a different<br />
type have begun to spring up<br />
for science and technology<br />
due to a combination of factors.<br />
There are credibility issues<br />
that may hatch trust issues<br />
resultant from intentional<br />
and unintentional expected<br />
[in a democracy at<br />
Benjamin Franklin Working at his desk<br />
least] questioning of findings<br />
in environmental studies and science becoming reality as scientists<br />
in various fields from varying epistemological backgrounds<br />
grapple with timeframes, conceptualisations and facts. But as<br />
consumers and scientists come to understand the capacity of science<br />
and technology to innovate and that this capacity is increasing<br />
rapidly, there needs great thinkers on the side of wisdom to<br />
steer us from wanton disaster.<br />
It is interesting and very noteworthy that modern society looks<br />
back for guidance to thinkers such as Aristotle and Asimov for<br />
ethics. It is likely that human ethics in terms of acceptable standards<br />
have improved in some cases but there are only laws similar<br />
to Moore's Law calculations for innovation of technologies<br />
available, not ethics, a most worthy calculation.<br />
Issac Asimov ascribed three laws in the design of robots that<br />
movies, and consumers worry that some scientists too, have reversed<br />
to create plots intended to entertain for 120 minutes. Issac<br />
Asimov's 3 Laws of Robotics:<br />
1. A robot may not injure a human being or, through inaction,<br />
allow a human being to come to harm.<br />
2. A robot must obey orders given to it by human beings except<br />
where such orders would conflict with the First Law.<br />
3. A robot must protect its own existence as long as such protection<br />
does not conflict with either the First or Second Law.<br />
As scientists get closer to making items such as the Minority<br />
Report's screen a reality, the voice of caution seems to ask too,<br />
what is a robot? Nanotechnology has taken flight into the realms<br />
of fantasy too and so have consumers' fears on what nano is motivated<br />
by. As companies post pictures of ingestible RFID the<br />
Center for Responsible Nanotechnology [CRN] lists the major<br />
pitfalls, as they correspond to each of Asimov's laws, as follows:<br />
1. A robot may not injure a human being or, through inaction,<br />
allow a human being to come to harm.<br />
ᆞNanotech weapons would be extremely powerful and<br />
could lead to a dangerously unstable arms race.<br />
ᆞCriminals and terrorists could make effective use of the<br />
technology.<br />
ᆞExtreme solutions and abusive regulations may be attempted<br />
[this is the surveillance concern].<br />
ᆞToo little or too much regulation can result in unrestricted<br />
availability [Why worry about the human spies?<br />
... Beware the atomic spies].<br />
ᆞCompeting nanotech programs increase the danger<br />
[the new arms race...acknowledged].<br />
2. A robot must obey orders given to it by human beings except<br />
where such orders would conflict with the First Law.<br />
ᆞGrey goo was an early concern of nanotechnology.<br />
ᆞToo little or too much regulation can result in unrestricted<br />
availability [Why worry about the human spies?<br />
... Beware the atomic spies and remote activation of ingestibles].<br />
3. A robot must protect its own existence as long as such protection<br />
does not conflict with either the First or Second Law.<br />
62 KOREA <strong>IT</strong> TIMES | July <strong>2010</strong>