15.08.2013 Views

General Computer Science 320201 GenCS I & II Lecture ... - Kwarc

General Computer Science 320201 GenCS I & II Lecture ... - Kwarc

General Computer Science 320201 GenCS I & II Lecture ... - Kwarc

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.2 Elementary Discrete Math<br />

2.2.1 Mathematical Foundations: Natural Numbers<br />

We have seen in the last section that we will use mathematical models for objects and data structures<br />

throughout <strong>Computer</strong> <strong>Science</strong>. As a consequence, we will need to learn some math before<br />

we can proceed. But we will study mathematics for another reason: it gives us the opportunity<br />

to study rigorous reasoning about abstract objects, which is needed to understand the “science”<br />

part of <strong>Computer</strong> <strong>Science</strong>.<br />

Note that the mathematics we will be studying in this course is probably different from the<br />

mathematics you already know; calculus and linear algebra are relatively useless for modeling<br />

computations. We will learn a branch of math. called “discrete mathematics”, it forms the<br />

foundation of computer science, and we will introduce it with an eye towards computation.<br />

Let’s start with the math!<br />

Discrete Math for the moment<br />

Kenneth H. Rosen Discrete Mathematics and Its Applications, McGraw-Hill, 1990 [Ros90].<br />

Harry R. Lewis and Christos H. Papadimitriou, Elements of the Theory of Computation,<br />

Prentice Hall, 1998 [LP98].<br />

Paul R. Halmos, Naive Set Theory, Springer Verlag, 1974 [Hal74].<br />

c○: Michael Kohlhase 29<br />

The roots of computer science are old, much older than one might expect. The very concept of<br />

computation is deeply linked with what makes mankind special. We are the only animal that<br />

manipulates abstract concepts and has come up with universal ways to form complex theories and<br />

to apply them to our environments. As humans are social animals, we do not only form these<br />

theories in our own minds, but we also found ways to communicate them to our fellow humans.<br />

The most fundamental abstract theory that mankind shares is the use of numbers. This theory<br />

of numbers is detached from the real world in the sense that we can apply the use of numbers to<br />

arbitrary objects, even unknown ones. Suppose you are stranded on an lonely island where you<br />

see a strange kind of fruit for the first time. Nevertheless, you can immediately count these fruits.<br />

Also, nothing prevents you from doing arithmetics with some fantasy objects in your mind. The<br />

question in the following sections will be: what are the principles that allow us to form and apply<br />

numbers in these general ways? To answer this question, we will try to find general ways to specify<br />

and manipulate arbitrary objects. Roughly speaking, this is what computation is all about.<br />

Something very basic:<br />

Numbers are symbolic representations of numeric quantities.<br />

There are many ways to represent numbers (more on this later)<br />

let’s take the simplest one (about 8,000 to 10,000 years old)<br />

18

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!