CS 255 Notes 1
-
Introduction
I got my BS and MS from UW-Madison. Ph.D from IIT in 1993. I've been
teaching since 1984. This is my first class at DePaul. I work at Lucent
Tech. It is ok to call me at work. If busy, I will call back. Email is
good. Call me at home before 11:00. At work or home after 0900. We screen
calls, so wait for the tone. Call me anything you want. I like interruptions.
It's easier to answer questions in the context you thought of them, rather
than later. Also, if you have a questions, so do several other people.
I also appreciate suggestions on how to improve the course, especially
when there is time for any changes to benefit you.
-
Assignments
There will be several kinds of assignments. I will be assigning problems
from the book, to keep up your interest in reading the material. Also,
it gives me a way to see if you are following the topics or if I need to
spend more time on them.
When we get to the pseudo-machine programming part of the course, there
will be small programming assignments. Most assignments will be due the
week after they are assigned.
-
Tests and test-like things
There will be a midterm test and a final. The final won't be cumulative.
I want to have a few quizzes as well. You will be given at least a weeks
warning on the quizzes.
-
Grades
The homework problems will be worth 20 points. The program projects
will be worth 50 points. The tests will also be worth 50 points. Quizzes
are worth 20 points.
I expect work to be turned in when it is due. I will give one day of
grace with no penalty. Each day after that loses 10%.
-
Missed classes
If something in real life causes you to miss a class, let me know as
soon as you can. The class notes will be available on the web.
-
Office hours
I will be here after class as needed most days. Other times are available
by appointment.
-
Overview
This class is going to cover a wide range of topics. We start with
very primitive parts of the machine and work our way up to moderately complex
structures. We will also cover some parts of computer architecture and
do a little programming in there as well. The title of this course
suggests the theme. One of the key jobs of software engineering is to figure
out how to represent the real world in the computer. A computer sees the
world in exactly two states, on or off. The real world is a little more
complicated. We have to figure out a way to translate the enormous complexity
of the real world into the limited and simple world of the computer. In
this class, we start to learn what our tools are and what they can do.
-
What is CS?
Computer science is actually a large collection of different topics.
CS proper is the study of algorithms. Software engineering is the practical
effort of turning the theories of CS into the solutions of real problem.
It is the same relationship civil engineering has to physics, just not
as old or deep.
Algorithms are encoded knowledge. For example, the first person to
make chocolate chip cookies had to try a lot of combinations before they
got it right. By writing it down as a sequence of small, precise steps,
they store all that knowledge so that you can make the cookies without
all the study and work. Some of the intelligence needed to do t he task
is removed, or rather, stored in the algorithm. Part of your studies in
CS will be learning algorithms others have developed.
A great deal of research has been done in discovering algorithms.
One of the results of this effort was the discovery bu Godel that some
problems have no solution, or that the solution can not be computed in
reasonable time. Reasonable in this case is measured in fractions of the
lifetime of the universe.
-
History
Some machines, like the abacus, have helped in computation for a very
long time. The actual history of computing is much more recent. People
started designing computing machines in the 1600's. These were mostly special
purpose devices until Charles Babbage designed the difference engine. It
would take instructions on cards to determine what operations to perform.
While it wasn't built due to the limitations of the technology of the time,
some programs were written by Ada Loveless. Also around this time, looms
were built that used punched cards to determine what to make. This made
very detailed patterns to be woven into the cloth, Punched cards were used
to help in computation of the 1890 census.
In the 40s and 50s, electro-mechanical and electronic computers began
to be built. We would recognize the overall architecture of these machines
today, although we wouldn't recognize most of the parts. Mercury delay
lines and tubes are pretty exotic things these days.
A non-mechanical computer was used for some of the atomic bomb calculations.
The mathematicians on the project broke the computations into a large number
of small steps that were then accumulated to produce the result. Rooms
full of people, mostly women, did the arithmetic on the small pieces and
accumulated the results. This is similar to the SIMD and MIMD parallel
computers of today.
-
Abstraction
We will see this term used a lot. In order to reduce the horrendous
complexity in large problems, we abstract the problem to higher and higher
levels. A single memory bit is actually a fairly complex thing. We talk
about windows as a thing to hide the huge number of things that go into
making it appear on the screen. Real life examples are cars and refrigerators.
-
Bits
A bit is a binary digit. It is either 1 or 0, on or off. It
doesn't matter what we call them, they are just the two states a bit can
be in. A bit is the smallest unit of information. They can be manipulated
using several Boolean (George Boole did a lot of research into mathematical
logic) operations. These include AND, OR and XOR.
-
AND
The AND operation works much like the and conjunction in English. If
A is true and B is true then A AND B is true. If either us false, then
A AND B is false. It is only true if both arguments are true.
-
OR
Similarly, OR works like or. A OR B is true if either A or B is true.
It is only false if both are false.
-
XOR (exclusive or)
XOR doesn't have an English conjunction to map to. It means either
A or B is true but not both.
-
NOT
This changes the value from true to false or from false to true.
There are some examples of these operations on page 17.
-
Gates
A gate is a device that produces a Boolean operation on its inputs.
They can be combined to produce a wide variety of results. One example
is a flip-flop. See the picture on page 18. If the two inputs are
both 0, the output will be whatever the flip-flop was last set to. Sending
a 1 into one or the other of the inputs will change the stored value. This
is very important because we can now build devices the remember things.
These days, electronic versions of flip-flops are used in memory. In the
past, other kinds of memory were used. Examples include core memory, mercury
delay lines and cathode ray tubes.
-
Hex notation
Long bit patterns a very difficult to remember. So a different notation
is used to represent them. This is another way to use abstraction to reduce
the compexity. Each hex digit (base 16) represents 4 bits. So
1010010011001000
can be represented as
A4C8
You can use the table on page 21 to decode one into the other.
-
Memory
We now have a way to store bits and a way to represent long bit strings.
So this gives us a memory. Each memory location is called a cell and holds
a certain amount of bits. THe smallest cells are 8 bits, called a byte.
Several bytes are usually accumulated into words. In most current computers,
a word consists of 32 bits or 4 bytes. Older machines had word sizes of
36 and 60 bits. This was related to the fact that they stored characters
as 6 bits.
Each cell has a unique address. Imagine all the memory cells being
laid out in one contiguous row, Then the address of a cell is the same
as its sequence number, starting at 0. 4 megabytes of memory is addresses
from 0 to 4192303. Since we can address any of these cells just by using
the address, this kind of memory is called RAM (Random Access Memory).
Within a cell, there is an order and structure to the bits. The bits
are numbered starting from the right and going left. The left most bit
is called the most significant bit and the right most is called the least
significant bit.