TIME |
MONDAY |
TUESDAY |
WEDNESDAY |
THURSDAY |
FRIDAY |
Contents |
|
---|---|---|---|---|---|---|---|
8:40 10:30 |
|
|
|
|
|
||
10:40 12:30 |
|
CENG 471 (T) B301 |
|
|
|
||
12:40 14:30 |
|
|
|
|
CENG 471 (L) INT3 |
||
14:40 16:30 |
|
|
|
|
|
||
16:40 18:30 |
|
|
|
|
|
|
|
Instructoroffice:
Computer Engineering Department, A318 |
TA
office: |
Watch
this space for the latest updates. Last updated:
Erdem Bozoğlu: Advantages and Disadvantages of Parallel Computing
Yağmur Şengez: Deep Blue Chess Playing Supercomputer
Burcu Taşkan: Parallel File Systems
SenaErkaya: Parallel Programming Languages
Rabia Mallı : Grid Computing
Efe Çiftci: Google as an example
Sebile Aydın: Parallel Password Cracking
Gökhan Demireğen: Linux Clustering
Mesut Ateser: Cluster Computing
Duygu Yilmaz: Fibonacci numbers as an example
Serdar Çetinkaya: Deadlock-free parallel processing
Ali Yılmaz: Message passing in Java
Ferdi Tekin: Motivation Parallel Computing and Compare 3 Parallel Computing System
Parallel Summation
General Solution
with point-to-point communications
with collective communicatins
performance analysis (Speed-Up, efficiency, (Amdahl and Gustafson laws : optional))
Parallel matrix-vector multiplication
General Solution
with point-to-point communications
with collective communicatins
performance analysis (Speed-Up, efficiency, (Amdahl and Gustafson laws : optional))
Parallel matrix-matrix multiplication (Optional)
General Solution
with point-to-point communications
with collective communicatins
performance analysis (Speed-Up, efficiency, (Amdahl and Gustafson laws : optional))
This course provides an introduction to parallel and distributed computing and practical experiences in writing parallel programs on a cluster of computers. You will learn about the following topics:
Parallel Computers
Message Passing Computing
Embarrassingly Parallel Computations
Partitioning and Divide-and-Conquer Strategies
Pipelined Computations
Synchronous Computations
Load Balancing
Programming with Shared Memory
Topics might be classifed into two main parts as;
Parallel computers: architectural types, shared memory, message passing, interconnection networks, potential for increased speed.
Basic techniques: embarrassingly parallel computations, partitioning and divide and conquer, pipelined computations, synchronous computations, load balancing, shared memory programming.
There is one group for lecturing. You will be expected to do significant programming assignments, as well as run programs we supply and analyze the output. Since we will program in C on a UNIX environment, some experience using C on UNIX will be important. We will provide tutorials for basic C on UNIX during the first few class periods.
In lab sessions, we will concentrate upon the message-passing method of parallel computing and use the standard parallel computing environment called MPI (Message Passing Interface). Thread-based programming will also be outlined, and the distributed shared memory (DSM) approach (If we have enough time). Each student will complete a project based on parallel computing for the laboratory study.
Also, each student will complete a project based on parallel computing, (distributed computing, cluster computing) for the midterm exam.
Important announcements will be posted to the Announcements section of this web page above, so please check this page frequently. You are responsible for all such announcements, as well as announcements made in lecture.
Readings will be assigned in Parallel Programming: Techniques and Application Using Networked Workstations and Parallel Computers, 2nd edition, by B. Wilkinson and M. Allen, Prentice Hall Inc., 2005, ISBN 0-13-140563-2.
Beowulf Cluster Computing with Linux, 2nd edition, edited by William Gropp, Ewing Lusk, Thomas Sterling, MIT Press, 2003, ISBN 0-262-69292-9.
Beowulf Cluster Computing with Windows, Thomas Sterling , MIT Press, 2001, ISBN 0-262-69275-9.
Using MPI , Portable Parallel Programming with the Message Passing Interface, William Gropp, Ewing Lusk and Anthony Skjellum, The MIT Press, 1999, ISBN 0-262-57132-3.
Using MPI-2, Advanced Features of the Message Passing Interface, William Gropp, Ewing Lusk, Rajeev Thakur, The MIT Press, 1999, ISBN 0-262-57133-1.
MPI: The Complete Reference (Vol. 1) - The MPI Core, Marc Snir, Steve Otto, Steven Huss-Lederman, David Walker and Jack Dongarra, The MIT Press, 1998, ISBN 0-262-69215-5.
MPI: The Complete Reference (Vol. 2) - The MPI-2 Extensions, William Gropp, Steven Huss-Lederman, Andrew Lumsdaine, Ewing Lusk, Bill Nitzberg, William Saphir and Marc Snir, The MIT Press, 1998, ISBN 0-262-57123-4.
In Search of Clusters: The ongoing battle in lowly parallel computing, Second Edition, by Gregory F. Pfister, Prentice Hall Publishing Company, 1998, ISBN: 0-13-899709-8.
How to Build a Beowulf – A Guide to the Implementation and Application of PC Clusters, by Thomas Sterling, John Salmon, Donald J. Becker and Daniel F. Savarese, MIT Press, 1999, ISBN 0-262-69218-X.
PVM: Parallel Virtual Machine, A Users' Guide and Tutorial for Network Parallel Computing, Al Geist, Adam Beguelin, Jack Dongarra, Weicheng Jiang, Robert Manchek and Vaidyalingam S. Sunderam, MIT Press, 1994, ISBN 0-262-57108-0.
This texts are only recommended rather than required. This books are useful for reference, for an alternative point of view.
Some materials are given. Please inform me about the usefullness of the materials. Check this place for updates.
The following references are available online
There will be a final exam: 40%
Term Project as Midterm exam: 25%
Term Project as Lab. exam: 25%
Attendance is required and constitutes part of your course grade; 10%
Attendance is not compulsory, but you are responsible for everything said in class.
I encourage you to ask questions in class. You are supposed to ask questions. Don't guess, ask a question!
You may discuss homework problems with classmates (although it is not to your advantage to do so).
You can use ideas from the literature (with proper citation).
You can use anything from the textbook/notes.
The code you submit must be written completely by you.
The following schedule is tentative; it may be updated later in the semester, so check back here frequently.
Week |
Dates |
Topic |
Lecture Notes |
||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Lectures |
|||||||||||||||||||||
1 |
Sep 25, 2007 |
First Meeting |
|||||||||||||||||||
Sep 28, 2007 |
|||||||||||||||||||||
2 |
Oct 2, 2007 |
||||||||||||||||||||
Oct 5, 2007 |
|||||||||||||||||||||
3 |
Oct 9, 2007 |
||||||||||||||||||||
Oct 12, 2007 |
Ramadan Holiday October 11-14, 2007 |
||||||||||||||||||||
4 |
Oct 16, 2007 |
||||||||||||||||||||
Oct 19, 2007 |
|||||||||||||||||||||
5 |
Oct 23, 2007 |
||||||||||||||||||||
Oct 26, 2007 |
|||||||||||||||||||||
6 |
Oct 30, 2007 |
Programming Using the Message-Passing Paradigm II National Holiday October 29, 2007 |
|||||||||||||||||||
Nov 2, 2007 |
|||||||||||||||||||||
7 |
Nov 6, 2007 |
||||||||||||||||||||
Nov 9, 2007 |
|||||||||||||||||||||
8 |
Nov 13, 2007 |
NA |
NA |
NA |
|||||||||||||||||
Nov 16, 2007 |
Laboratory Review |
NA |
|||||||||||||||||||
9 |
Nov 20, 2007 |
Laboratory Review |
NA |
NA |
|||||||||||||||||
Nov 23, 2007 |
|||||||||||||||||||||
10 |
Nov 26 - 30, 2007 |
Midterm Week |
|||||||||||||||||||
11 |
Dec 4, 2007 |
||||||||||||||||||||
Dec 7, 2007 |
|||||||||||||||||||||
12 |
Dec 11, 2007 |
||||||||||||||||||||
Dec 14, 2007 |
|||||||||||||||||||||
13 |
Dec 18, 2007 |
Religional Holiday December 19-23, 2007 |
|||||||||||||||||||
Dec 21, 2007 |
|||||||||||||||||||||
14 |
Dec 25, 2007 |
||||||||||||||||||||
Dec 28, 2007 |
|||||||||||||||||||||
Exams |
|||||||||||||||||||||
Final |
|
|
|