Proposed Knowledge Units for Programming Languages
for Curriculum 2001

as formulated by the Programming Language KFG

[PL] Programming Languages

The Programming Language Knowledge Area Focus Group was formed less than two weeks before this report was due, so the report presented here is preliminary. We have had extensive discussion of the topics included here, but wish to present our ideas to a larger group of programming languages experts to get feedback on our ideas. We began with the knowledge units of CC '91, but made extensive changes to update the material and change the balance of material.

Contributing members of this knowledge area focus group are Kim Bruce (Williams College), Benjamin Goldberg (NYU), Chris Haynes (Indiana U.), Gary Leavens (Iowa State U.), and John Mitchell (Stanford U.).

The main changes of this report from CC '91 are

  1. Shifting logic programming from core to an intermediate or advanced topic (we have not yet included a KU for this, however, as we have only focussed on those topics having core material).

  2. Significantly cutting back the material on automata, regular expressions, and context-free grammars to the material most needed for programming languages. We anticipate that some of the material omitted here will be picked up in the algorithms [AL] or foundations [FO] KU's.

  3. Increasing the attention to modules and information hiding.

  4. Including more detail in most knowledge units, esp. those involving object-oriented languages.

  5. Rearranging and reordering KU's to increase their coherence.

The changes we have made resulted in a decrease of roughly 11 hours from the KU's specified in CC '91.

We have included mainly core and intermediate level topics in the KU's. A few advanced topics occur, but we made no attempt to be complete in these (for instance we do not yet have the intermediate / advanced KU's necessary to support a compiler course). We did label the sections of each KU that belong in each category. The following is a short listing of the KU topics:

PL1: History and Overview of Programming Languages (1.5/.5)

PL2: Virtual Machines (1)

PL3: Formal Languages and Language Analysis (1.5/.5)

PL4: Language Translation Systems (1.5/1/1)

PL5: Types (5/0/1)

PL6: Control of execution (3/1)

PL7: Declarations and Modules (5)

PL8: Run-time Storage Management (3)

PL9: Programming Language Semantics (2/0/3)

PL10: Functional Programming Paradigms (5)

PL11: Object-Oriented Programming Paradigm (4/0/1)

PL12: Distributed and Parallel Programming Constructs (3/2/2)

The numbers after each KU titles are the number of hours devoted to the KU. If there is only a single number then all topics are at the core level. If there are multiple numbers then the first represents core hours, the second represents intermediate, and the last (if present) represents advanced hours. We were inconsistent in listing intermediate and advanced topics (and in fact weeded out most advanced topics). Most remaining are there because they were felt by at least one member of the committee to be a candidate for moving to a stronger requirement.

We included justification for most of the KU's, but because of a lack of time, were not able to include them for all KU's. We will add these later. We also hope to have the benefit of comments from a wider group by the time we prepare the next version of this report.

The three processes of theory, abstraction, and design are well represented here. PL3 (formal languages) and PL9 (formal semantics) provide the strongest representation of theory in the curriculum, though aspects of theory also show up in discussions of type checking in PL5. Abstraction is very well represented in these KU's as programming languages provide abstractions for controlling computations and the representation of information. PL2 (Virtual machines), PL5 (types), PL6 (Control of Execution), PL7 (Declarations and Modules), as well as sections of PL10 (Functional Paradigm), PL11 (Object-oriented Paradigm) all present a heavy dose of abstraction. Design shows up in examination of trade-offs in choosing language constructs as well as PL4 (Language Translation Systems), PL8 (Run-time Storage Management), and PL12 (Distributed and Parallel Programming Constructs). Design also shows up in discussions of various programming language paradigms.

The knowledge units presented here do not have any explicit requirements for mathematics and physical sciences, though we expect that discrete mathematics will be important for the prerequisites of many topics. We have not yet prepared model courses with these KU's. We have not yet completely fixed the lists of prerequisites and requisites, so many will be revised later.

As a last thought before presenting the programming language knowledge units: We note that examination of the use of programming languages over the last 30 to 40 years has made it absolutely clear that trends in programming language usage change dramatically over time. Mainstream languages have moved from FORTRAN and COBOL in the 60's to PL/I, Pascal and C in the '70's (and beyond for the last two) to Smalltalk and C++ in the '80's (and beyond) to Java in the '90's - and that is ignoring large programming subcultures using languages like LISP/Scheme, APL, PROLOG, ML, Visual Basic, Perl, etc. Clearly no programmer can expect to use the same language or even the same language family/paradigm during her career. An understanding of the core concepts of programming languages and different programming language paradigms will better enable programmers to keep up with language changes that will occur over their careers.

 

 

PL1: History and Overview of Programming Languages (1.5/.5)

A brief historical survey of major early developments in programming languages, beginning with the evolution of procedural high-level languages. An overview of contemporary programming paradigms and their related languages, including procedural, object-oriented, functional, logic, and parallel programming. Language families and trends.

Recurring Concepts: evolution, conceptual and formal models, complexity of large problems, efficiency, tradeoffs and consequences, levels of abstraction, security

Core Lecture Topics: (One and a half hours minimum)

  1. Early languages; FORTRAN, ALGOL, COBOL, LISP, BASIC.
  2. The evolution of procedural languages (the ALGOL 60, PL/1, ALGOL 68, Pascal, C, Euclid, Modula-2, Ada83, and Ada95 chain of development)
  3. Imperative, algorithmic
    1. Procedural, structured paradigm and languages (Pascal, C, Modula-2, and Ada83)
    2. Object-oriented paradigm and languages: (Simula, Smalltalk, C++, Java, Eiffel, Modula-3, Oberon, CLOS, Dylan)
  4. Mostly-functional, algorithmic, higher-order paradigm and higher-order languages with eager evaluation (Common Lisp, Scheme, ML, APL)
  5. Purely-functional, algorithmic paradigm: (Miranda, Haskell, Clean),
  6. Declarative (non-algorithmic) Languages: Logic programming languages (e.g., Prolog)

Intermediate Lecture Topics (half hour minimum)

  1. Parallel programming paradigms: (CSP, Occam, Turing Plus, SR, Emerald, Java threads, Linda, Ada, Orca)
  2. Scripting paradigm (UNIX Shell, Perl, Tcl, Python, Visual BASIC)

Justification for core sections

The core sections of this KU present the student with an historical perspective on the evolution of the important modern programming languages in the commercial and academic arenas. Of particular importance is the emphasis on identifying families of programming languages and their relationships. This facilitates the student being able to identify substantial similarities among languages in a family, aiding in the student's learning of new languages in a given paradigm (e.g. C++ and Java).

Prerequisites: PF

 

PL2: Virtual Machines (1)

Actual vs. virtual computers. The understanding of programming languages in terms of the corresponding virtual machines (regardless of the actual architecture on which they run). Language translation understood conceptually as an implementation on a virtual machine, followed by a sequence of translations to simpler core languages through a hierarchy of virtual computers.

Recurring Concepts: binding, conceptual and formal models, levels of abstraction.

Core Lecture Topics: (one hour minimum)

  1. What is a virtual machine? (examples: Java Virtual Machine, lambda calculus, etc.)
  2. Hierarchy of virtual machines presented to the user through the program, the translator, the operating system, etc.

Justification for core sections

This unit is important because emphasizes to the student the architectural independence of good programming language design. Using a virtual machine, or a hierarchy of virtual machines, language features - even the implementation of these features - can be compared and evaluated using a machine model that is substantially simpler than actual processors. This greatly facilitates the understanding of the dynamic behavior of a program and proving properties about such behavior. In certain cases, the use of a virtual machine, such as the Java Virtual Machine, facilitates implementations across a wide range of physical machines.

Prerequisites:

 

PL3: Formal Languages and Language Analysis (1.5/.5)

Application of regular expressions and context free languages as formal descriptions of language syntax and their use in programming language analysis.

Recurring Concepts: conceptual and formal models, levels of abstraction

Core Lecture Topics: (one and a half hour minimum)

  1. Overview of regular expressions, context-free grammars (and syntax diagrams), and their use in specifying and implementing programming languages.
  2. Context-free grammars and parse-trees, ambiguous grammars.

Intermediate Lecture Topics: (one half hour minimum)

  1. Applications of regular expressions in lexical analysis.

Justification for core sections

The student should be made aware of how the syntax of a programming language is formally specified, thus enabling the programming language designer to communicate the language syntax to the users and language implementers. Being able to comprehend regular expressions and grammars, such as the Backus-Naur Form (BNF), is crucial for any programmer attempting to learn a new language. Furthermore, the students should understand that there exists a direct relationship between the formal specification of syntax and the structures created by the compiler during parsing. This provides the foundation for automatic generation of lexers and parsers in modern compilers.

Prerequisites: AL5

 

PL4: Language Translation Systems (1.5/1/1)

An overview of the language translation process, encompassing the range from compilers to interpreters. The focus is on the architecture of compilers.

Recurring Concepts: binding, conceptual and formal models, consistency and completeness, levels of abstraction, ordering in space, ordering in time, efficiency, tradeoffs and consequences

Core Lecture Topics: (one and a half hour minimum)

  1. Comparison of pure interpreters vs. compilers; operation and use
  2. Architecture of a compiler (lexical analysis phase, parsing, symbol table, code generation, optimization)

Intermediate Lecture Topics (one hour minimum)

  1. Parsing: concrete and abstract syntax, abstract syntax trees
  2. Code generation by tree walking
  3. Optimization techniques

Advanced Lecture Topics (one hour minimum)

  1. Application of regular expressions in table-driven lexical scanners
  2. Application of cfg's in table-driven and recursive descent parsing

Justification for core sections

The core components of this unit present an overview of how languages are implemented. Although the precise algorithms used in compilers and interpreters can be considered more advanced topics, it is critical that the student understand how the various aspects of a language, such the syntax, type system, etc., correspond to components of the implementation, such as the parser, type checker, etc. Without this view, the student will be unable to understand the extent to which decisions made by the language designer have an effect on the implementation of the language.

Suggested Laboratories:

  1. (open) develop a simple parser (e.g., recursive descent) for arithmetic expression that returns an expression tree.
  2. (closed) Use a compiler-generator tool to specify and run a finite state automaton that will accept a small part of the lexical grammar of some programming language.
  3. Design and exercise a table-driven parser for a simple context-free language.

Prerequisites: AL6, AR3, PL2, PL3

 

PL5: Types (5/0/1)

Models and descriptions of data. Elementary and structured data types. Type checking and type inference. Polymorphism. User-defined and abstract data types.

Recurring Concepts: binding, security, efficiency, tradeoffs and consequences, reuse, conceptual and formal models, levels of abstraction, ordering in space.

Core Lecture Topics: (five hours minimum)

  1. Data type as set of values with set of operations
    1. Elementary data types: Booleans, characters, integers, floating-point numbers.
    2. Structured data types
      1. product types: arrays, strings, records (structs in C/C++)
      2. coproduct types: unions, variant records,
      3. algebraic types (as in ML, Miranda, Haskell)
      4. recursive types: representation by pointers or references
      5. arrow types: function and procedure types
      6. parameterized types
  2. Type checking
    1. Goals
      1. detect errors; preserve intended meaning of program operations
      2. representation independence
    2. Static vs. Dynamic type systems
      1. dynamic strong type checking in Smalltalk and Lisp/Scheme
      2. tradeoff between flexibility and catching errors early
    3. Basic type-checking (without polymorphism)
      1. explicit type checking with explicit type declarations
      2. type inference
  3. User-defined types
    1. type abbreviations (like typedef in C/C++, type in ML)
    2. ADT's and preview of encapsulation via modules
    3. type equality: structural type equivalence (as in ML) vs. name equivalence (as in Java)
  4. Parametric polymorphism (Generics)
    1. intuition and applications: polymorphic operations on lists, other data structures
    2. comparison of implicit (ML, Scheme) and explicit (Ada generics, C++ templates) polymorphism
    3. comparison with ad hoc polymorphism (typecase, instanceof in Java), static overloading (as in Ada83, Haskell), and dispatch in OO languages (and multiple dispatch in CLOS, Cecil)
  5. Subtype polymorphism
    1. structural subtyping rules for records, variants, functions, objects.
    2. type casts (downcast vs. upcast and safety)

Advanced Lecture Topics (one hour minimum)

  1. Type-checking algorithms
    1. explicit algorithm & implicit algorithm w/ parametric polymorphism (Hindley/Milner, as in ML)
    2. algorithm for checking subtyping in an explicit language

Justification for core sections

Progress in type systems is at the core of many advances in programming languages. Types are an important source of abstraction in helping the programmer think about problems in a higher-level way. Advances in type-checking have made the use of stricter type systems a greater help without the cost in expressiveness of older systems. Topics of structural vs. name equivalence of types, parametric polymorphism, and subtype polymorphism are issues that programmers need to understand in working with modern programming languages.

Suggested Laboratories:

  1. (closed) Find a seeded type error in a dynamically-typed language.
  2. (open) Develop the same program (e.g., an interpreter) in a dynamically typed language (e.g., Scheme) and in a statically typed language (e.g., ML). Compare the development efforts and the resulting code.
  3. (open) Implement some generic collection ADT (e.g., set[T], bag[T], etc.) using a recursive representation (e.g., linked lists). Student gains experience with language support for structured types, recursive types, ADT's, specifications and polymorphism.

Prerequisites: PF2, PF3, PF5, PL2

 

PL6: Control of execution (3/1)

Flow of control associated with evaluating expressions and executing statements. User-defined expressions and statements.

Recurring Concepts: levels of abstraction, ordering in time, security, efficiency, tradeoffs and consequences.

Core Lecture Topics: (three hours minimum)

  1. Expressions, order of evaluation of sub-expressions
    1. Reasons for underspecifying evaluation order: reordering can improve efficiency, allow parallelism
    2. side effects and possible non-termination prevent reordering
    3. conditional expressions: some subexpressions are not evaluated; strictness
    4. functions as abstraction of expressions
  2. Statements
    1. assignment, sequencing (S1;S2), function/procedure calls, goto
    2. conditional and case/switch, loops (while-do, do-until), break and continue
    3. procedures as abstractions of statements
    4. iterators as abstraction of loop structure over data structures
  3. Exceptions and exception handling
    1. try-statements in Ada and Java
    2. termination model vs. resumption model

Intermediate Lecture Topics: (1 hour minimum)

  1. parallel composition (S1||S2)
  2. Functions delay evaluation
    1. closures: lambda in Scheme, fun in ML, blocks in Smalltalk
    2. lazy languages and user-defined control constructs (contrast Haskell and Smalltalk)

Justification for core sections

This KU covers basic components of programming languages, emphasizing subtle or difficult issues that may cause problems for programmers. For example, issues of underspecified evaluation order in expressions with side effects have caused serious difficulties in programs. Most of the standard expressions and statements will be covered only in passing, emphasis will be on the use of programmer created abstractions (functions, procedures, and iterators) and especially newer or more unfamiliar constructs like iterators and exception handlers that are important parts of common modern languages.

Prerequisites: PL2

 

PL7: Declarations and Modules (5)

Methods of sharing and restricting access to information in programming languages.

Recurring Concepts: binding, complexity of large problems, levels of abstraction, ordering in space, security, reuse, and evolution.

Core Lecture Topics: (five hours minimum)

  1. Declarations
    1. binding and allocation: aliases, constants vs. variables
    2. visibility of declarations, static vs. dynamic scope
    3. lifetimes (impact of garbage collection. and closures)
  2. Parameterization mechanisms;
    1. parameter-passing: reference, copy (value, result, and value-result), name, sharing (as in Java) and correspondence to declaration forms.
    2. type parameterization: generics or templates (as in Ada, C++), implicit polymorphism (as in ML, Haskell, and Scheme) [overlap w/PL5.4 to be resolved]
  3. Mechanisms for sharing and restricting visibility of declarations (blocks in ALGOL-like languages, modules in Ada and ML, classes and subclasses in OO languages, packages in Java)
  4. Use of modules to enforce information hiding for data abstractions and code reuse in libraries
    1. information hiding and abstraction boundaries
    2. separation of interface and implementation; existential type binding
    3. aliasing and how it may violate information hiding
    4. separate compilation and linking (interface vs. implementation)
    5. information hiding vs. inheritance (protected in C++ and Java)

Justification for core sections

Declaration and scoping issues and problems need to be understood by programmer in order to understand when names are visible and objects they refer to are accessible. Understanding parameter passing modes is essential in order to understand the difference between, for example, parameter passing in Java, Pascal/Ada, and C or C++. Languages like C and Java offer only one parameter passing mode, but other languages offer different or multiple modes. Programmers need to understand the differences between these to avoid confusion. Finally, with the increasing importance of libraries and the general concept of code reuse, a deeper understanding of the purpose of modules and current manifestations of them in programming languages is essential. In particular, information hiding as a way of forming abstraction barriers is key to enabling reuse.

Suggested Laboratories:

  1. (closed) Exercise the same program in languages with dynamic and static scoping, and/or with different parameter mechanisms. Explain the different effects.
  2. (open) Write a large program, in teams, that uses several modules.

Prerequisites: PL2, PL5

 

PL8: Run-time Storage Management (3)

Allocation, recovery, and reuse of storage during program execution.

Recurring Concepts: binding, levels of abstraction, ordering in space, reuse, security.

Core Lecture Topics: (three hours minimum)

  1. Static allocation (as in Fortran or C static)
  2. Stack-based allocation and its relationship with recursion
  3. Heap-based allocation
  4. Garbage collection - include benefits and problems of each technique
    1. Explicit allocation/deallocation (as in C, C++)
    2. Reference counting
    3. Overview of garbage collection algorithms (mark and sweep and/or copying)

Justification for core sections

These sections are critical for having the student understand the effect of language design on implementation, as well as the effect of programming techniques on efficiency and memory usage. With the advent of the first truly popular garbage collected language, Java, it is increasingly important for the student to understand the implementation issues and program correctness issues (explicit vs. automatic allocation/deallocation) involved in choosing a language based on a particular memory allocation/deallocation model.

Prerequisites: PF4, PF5, PL2

 

PL9: Programming Language Semantics (2/0/3)

Use of formal and informal models to describe programming language semantics.

Recurring concepts: conceptual and formal models, levels of abstraction.

Core Lecture Topics: (2 hours minimum)

  1. Informal semantics (e.g., the ALGOL 60 or Scheme reports)
  2. Formal semantics
    1. kinds of formal semantics: operational (natural, SOS), axiomatic (Hoare logics), denotational (Domains, functions)
    2. formal operational semantics of some small language (like PCF ...).
    3. Benefits of formal semantics

Advanced Lecture Topics: (3 hours minimum)

  1. Denotational Semantics.
    1. Domains, fixed points
    2. Denotational semantics of some language features, comparison to their operational semantics
  2. Axiomatic Semantics
    1. Hoare triples
    2. Weakest pre-conditions

Justification for core sections

Students need to know how languages are defined so that they can read language reference manuals, and so that they can more quickly learn new languages. Students are also likely to design something resembling a programming language eventually (such as a class library or user interface extension mechanism), and thus should know how they can do this carefully. Operational semantics is the easiest formalism to teach in this area, and also applies most easily to concurrent programming.

Suggested Laboratories:

  1. (open) Given a semantics of a simple language (e.g., PCF), write an interpreter which implements the semantics.

Prerequisites: PL2, PL3, PL6

 

PL10: Functional Programming Paradigms (5)

The functional programming paradigm. Advantages and disadvantages. Recursion over recursive data structures. Functions as data. Amortized complexity.

Recurring Concepts: conceptual and formal models, levels of abstraction, trade-offs and consequences, efficiency, reuse, security

Core Lecture Topics: (5 hours minimum)

  1. Overview and motivation
    1. problems with reasoning about assignment in presence of aliasing vs. referential transparency when there is no mutation
    2. need for copying when have mutation vs. sharing when no mutation
  2. Recursion over lists, natural numbers, trees, and generalization to other recursively-defined data (using some language like Scheme, ML, Miranda, or Haskell)
  3. Pragmatics: debugging by divide and conquer; persistency of data structures (the old version is still available when the new version is produced).
  4. Amortized efficiency for functional data structures (e.g., amortized queues) and comparison to imperative data structures
  5. Closures, and uses of functions as data (e.g., infinite sets, streams)

Justification for core sections

Functional programming is the main competing alternative to imperative programming. It is important both as a technique that is useful for doing certain kinds of work (e.g., prototyping language designs), and as a connection to other parts of computing. For example, functional programming techniques are used in program specification, theorem proving, and are directly related to mathematics. Moreover, functional programming is important because it teaches students new ways to think about programming, and gives them ideas on how to combine and abstract program parts that are very difficult to see in other paradigms. Recursion is a fundamental technique for functional programming, since it corresponds to recursively described data. The pragmatic features are important for students to understand how to write programs in a functional style (which is important for learning how to specify programs, for example). Knowing something about efficiency in a functional setting prevents students from using these techniques inappropriately, ties the subject to data structure and algorithm analysis, and helps students see the fundamental tradeoffs. The use of higher-order functions is the definition of this style, and important for abstraction and reuse.

Suggested Laboratories:

  1. (open) Write a denotational or operational-style interpreter for a small programming language. Students may also modify such an interpreter to experiment with choices in parameter-passing and scoping.
  2. (open) See the programming exercises in Abelson and Sussman's book...

Prerequisites: PF4, PF5, PL2

 

PL11: Object-Oriented Programming Paradigm (4/0/1)

The Object-Oriented (OO) paradigm. Advantages and disadvantages. Types, classes and objects; subtyping and inheritance. Type checking.

Recurring Concepts: conceptual and formal models, levels of abstraction, trade-offs and consequences, reuse, security

Core Lecture Topics: (4 hours minimum)

  1. Overview and motivation
    1. problems with change in stepwise refinement method
    2. difficulty of code reuse
    3. evolution of programs and the need to reflect incremental changes in the program structure
    4. terms: ADT, type, class, object (instance), method, object's instance protocol (method interface), self/this, super
  2. Mechanisms for defining classes and instances in some OO language (e.g., Smalltalk, Java, C++, Eiffel) and for defining interfaces in Java.
  3. Object creation and initialization.
  4. Inheritance and dynamic dispatch
    1. single vs. multiple dispatch (as in CLOS, Dylan)
    2. dynamic dispatch of methods, method overriding, and method inheritance (examples of method ping-ponging)
  5. Sketch of run-time representation of objects and method tables, how it enables method dispatch.
  6. Distinction between subtyping and inheritance

Advanced Lecture Topics: (1 hour minimum)

  1. Advanced OO type problems
    1. Need for sophisticated parametric polymorphism (e.g., F-bounded or equivalent)
    2. Problems with binary methods

Justification for core sections

Object-oriented programming and object-oriented languages represent the main stream of programming, and programming language design. They also raise many interesting and confusing issues, which need to be discussed in relation to programming languages. Understanding of the basic terms and semantics of such languages is thus fundamental for practical programming, maintenance, and for further language design. Methods and inheritance, which define the object-oriented paradigm, are the key aspects of this understanding.

Suggested Laboratories:

  1. (open) Write an interpreter for a small programming language, or a Turing machine, in an OO language, using the interpreter or visitor pattern.

Prerequisites: PF6, PL2, PL5.

 

PL12: Distributed and Parallel Programming Constructs (3/2/2)

Description of alternative realizations of parallel and distributed programming constructs.

Recurring Concepts: conceptual and formal models, consistency and completeness, efficiency, levels of abstraction.

Core Lecture Topics: (3 hours minimum)

  1. Overview and motivation
    1. Massive (exponential) computational cost of important problems
    2. data-parallel model vs. explicit tasking models of programming
    3. parallel vs. distributed computing (differences in granularity of parallelism and fault tolerance, physical distribution)
  2. Communication primitives for tasking models with explicit communication (distributed programming)
    1. message passing without linked replies (e.g., CSP, Occam, MPI, PVM)
    2. remote procedure calls (e.g., Argus, SR)
  3. Communication primitives for tasking models with shared memory
    1. Semaphores and conditional critical regions
    2. events (publish/subscribe)
    3. threads and monitors (e.g., Ada, Java)

    Intermediate Lecture Topics: (2 hours minimum)

    1. Programming primitives for data-parallel models ( vector and data parallel, SIMD, machines)
      1. parallel machine architectures
      2. language extensions (plural data, compiler directives, Fortran 90, HPF, C*)
      3. new languages (e.g., , ZPL, Data parallel C, NESL, Parlation)
    2. Comparison of language features for parallel and distributed programming.

    Advanced Lecture Topics: (two hours minimum)

    1. Optimistic concurrency control (for tasking models with shared memory) vs. locking and transactions
    2. Coordination languages (e.g., Linda)
    3. Asynchronous remote procedure calls (pipes)
    4. Other approaches
      1. functional languages (e.g., Sisal, Erlang)
      2. nondeterministic languages (e.g., Unity, Parlog)

    Justification for core sections

    Parallel and distributed programming are becoming quite common; for example, Java includes a locks and constructs that allow one to program monitors, and the Java Jini and RPC mechanisms allow one to do RPC. Most programming for window systems involves threads. Even businesses are using distributed programming extensively in CORBA and client-server contexts. Thus it is crucial that students see and understand the basic mechanisms found in such contexts: remote procedure calls (RPC) and monitors.

    Suggested Laboratories:

    1. (open) Develop a parallel program in Java.
    2. (open) Write and measure some scientific program in HPF or some other data-parallel language.

    Prerequisites: PF7, PL6