Contents
Original PL Prelim Syllabus
Jeremy Condit's mirror of PL Prelim Syllabus
Bor-Yuh Chang's Programming Language Prelim Study Schedule
CS164 - Compilers and Programming Languages
CS263 - Design and Analysis of Programming Languages
CS264 - Implementation of Programming Languages (Spring 1995)
CS264 - Implementation of Programming Languages (Spring 2002)
ML
Prolog
Scheme
Introduction to Combinators and λ-Calculus [5].
Chapter 1. λ-Calculus. P.1-19
Chapter 2. Combinatory Logic. P.20-31
Chapter 3. The Power of λ and Combinators. P.32-46 (excluding Boehm's theorem)
Chapter 13. Typed Terms. P.159-167
The Formal Semantics of Programming Languages - an Introduction [3].
Chapter 2. Introduction to Operational Semantics. P.11-26
Chapter 5. The Denotational Semantics of IMP. P.55-76
Chapter 6. The Axiomatic Semantics of IMP. P.77-98
Chapter 8. Introduction to Domain Theory. P.119-140
The Science of Programming [8].
Chapter 7. The Predicate Transformer wp. P.108-113
Chapter 8. The Commands skip, abort and Composition. P.114-116
Chapter 9. The Assignment Command. P.117-130
Chapter 10. The Alternative Command. P.131-137
Chapter 11. The Iterative Command. P.138-148
Chapter 12. Procedure Call. P.149-162
Compilers: Principles, Techniques, and Tools [4].
Chapter 6. Type Checking. P.343-388
Types and Programming Languages [1].
Chapter 8. Typed Arithmetic Expressions. P.91-98
Chapter 9. Simply Typed Lambda-Calculus. P.99-111
Abstraction and Specification in Program Development [6]
Chapter 4. Data Abstraction. P.56-98
Chapter 10. Writing Formal Specifications. P.187-218
Essentials of Programming Languages [2]
Chapter 7. Continuation-Passing Interpreters. P.241-300
Chapter 8. Continuation-Passing Style. P.301-344
Note 1: The above chapter list contains only the chapters most relavent to the prelim. I.e., it may be very helpful reading other chapters in the same books.
Note 2: If you are another Spring 2005 PL prelim taker waiting for the same books from the UCB libraries, please contact me directly.
Pratical issues:
Hints on Programming Language Design
[102] (2004.10.19)
Lisp: Good News, Bad News, and How to Win Big [103] (2004.10.20)
Functional languages:
Object-oriented languages:
SELF: The Power of Simplicity
[25] (2004.10.23)
Engineering a Programming Language: The Type and Class System of Sather
[99] (2004.10.23)
Pizza into Java: Translating Theory into Practice
[67] (2004.10.24)
Logic programming
Algorithm = Logic + Control
[35] (2004.10.25)
Algol family languages
Revised Report on the Algorithmic Language ALGOL 60 [40] (2004.10.25)
Parallel languages
Parallel Programming in Split-C
[71] (2004.10.26)
Programming Parallel Algorithms
[19] (2004.10.26)
Jade: A High-Level, Machine-Independent Language for Parallel Programming
[24] (2004.10.27)
An Introduction to Programming with Threads
[101] (2004.10.28)
Data Abstraction
Abstraction mechanisms in CLU
[39] (2004.10.30)
Type Inference
A Theory of Type Polymorphism in Programming
[37] (2005.01.08)
Subtyping
A Behavioral Notion of Subtyping
[22] (2004.10.31)
Subtyping Recursive Types
[23] (NOT FINISHED) (2004.11.02)
Parsing
Practical LR Error Recovery
[81] (2004.11.04)
A Practical Method for LR and LL Syntactic Error Diagnosis and Recovery
[33] (2004.11.05)
Attribute grammars
Automated code selection
Optimal Code Generation for Expression Trees: An Application of BURS Theory
[78] (2004.11.07)
BURS Automata Generation
[20] (2004.11.08)
BURG -- Fast Optimal Instruction Selection and Tree Parsing
[74] (2004.11.08)
One-Pass, Optimal Tree Parsing - With Or Without Trees
[69] (2004.11.09)
Hard-coding Bottom-up Code Generation Tables to Save Time and Space
[26] (2004.11.10)
Garbage collection
Uniprocessor Garbage Collection Techniques
[72] (2004.11.15)
Garbage Collection in an Uncooperative Environment
[29] (2004.11.16)
Garbage Collection can be Faster than Stack Alocation
[32] (2004.11.17)
Simple Generational Garbage Collection and Stack Allocation
[28] (2004.11.19)
Numerics
Compiler Support for Floating-point Computation
[30] (2004.11.24)
Runtime representations
Continuation-passing, Closure-passing Style
[76] (2004.11.20)
A New Implementation Technique for Applicative Languages
[34] (2004.11.20)
Abstract interpretation
Call graphs for higher-order languages
Control Flow Anlaysis in Scheme
[77] (2004.11.24)
Object-oriented Type Inference
[75] (2004.11.26)
Type-based analysis
Global Tagging Optimization by Type Inference
[73] (2004.11.26)
Barrier Inference
[65] (2004.11.28)
Dynamic compilation
Dynamic vs. Static Optimization Techniques for Object-Oriented Languages
[18] (2004.11.30)
Annotation-Directed Run-Time Specialization in C
[66] (2004.12.01)
Fast, Effective Dynamic Compilation
[68] (2004.12.02)
Parallelization and Vectorization
Automatic Translation of FORTRAN Programs to Vector Form
[31] (2004.12.04)
Compiler Transformations for High-Performance Computing
[21] (2004.12.08)
There are 2 questions for the prelim, half an hour for each. The examiners are Prof. Ras Bodik and Prof. Sue Graham. These questions are open ended. What the students are asked as sub-questions of each question depends heavily on their previous answers.
-
SSA (Ras Bodik).
What is "SSA" short for?
Define SSA. (You could give some examples and then derive a more formal answer.)
-
Since I gave an example like this:
v = ... u1 = v; while (...) { u2 = v; v = ...; u3 = v; }
I was asked to perform the SSA transformation on
v
. What's the growth of program size after transforming it into SSA (linear/quadratic/exponential)? In terms of a single variable? In terms of all the existing variables?
What is SSA good for? It can make some analysis more powerful and some faster. Give an example of analysis made faster by using SSA.
How to achieve minimal SSA?
Suppose now you don't care about performance. Give an SSA transformation algorithm.
Since I gave a bruteforce algorithm (right, the least efficient one), I was asked to improve it slightly (dominance frontier algorithm).
Size of definition-use chains of variables after transformed into SSA.
28 minutes passed.
-
Macro (Sue Graham).
Give a definition for macros.
Are macros good or bad? Give examples.
Since I carelessly mentioned macros introduce no overhead, I had to clarify this statement, and compare macro expansion with method calls. :(
In which stage are macros handled?
Someone says macros are not natural in a language because they are expanded with a preprocessor that does not belong to the compiler. If we're not gonna use preprocessors, how to implement macros with languages constructs? You're free to invent any constructs appropriate.
Since I proposed method inlining, I had to compare the power of inlining and macros. :(
In which way bindings differ in programming languages than in macros?
How to implement multiple return values and iterators without macros?
It's common to define macros for different compilation targets. How to use PL constructs to achieve the same effect?
15 minutes out of time. :)