Course 2015-2016

Object Oriented Programming

Software Architecture
Manuel Mazzara and Bertrand Meyer, Spring 2015


This course explores issues and fundamental techniques for producing and deploying successful software systems, including large ones.

The course includes three components:
- Software architecture in the strict sense: designing modular structures for reliability, extendibility and reusability. Software architecture addresses the overall organization of software systems and the techniques that make the development of large systems possible. Not all programming techniques that work on small programs can "scale up"; this course explores some that do.
- Non-programming, non-design aspects of software engineering, such as lifecycle models, metrics, project management, quality assurance (including testing and other verification methods), software tools.
- A strong practical content in the form of a collaborative project that applies the techniques discussed.


Understanding the issues emerging when software systems scale up in size and learning techniques to manage complexity and deploy successful software.

Students will learn how:

  • to be effective software engineers
  • to manage complexity in large software systems
  • to design, implement, verify and maintain efficient systems
  • to work in a development team

Assessment mechanism

· 50% based on the project
· 25% on a mid‐term exam at week 9
· 25% on the final exam in June (date to be fixed)

Reference Material

  • Bertrand Meyer: Object-Oriented Software Construction, 2nd edition, Prentice Hall 1997
  • Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley (1995);
  • Lecturing slides will be provided

Lectures slides (Monday): Lab slides (practice)

Week 1 Practice plan and course project

Week 2 Introduction

Week 3 Requirements engineering

Week 4

Week 5

Week 6 Design by contract

Week 7

Week 8

Mid-term exam

Week 10

Week 11 (A)

Week 11 (B)

Week 12 Testing examples

Week 13

Week 14


Theory of Computation
Manuel Mazzara, Fall 2014

Material to appear here, for IU students now available on Dropbox


This course will investigate the fundamentals behind compilers functioning. Although the act of compilation appears deceptively simple to most of the modern developers, great minds and results are behind the major achievements that made this possible. All starts with the Epimenides paradox (about 600 BC), which emphasizes a problem of self-reference in logic and brings us to the short time window between WWI and WW2 when, in 1936, Alan Turing proved that a general procedure to identify algorithm termination simply does not exist. Another major milestone has been reached by Noam Chomsky in 1956 with his description of a hierarchy of grammars. In this long historical timeframe we can put most of the bricks with which we build modern compilers. The course will be an historical tour through the lives of some of the greatest minds who ever lived on this planet.


A good software developer ignorant of how the mechanics of a compiler works is not better than a good pilot when it comes to fix the engine and he will definitively not able to provide more than average solutions to the problems he is employed to solve. Like automotive engineering teach us, races can only be won by the right synergy of a good driving style and mechanics. Most importantly, limits of computation cannot be ignored in the same way we precisely know how accelerations, forces and frictions prevent us from racing at an unlimited speed.

Students will learn how:

  • Compilers work behind the scenes
  • Some History of computing and its theory and major personalities
  • Limits of computation, i.e. what computers cannot do
  • What is tractability of a problem

Assessment mechanism

Students will be assigned papers or chapter by the J. Hromkovic book and will be asked to present a seminars. This will account for 1/3 of the evaluation. There will then be a mid-term Exam and Final Exam accounting for the remaining 2/3 (exam material attached separately).

Reference Material:

  • J.E.Hopcroft and J.D.Ullman. Introduction to Automata Theory, Languages, and Computation. Addsion Wesley (1979). - M.Davis, R. Sigal and E.J. Weyuker. Computability, complexity, and languages: fundamentals of theoretical computer science. 2nd ed., Academic Press (1994). - J. Hromkovic. Algorithmic Adventures: From Knowledge to Magic. Springer (2009)
  • Lecturing slides will be provided

Distributed and Outsourced Software Engineering
Manuel Mazzara, Fall 2014

Material to appear here, for IU students now available on Dropbox


This course explores the offshoring phenomenon from a technical software engineering perspective, providing a set of guidelines for making outsourced projects succeed, through both management approaches (in particular the CMMI) and technical solutions in areas of requirements, specification, design, documentation and quality control. The presentation is based on experience of outsourcing at ABB and other companies. The participants will take part in a case study exploring techniques for making an offshored project succeed (or recover from problems).This course provides students with a clear view of the offshore software development phenomenon, enabling them to participate successfully in projects outsourced partially or totally, and also helping them define their own career strategies in the context of outsourcing's continued growth.


Good software engineers represent a role of crucial important in software industry. However, many of them do not have a proper background to be effective in a team, especially when this is geographically distributed.

Students will learn how:

  • to be effective requirements engineers
  • to design efficient object oriented systems
  • to develop quality object oriented code
  • to work in a geographically distributed team
  • to understand the need for and the method of outsourcing
  • to grasp the basic of agile development

Assessment mechanism

Assessment is entirely based on the project to be submitted, in coordination with the other universities participating, before Christmas 2014. The project will be evaluated according to:

  • Correct implementation of requirements
  • Architectural Design
  • Functionalities
  • Usability
  • Performances
  • Effectiveness of teamwork (remote and local)

Reference Material:

  • Bertrand Meyer. Object-Oriented Software Construction (2nd edition). Prentice Hall (1997);
  • Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley (1995);
  • Lecturing slides will be provided

Dependability and performance evaluation of computer systems
Salvatore Distefano, Fall 2014

Material to appear here, for IU students now available on Dropbox


The course covers the basics of the actual data center architectures, ranging from the analysis of the single components to the global infrastructure.


To analyze a modern enterprise data center, focusing on the technologies and on the main components, such as computing, memory, storage, and network systems.

Students will have:

  • Capacity in planning of an IT infrastructure
  • Scalability
  • Performance in evaluation
  • Dependability

Assessment mechanism

Students will be given assignments which will account for 30% of the evaluation. There will then be a mid-term Exam accounting 30% and Final Exam accounting 40% of the evaluation.


  • Kishor S. Trivedi. Probability and Statistics with Reliability, Queuing, and Computer Science Applications, John Wiley and Sons, New York, 2001. ISBN number 0-471-33341-7

Reference Material:

  • Ananth Grama, George Karypis, Vipin Kumar, Anshul Gupta. Introduction to Parallel Computing, 2/E. ISBN-10: 0201648652 • ISBN-13: 9780201648652, 2003 Addison-Wesley, 656 pp
  • Edward D. Lazowska, John Zahorjan, G. Scott Graham, Kenneth C. Sevcik. Quantitative System Performance Computer System Analysis Using Queueing Network Models (
Сайт находится в технической разработке