Friday, December 18, 2009

Lessons Learned: Notes on teaching Fall 2009

The past semester was spent teaching two classes,the introduction to operations research that is typically taught to juniors as their introduction to the more mathematical side of industrial engineering/operations research. It is a required class, so it gets the entire year at once. The other course was a new course in Homeland Security modeling which was a concept course for graduate students. We are working on a emergency response certificate program, and one of the requirements will be a quantitative modeling course.

I had the advantage of the syllabi from the last two people who have taught the course. The main issue is retention of material over time after the course. With that in mind, the course coordinator (a faculty who was given 'ownership' of the class over the long term) and I made the choice to focus on modeling as opposed to algorithms, with the goal of the students learning modeling which can be applied, even if the actual methods are forgotten.

As a preface, I think the reality is that the top students would do well pretty much regardless of how (in)competent the professor is. For them the best I can do is make the claim that something is worth spending time on. But as much as we may be proud of our top students, and we probably can't do much about the bottom, the middle is where we as professors show our worth. And I try to resist the temptation of only thinking about the top of the class in these notes.

1. Focus on modeling and sensitivity. The overt choice was made to focus the course on modeling and sensitivity as opposed to methods. Given that these are engineering students, it is accepted truth that methods are easier to teach then modeling. The issue is that these students have been taking courses that teach them how to follow procedures for 12+ years in their math and science classes, so modeling is something different. And there are lots of people who work in the field who never really learned how to do this.

One result is that the class was more fun to teach. Because the focus was on the modeling, the concepts could be introduced with examples and the models can be built up from understanding the physical example. For some of the models, after going through the example I could discuss the historical situation that led to the model. For one quiz, I used a paragraph from a New York Times article to provide the problem the students had to model.

Response seemed reasonably positive. In particular, there was gradual recognition of what they were learning as various students started clicking as the semester went on ("I've started to think in sets!"). Others were somewhat resistant, as they were much more comfortable following algorithms. (e.g. simplex, Dijkstra's, MST) There was a general resistance to visualizing the problem through the use of diagrams. In the end, the real test is if they have developed modeling skills by next year when they do senior projects. (while they have LP, queuing, simulation, etc. senior projects tend to be process improvement projects.)


2. Software. In my preclass survey of goals, more then half of the students mentioned something about using software. The textbook uses LINDO (matrix generator) and Excel Solver. I had them learn Excel solver and GLPK. I don't think GLPK was any harder the LINDO. In particular, I think software was less important then I expected. Other then the middle portion of the course that focused on sensitivity and duality, there was not much use of the software to actually solve LPs. There was considerably more time spent on interpretation of output. I don't know if the students actually got skilled at using the software. We went through a few rounds of giving instruction, in class examples, live demonstrations of translating a formulation into a model, a YouTube video (by a business school professor demonstrating the Excel Solver) and a grad student presentation on GLPK.

We also found two bugs. Excel had a tendency of giving solutions that violated a constraint. The issue is that there was a default setting for the tolerance that was positive (>0) and was less then the rounding in the standard display. So the Excel solver violated constraints, even on small problems (where finding a feasible solution should not have been too much work). GLPK had a problem with bounds analysis in the Windows version of the software. It turned out that a fix to this problem was recently found and the patch developed by a senior in the Pittsburgh IE department (i.e. someone who took this class a year ago).

3. Class management. It was a 58 person class, so very large. A large portion of the course was taught semi-socratic, mostly the overview of different types of models. While this was fun as an instructor, the issue with socratic method is you go at the speed of the fastest students. Which I soon realized meant that I was loosing a big chunk of the class, even though there is a lot of repetition involved.

4. Team teaching. For the Homeland security course, this was team taught by myself and the head of a Center that was developing the certificate. There was a problem with communication. While the topics were agreed upon, we seem to have somewhat different ideas on what the use and purpose of models are. This was made worse by a lack of a communication plan between us, so when questions came up, they were not resolved. In addition, he had his students in the class do their project that was of very different character then the rest of the class (or the stated purpose of the course), which made grading and advising problematic. Before doing something like this again, I would have to have a more formal discussion on goals and purpose, as well as plan for ongoing adjustments.

No comments: