Saturday, May 28, 2011

How Do You Know How Your Program Is Doing?

I had a great time last week attending the Teaching Summit put on by the Lebow College of Business at Drexel University. Dr. Thomas Hindelang was a wonderful host and I was so pleased to have the opportunity of presenting the opening keynote address. It is hard to boil a 45 minute speech down to a sentence or two but my primary theme was the disconnect between what colleges and universities claim to accomplish and what the new book Academically Adrift: Limited Learning on College Campuses (Arum and Roksa) says that we are actually accomplishing. I then described six steps that I felt we (as faculty) needed to consider if we were going to fulfill the promises being made to our students.

I might write more about those thoughts later but that speech was not the purpose of this post. After giving my opening remarks, I attended a number of paper presentations at the conference. One of my favorites was Feedback from Alumni and Employers Guiding Assessment of Business Curricula by Ellen Kraft and Diane Holtzman of Richard Stockton College of New Jersey.

I was interested in this paper because it addressed one of my pet peeves about colleges. Over the past 40 years, I cannot tell you how many meetings I have attended where academic programs were discussed and changes suggested with absolutely no statistical backup. No evidence was given that a change was needed and no one could show that the proposal in question might actually provide improvement. It seems that we are often asked to modify programs based on our own intuition of what might work better.

Professors Kraft and Holtzman had helped direct a survey at their school where alumni as well as the employers of those former students were asked to identify characteristics that employees needed for success. Because I didn’t take great notes, I won’t try to explain their methodology. I was just impressed that the folks at that college wanted data to help in assessing the past and planning for the future.

All educational programs must evolve over time. That process, though, should be guided (I believe) by careful data analysis. Too often changes are made that seem to have no basis other than which person can argue his or her position the best (or who can last the longest in the debate).

Since I believe data gathering is essential as a prerequisite for change, I wanted to mention two relatively painless ways that information can be generated for better program decision-making.

(1) – I am a member of the Accounting Department here at the Robins School of Business at the University of Richmond. Last summer, we decided to survey all Accounting students who graduated between May 2006 and May 2009. We didn’t include the 2010 graduates because they had not yet started work. We didn’t go back earlier than 2006 because the program had changed enough over the intervening years that we were not sure the feedback would still be relevant.

We had reached the point where we were tired of making decisions based on what we would hear from former students (almost randomly) at alumni receptions or from an occasional email. The basic questions were obvious: What were we doing well and, more importantly, where did we need to make improvements to better prepare our students for careers after graduation? All faculty members like to believe they are adding value and providing their students with an appropriate education. But how does any teacher really know?

The Alumni Affairs office here on campus was able to provide us with email addresses for well over half of those students who graduated in our designated time period. The Office of Institutional Effectiveness helped create the survey, send it out, and accumulate the returned data.

We asked a number of general questions including:
--How many jobs have you held since graduation?
--In what field are you currently working?
--Have you taken the CPA Exam and, if so, how well have you done?
--Have you attempted any other certification exams such as the CMA Exam?
--Have you attended graduate school?

We then took the former students through a number of different subject areas (income tax, auditing, government accounting, financial accounting, and the like) and asked them to assess whether they felt their education was adequate (a) for their employment and (b) in comparison to students from other schools with whom they worked. We also asked the same questions for three skills: written communications, oral communications, and IT skills.

We wanted to pinpoint areas where changes might need to be made.

Our response rate was quite good and the information has proven to be helpful as we move forward and try to guide the logical evolution of our program. Although the overall results were excellent, we did uncover several smaller areas where a number of individuals indicated that they had been at a disadvantage educationally. Obviously, we have worked on those areas during the past 9 months to rectify the problems.

(2) – Approximately six years ago, our department – once again – wanted to gather information to aid our internal discussions and decisions. Many of our Accounting students continue to live here in the Richmond area after graduation. We invited 16 former accounting majors who had been out of school 1-3 years to return to campus for a one-hour focus group. Because of the number, we had to split them into two focus groups held on separate days. We asked a facilitator to talk with these individuals about their job experiences and how well prepared they felt based on the education they received in the Richmond program. We provided specific questions for the facilitator to discuss with the group. The sessions were audio-taped and a transcript was typed. The facilitator wrote up his assessment of the information that was gathered.

No serious political campaign would ever make decisions without focus group studies. No large business would ever start a major advertising campaign without focus group studies. I am surprised that universities do not make better use of that technique to help assess the quality of the education that their students actually obtain. Once again, as with the email survey that was done more recently, we were pleased with what we heard although the former students did point up areas where they felt they had been at a disadvantage in their jobs.


Perhaps one reason that books like Academically Adrift are written is because universities do not do a better job of talking with their graduates to identify both strengths and weaknesses of the academic programs being offered. Professors Kraft and Holtzman seem to have done a great job of providing useful information for their school.

When is the last time your program made a serious attempt to generate data to indicate the quality of the education that your students are receiving? Either through a survey or through a focus group, information is not that difficult to obtain. Such data can help guide the decisions that are made in the future about modifications needed in your academic programs.

No comments:

Post a Comment