Peggy Lowry (recently retired from Oregon State) and I (retired from Colorado State) published an article about this topic way back in 1992.  "The Need to Evaluate Research Support Offices in Institutions of Higher Education,” Journal of the Society of Research Administrators (Vol XXIII, No. 4 (Spring issue)).  Peggy published at least one more article on the topic, with Sharon Davis, in maybe 1995, I think with NCURA.  This is to say there isn’t a great body of literature about the topic.  

 

As you put together your evaluation process, it might be worthwhile to ask some questions.  Those which come to my mind include:

 

  1. What all are we doing?  Newsletters, active electronic information (e.g., forced e-mails to lists or individual faculty about opportunities), training, FAQs, data bases, one-on-one contacts (e.g., meeting with new faculty), editing, coaching, topical seminars, collaborations with administration or post-award…
  2. What are the quantities involved?  Staffing hours by professional levels, numbers of contacts, numbers of programs tracked, numbers of recipients of info or opportunities advertised, numbers of training events and numbers of attendees from how many departments, dollars-worth of programs promoted vs dollars-worth of proposals submitted vs dollars-worth of awards, number of fields or agencies advertised.
  3. Has anything changed?  If earlier assessment results are available, how does current compare to former?  If no earlier assessment results are available, has anything changed noticeably?  New faculty, new dept, new emphasis, new staff, new effort in some direction, new funding, obvious increases, change in pace, demand?
  4. How well are we doing what we do?  Qualitative assessments by faculty, pre-award staff, administrators involved in proposal submissions at all levels about what works well and what has not, including both process and product. On-going evaluations of training efforts can contribute.  Some info here might be useful in focusing efforts more precisely – what seems to be unproductive, unnecessary, labor-intensive?
  5. Attribution and predictions.  Can you tell what contributes to what?  Will new systems on campus predicate new services or procedures?  Will projected research emphases alter pre-award directions?  Staffing changes?

 

It is worth mentioning that numbers don’t account for everything.  Sometimes just FINDING a funding opportunity for a faculty member in a field without much funding available, or the FIRST award for a new faculty member albeit at $20K, is a stunning success.  Find a way of recording these, as well as the $8.7M awards.  

 

Celia Walker

xxxxxx@Comcast.net 

====================================================================== Instructions on how to use the RESADM-L Mailing List, including subscription information and a web-searchable archive, are available via our web site at http://www.hrinet.org (click on "Listserv Lists") ======================================================================