The topic of evaluating training events is quite a tricky one. There are several evaluation models out there and articles delving deeper and deeper into what should we exactly try to measure?
The least useful way to measure whether learning has taken place would be to ask mainly about the basics: the event arrangements, logistics, catering, invitations and how people felt about attending the class (including materials and trainer experience). This kind of feedback would merely help improve how you go about setting up future classes, inviting people and which trainers are better than others in conducting a particular course.
To delve deeper you may want to consider questions like:
Asking for written comments could give you some reasons behind scoring which otherwise could be missed when you evaluate scores. You may always choose to go back and ask for more information (if you are able to) if you find ratings received hard to interpret.
Get their impressions immediately. Expecting people to follow an online link and complete a training class evaluation electronically after the event can be difficult as people may have to travel back to their offices or may get sucked into the day-to-day activities pretty fast after the class. This means they may have forgotten a lot of the impressions you were hoping to capture by the time they see your online link. Consider using paper copies of the evaluation and collecting the responses before people leave the location.
Illegible comments. It is a good idea to give each submission a quick overview while the person is still around in case you can’t read their handwriting very easily after the event. I have seen training support personnel spend quite a bit of time trying to puzzle and interpret a comment written down in haste at the end of a session.
Focus on the trends. In most cases the most important factor to look for in the consolidation of responses is the extremes: what are the highest scores and the lowest scores? Then check for comments which may help you understand those extreme scores. The trends will give you a quick check on whether anything was considered completely out of the ordinary or as expected by participants.
Cultural impact. The way that people look at scoring on an evaluation sheet can vary quite dramatically among employees from different companies or different geographical regions. In some regions or companies it may be considered impolite to be overly critical while in other regions or companies a critical eye may be considered a sign of intelligence. Bear that in mind when you try to interpret scores for a group that may be made up of non-homogeneous participants.