Loading
 

The content on these pages is part of the cni archives; some links may no longer be functional

Networked Environment Training Assessment

BROWN UNIVERSITY CIS TRAINING

Assessing the Academic Networked Environment

Training Assessment

 

ASSESSING THE ACADEMIC NETWORKED ENVIRONMENT

Brown University CIS Training

115 Waterman Street Box 1885

Providence RI 02912-1885

Phone 401-863-7299 • Fax 401-863-7329

 

Table of Contents

IN CLASS EVALUATION    1

EXTENSIVENESS    1

EFFICIENCY    1

EFFECTIVENESS    2

SERVICE QUALITY    2

IMPACT    2

USEFULNESS    3

INSTRUCTOR EVALUATION    4

EXTENSIVENESS    4

EFFICIENCY    4

EFFECTIVENESS    5

SERVICE QUALITY    5

IMPACT    5

USEFULNESS    5

POST COURSE EVALUATION    6

EXTENSIVENESS    6

EFFICIENCY    6

EFFECTIVENESS    7

SERVICE QUALITY-    7

IMPACT    7

USEFULNESS    7

SIGN-IN SHEETS    8

EXTENSIVENESS    8

EFFICIENCY    8

EFFECTIVENESS    8

SERVICE QUALITY    9

IMPACT    9

USEFULNESS    9

COURSE REGISTRATION    10

EXTENSIVENESS    10

EFFICIENCY    10

EFFECTIVENESS    11

SERVICE QUALITY    11

IMPACT    11

USEFULNESS    11

 

Chapter

1

In-class Evaluations

In-class Evaluations provide real time feedback on topic value and instructor style.

T

ICON KEY

* Extensiveness

$ Efficiency

* Effectiveness

* Service Quality

* Impact

* Usefulness

Too often, people doing computer training have extreme opinions of in-class evaluation sheets.

Some trainers consider them nearly worthless, often referring to them as “Smiley Sheets”,

feeling that they only serve to measure how personable the instructor was.

Others trainers err on the side of putting too much value on the in-class evaluations,

using them as the sole, or major assessment tool.

Either attitude, carried to the extreme,

loses sight of the value and the limitations of an in-class evaluation.

ü

Extensiveness:   D+

The in-class evaluations are poor measurements of extensiveness

of the training offerings. These evaluations are optional,

and usually unsigned. Fewer than 20% of the attendees

normally fill out the critiques, but use varies depending

on how much the instructor mentions the form.

$

Efficiency:   F-

In class critiques have no value in calculating the efficiency

of the training program.

:

Effectiveness:  B+

In-class critiques are often used to try to measure

the effectiveness of a training program.

Users attending training classes (especially

undergraduate students) are not shy about expressing

their displeasure if a class fails to meet their expectations.

This can happen if a course description is poorly worded or unclear,

attendees arrive expecting training in one area and receive

training in a different form on or a different topic.

An instructor with poor instructor skills will also be

flagged in the in-course critique, subject mater

expertise is important, but you have to be able to explain the material to new users.

:

Service Quality:  A+

Service Quality is the area where in-class critiques shine.

Instructor talent is the area where this is most often commented on.

Positive comments are usually filled out in class and left for the

instructor to read, i.e. “Best computer class I have ever taken!”.

Attendees less happy with the class will often take the critique

sheet with them when they leave class. The form can be completed

and dropped into the campus mail system. This saves the user the

cost of a stamp, and increases the likelihood of the critique being returned.

Users may comment on the usefulness of the material,

but often it is too early to tell.

They will comment about how wonderful the application sounds

(especially after a demo only class), but they want to go back

and try it at their own computer before they could state how

much they have learned. Comments about facilities (room temperature),

registration, and confirmation would be included in this category as well.

M

Impact:  C+

The ability to measure the impact of training varies a great deal

between demo classes and hands-on classes.

The demos expose users to new software, upgraded applications,

and tips and tricks of using the software. A demo attendee sees

and hears things that sound useful, but it is often difficult to measure

the impact until they have had a chance to try the material.

Hands-on training classes provide in-class opportunities to apply

the material covered, so in-class critiques can and do include

some indication of the real world usefulness (or the lack there of)

of the material covered.

&

Usefulness:  B-

Without a period of actually applying the material,

users often are unclear as to how useful the material

will be to them outside of the classroom.

Negative comments on the in-class critique form

will most likely be accurate, the user will normally

know when the material has no value to them.

Positive, or undecided users need time outside of class

to try and apply the material in the real world.

 

Chapter

2

Instructor Evaluation

Instructor Evaluations provide real time feedback

on the instructor’s opinion of topic value

and various course support issues.

T

ICON KEY

* Extensiveness

$ Efficiency

* Effectiveness

* Service Quality

* Impact

* Usefulness

he instructor for a class can provide valuable,

first hand information on course content,

unexpected questions regarding content,

difficulties experienced by the attendees

with the registration process, and a host of other issues.

These are NOT critiques of the instructor, rather,

they are the instructor’s opinions of how the class went

and how it could be improved.

ü

Extensiveness:   B

The instructor evaluations provide an opportunity

for the instructor to comment on who he or she has observed

attending the training session. Demos, in particular,

have no pre-registration requirement, so often

the instructor’s notes as to who attended are

very helpful indications of the extensiveness

of the training offerings.

$

Efficiency:   F-

Instructor critiques have only a limited value in calculating

the efficiency of the training program. Often times,

a large percentage of the people attending the training

may arrive late or leave early, and often neglect to sign

the Attendance Sheet.  The instructor’s critique form can

indicate an estimate of the number or percentage of

the attendees that did not sign in.

:

Effectiveness:  B+

Instructor critiques can indicated where material in

the lesson plan was ineffective with a particular audience,

or subset of the classroom population. The examples used

in class often can be tailored to the needs of the attendees,

and the wrong examples can hurt the class effectiveness.

The instructor, noting this on his or her critique,

can give the curriculum developed for the course valuable feedback.

:)

Service Quality:  B-

Service Quality is harder to measure with instructor critiques.

Instructors may not always know how well a class went,

but they usually know when a class goes poorly.

Since the instructor critiques may be seen by

the instructor’s supervisor, and feedback is provided

on a semestral basis from the Training Coordinator

to all instructor supervisors, most instructors may

be reluctant to “place themselves on report”.

M

Impact:  D+

The instructor can rarely measure the impact of

the material being taught. If the instructor has

a class with limited attendance, it is possible

to survey the attendees to see how they hope to use

the material, but the critique is not designed

to collect this information. This form of feedback

is almost always passed by word of mouth to the Training Group.

&

Usefulness:  B-

Only in cases where the instructor is given verbal feedback

by someone unwilling to fill out an in-class critique

would the instructor have second hand information

on the usefulness of the material.

 

Chapter

3

Post Course Evaluation

Post-course evaluations provide real world feedback

on the usefulness of the course content.

T

ICON KEY

* Extensiveness

$ Efficiency

* Effectiveness

* Service Quality

* Impact

* Usefulness

he newest form of evaluation used by CIS Training,

the post-course evaluation is a form sent out via electronic mail

two to four weeks after a hands-on training class to measure

the trainees’ evaluation of the course after he or she

has had an opportunity to apply the material learned.

Limitations include the fact that not all attendees use email,

and not everyone will bother to reply to the survey.

ü

Extensiveness:   F

The post-course evaluations depend on data collected

during course registration. It adds no new information

as to the extent of users of the training services

$

Efficiency:   F+

Post-course critiques have only a limited value

in calculating the efficiency of the training program.

Post-course evaluations can show when people are

attending training that they are not using in their

jobs (staff learning how to design web pages that

don’t need those skills for their jobs).

:

Effectiveness:  A+

Post-course critiques can indicate whether

the material being taught is in fact of value

to people back at their own computers.

By looking at the Usefulness to the different audiences

(staff, faculty, undergraduate students and graduate students),

CIS Training can determine how effective

the training programs are in meeting the

needs of the University.

:)

Service Quality:  B-

Service Quality is can also be measured

with post-course critiques. Surveying for Service Quality

is not the primary goal of the post-course survey form,

and has intentionally been minimized on the form.

However, the difference in the design format makes

some people feel that they are personally

being asked their opinion, so comments

sometimes are received from attendees that did

not choose to fill out a in-class critique.

M

Impact:  A+

The post-course survey was created to measure

impact and usefulness. Training classes are attended,

knowledge leaves the room, but what benefit is that

knowledge to the people attending training?

This is what the post-course survey is designed to measure.

One area that is not presently covered that we would

like to add is a supervisor’s section of the post-course

survey to collect the supervisor’s opinion of

the training received. This might introduced slowly,

using it only for Departmental Training classes at first.

&

Usefulness:  A+

Post-course surveys measure usefulness. “Nuff said.

 

Chapter

4

Sign-in Sheets

Sign-ins provide information on who attends the training classes.

T

ICON KEY

* Extensiveness

$ Efficiency

* Effectiveness

* Service Quality

* Impact

* Usefulness

he sign-in sheet asks for the attendees name, department (as applicable),

status: staff, faculty, undergraduate or graduate.

It provides a count of the numbers of people attending class,

and a rough breakout as to which of the major audiences they are from.

ü

Extensiveness:   B+

The sign-in sheets collect data on who is attending classes,

and which community they are from. This is particularly useful for demos,

which require no pre-registration.

$

Efficiency:   C-

Sign-ins have limited value in calculating the efficiency

of the training program. They provide us a count of

the number of people attending classes.

:

Effectiveness:  F

Sign-ins provide no information on effectiveness.

:)

Service Quality:  F

Sign-ins provide no information on the quality of the service provided.

M

Impact:  F

Sign-ins provide no information on the impact of the training provided.

&

Usefulness:  F

Sign-ins provide no information on the usefulness of the training classes.

 

Chapter

5

Course Registration

Course Registration provides information on who wishes

to attend the training classes, and why.

T

ICON KEY

* Extensiveness

$ Efficiency

* Effectiveness

* Service Quality

* Impact

* Usefulness

he registration process is used only for the hands-on training

classes, but collects a lot of valuable information.

We learn which classes are most popular, which portion

of the campus community is interested in the training

(the status, as we call it), the time of day, day of week,

and time of year that is most popular, the platform

(operating system) of choice, and the version(s)

of software being used.

ü

Extensiveness:   A+

The registration information collected provides data

on who is attending classes, and which community they are from.

$

Efficiency:   B+

We try to schedule enough classes to meet the demand,

without having too many. The registration process

allows us to see where the demand is, and to schedule

overflow classes as needed..

:

Effectiveness:  B

The speed that classes fill up is a fairly good indicator

of which classes are popular, but doesn’t tell us what

we should be offering that we are not, or how people

are using the training after class..

:)

Service Quality:  C+

Registration allows us to track who is taking training,

either at the individual level (repeat customers) or the

status level (how many grad students took the Windows 3.1 modem class).

M

Impact:  C-

The registration process is not designed to collect

this type of information, but it is noted when it

is mentioned, and passed onto the Training Coordinator..

&

Usefulness:  C+

The registration process guides the users to the right

class for their needs, so it helps improve the usefulness

more than it measures it..

 

 

3

3

BROWN UNIVERSITY   CIS TRAINING

1    12/16/97

BROWN UNIVERSITY   CIS TRAINING

3

4

11

12

Last updated:  Saturday, July 13th, 2013