WO2002097683A1 - Evaluation system and method - Google Patents

Evaluation system and method Download PDF

Info

Publication number
WO2002097683A1
WO2002097683A1 PCT/AU2002/000722 AU0200722W WO02097683A1 WO 2002097683 A1 WO2002097683 A1 WO 2002097683A1 AU 0200722 W AU0200722 W AU 0200722W WO 02097683 A1 WO02097683 A1 WO 02097683A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation
respondents
groups
computer system
survey
Prior art date
Application number
PCT/AU2002/000722
Other languages
French (fr)
Inventor
Mark Londsdale Hetherington
Original Assignee
Evalu8 Pty Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Evalu8 Pty Limited filed Critical Evalu8 Pty Limited
Publication of WO2002097683A1 publication Critical patent/WO2002097683A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • the present invention relates to a computer based system and method for carrying out an evaluation, and in particular for performance measurement and evaluation within a group based organisational structure.
  • Performance within an organisation or between organisations is often measured by means of performance reviews or the like.
  • a less than optimal structure often present in performance reviews can lead to a number of undesirable behaviours amongst an organisation's staff. For example, in organisations where an individual's immediate superior has a large influence on a performance review outcome, staff behaviour tends to become of an ingratiating nature towards their immediate superior with little regard for peers and team behaviour.
  • performance evaluation can often be a tedious process and can consequently be implemented very poorly. Further, if diverse inputs are solicited from many different sources to form the basis of a performance evaluation then they may be poorly integrated together. Often, the individuals or structures being reviewed may not be well aware of an organisation's values and the performance evaluation may not properly measure the staff adherence to such values.
  • method of providing an evaluation on a computer system comprising the steps of: (a) selecting a plurality of evaluation respondents; (b) selecting a plurality of groups into which the evaluation respondents are divided; (c) creating at least one interactive evaluation session for the groups of respondents, the evaluation session comprising a series of questions; (d) storing the respondents, groups and evaluation session on the computer system, (e) providing each of said respondents with the opportunity to interactively answer the questions in the evaluation session, (f) storing the answers on the computer system and (g) generating a series of reports based on the answers, wherein different types of reports can be generated by selecting different groups as a basis of comparison.
  • the computer system includes a relational database, and at least some of the evaluation respondents are divided into more than one group, with the relational database being set up to enable the different groups to be selected as a basis of comparison.
  • the groups include primary groups and non-primary or secondary groups, with the groups being selected such that a respondent may belong to only one primary group and to one or more secondary groups.
  • the evaluation session conveniently comprises a number of substantially independent surveys each having a series of questions for answer by a respondent.
  • the method is operated in an internet browser/internet server environment.
  • the environment includes a host server having an associated relational database, at least one facilitator browser computer and a plurality of respondent computers, with the facilitator being involved in the initial selecting and creating steps.
  • Step (e) typically includes the steps of launching the evaluation process by including a universal resource locator (URL) of the address of a corresponding evaluation or survey to be carried out by the respondent.
  • URL universal resource locator
  • the invention extends to a computer system for producing an interactive evaluation comprising respondent and group organisation means for inputting respondent and group details into said computer system of the respondents and groups participating in the evaluation, evaluation creation means for creating, for each respondent in a group of respondents, an interactive evaluation including a series of questions, notification means for notifying the respondents of said group of the availability of the corresponding interactive evaluations, evaluation execution means for enabling respondents to interact with a corresponding evaluation, storage means for storing the evaluation result in respect of each respondent, and reporting means for collating and reporting said evaluation results.
  • the system preferably includes monitoring means for monitoring the current status of the interactive evaluation.
  • said monitoring means includes a second notification means for sending a reminder to respondents who have not completed their evaluation.
  • the system conveniently includes registration means for initially registering an organisation's details within said computer system.
  • the respondent and group organisation means comprises a set up graphical user interface (GUI) for enabling a facilitator to enter and arrange names and into primary and secondary groups.
  • GUI graphical user interface
  • the evaluation creating means advantageously comprises at least one evaluation GUI including a survey template option providing different survey types to choose from, and a question bank including questions associated with at least some of the survey types.
  • the survey type is preferably chosen from a group including a self review, a colleague review, an opinion poll, and confidential and non-confidential reviews.
  • the evaluation execution means typically includes a plurality of GUI's carrying respondent-specific survey questions.
  • the computer system advantageously includes closure means for enabling a facilitator to close an evaluation by force closure or deletion.
  • the reporting means may include select means for selecting a reports option available to an identified subject, the reports option being respondent-and group-based.
  • the reports option preferably includes a ranking option, whereby the identified subject can select which subject-accessible groups or respondents can be ranked by which other groups or respondents.
  • the reporting means may be arranged to process textual answers, and includes an edit facility to enable a user to moderate or bowdlerise the textual answers prior to saving them.
  • the invention extends to a method of providing an evaluation on a computer system, the method comprising the steps of providing an interface for facilitating the initiation of an interactive evaluation session, providing a respondent and group entry interface for enabling the entry of groups of evaluation respondents, providing an evaluation creation interface for enabling the compilation of at least one targeted survey for the respondents, providing an evaluation launching interface for launching the survey to the respondents, providing each of the respondents with the opportunity of interactively answering questions in the survey, receiving and storing on the computer system details of the evaluation session, respondents, groups, surveys and answers, and providing a report selection interface for enabling the selection of a series of reports on the survey.
  • the group and respondent details on the evaluation session are stored on a relational database in such a way that allows for the selection of different combinations of groups at the reporting stage.
  • a method may further include the step of allowing the entry of different groups of evaluation respondents as primary groups and secondary groups, with respondents being limited to membership of only one primary group, but to more than one secondary group.
  • the group and respondent details are stored in tables which include a group table containing details of the groups being evaluated, and a membership table including details of the primary group membership in respect of a given evaluation, and all secondary groups in respect of the evaluation.
  • a further aspect of the invention provides a computer system for enabling a computer- based evaluation, the system comprising initiation means for generating an interface for facilitating the initiation of the interactive evaluation session, respondent and group entry interface generation means for enabling the entry of groups of evaluation respondents, evaluation creation interface generation means for enabling the compilation of at least one targeted survey for the respondents, evaluation launching generation interface means for launching the survey to the respondents, enabling means for providing each of the respondents with the opportunity of interactively answering questions in the survey, storage means for storing on the computer system details of the evaluation session, respondents, groups, surveys and answers, and report selection interface means for enabling the selection of a series of reports on the survey.
  • the computer system may include server means incorporating storage means in the form of a relational database.
  • server means incorporating storage means in the form of a relational database.
  • a computer program product comprising a computer readable medium having its own program code means when said program is loaded, to make the computer execute procedures in accordance with the method as set out above.
  • the invention extends to a server hosting a web page database wherein the graphical user interfaces described above are in the form of web pages, and in combination define a web page database.
  • Figure 1 illustrates schematically an Internet type environment within which the invention operates, including the architecture of the evaluation system
  • Figure 2 is a flow chart of the registration process of the invention
  • Figure 3 is a flow chart of the organise process of the invention.
  • Figure 4 is a flow chart of the create process of the invention
  • Figure 5 is a flow chart of the edit process of the invention
  • Figure 6 is a flow chart of the open or launch process of the invention.
  • Figure 7 is a flow chart of the manage or progress process of the invention.
  • Figure 8 is a flow chart of the close process of the invention.
  • Figure 9 is a flow chart of a report process of the invention.
  • Figure 10 is a flow chart of the evaluate or respond process of the invention.
  • Figure 11 shows a screen shot of a main manage evaluations page
  • Figure 12 shows a screen shot of part of a subject details set up page
  • Figure 13 shows a screen shot of a spreadsheet loading page
  • Figure 14 shows a screen shot of a group display page
  • Figure 15 shows a screen shot of a create evaluation page
  • Figure 16 shows a screen shot of a blank evaluation page
  • Figure 17 shows a survey template window
  • Figure 18 shows a screen shot of a first sample evaluation or survey page
  • Figure 19 shows a screen shot of a second sample evaluation or survey page
  • Figure 20 shows a screen shot of a survey report by group page
  • Figure 21 shows a screen shot of a survey or evaluation launch page
  • Figure 22 shows a screen shot of a launch confirmation page
  • Figure 23 shows a screen shot of a respondent email page
  • Figure 24 shows a screen shot of a first employee opinion survey page
  • Figure 25 shows a screen shot of a first colleague review page
  • Figure 26 shows a screen shot of a second colleague review page
  • Figure 27 shows a screen shot of a third colleague review page
  • Figure 28 shows a screen shot of a fourth colleague review page
  • Figure 29 shows a screen shot of a fifth colleague review page
  • Figure 30 shows a screen shot of a self estimate page
  • Figure 31 shows a screen shot of a colleague review ranking page
  • Figure 32 shows a screen shot of a first progress review page
  • Figure 33 shows a screen shot of a second progress review page
  • Figure 34 shows a screen shot of a closed evaluation or survey page
  • Figure 35 shows a screen shot of a report reviewing page
  • Figure 36 shows a screen shot of a report listing page
  • Figure 37 shows a screen shot of typical confidential report
  • Figure 38 shows a screen shot a typical non-confidential report
  • Figure 39 shows a table representative of a ranking report
  • Figures 40A and 40B show a schematic diagram of part of the relational database of the evaluation system of the invention.
  • an interactive performance evaluation system operational over an Internet browser type environment.
  • Figure 1 there is illustrated schematically a standard browser environment wherein a facilitating client computer and browser 1 interacts over the Internet 2 with a server application 3 running on another host computer.
  • Other client computers which are linked both to the facilitating client computer 1 and the server application 3 include respondent computers and report receiver computers 4 and 5 respectively.
  • the server application can operate in accordance with standard common gateway interface (CGI) techniques, and incorporates a relational database 6 which is populated by the facilitator and respondent computers.
  • CGI common gateway interface
  • the application is provided for serving web pages over the Internet so as to allow an organisation to perform comprehensive evaluations of their staff performance, team performance, organisation performance and staff opinion as required or on as many issues as required. Further, the application can be used for evaluating anything by anyone in as much non-human objects (for example films or detergents) can be the subjects of a survey and anyone at all can be a respondent of a survey.
  • the functions/interfaces provided by web pages served by the server 3 can include the following (described in more detail below):
  • Organise For the organisation, review entered details, upload corporate information such as a logo, enter the staff (and other people or objects that they might require to be respondents or subjects of surveys), designate supervisors, assign other administrators/facililators, enter entities or groups, relate groups to each other, relate groups to staff, and designate which staff can see which results for which groups;
  • FIG. 2 illustrates an example flow chart 10 of the steps involved in the registration process.
  • This flow chart shows how a representative or facilitator of an organisation can enter a web site address 11 to provide for initial registration.
  • the representative is asked to enter minimal organisational information 12 including naming a facilitator and their email address.
  • An initial password is then sent 13 to this facilitator; to at least allow the email address to be authenticated.
  • the facilitator is also sent an email containing a special URL.
  • the facilitator is requested to enter the password contained in the email; once this is done the facilitator enters his or her own secret password 15 (twice to preventing typing errors).
  • the facilitator After passing through a "Welcome" page 16, the facilitator is then logged in and can proceed to the next step of Organise, or another facilitator function 17 of the facilitator.
  • FIG. 3 illustrates an example flow chart of the steps 20 involved in creating an organisation profile within the system.
  • the flow chart shows four different selectable functions 21 to 24 that the facilitator has at their disposal to set up the organisation ready for evaluations to be created.
  • the first function that the facilitator is likely to use is Edit Organisation Details 22. This allows them to review existing details, add country (mandatory for GST purposes), set system language preference and upload the organisation logo.
  • the second function that can be used is to upload a staff file 23. This can be done by first producing a spreadsheet containing the fields as predefined by the system. This can be done by a variety of methods as required and may be created by extraction of information from existing databases run by the organisation. If no such data exists, the facilitator can enter the details with the Edit Staff function.
  • One key field in the upload is the primary group.
  • a primary group is defined as the one group (often department or group) to which a person belongs. As the upload process encounters new groups, they are created in the system database.
  • the facilitator is able to alter entries using the Edit Groups functionality 24. This allows the facilitator to add secondary groups and relate staff to them. It also allows the facilitator to relate groups to each other and designate one to be the son of another which is used in report generation at a later date.
  • the Edit Staff function 21 allows the facilitator to review what has been uploaded, add new staff and relate staff to secondary groups. At this time the facilitator can also designate which staff can view reports on groups and which staff can be co-facilitators. Create:
  • Figure 4 illustrates the exemplary steps 30 involved in a facilitator creating an evaluation initially. After entering 31 the name, description and target dates for launch and close, the facilitator can decide 32 whether to specify what surveys constitute the evaluation 33 or derive this previous evaluation from a list 34.
  • Figure 5 illustrates the exemplary steps 40 involved in editing an evaluation.
  • a created evaluation is chosen 41, and surveys can then be named, renamed, added or removed 42.
  • For each survey 43 the list of respondents, report permissions, questions and multiple-choice answers are specified and added 44. If the survey requires a set of subjects (people or objects) to be evaluated, these are specified here also.
  • Figure 6 illustrates a flow chart of how a facilitator opens or launches an evaluation 50.
  • the facilitator instructs the evaluation system to launch it 51.
  • the evaluation system needs to run some checks 52 to ensure that the staff list is valid, the group relationships are consistent (no loops) and that all details of the evaluation are consistent.
  • the evaluation system asks the facilitator to provide the text of the opening email 53 to be sent to all respondents before formally opening the evaluation, confirming the list of respondents 54 and sending the email to all respondents.
  • the evaluation system asks the facilitator 55 which of the new questions that the facilitator has created can be placed in the question bank. This question bank is open to everybody and enhances the value of the service.
  • the facilitator may optionally be asked if they want the organisation name to be placed by the example. This is advantageous for two reasons: first, facilitators of the organisation can quickly recognise and reuse their questions and secondly, the organisation gets advertising and value from their donation to the bank.
  • each email sent 57 is a unique URL that describes the facilitator's name, organisation and evaluation so that the log-in will only require the entry of a password.
  • Figure 7 illustrates a flow chart 60 of the steps in how a facilitator manages the progress of an evaluation.
  • a 'View Progress' selection on an evaluation in progress is made 61.
  • a report 62 is then generated showing which respondents have not started, started but not finished and finished their part in the evaluation.
  • the facilitator is given the opportunity of sending reminder emails 64 to either those who have not even started or those that have started but not finished.
  • the facilitator is also given the opportunity to "unfinish" a session 63. This may be necessary because a respondent may have prematurely finished a session and might want to go back and add more feedback. For those respondents who have not even started 65 or who are still busy 66 with evaluations, reminder emails 67 are sent.
  • Figure 8 illustrates a flow chart 70 of the steps involved in a facilitator closing an evaluation. It may be the case, despite a series of reminder emails, that when a facilitator comes to close an evaluation, there will be unfinished evaluation sessions.
  • the facilitator selects the close option 71, and is then given a summary of the unfinished sessions 72 and is given the choice on each of them to either automatically finish the sessions or delete them together with all the associated answers and rankings.
  • closure 74 all people who can review reports on the evaluation are subsequently notified by email 75 giving them another unique URL to aid quick log-in; these are the supervisors of staff affected and those people designated as being able to review reports on specific groups.
  • Figure 9 illustrates a flow chart 80 of the steps involved in report production.
  • the flow chart 80 shows how a facilitator can produce reports after an evaluation has been closed 81. These reports can be survey specific.
  • a list of available reports is produced 82. The list will vary depending on the report permissions set up during the create phase. If the person is the nominated reviewer for a report then a link will appear for the option to review the report. Every report that has textual answers in it must be reviewed. The reviewer of each report is specified by the facilitator during the create phase. The user simply selects the required report 83, enters any required parameters, receives the report on their screen 84 and prints it 85 via the browser functionality if required.
  • Figure 10 illustrates a flow chart 110 of the steps involved in the main process of respondents performing an evaluation. Although the steps form only a small part of the system, such steps constitute the main point of interaction of most users of the system.
  • an evaluator or respondent After logging in via the special URL sent from the open evaluation process 111, an evaluator or respondent will be presented with a welcome page and a list of modules or surveys for which he or she is required to be an evaluator or respondent 112. This will vary from respondent to respondent.
  • the respondent answers questions, ranks people or objects, assesses performance against objectives 114, 115 or whatever is required by the survey or page before pressing NEXT to select a different page from the navigation bar 115.
  • the respondent is taken into the Finish survey 117 where he or she is asked questions on the evaluation or survey itself (specified, if required, by the evaluation or survey creator). In this survey or page a general check is made to ensure that that the respondent has performed all mandatory actions 118 before allowing them to finish and commit their results to the database 119 and form part of the evaluation data.
  • GUI graphical user interface
  • the client or facilitator is informed that a certain level of Java needs to be installed as a plug-in to the browser. If required, a Java download and install will be automatically initiated, this being part of a standard process provided by Sun Microsystems.
  • the initial page providing entry into the organised or managed process of the invention is shown at 120 in Figure 11.
  • a menu bar 121 allows the user to set up a staff customer (ie subject) list or to create an evaluation. All draft evaluations, evaluations in progress and closed evaluations are summarised on this page at 122, 123, and 124.
  • this page will have fewer instructions and will show all the subject's evaluations in their relevant "states”.
  • the first task that the facilitator needs to perform is to set up a people database by selecting a set up staff/customer list.
  • the so-called Staff/Customer details page 125 of Figure 12 is presented, with, the subject details 126 at this stage including only those of the facilitator.
  • the facilitator then needs to enter more staff or other subjects either by typing them into the matrix provided or by preparing and loading a spreadsheet by selecting the option under "Tools".
  • the screen shot of Figure 13 shows a spreadsheet loading page 127 for enabling bulk entry of staff/customer or other details. This is achieved by downloading and saving the sample Excel spreadsheet, populating it with staff details, clicking on the browse button to select the spreadsheet from the system, and clicking on the upload staff customer spreadsheet button 128.
  • Figure 15 shows a screen shot of a create evaluation page 133.
  • the name of the new evaluation is entered into text box 134, and the style of evaluation is then chosen either by clicking on the "blank" link 135 or on the name of a previous evaluation.
  • the blank evaluation is generally created in the first instance, and Figure 16 shows a screen shot of a blank evaluation page 136 with the evaluation name, description, target launch date and target end date having to be filled in respective text boxes 137, 138, 139 and 140.
  • An add survey link 141 is also provided for enabling the facilitator to add different types of surveys to the evaluation- in this case broadly categorised into confidential, non-confidential and colleague review. Once a survey is added, the focus turns to the actual survey.
  • An alternative way of adding a survey is shown in Figure 17.
  • a survey template window 142 provides various survey options, some of which are set out in the window 142. The selected survey is simply dragged and dropped into the evaluation, and appears in a tree structure 143 in the left panel of Figure 18, which shows part of a sample survey or evaluation 144. Questions can be modified and inserted either by entering new ones or dragging existing ones in from a question bank window 145 including categorised questions obtained from the tools menu.
  • Figure 19 shows a colleague review screen shot 146 in more detail.
  • a colleague review has a number of selectable features. First, the facilitator can choose whether the questions are asked all at once in respect of each subject, as is shown at 147. Alternatively, or in addition, respondents can answer one question at a time on a list of subjects 148. A ranking option 149 is also provided, enabling the facilitator to choose that the respondents rank the subject in order of value so that a unique value ranking can be created.
  • the colleague review also allows respondents to choose which people to answer questions on. This constitutes a subset of the subject selected from the staff/customer list.
  • Figure 20 shows a page incorporating a survey report by group window 151 providing the facilitator with option blocks 152 for enabling the facilitator to specify exactly who will receive the reports. Depending on the type of report, different report permissions are provided.
  • the facilitator can check that the evaluation is ready for launching.
  • a check window (not shown) may be provided for ensuring that further information is incorporated. For example, respondents may be added to the self assessment survey, and report persmissions may be required for the "survey report by person" in the self assessment survey.
  • Part of a survey launch page is shown at 153 in Figure 21.
  • the facilitator can select at 154 if any questions authored can be used in the question bank, and the subject and text of the email inviting people to participate in the evaluation are incorporated in respect of the text blocks 155 and 156.
  • the confirmation screen 160 of Figure 22 is displayed on launching the evaluation, and includes full details of the respondents who have been sent an email inviting them to participate.
  • each respondent automatically receives an email 161 asking the respondent to select a link and set their password.
  • Figure 24 provides a first employee opinion survey page 162, which is the first page of the first survey.
  • a left-hand panel 163 describes the contents of the evaluation, and includes link to colleague review, person, value, ranking and self-assessment. The status of each survey is depicted by a traffic light icon 164.
  • a first colleague review page is shown at 165 in Figure 25, with the left-hand panel now including a list of colleagues to be reviewed or evaluated 166.
  • Figure 26 shows at 167 a second colleague review page incorporating a comparative list of colleagues dealt with on an issue by issue basis.
  • a third colleague review page 168 provides a series of text boxes 169 in which questions on various colleagues can be answered on an issue by issue basis.
  • Figure 28 shows a fourth colleague review page 170 including a series of score boxes 171 for enabling a list of selected colleagues to be scored out of 10.
  • a fifth colleague review page 172 includes tick in the box questions about colleagues which can be dealt with on an issue by issue basis.
  • Figure 30 shows a self assessment page 173 providing a series of text boxes 174 for text- based responses.
  • a colleague review ranking page 175 allows respondents to rank one another in value order, providing a series of text boxes 176 of differing levels, and in this case ranked from high to low. This constitutes a significant feature of the invention, in that by collecting individual ranks of subjects by respondents, the system is later able to produce a ranking of any group or set of groups as ranked by any other group or set of groups.
  • the respondent ultimately proceeds to a Finish page where the submission will be submitted. This is analogous to signing a form. Once this has been done, the respondent cannot return to answer further questions without requesting for the session to be "unfinished" by the facilitator.
  • the evaluation acquires an "in progress" status.
  • the facilitator is able to view the progress of the evaluation by selecting the relevant option from the relevant drop down list 178.
  • a full survey summary is provided providing the statuses of all the respondents. The facilitator is then able to review the progress of the response and can choose to delete or to "unfinish" sessions 179 A or to send out reminders to respondents who not have not yet finished.
  • a confirmation report showing which people have been sent emails about reports for the evaluation is then provided, and an email informing the respondents that the evaluation is closed is then sent to all respondents, who are also invited to review reports.
  • the facilitator can then see that the evaluation has been closed and can be selected from drop-down menu 178 for report viewing, as is shown in Figure 35.
  • the user is then presented with a list of the reports that he or she is entitled to view for the evaluation in one of two ways. If the user is a facilitator, the above drop-down menu selection may be made. Alternatively, the user may receive an email inviting him or her to review report, after which the link in the email is clicked on and the user then is able to sign in. As is clear from the report listing page 181 of Figure 36, the user can then view a selected report or can review the textual answers of a report if such user is a designated viewer for the report.
  • the way in which the database is assigned allows a complex organisational structure to be captured through the facilitator interface, and for the structure subsequently to be analysed on numerous different group-based levels.
  • Ranking reports represent a more complex version of the report.
  • a user is given rights to view ranking reports of one or a set of different groups.
  • the ranking report option When the ranking report option is selected, the user is presented with a first list of the groups for which they are allowed to view the results (which could be only one). These are known as the subject groups.
  • the second list is of all the groups that have members who have ranked or voted on any of the members of the subject groups. These are known as respondent groups.
  • respondent groups The user is enabled to get a ranking report on a combination of subject groups as voted on by any combination of respondent groups.
  • a typical ranking report is shown at 184 in Figure 39, and includes name and group details, together with a relative score column 185 and details of the number of respondents 186 in each score.
  • the system is typically set up not to include subjects received fewer than a critical number- say five- ranks or votes. It can be seen that any subject can be given a ranking position within any group for which they are a member based on the rankings (votes cast by any combination of groups). These positions can be reported in an individual's Person Reports. The most common report is to view the whole organisation as ranked by the whole organisation. It is also possible to drill down and to view the results of any subset of the organisation on the votes cast by any other subset (or the same subset).
  • the system will also allow "objects" to be ranked by Respondents.
  • Objects are very similar to subjects; they can be arranged in group structures and reported on in the same way.
  • An object can be anything that is comparable with another set of objects. Typical examples are: films, shows, detergents, sports teams, political parties, and individual performances. This allows Respondents to be requested to answer questions on and rank not only people but also any set of objects that they want compared. Films, for instance, would be categorised in some way, say: drama, thriller, comedy, action. They may also be categorised using secondary groups by say; censorship level, actors, or indeed any other factor. The Respondents can then be grouped by area with subgroups (e.g.
  • Subjects and objects can accordingly be ranked with the ability to group hierarchically in both primary groups and secondary groups, with both being combined in reports of any kind.
  • FIG. 40A and 40B part of a relational database 190 is shown illustrating the key tables that implement the system of the invention, together with the relationship between the tables.
  • Each table consists of several fields which are labelled with the table boundaries. Key icons indicate the primary key and the lines show the relationship between the various tables.
  • the tables include a person table 191, an answer table 192, a survey question (called ModuleQuestion) table 193, a session table 194, a ranking table 195, an "evaluation group” (called EvaluationEntity) table 196 containing the various groups being evaluated or surveyed, and an "evaluation of group member” (called EvaluationEntityMember) table 197 including a membership for a given group in respect of the given evaluation.
  • EvaluationEntity evaluation Entity
  • EvaluationEntityMember evaluation of group member
  • EvaluationEntityMemberAll An evaluation of all group members (called EvaluationEntityMemberAll) table 198 contains a membership for a given group and all its sub-groups in respect of the given evaluation or survey.
  • an "answer detail" table 199 provides details of all the answer choices in a multiple choice question, whilst the "answer” table 192 contains the actual answers which refer to either answer choice or a textual answer or contains a number.
  • a question table 200 includes various question types, a survey table 201 includes the various different surveys or modules and an evaluation table 202 includes details of the evaluation itself such as name, description and start and end dates.
  • FIG. 40A and 40B The relationship between the various tables will be apparent to one of normal skill in the art on the basis of what is included in Figures 40A and 40B.
  • Foreign keys in tables relate to primary keys in other tables, and the relationships can be derived from the names.
  • tblAnswerl92 has three foreign keys relating to primary keys of other tables: anPEID to peJD in tblPerson 191, anSEID to selD in tblSession 194 and anMQID to mqlD in tblModuleQuestion 193.
  • the answer table actually has two other foreign key links: aTEID relates to the text table where textual answers are held and a USID relates to the user session table which holds details of the users session. It can be seen that this design allows a facilitator to collect and analyse the opinions (both answers to questions and rankings) of any set of groups on any other set of groups.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A method for providing interactive performance evaluation on a computer system, the method comprising the steps of: (a) creating an evaluation session for a group of users, the evaluation session including an interactive series of questions about people or organisations groups with each of the users; (b) providing each of the uses with the opportunity to interactively answer their corresponding evaluation session, storing the answers on the computer system for later analysis; (c) collating the answers into a series of reports.

Description

Evaluation System and Method
Field of the invention
The present invention relates to a computer based system and method for carrying out an evaluation, and in particular for performance measurement and evaluation within a group based organisational structure.
Background of the invention
Performance within an organisation or between organisations is often measured by means of performance reviews or the like. A less than optimal structure often present in performance reviews can lead to a number of undesirable behaviours amongst an organisation's staff. For example, in organisations where an individual's immediate superior has a large influence on a performance review outcome, staff behaviour tends to become of an ingratiating nature towards their immediate superior with little regard for peers and team behaviour. Also, performance evaluation can often be a tedious process and can consequently be implemented very poorly. Further, if diverse inputs are solicited from many different sources to form the basis of a performance evaluation then they may be poorly integrated together. Often, the individuals or structures being reviewed may not be well aware of an organisation's values and the performance evaluation may not properly measure the staff adherence to such values.
Summary of the invention
In accordance with a first aspect of the invention there is provided method of providing an evaluation on a computer system, the method comprising the steps of: (a) selecting a plurality of evaluation respondents; (b) selecting a plurality of groups into which the evaluation respondents are divided; (c) creating at least one interactive evaluation session for the groups of respondents, the evaluation session comprising a series of questions; (d) storing the respondents, groups and evaluation session on the computer system, (e) providing each of said respondents with the opportunity to interactively answer the questions in the evaluation session, (f) storing the answers on the computer system and (g) generating a series of reports based on the answers, wherein different types of reports can be generated by selecting different groups as a basis of comparison. Preferably the computer system includes a relational database, and at least some of the evaluation respondents are divided into more than one group, with the relational database being set up to enable the different groups to be selected as a basis of comparison.
Conveniently, the groups include primary groups and non-primary or secondary groups, with the groups being selected such that a respondent may belong to only one primary group and to one or more secondary groups.
The evaluation session conveniently comprises a number of substantially independent surveys each having a series of questions for answer by a respondent.
Typically, the method is operated in an internet browser/internet server environment. Preferably, the environment includes a host server having an associated relational database, at least one facilitator browser computer and a plurality of respondent computers, with the facilitator being involved in the initial selecting and creating steps.
Advantageously, the series of questions are arranged to require a response whereby respondents rank subjects or objects, for enabling the generation of ranking reports. Step (e) typically includes the steps of launching the evaluation process by including a universal resource locator (URL) of the address of a corresponding evaluation or survey to be carried out by the respondent.
The invention extends to a computer system for producing an interactive evaluation comprising respondent and group organisation means for inputting respondent and group details into said computer system of the respondents and groups participating in the evaluation, evaluation creation means for creating, for each respondent in a group of respondents, an interactive evaluation including a series of questions, notification means for notifying the respondents of said group of the availability of the corresponding interactive evaluations, evaluation execution means for enabling respondents to interact with a corresponding evaluation, storage means for storing the evaluation result in respect of each respondent, and reporting means for collating and reporting said evaluation results.
The system preferably includes monitoring means for monitoring the current status of the interactive evaluation. Typically, said monitoring means includes a second notification means for sending a reminder to respondents who have not completed their evaluation.
The system conveniently includes registration means for initially registering an organisation's details within said computer system. Preferably, the respondent and group organisation means comprises a set up graphical user interface (GUI) for enabling a facilitator to enter and arrange names and into primary and secondary groups.
The evaluation creating means advantageously comprises at least one evaluation GUI including a survey template option providing different survey types to choose from, and a question bank including questions associated with at least some of the survey types.
The survey type is preferably chosen from a group including a self review, a colleague review, an opinion poll, and confidential and non-confidential reviews.
The evaluation execution means typically includes a plurality of GUI's carrying respondent-specific survey questions. The computer system advantageously includes closure means for enabling a facilitator to close an evaluation by force closure or deletion.
The reporting means may include select means for selecting a reports option available to an identified subject, the reports option being respondent-and group-based.
The reports option preferably includes a ranking option, whereby the identified subject can select which subject-accessible groups or respondents can be ranked by which other groups or respondents.
The reporting means may be arranged to process textual answers, and includes an edit facility to enable a user to moderate or bowdlerise the textual answers prior to saving them.
The invention extends to a method of providing an evaluation on a computer system, the method comprising the steps of providing an interface for facilitating the initiation of an interactive evaluation session, providing a respondent and group entry interface for enabling the entry of groups of evaluation respondents, providing an evaluation creation interface for enabling the compilation of at least one targeted survey for the respondents, providing an evaluation launching interface for launching the survey to the respondents, providing each of the respondents with the opportunity of interactively answering questions in the survey, receiving and storing on the computer system details of the evaluation session, respondents, groups, surveys and answers, and providing a report selection interface for enabling the selection of a series of reports on the survey.
Preferably, the group and respondent details on the evaluation session are stored on a relational database in such a way that allows for the selection of different combinations of groups at the reporting stage.
A method may further include the step of allowing the entry of different groups of evaluation respondents as primary groups and secondary groups, with respondents being limited to membership of only one primary group, but to more than one secondary group.
Conveniently, the group and respondent details are stored in tables which include a group table containing details of the groups being evaluated, and a membership table including details of the primary group membership in respect of a given evaluation, and all secondary groups in respect of the evaluation.
A further aspect of the invention provides a computer system for enabling a computer- based evaluation, the system comprising initiation means for generating an interface for facilitating the initiation of the interactive evaluation session, respondent and group entry interface generation means for enabling the entry of groups of evaluation respondents, evaluation creation interface generation means for enabling the compilation of at least one targeted survey for the respondents, evaluation launching generation interface means for launching the survey to the respondents, enabling means for providing each of the respondents with the opportunity of interactively answering questions in the survey, storage means for storing on the computer system details of the evaluation session, respondents, groups, surveys and answers, and report selection interface means for enabling the selection of a series of reports on the survey.
The computer system may include server means incorporating storage means in the form of a relational database. According to a still further aspect of the invention there is provided a computer program product comprising a computer readable medium having its own program code means when said program is loaded, to make the computer execute procedures in accordance with the method as set out above. The invention extends to a server hosting a web page database wherein the graphical user interfaces described above are in the form of web pages, and in combination define a web page database.
These and further features of the invention will be made apparent from the description of a preferred embodiment thereof given below by way of example. In the description references made to the accompanying drawings but the specific features shown in the drawings should not be construed as limiting on the invention.
Brief description of the drawings
Preferred and other embodiments of the present invention will now be described with reference to the accompanying drawings in which: Figure 1 illustrates schematically an Internet type environment within which the invention operates, including the architecture of the evaluation system;
Figure 2 is a flow chart of the registration process of the invention;
Figure 3 is a flow chart of the organise process of the invention;
Figure 4 is a flow chart of the create process of the invention; Figure 5 is a flow chart of the edit process of the invention;
Figure 6 is a flow chart of the open or launch process of the invention;
Figure 7 is a flow chart of the manage or progress process of the invention;
Figure 8 is a flow chart of the close process of the invention;
Figure 9 is a flow chart of a report process of the invention; Figure 10 is a flow chart of the evaluate or respond process of the invention;
Figure 11 shows a screen shot of a main manage evaluations page; Figure 12 shows a screen shot of part of a subject details set up page;
Figure 13 shows a screen shot of a spreadsheet loading page;
Figure 14 shows a screen shot of a group display page;
Figure 15 shows a screen shot of a create evaluation page; Figure 16 shows a screen shot of a blank evaluation page;
Figure 17 shows a survey template window;
Figure 18 shows a screen shot of a first sample evaluation or survey page;
Figure 19 shows a screen shot of a second sample evaluation or survey page;
Figure 20 shows a screen shot of a survey report by group page; Figure 21 shows a screen shot of a survey or evaluation launch page;
Figure 22 shows a screen shot of a launch confirmation page;
Figure 23 shows a screen shot of a respondent email page;
Figure 24 shows a screen shot of a first employee opinion survey page;
Figure 25 shows a screen shot of a first colleague review page; Figure 26 shows a screen shot of a second colleague review page;
Figure 27 shows a screen shot of a third colleague review page;
Figure 28 shows a screen shot of a fourth colleague review page;
Figure 29 shows a screen shot of a fifth colleague review page;
Figure 30 shows a screen shot of a self estimate page; Figure 31 shows a screen shot of a colleague review ranking page;
Figure 32 shows a screen shot of a first progress review page;
Figure 33 shows a screen shot of a second progress review page;
Figure 34 shows a screen shot of a closed evaluation or survey page;
Figure 35 shows a screen shot of a report reviewing page; Figure 36 shows a screen shot of a report listing page;
Figure 37 shows a screen shot of typical confidential report;
Figure 38 shows a screen shot a typical non-confidential report;
Figure 39 shows a table representative of a ranking report; and Figures 40A and 40B show a schematic diagram of part of the relational database of the evaluation system of the invention.
Description of preferred and other embodiments
In the preferred embodiment, there is provided an interactive performance evaluation system operational over an Internet browser type environment. Turning initially to Figure 1, there is illustrated schematically a standard browser environment wherein a facilitating client computer and browser 1 interacts over the Internet 2 with a server application 3 running on another host computer. Other client computers which are linked both to the facilitating client computer 1 and the server application 3 include respondent computers and report receiver computers 4 and 5 respectively. The server application can operate in accordance with standard common gateway interface (CGI) techniques, and incorporates a relational database 6 which is populated by the facilitator and respondent computers.
The application is provided for serving web pages over the Internet so as to allow an organisation to perform comprehensive evaluations of their staff performance, team performance, organisation performance and staff opinion as required or on as many issues as required. Further, the application can be used for evaluating anything by anyone in as much non-human objects (for example films or detergents) can be the subjects of a survey and anyone at all can be a respondent of a survey. Specifically the functions/interfaces provided by web pages served by the server 3 can include the following (described in more detail below):
1. Register - Organisations to initially register with the system and designate an administrator or facilitator. Facilitators act on behalf of the organisation to:
2. Organise - For the organisation, review entered details, upload corporate information such as a logo, enter the staff (and other people or objects that they might require to be respondents or subjects of surveys), designate supervisors, assign other administrators/facililators, enter entities or groups, relate groups to each other, relate groups to staff, and designate which staff can see which results for which groups;
3. Create - Create evaluations either from a template, a previous evaluation or from a designated list of module or survey types selected by the facilitator; 4. Edit - Edits evaluation details, survey lists, and for each evaluation: designates the evaluators or respondents, designates who or what are to be evaluated, assigns questions and assigns multiple choice answers to questions;
5. Launch - Launch or Open an evaluation and send out invitations to all evaluators or respondents; 6. Progress - Generate reports on the status of an evaluation and send out reminders to delinquents; re-open individual sessions on request;
7. Close - Close an evaluation and make reports on the results of the evaluation available;
8. Report - Generate reports for organisation- wide surveys and for staff, if necessary;
9. Respond - Respondents or Evaluators to complete an evaluation or survey when invited; they are presented with only the surveys for which they have been designated as an evaluator;
Re ister: Figure 2 illustrates an example flow chart 10 of the steps involved in the registration process. This flow chart shows how a representative or facilitator of an organisation can enter a web site address 11 to provide for initial registration. The representative is asked to enter minimal organisational information 12 including naming a facilitator and their email address. An initial password is then sent 13 to this facilitator; to at least allow the email address to be authenticated. The facilitator is also sent an email containing a special URL. On clicking this URL 14, the facilitator is requested to enter the password contained in the email; once this is done the facilitator enters his or her own secret password 15 (twice to preventing typing errors). After passing through a "Welcome" page 16, the facilitator is then logged in and can proceed to the next step of Organise, or another facilitator function 17 of the facilitator.
Organise:
Figure 3 illustrates an example flow chart of the steps 20 involved in creating an organisation profile within the system. The flow chart shows four different selectable functions 21 to 24 that the facilitator has at their disposal to set up the organisation ready for evaluations to be created.
The first function that the facilitator is likely to use is Edit Organisation Details 22. This allows them to review existing details, add country (mandatory for GST purposes), set system language preference and upload the organisation logo.
The second function that can be used is to upload a staff file 23. This can be done by first producing a spreadsheet containing the fields as predefined by the system. This can be done by a variety of methods as required and may be created by extraction of information from existing databases run by the organisation. If no such data exists, the facilitator can enter the details with the Edit Staff function. One key field in the upload is the primary group. A primary group is defined as the one group (often department or group) to which a person belongs. As the upload process encounters new groups, they are created in the system database.
The facilitator is able to alter entries using the Edit Groups functionality 24. This allows the facilitator to add secondary groups and relate staff to them. It also allows the facilitator to relate groups to each other and designate one to be the son of another which is used in report generation at a later date.
The Edit Staff function 21 allows the facilitator to review what has been uploaded, add new staff and relate staff to secondary groups. At this time the facilitator can also designate which staff can view reports on groups and which staff can be co-facilitators. Create:
Figure 4 illustrates the exemplary steps 30 involved in a facilitator creating an evaluation initially. After entering 31 the name, description and target dates for launch and close, the facilitator can decide 32 whether to specify what surveys constitute the evaluation 33 or derive this previous evaluation from a list 34.
Edit:
Figure 5 illustrates the exemplary steps 40 involved in editing an evaluation. A created evaluation is chosen 41, and surveys can then be named, renamed, added or removed 42. For each survey 43, the list of respondents, report permissions, questions and multiple-choice answers are specified and added 44. If the survey requires a set of subjects (people or objects) to be evaluated, these are specified here also.
Launch: Figure 6 illustrates a flow chart of how a facilitator opens or launches an evaluation 50.
Once the facilitator is satisfied with the creation and editing of an evaluation and the start date has arrived, the facilitator instructs the evaluation system to launch it 51. The evaluation system needs to run some checks 52 to ensure that the staff list is valid, the group relationships are consistent (no loops) and that all details of the evaluation are consistent. When this is so, the evaluation system asks the facilitator to provide the text of the opening email 53 to be sent to all respondents before formally opening the evaluation, confirming the list of respondents 54 and sending the email to all respondents. Prior to this event taking place, the evaluation system asks the facilitator 55 which of the new questions that the facilitator has created can be placed in the question bank. This question bank is open to everybody and enhances the value of the service. If the facilitator agrees 56 to donate them, the facilitator may optionally be asked if they want the organisation name to be placed by the example. This is advantageous for two reasons: first, facilitators of the organisation can quickly recognise and reuse their questions and secondly, the organisation gets advertising and value from their donation to the bank.
Within each email sent 57 is a unique URL that describes the facilitator's name, organisation and evaluation so that the log-in will only require the entry of a password.
Progress:
Figure 7 illustrates a flow chart 60 of the steps in how a facilitator manages the progress of an evaluation. A 'View Progress' selection on an evaluation in progress is made 61. A report 62 is then generated showing which respondents have not started, started but not finished and finished their part in the evaluation. The facilitator is given the opportunity of sending reminder emails 64 to either those who have not even started or those that have started but not finished. The facilitator is also given the opportunity to "unfinish" a session 63. This may be necessary because a respondent may have prematurely finished a session and might want to go back and add more feedback. For those respondents who have not even started 65 or who are still busy 66 with evaluations, reminder emails 67 are sent.
Close:
Figure 8 illustrates a flow chart 70 of the steps involved in a facilitator closing an evaluation. It may be the case, despite a series of reminder emails, that when a facilitator comes to close an evaluation, there will be unfinished evaluation sessions. The facilitator selects the close option 71, and is then given a summary of the unfinished sessions 72 and is given the choice on each of them to either automatically finish the sessions or delete them together with all the associated answers and rankings. On closure 74, all people who can review reports on the evaluation are subsequently notified by email 75 giving them another unique URL to aid quick log-in; these are the supervisors of staff affected and those people designated as being able to review reports on specific groups.
Report:
Figure 9 illustrates a flow chart 80 of the steps involved in report production. The flow chart 80 shows how a facilitator can produce reports after an evaluation has been closed 81. These reports can be survey specific. A list of available reports is produced 82. The list will vary depending on the report permissions set up during the create phase. If the person is the nominated reviewer for a report then a link will appear for the option to review the report. Every report that has textual answers in it must be reviewed. The reviewer of each report is specified by the facilitator during the create phase. The user simply selects the required report 83, enters any required parameters, receives the report on their screen 84 and prints it 85 via the browser functionality if required.
Evaluate: Figure 10 illustrates a flow chart 110 of the steps involved in the main process of respondents performing an evaluation. Although the steps form only a small part of the system, such steps constitute the main point of interaction of most users of the system. After logging in via the special URL sent from the open evaluation process 111, an evaluator or respondent will be presented with a welcome page and a list of modules or surveys for which he or she is required to be an evaluator or respondent 112. This will vary from respondent to respondent. On pressing NEXT from the welcome page 113, the respondent answers questions, ranks people or objects, assesses performance against objectives 114, 115 or whatever is required by the survey or page before pressing NEXT to select a different page from the navigation bar 115. This occurs after the system has saved all answers and ranks 116. Once all pages have been processed, the respondent is taken into the Finish survey 117 where he or she is asked questions on the evaluation or survey itself (specified, if required, by the evaluation or survey creator). In this survey or page a general check is made to ensure that that the respondent has performed all mandatory actions 118 before allowing them to finish and commit their results to the database 119 and form part of the evaluation data.
The various processes described with reference to the flow charts will now be discussed in more detail with reference to a series of screen shots or portions thereof embodying graphical user interfaces (GUI's)
After the final step of the registration procedure, in which the welcome page is displayed at 16, the client or facilitator is informed that a certain level of Java needs to be installed as a plug-in to the browser. If required, a Java download and install will be automatically initiated, this being part of a standard process provided by Sun Microsystems.
The initial page providing entry into the organised or managed process of the invention is shown at 120 in Figure 11. A menu bar 121 allows the user to set up a staff customer (ie subject) list or to create an evaluation. All draft evaluations, evaluations in progress and closed evaluations are summarised on this page at 122, 123, and 124. Once the facilitator has added staff or other subjects and created evaluations, this page will have fewer instructions and will show all the subject's evaluations in their relevant "states". Referring now to Figure 12, the first task that the facilitator needs to perform is to set up a people database by selecting a set up staff/customer list. The so-called Staff/Customer details page 125 of Figure 12 is presented, with, the subject details 126 at this stage including only those of the facilitator. The facilitator then needs to enter more staff or other subjects either by typing them into the matrix provided or by preparing and loading a spreadsheet by selecting the option under "Tools". The screen shot of Figure 13 shows a spreadsheet loading page 127 for enabling bulk entry of staff/customer or other details. This is achieved by downloading and saving the sample Excel spreadsheet, populating it with staff details, clicking on the browse button to select the spreadsheet from the system, and clicking on the upload staff customer spreadsheet button 128.
Referring now to Figure 14, once the subjects have been loaded, they are organised into groups by selecting the organisation or "arrange names into groups" tab 130 and dragging and dropping groups and names into primary groups 131 and secondary groups 132. Each person can only be a member of one primary group 132, but can be a member of any number of secondary groups 132. This forms the basis for powerful reporting capacity, as it ultimately allows the feedback of any one group on another group to be reported.
Once the primary and secondary groups of the organisation have been defined and populated with subjects, the facilitator may choose to edit the organisation details or even to upload the organisation logo, which could then be displayed on each screen shot. The creation process will now be described with reference to Figure 15, which shows a screen shot of a create evaluation page 133. The name of the new evaluation is entered into text box 134, and the style of evaluation is then chosen either by clicking on the "blank" link 135 or on the name of a previous evaluation. The blank evaluation is generally created in the first instance, and Figure 16 shows a screen shot of a blank evaluation page 136 with the evaluation name, description, target launch date and target end date having to be filled in respective text boxes 137, 138, 139 and 140. An add survey link 141 is also provided for enabling the facilitator to add different types of surveys to the evaluation- in this case broadly categorised into confidential, non-confidential and colleague review. Once a survey is added, the focus turns to the actual survey. An alternative way of adding a survey is shown in Figure 17. A survey template window 142 provides various survey options, some of which are set out in the window 142. The selected survey is simply dragged and dropped into the evaluation, and appears in a tree structure 143 in the left panel of Figure 18, which shows part of a sample survey or evaluation 144. Questions can be modified and inserted either by entering new ones or dragging existing ones in from a question bank window 145 including categorised questions obtained from the tools menu. If the survey is based on people as subjects, the facilitator also needs to add the names and other relevant details of the subjects who need to be assessed by the questions, by clicking on a subjects tab. Figure 19 shows a colleague review screen shot 146 in more detail. A colleague review has a number of selectable features. First, the facilitator can choose whether the questions are asked all at once in respect of each subject, as is shown at 147. Alternatively, or in addition, respondents can answer one question at a time on a list of subjects 148. A ranking option 149 is also provided, enabling the facilitator to choose that the respondents rank the subject in order of value so that a unique value ranking can be created. The colleague review also allows respondents to choose which people to answer questions on. This constitutes a subset of the subject selected from the staff/customer list.
Once each survey has had its details completed, the various questions have been added and the respondent's (and subject, if required) have been added, the facilitator then needs to specify who needs to receive the reports.
Figure 20 shows a page incorporating a survey report by group window 151 providing the facilitator with option blocks 152 for enabling the facilitator to specify exactly who will receive the reports. Depending on the type of report, different report permissions are provided. Once the report selection process has been completed, the facilitator can check that the evaluation is ready for launching. A check window (not shown) may be provided for ensuring that further information is incorporated. For example, respondents may be added to the self assessment survey, and report persmissions may be required for the "survey report by person" in the self assessment survey. Part of a survey launch page is shown at 153 in Figure 21. Prior to launching the evaluation, the facilitator can select at 154 if any questions authored can be used in the question bank, and the subject and text of the email inviting people to participate in the evaluation are incorporated in respect of the text blocks 155 and 156.The confirmation screen 160 of Figure 22 is displayed on launching the evaluation, and includes full details of the respondents who have been sent an email inviting them to participate.
As is clear from Figure 23, each respondent automatically receives an email 161 asking the respondent to select a link and set their password. Figure 24 provides a first employee opinion survey page 162, which is the first page of the first survey. A left-hand panel 163 describes the contents of the evaluation, and includes link to colleague review, person, value, ranking and self-assessment. The status of each survey is depicted by a traffic light icon 164.
A first colleague review page is shown at 165 in Figure 25, with the left-hand panel now including a list of colleagues to be reviewed or evaluated 166.
Figure 26 shows at 167 a second colleague review page incorporating a comparative list of colleagues dealt with on an issue by issue basis. In Figure 27, a third colleague review page 168 provides a series of text boxes 169 in which questions on various colleagues can be answered on an issue by issue basis. Figure 28 shows a fourth colleague review page 170 including a series of score boxes 171 for enabling a list of selected colleagues to be scored out of 10.
In Figure 29, a fifth colleague review page 172 includes tick in the box questions about colleagues which can be dealt with on an issue by issue basis.
Figure 30 shows a self assessment page 173 providing a series of text boxes 174 for text- based responses. In Figure 31, a colleague review ranking page 175 allows respondents to rank one another in value order, providing a series of text boxes 176 of differing levels, and in this case ranked from high to low. This constitutes a significant feature of the invention, in that by collecting individual ranks of subjects by respondents, the system is later able to produce a ranking of any group or set of groups as ranked by any other group or set of groups. The respondent ultimately proceeds to a Finish page where the submission will be submitted. This is analogous to signing a form. Once this has been done, the respondent cannot return to answer further questions without requesting for the session to be "unfinished" by the facilitator. Once the evaluation is launched, it acquires an "in progress" status. In the first review page 177 of Figure 32, the facilitator is able to view the progress of the evaluation by selecting the relevant option from the relevant drop down list 178. In the second review page 179 of Figure 33, a full survey summary is provided providing the statuses of all the respondents. The facilitator is then able to review the progress of the response and can choose to delete or to "unfinish" sessions 179 A or to send out reminders to respondents who not have not yet finished.
Once the facilitator decides to close the evaluation, this can be done provided there are no started but not finished sessions. Part of a closed evaluation page is indicated at 180 in Figure 34. The facilitator then authors an email that is sent to all those people who have been designated to receive reports.
After closing, a confirmation report showing which people have been sent emails about reports for the evaluation is then provided, and an email informing the respondents that the evaluation is closed is then sent to all respondents, who are also invited to review reports.
On returning to the manage evaluations page from the closed evaluation page, the facilitator can then see that the evaluation has been closed and can be selected from drop-down menu 178 for report viewing, as is shown in Figure 35. The user is then presented with a list of the reports that he or she is entitled to view for the evaluation in one of two ways. If the user is a facilitator, the above drop-down menu selection may be made. Alternatively, the user may receive an email inviting him or her to review report, after which the link in the email is clicked on and the user then is able to sign in. As is clear from the report listing page 181 of Figure 36, the user can then view a selected report or can review the textual answers of a report if such user is a designated viewer for the report. If the user selects to review the textual answers of a report, the user is then presented with a page of all of the answers entered, and is given the ability to edit it to summarise them. This is considered necessary in the case of a confidential report, on the basis that inflammatory or counter-productive comments may possibly have been made, with the result that the company will need to protect itself and its employees. Once the answers are reviewed, the resulting text is made available in the report. If the report is reviewed by someone before the text answers have been reviewed, the answers are not available, and the report viewer is advised which review they are awaiting. On selection of a confidential report, an authorised user is presented with it immediately, it is shown from part of a confidential report page 182 in Figure 37. This page shows the beginning of a confidential report, which means that individual answers are protected. Figure 38 shows a page 183 of part of a non-confidential report, including individual answers underneath each group entry.
A summary of all of the answers given by group and sub-group of both primary and secondary groups is presented. This is an extremely powerful analytical tool, and it can be seen why secondary groups allow the facilitator to get different views on different subjects by allowing different cross-sections of the organisation to be formed. It is possible to generate many different types of reports once the database structure is in place.
The way in which the database is assigned allows a complex organisational structure to be captured through the facilitator interface, and for the structure subsequently to be analysed on numerous different group-based levels.
Ranking reports represent a more complex version of the report. A user is given rights to view ranking reports of one or a set of different groups. When the ranking report option is selected, the user is presented with a first list of the groups for which they are allowed to view the results (which could be only one). These are known as the subject groups. The second list is of all the groups that have members who have ranked or voted on any of the members of the subject groups. These are known as respondent groups. The user is enabled to get a ranking report on a combination of subject groups as voted on by any combination of respondent groups. A typical ranking report is shown at 184 in Figure 39, and includes name and group details, together with a relative score column 185 and details of the number of respondents 186 in each score. In order to exclude inconsistent or freak results, the system is typically set up not to include subjects received fewer than a critical number- say five- ranks or votes. It can be seen that any subject can be given a ranking position within any group for which they are a member based on the rankings (votes cast by any combination of groups). These positions can be reported in an individual's Person Reports. The most common report is to view the whole organisation as ranked by the whole organisation. It is also possible to drill down and to view the results of any subset of the organisation on the votes cast by any other subset (or the same subset).
The system will also allow "objects" to be ranked by Respondents. Objects are very similar to subjects; they can be arranged in group structures and reported on in the same way. An object can be anything that is comparable with another set of objects. Typical examples are: films, shows, detergents, sports teams, political parties, and individual performances. This allows Respondents to be requested to answer questions on and rank not only people but also any set of objects that they want compared. Films, for instance, would be categorised in some way, say: drama, thriller, comedy, action. They may also be categorised using secondary groups by say; censorship level, actors, or indeed any other factor. The Respondents can then be grouped by area with subgroups (e.g. NSW, Sydney, Mosman) and then be categorised further by cross-section using secondary groups such as sex, age bracket and the like. The Respondents are then asked questions about all the fihns and asked to rank them. Reports can then be generated on any subset of the films as assessed by any subset of the Respondents.
Subjects and objects can accordingly be ranked with the ability to group hierarchically in both primary groups and secondary groups, with both being combined in reports of any kind.
Referring now to Figures 40A and 40B, part of a relational database 190 is shown illustrating the key tables that implement the system of the invention, together with the relationship between the tables. Each table consists of several fields which are labelled with the table boundaries. Key icons indicate the primary key and the lines show the relationship between the various tables. The tables include a person table 191, an answer table 192, a survey question (called ModuleQuestion) table 193, a session table 194, a ranking table 195, an "evaluation group" (called EvaluationEntity) table 196 containing the various groups being evaluated or surveyed, and an "evaluation of group member" (called EvaluationEntityMember) table 197 including a membership for a given group in respect of the given evaluation.
An evaluation of all group members (called EvaluationEntityMemberAll) table 198 contains a membership for a given group and all its sub-groups in respect of the given evaluation or survey. In Figure 40B, an "answer detail" table 199 provides details of all the answer choices in a multiple choice question, whilst the "answer" table 192 contains the actual answers which refer to either answer choice or a textual answer or contains a number. A question table 200 includes various question types, a survey table 201 includes the various different surveys or modules and an evaluation table 202 includes details of the evaluation itself such as name, description and start and end dates.
The relationship between the various tables will be apparent to one of normal skill in the art on the basis of what is included in Figures 40A and 40B. Foreign keys in tables relate to primary keys in other tables, and the relationships can be derived from the names. For example, tblAnswerl92 has three foreign keys relating to primary keys of other tables: anPEID to peJD in tblPerson 191, anSEID to selD in tblSession 194 and anMQID to mqlD in tblModuleQuestion 193. The answer table actually has two other foreign key links: aTEID relates to the text table where textual answers are held and a USID relates to the user session table which holds details of the users session. It can be seen that this design allows a facilitator to collect and analyse the opinions (both answers to questions and rankings) of any set of groups on any other set of groups.
It will be understood that the invention disclosed and defined herein extends to all alternative combinations of two or more of the individual features mentioned or evident from the text or drawings. All of these different combinations constitute various alternative aspects of the invention.
The foregoing describes embodiments of the present invention and modifications, obvious to those skilled in the art can be made thereto, without departing from the scope of the present invention.

Claims

Claims
1. A method of providing an evaluation on a computer system, the method comprising the steps of:
(a) selecting a plurality of evaluation respondents; (b) selecting a plurality of groups into which the evaluation respondents are divided;
(c) creating at least one interactive evaluation session for the groups of respondents, the evaluation session comprising a series of questions;
(d) storing the respondents, groups and evaluation session on the computer system;
(e) providing each of said respondents with the opportunity to interactively answer the questions in the evaluation session;
(f) storing the answers on the computer system; and
(g) generating a series of reports based on the answers, wherein different types of reports can be generated by selecting different groups as a basis of comparison.
2. A method according to claim 1 in which the computer system includes a relational database, and at least some of the evaluation respondents are divided into more than one group, with the relational database being set up to enable the different groups to be selected as a basis of comparison.
3. A method according to claim 2 in which the groups include primary groups and non-primary or secondary groups, with the groups being selected such that a respondent may belong to only one primary group and to one or more secondary groups.
4. A method according to any one of the preceding claims in which the evaluation session comprises a number of substantially independent surveys each having a series of questions for answer by a respondent.
5. A method according to any one of the preceding claims which is operated in an internet browser/internet server environment.
6. A method according to claim 5 wherein the environment includes a host server having an associated relational database, at least one facilitator browser computer and a plurality of respondent computers, with the facilitator being involved in the initial selecting and creating steps.
7. A method according to claim 4 wherein the series of questions are arranged to require a response whereby respondents rank subjects or objects, for enabling the generation of ranking reports.
8. A method according to claim 5 wherein step (e) includes the steps of launching the evaluation process by including a universal resource locator (URL) of the address of a corresponding evaluation or survey to be carried out by the respondent.
9. A computer system for producing an interactive evaluation comprising: respondent and group organisation means for inputting respondent and group details into said computer system of the respondents and groups participating in the evaluation; evaluation creation means for creating, for each respondent in a group of respondents, an interactive evaluation including a series of questions; notification means for notifying the respondents of said group of the availability of the corresponding interactive evaluations; evaluation execution means for enabling respondents to interact with a corresponding evaluation; storage means for storing the evaluation result in respect of each respondent, and reporting means for collating and reporting said evaluation results.
10. A system as claimed in claim 9 further comprising: monitoring means for monitoring the current status of the interactive evaluation.
11. A system as claimed in claim 10 wherein said monitoring means includes a second notification means for sending a reminder to respondents who have not completed their evaluation.
12. A system as claimed in claim 10 wherein the storage means comprises a relational database, and is arranged further to store details of the respondents, the groups, and the evaluations.
13. A system as claimed in any one of claims 10 to 13 which includes registration means for initially registering an organisation's details with said computer system.
14. A computer system according to any one of claim 9 to 13 wherein the respondent and group organisation means comprises a set up graphical user interface (GUI) for enabling a facilitator to enter and arrange names and into primary and secondary groups.
15. A computer system according to any one of claims 9 to 14 wherein the evaluation creating means comprises at least one evaluation GUI including a survey' template option providing different survey types to choose from, and a question bank including questions associated with at least some of the survey types.
16. A computer system according to claim 15 in which the survey type is chosen from a group including a self review, a colleague review, an opinion poll, and confidential and non-confidential reviews.
17. A computer system according to any one of claim 9 to 16 in which the evaluation execution means includes a plurality of GUI's carrying respondent-specific survey questions.
18. A computer system according to any one of the preceding claims 9 to 17 which includes closure means for enabling a facilitator to close an evaluation by force closure or deletion.
19. A computer system according to any one of claims 9 to 18 wherein the reporting means includes select means for selecting a reports option available to an identified subject, the reports option being respondent-and group-based.
20. A computer system according to claim 19 wherein the reports option includes a ranking option, whereby the identified subject can select which subject-accessible groups or respondents can be ranked by which other groups or respondents.
21. A computer system according to either one of claims 19 or 20 wherein the reporting means is arranged to process textual answers, and includes an edit facility to enable a user to moderate or bowdlerise the textual answers prior to saving them.
22. A method of providing an evaluation on a computer system, the method comprising the steps of: providing an interface for facilitating the initiation of an interactive evaluation session; providing a respondent and group entry interface for enabling the entry of groups of evaluation respondents; providing an evaluation creation interface for enabling the compilation of at least one targeted survey for the respondents; providing an evaluation launching interface for launching the survey to the respondents; providing each of the respondents with the opportunity of interactively answering questions in the survey; receiving and storing on the computer system details of the evaluation session, respondents, groups, surveys and answers; and providing a report selection interface for enabling the selection of a series of reports on the survey.
23. A method according to claim 22 wherein the group and respondent details on the evaluation session are stored on a relational database in such a way that allows for the selection of different combinations of groups at the reporting stage.
24. A method according to either one of the preceding claims 22 or 23 which includes the step of allowing the entry of different groups of evaluation respondents as primary groups and secondary groups, with respondents being limited to membership of only one primary group, but to more than one secondary group.
25. A method according to claim 24 wherein the group and respondent details are stored in tables which include a group table containing details of the groups being evaluated, and a membership table including details of the primary group membership in respect of a given evaluation, and all secondary groups in respect of the evaluation.
26. A computer system for enabling a computer-based evaluation, the system comprising: initiation means for generating an interface for facilitating the initiation of the interactive evaluation session; respondent and group entry interface generation means for enabling the entry of groups of evaluation respondents; evaluation creation interface generation means for enabling the compilation of at least one targeted survey for the respondents; evaluation launching generation interface means for launching the survey to the respondents; enabling means for providing each of the respondents with the opportunity of interactively answering questions in the survey; storage means for storing on the computer system details of the evaluation session, respondents, groups, surveys and answers, and report selection interface means for enabling the selection of a series of reports on the survey.
27. A computer system according to claim 26 which includes server means incorporating storage means in the form of a relational database.
28. A computer program product comprising a computer readable medium having its own program code means when said program is loaded, to make the computer execute procedures in accordance with the method of claims 22 to 25.
29. A computer system according to any one of claims 14 to 17 wherein the graphical user interfaces are in the form of web pages, and in combination define a web page database.
30. A server hosting a web page database as claimed in claim 29.
Dated this 3τa day of June 2002
Evalu8 Pty Limited by its attorneys Freehills Carter Smith Beadle
PCT/AU2002/000722 2001-06-01 2002-06-03 Evaluation system and method WO2002097683A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPR5416 2001-06-01
AUPR5416A AUPR541601A0 (en) 2001-06-01 2001-06-01 Evaluation system and method

Publications (1)

Publication Number Publication Date
WO2002097683A1 true WO2002097683A1 (en) 2002-12-05

Family

ID=3829394

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2002/000722 WO2002097683A1 (en) 2001-06-01 2002-06-03 Evaluation system and method

Country Status (2)

Country Link
AU (1) AUPR541601A0 (en)
WO (1) WO2002097683A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005076182A1 (en) * 2004-02-10 2005-08-18 Nils Zettervall Method and device for recording of data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5909669A (en) * 1996-04-01 1999-06-01 Electronic Data Systems Corporation System and method for generating a knowledge worker productivity assessment
WO1999045489A1 (en) * 1998-03-02 1999-09-10 Nfo Worldwide, Inc. Method and apparatus for automating the conduct of surveys over a network system
WO2000041110A1 (en) * 1998-12-30 2000-07-13 Alessandro Alex D Survey system to quantify various criteria relating to the operation of an organization
WO2000055792A2 (en) * 1999-03-18 2000-09-21 Kpt Corporation Performance review and job description system
WO2001013296A1 (en) * 1999-08-13 2001-02-22 Miesionczek Robert J System for personnel performance management

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5909669A (en) * 1996-04-01 1999-06-01 Electronic Data Systems Corporation System and method for generating a knowledge worker productivity assessment
WO1999045489A1 (en) * 1998-03-02 1999-09-10 Nfo Worldwide, Inc. Method and apparatus for automating the conduct of surveys over a network system
WO2000041110A1 (en) * 1998-12-30 2000-07-13 Alessandro Alex D Survey system to quantify various criteria relating to the operation of an organization
WO2000055792A2 (en) * 1999-03-18 2000-09-21 Kpt Corporation Performance review and job description system
WO2001013296A1 (en) * 1999-08-13 2001-02-22 Miesionczek Robert J System for personnel performance management

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"360 Degree performance feedback software, multi-soucr ratings, multi-rater feedback", N.E. FRIED AND ASSOCIATES, Retrieved from the Internet <URL:http://www.nefried.com/360/> *
"Bringing the power of the internet to 360 Degree feedback", 360-DEGREEFEEDBACK.COM, Retrieved from the Internet <URL:http://www.360-degreefeedback.com/> *
"The cuetel employee attitude survey", CUETEL PTY LTD., Retrieved from the Internet <URL:http://www.cuetel.com.au/Cuetel%20Employee%20Attitude%20Survey.pdf> *
"The leader in online survey software", WEBSURVEYOR, Retrieved from the Internet <URL:http://www.websurveyor.com/home_onlinesurveys.asp> *
FRIED N. ELIZABETH: "360 Degree software vendor shootout: comparing features with needs", SOCIETY FOR HUMAN RESOURCE MANAGEMENT, December 1998 (1998-12-01), Retrieved from the Internet <URL:http://www.shrm.org/hrmagazine/articles/1298fried.htm> *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005076182A1 (en) * 2004-02-10 2005-08-18 Nils Zettervall Method and device for recording of data

Also Published As

Publication number Publication date
AUPR541601A0 (en) 2001-06-28

Similar Documents

Publication Publication Date Title
US9116950B2 (en) Method and apparatus for internet-based human network brokering
US8224767B2 (en) Rapid knowledge transfer among workers
US6988239B2 (en) Methods and apparatus for preparation and administration of training courses
US6871197B1 (en) Method and mechanism for a web based knowledge management tool
US8719173B2 (en) Collaborative portal system for business launch centers and other environments
US7403989B2 (en) Facilitating improved workflow
US7548930B2 (en) Platform for management of internet based public communications and public comment
US7788372B2 (en) Advisory systems and methods
US20060253478A1 (en) Client centric document preparation interface
US20030187932A1 (en) Network project development system and method
US20060085480A1 (en) Human resource sourcing exchange
US7295991B1 (en) Employment sourcing system
US20130018812A1 (en) System for Regulation of Continuing Education Requirements
US20030190593A1 (en) Systems and methods for the automated generation of individual transition plans
US7035838B2 (en) Methods and systems for organizing information stored within a computer network-based system
US20040002892A1 (en) Portal for global portfolio management system method &amp; apparatus
WO2002097683A1 (en) Evaluation system and method
US20030182286A1 (en) System and method to save, secure and access records of discussion
WO2008032029A2 (en) Advisory systems and methods
EP1242949A1 (en) Legal information distribution system and method
Yau System development of FYP portal (Registration module)
Yalamanchi et al. Project Group Assignment System

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP