simulation of a careers advisor using chatbot … is a user-centred, conversational portal that...

115
The candidate confirms that the work submitted is their own and the appropriate credit has been given where reference has been made to the work of others. I understand that failure to attribute material which is obtained from another source may be considered as plagiarism. (Signature of student) _______________________________ Simulation of a Careers Advisor using Chatbot Technologies James Francis McKenzie Information Systems (Industry) 2003/2004

Upload: hoanglien

Post on 26-May-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

The candidate confirms that the work submitted is their own and the appropriate credit has been given where reference has been made to the work of others. I understand that failure to attribute material which is obtained from another source may be considered as plagiarism. (Signature of student) _______________________________

Simulation of a Careers Advisor using Chatbot Technologies

James Francis McKenzie

Information Systems (Industry) 2003/2004

Simulation of a Careers Advisor using Chatbot Technologies

I

Summary

The aim of the project was to investigate the feasibility of using chatbot technologies to simulate

careers advisors on the internet. In order to meet this aim, background research was conducted

examining; chatbot technologies, the role of a careers advisor and analysis of the current system. This

knowledge combined with the lifecycle model for interaction design allowed three iterations of a

careers chatbot to be implemented, allowing a range of different users to give valuable feedback. The

feedback then became the basis for the evaluation of each iteration, allowing the chatbot to evolve,

eventually allowing conclusions to be drawn about the overall feasibility of a careers chatbot and

comparisons to be made with other solutions.

The final iteration of the careers chatbot is online at:

http://homepage.ntlworld.com/devblock/careers/index4.html

Simulation of a Careers Advisor using Chatbot Technologies

II

Acknowledgements

I would like to thank the following people:

My project supervisor Vania Dimitrova for help throughout the project.

The students who took time to undertake testing.

The Careers Centre for their full co-operation throughout the project.

Finally I would like to thank my friends and family for the support they have given me throughout the

project, with thanks going to David Milledge for help with proof reading.

Simulation of a Careers Advisor using Chatbot Technologies

III

Contents Summary .....................................................................................................................................I

Acknowledgements ................................................................................................................... II

Contents.................................................................................................................................... III

Chapter 1 - Introduction ........................................................................................................ - 1 -

1.1 Problem Definition...................................................................................................... - 1 -

1.2 Project Aim ................................................................................................................. - 1 -

1.3 Project Objectives ....................................................................................................... - 1 -

1.4 The Minimum Requirements ...................................................................................... - 1 -

1.5 Time Plan .................................................................................................................... - 2 -

Chapter 2 - Background reading ........................................................................................... - 3 -

2.1 Review of Chatbot Technologies ................................................................................ - 3 -

2.2 The Current System................................................................................................... - 13 -

2.3 Role of a Careers Advisor ......................................................................................... - 13 -

Chapter 3 – Methodology.................................................................................................... - 16 -

3.1 Lifecycle Model for Interaction Design .................................................................... - 16 -

3.2 The Waterfall Lifecycle Model ................................................................................. - 16 -

3.3 The Rational Unified Process.................................................................................... - 17 -

3.4 Choice of Methodology............................................................................................. - 17 -

3.5 User Involvement ...................................................................................................... - 18 -

Chapter 4 - Requirements.................................................................................................... - 19 -

4.1 Data Gathering .......................................................................................................... - 19 -

4.2 System Requirements................................................................................................ - 22 -

Chapter 5 - Design............................................................................................................... - 25 -

5.1 Pandorabots ............................................................................................................... - 25 -

5.2 Corpus Design ........................................................................................................... - 27 -

5.3 Interface Design ........................................................................................................ - 28 -

Chapter 6 - Iteration / prototype evaluation ........................................................................ - 31 -

6.1 Iteration 1 .................................................................................................................. - 31 -

6.2 Iteration 2 .................................................................................................................. - 37 -

6.3 Iteration 3 .................................................................................................................. - 40 -

Chapter 7 - Evaluation......................................................................................................... - 46 -

7.1 Minimum Requirements............................................................................................ - 46 -

Simulation of a Careers Advisor using Chatbot Technologies

IV

7.2 Extensions ................................................................................................................. - 47 -

7.3 Methodology ............................................................................................................. - 47 -

7.4 Time Schedule........................................................................................................... - 48 -

7.5 System Requirements................................................................................................ - 48 -

Chapter 8 - Conclusion........................................................................................................ - 51 -

8.1 The Feasibility........................................................................................................... - 51 -

8.2 Comparison To Other Solutions................................................................................ - 54 -

8.3 Summary ................................................................................................................... - 58 -

References: .......................................................................................................................... - 59 -

Appendix A – Personal Reflection...................................................................................... - 64 -

Appendix B – Time Schedule ............................................................................................. - 66 -

Appendix C – Interview with Careers Advisor ................................................................... - 69 -

Appendix D – Dialogue Analysis........................................................................................ - 71 -

Appendix E – Dialogue Analysis in Detail ......................................................................... - 72 -

Appendix F – Testing Tasks................................................................................................ - 78 -

Appendix G – Iteration 1 Testing........................................................................................ - 79 -

Appendix H – Iteration 1 Conversation Logs ..................................................................... - 82 -

Appendix I – Iteration 2 Testing ......................................................................................... - 86 -

Appendix J – Iteration 2 Conversation Logs....................................................................... - 90 -

Appendix K – Iteration 3 Testing........................................................................................ - 96 -

Appendix L – Iteration 3 Conversation Logs.................................................................... - 101 -

Appendix M – Interview with Expert Testers................................................................... - 107 -

Appendix N – Hardware Requirement.............................................................................. - 110 -

Simulation of a Careers Advisor using Chatbot Technologies

- 1 -

Chapter 1 - Introduction 1.1 Problem Definition

As more and more students have chosen to follow the path of further education this has increased the

demand for advice about the choices that are available for graduates after university. Leeds university

graduates and careers advisors currently spend a lot of time dealing with common careers problems.

A system is required to aid the careers advisors in cutting down the time taken to help students. Such

a system could take a simple generic careers question and return the appropriate answer. For example,

“where can I find out about summer internships?”

Currently The University of Leeds Careers Centre has an e-guidance system in place to cope with

student’s careers problems. This involves a student submitting their problem/question via an online

web form, a careers advisor then emails the student back within a short time period with a suggested

answer to the students query.

1.2 Project Aim

The aim of the project is to investigate the feasibility of using chatbot technologies to simulate careers

advisors on the internet.

1.3 Project Objectives

In order to satisfy the project aim eight objectives were set:

• Familiarise with chatbot technologies

• Compare chatbot technologies with other alternative strategies

• Investigate the role of a careers advisor and how this could be incorporated in to the

simulation of a careers chatbot.

• Familiarise with Artificial Intelligent Mark-up Language (AIML)

• Compile a corpus of careers conversations

• Setup a careers chatbot

• Test the chatbot appropriately, based on sample scenarios

• Evaluate the feasibility of a careers chatbot

1.4 The Minimum Requirements

The minimum requirements indicate the minimum work which could be completed in order to satisfy

the project aim and project objectives.

Simulation of a Careers Advisor using Chatbot Technologies

- 2 -

• Review of chatbot technologies

In order to build a chatbot it is essential to have a firm understanding of the technology and how it

works

• Identify the roles of a careers advisor and list the requirements for the careers chatbot

In order to simulate a careers advisor it is necessary to understand their roles and how they

approach students/graduates problems.

• Compile a sample of careers interactions sufficient to initialise a careers chatbot

Based on knowledge gained from research propose a number of questions and answers for a

careers chatbot.

• Build a proto-type of a careers chatbot

Build a chatbot based upon the analysis of the problem and background research. The design should

follow a chosen software methodology.

• Evaluate the careers chatbot prototype

Assess whether the solution produced has been successful in meeting the aim of the project using set

criteria. Finally, assess the project as a whole.

1.4.1 Possible Extensions

Any of the following extensions would go beyond the minimum requirements for the project

• Evaluate/test the chatbot with actual end users

• Compare chatbot technologies to Frequently Asked Question (FAQ’s)

• Undertake a number of iterations to tune the chatbot

• Take the system live with the Careers Centre

1.5 Time Plan

The initial time schedule for the project is shown in Appendix B; this shows the plan suggested in

October 2003. Throughout the project, changes were made to the schedule, the new schedule is shown

alongside the old one in Appendix B, explaining and justifying the changes.

Simulation of a Careers Advisor using Chatbot Technologies

- 3 -

Chapter 2 - Background reading

The aim of this chapter is to define what a chatbot is, show example applications of chatbots, and list

the components necessary to build a chatbot and to describe what tagging is necessary. The current

careers centre system will be critically analysed and the role of a careers advisor will also be

investigated.

2.1 Review of Chatbot Technologies

Definition

A chatbot is a computer program aimed at simulating the conversation of a human being (Shawar &

Atwell, 2002). Chatbots can also often be referred to as Chat-bots, chatterbots or sometimes

conversational agents. The chatbots considered for this project are deemed to be non-intelligent

conversational agents. They take an input in the form of a query, run this query against a database and

then return an answer. Allen (1995, p541) suggests a chatbot cannot be intelligent since if it were to

receive an input that didn’t appear in the database it would not be able to construct an appropriate

answer. He refers to an intelligent agent as one with human-like motivations such as: i) perception ii)

beliefs iii) desires/wants iv) planning/reasoning v) commitments vi) intentions and vii) acting. The

seven characteristics would enable the chatbot to be continuously changing, updating beliefs as it

increases it knowledge through interaction.

One of the first and most famous chatbots was Eliza. This program is by Joseph Weizenbaum. It

aimed to simulate a psychoanalyst by rephrasing many of the patient's statements as questions and

then posing them to the patient. The chatbot applied a user-supplied corpus to respond to questions or

statements posed by the user. Eliza worked by simple pattern recognition and then applied reassembly

rules to form an answer to the question. Eliza was thought to be so believable that the term the ‘Eliza

effect’ was incorporated in the common vocabulary. It is defined by Howe (2004) as the tendency of

humans to attach associations to terms from prior experience.

Since Eliza there has been many chatbots, the most prominent being Dr. Richard S. Wallace’s chatbot

ALICE (A.L.I.C.E. AI Foundation, 2001-2004). ALICE was created in 1995; it uses a language called

Artificial Intelligent Mark up Language (AIML) which is compliant to XML and uses pattern

matching techniques to simulate intelligence. AIML will be discussed later in this Chapter.

Simulation of a Careers Advisor using Chatbot Technologies

- 4 -

Example Applications of Chatbots

The chatbots Elizabeth and ALICE are adaptations of the Eliza program. They are two of the most

famous chatbots. Although both are based upon similar principles they work in different ways.

Shawar and Atwell (2002) suggest they both have a number of advantages and disadvantages. ALICE

can hold a large corpus text and is based upon simple pattern matching thus making it easier to build

machine learning. ALICE also has the ability to split the user input into more than one sentence, reply

to each sub-sentence and return an answer that is made up of the replies to the sub-sentences. This

feature is not available within Elizabeth. Elizabeth provides grammatical analysis for sentences, it

uses complex rules for which input/output transformations and keyword patterns are needed both to

form a user input and Elizabeth’s output.

There are already numerous real life applications which use chatbot technologies. An example of a

mainstream application is Steven Spielberg's A.I. movie website. This film is based on artificial

intelligence and provides a chatbot on its website. This chatbot, shown in Fig 2.0, is based upon an

early version of ALICE and uses case based reasoning. Case based reasoning is the process whereby a

new problem is solved by basing answers on problems that have occurred in the past (Wikipedia,

2004). For example, a careers advisor who answers a student’s query by remembering the answers to

other queries that had similarities is using case based reasoning. Below in Fig2.0 is the interface a

user would see while interacting with the chatbot.

Fig 2.0 - Screenshot from the A.I movie website, showing the interface to the chatbot (Spielberg,

2000).

A complex professional application of chatbot technologies is the Catacomb Project; this also uses

the ALICE chatbot engine. This is a user-centred, conversational portal that extends the ALICE

chatbot technology platform and links the user conveniently to information resources, such as Web

Services, with specialized query routing. (Mark Ginsburg 2002). This is a more complex chatbot

which demonstrates that it is still possible to use the underlining chatbot technologies for an effective

result.

The chatbot returns an answer to the question

Here the user can type a question

Simulation of a Careers Advisor using Chatbot Technologies

- 5 -

A more practical professional application of chatbot systems is Mia (my interactive assistant)

(1882direkt 2004). This is a chatbot for the German bank 1882 direkt. Mia offers help to prospective

customers wishing to apply for a new bank accounts; it uses Artificial Intelligence Mark-Up Language

(AIML) and was created by Smart Bot Technologies. Mia has proved to be a successful application of

chatbot technologies leading the way for other banks to use similar systems, and proving highly

popular within Germany. See Fig 2.1 for the interface which a customer is offered.

Fig 2.1 - Screen shot of Mia’s interface taken from 1882direkt (2004)

The example applications above show several domains where chatbot technology can be used. These

can mainly be split into social and professional areas, where social is a chatbot that is built for fun and

professional is one built for a purpose. There are also a number of new applications where chatbot

technologies are currently being developed; dynamic characters that can interact using speech are

being developed by Oddcast (Oddcast, 24th April 2004). They aim to offer conversational characters

online which can deliver information and respond dynamically to customer queries. Pandorabots

(2004) also offer a service whereby a chatbot can be linked with AOL messenger to show as a user

online.

Chatbot Components

What is needed to create a chatbot?

In order to create a chatbot based upon the ALICE architecture two components are necessary: i) A

set of software in order to interpret AIML and serve clients via the web, often called ‘programs’ ii)

The corpus needed to create an ALICE brain written in AIML (Wallace, 24th April 2004). In the

context of this project, the corpus shall be previous careers queries and answers and the software used

shall be Pandorabots, these shall be discussed during chapter 5. The tagging AIML will be discussed

later in this chapter.

Mia takes the user through the steps of setting up an account

Simulation of a Careers Advisor using Chatbot Technologies

- 6 -

The software to create an interpreter has been made open source enabling many people to contribute to

the advancement of the technology. The first version of ALICE used a widely know language called

SETL which was based on set theory and mathematical logic. Since this there have been many

variations and over 300 software programmers who have contributed to the various programs, these

have used amongst others: Java and AIML, pre-Java 2 technologies that provided a class of editors

and tools to AIML developers, C/C++ implementations and newer Java implementations. One of the

latest programs ‘Program D’ is what ALICE currently uses; it is based in newer Java implementation

which heavily focuses on the interface (Wallace, 24th April 2004).

AIML

In order to build a chatbot it is necessary to understand what AIML is and how it is constructed.

AIML is an Artificial Intelligence Mark-up Language that lets users give the chatbots knowledge. It

was developed from 1995-2000 by Wallace (27th February 2004) and the Alicebot free software

community. It is compliant to XML and has tags similar to HTML. AIML as a language is a fairly

simple, it only contains letters, numbers and wildcard symbols * and _. Words are separated by a

single space and wildcards can work like words.

AIML consists of data objects (AIML Objects), these consist of units called topics and categories,

which contain either parsed or unparsed data. The topic is optional and allows you to set a name for

the categories that follow it. The category unit is the main part of the AIML language, inside the

category is an input question, an output answer and an optional context. The question is called a

pattern and an answer is called a template. The two types of optional contexts are called ‘that’ and

‘topic’. The AIML software stores the patterns in a tree like structure managed by an object called the

graphmaster; this controls the storage of the patterns and the complicated matching algorithms. Using

the file storage analogy, the pattern “I LIKE LAW” would be written as c:/I/LIKE/LAW/ where

anything that begins with “I” would be in c:/I/ and everything with “I LIKE” would be in c:/I/LIKE/.

Below shows a graphical representation in the graphmaster, the spiral plots the categories in the

ALICE brain and represents the root; in the example above this would be c:/. The branches coming

from the root are the patterns.

Fig 2.2 - A graphical representation of the AIML graphmaster , taken from the Wallace (2001)

The darker areas show where there are more branches from categories

The spiral represents all of the categories

Simulation of a Careers Advisor using Chatbot Technologies

- 7 -

AIML Categories

The category unit is made up of three sub categories, Atomic, Default and Recursive categories.

Atomic categories are those which contain no wildcards such and * and _. They take the following

format: <category>

<pattern> 10 interviews </pattern>

<template> that’s a lot </template>

</category>

If the user inputs: 10 interviews

Then the response would be: that’s a lot

Default categories are those which use wildcards * and _, this allows a reduction process to happen.

<category>

<pattern> 10* </pattern>

<template> It is 10 </template>

</category>

If the user were to input 10 followed by something else (10 time a day) and no matching was made on

the full sentence it would be go back to the answer for just 10 on its own and return ‘It is 10’.

Recursive categories are those with <srai> tags, the AI stand for artificial intelligence and there is

some debate of the SR but most people call it symbolic reduction (Wallace, 24th April 2004). There

are many applications to use <srai>, listed below.

1. Symbolic Reduction - Reduce complex grammatical forms to simpler ones.

2. Divide and Conquer - Split an input into two or more subparts, and combine the responses

to each.

3. Synonyms - Map different ways of saying the same thing to the same reply.

4. Spelling or grammar corrections.

5. Detecting keywords anywhere in the input.

6. Conditionals - Certain forms of branching may be implemented with <srai>.

7. Any combination of 1-6.

Within this project number 1, 2, 3 and 5 shall be looked at due to time constriction and the time taken

to manually code a chatbot.

Simulation of a Careers Advisor using Chatbot Technologies

- 8 -

1. Symbolic reduction is where complex sentence forms are turned into simpler ones. The simplest

possible form of saying something is normally stored in an atomic pattern. For example DO YOU

KNOW WHERE LEEDS IS? Is the same as WHERE IS LEEDS?

<category>

<pattern>DO YOU KNOW WHERE * IS</pattern>

<template><srai>WHERE IS <star/></srai></template>

</category>

This category makes any input of the form “Do you know where * is?” to “Where is *?” Whereby *

maybe inserted into the reply by representing it by <star/>.

2. Divide and conquer is where a sentence can be divided into one or many sub-sentences, the reply

is then made by combining replies to each of the sub-sentences. For example if a sentence starts with

yes then carries on, it maybe be treated as yes (part 1) then the rest of the sentence is (part2).

<category>

<pattern>YES *</pattern>

<template><srai>YES</srai> <sr/></template>

</category>

The markup <sr/> is simply an abbreviation for <srai><star/></srai>.

3. Synonyms are the most common form of <srai>, this is can be used to recognize different way of

saying the same thing, for example HI and HELLO. AIML version 1.01 only allows one pattern per

category. <category>

<pattern>HELLO</pattern>

<template>Hi there!</template>

</category>

<category>

<pattern>HI</pattern>

<template><srai>HELLO</srai></template>

</category>

<category>

<pattern>HI THERE</pattern>

<template><srai>HELLO</srai></template>

</category>

<category>

Simulation of a Careers Advisor using Chatbot Technologies

- 9 -

5. Detecting keywords within sentences, this is most effective with the use of the wildcards, they can

be used to detect whether the word(s) are on there own, at the beginning of the sentence, in the middle

of at the end

<pattern>telephone number</pattern>

<template>The careers telephone number is: +44 (0)113 343 5298

</template>

</category>

<category>

<pattern>_ telephone number</pattern>

<template><srai>telephone number</srai></template>

</category>

<category>

<pattern>telephone number _</pattern>

<template><srai>telephone number</srai></template>

</category>

<category>

<pattern>* telephone number *</pattern>

<template><srai>telephone number</srai></template>

</category>

AIML shall be looked at again in chapter 6 when encoding the corpus for the desired chatbot. The

techniques above should allow a chatbot to be able to interact in a basic way, becoming more powerful

when using different techniques together.

Tagging

The aim of looking at tagging is to understand how AIML tagging works in general, and how AIML

differs from other mark up languages.

In order for a machine to be able to process information it is an important requirement that it has a

structure (Ding et al (2002). The main structure for a web document is a document mark-up language

which allows the user to describe the layout of a webpage. HTML is widely used for this and contains

tags such as <title> and <h1> that define the look and feel of a website. However HTML is somewhat

limited as it can only be used for the rendering of the content.

XML is another mark-up language which is a restricted subset of SGML (Bray et al, 2002). SGML

being a framework for defining mark up languages (Connolly, 1995). XML is a language designed for

describing data in a (semi-)structured manner, it uses tags which have arbitrary names and can contain

other tags or arbitrary data. The name of these tags can be chosen by the user. In order for a machine

Simulation of a Careers Advisor using Chatbot Technologies

- 10 -

to be able to process what’s within the tags, they have to be set out in a structured way. The example

below shows a XML representation of a book.

<book>

<author>Dr Richard Wallace </author>

<isbn>0123456789</isbn>

<title> Beginners Guide to AI </title>

</book>

An advanced form of tagging is Part of Speech Tagging. “This is the process of assigning a part-of-

speech or other lexical class marker to each word in a corpus” (Jurafsky and Martin (2000). This is

different to HTML and XML as it is concerned tagging of inputs of strings. The process involves

breaking down sentences into separate words and assigning tag markers to each word. These markers

are concerned with what each word is, for example: nouns, verbs, pronouns, prepositions, adverbs,

conjunctions, participles and articles. One of the most popular tagsets is called CLAWS, an example

below shows how each word is categorised by fixing a number of letters using the CLAWS1 tagset.

(URCEL, 10th April 2004).

you PP2 can MD come VB tomorrow NR for CC your PP$ interview NN

The reason why tagsets are used is to give information about each word and the surrounding words.

This can be useful for a language model for speech recognition. Part of Speech recognition can also

be used for information retrieval as it is able to pick out nouns and important words; this could be

useful when studying large dialogues or corpuses.

AIML the tagging used for many chatbots as mentioned earlier in this chapter is compliant to XML

and has tags similar to HTML. This can also be related to natural language and part of speech tagging

as it can involves analysing dialogs and picking out important words in order to form a response.

AIML is not as simple as HTML and it is not concerned solely with the layout of a page, but it isn’t as

complicated as Part of Speech Tagging since it doesn’t break down individual strings to look at each

separate word and give it a marker. It is closest to XML in that it “describes a class of data objects

called AIML objects and partially describes the behaviour of computer programs that process them”

(Wallace [24th April 2004]).

Key

PP2 personal pronoun, MD modal, VB verb, NR singular adverbial noun, CC co-ordinating conjunction

PP$ prenominal possessive personal pronoun, NN noun

Simulation of a Careers Advisor using Chatbot Technologies

- 11 -

User Studies

The main goal for a chatbot is to mimic human based conversation. The Turing test asked the

question ‘can machines think?’ (Turing (1950). The game was originally played with 3 people, a male

(A), female (B), and interrogator (C). The interrogator is split from the other two in a separate room.

The aim is to guess whether (A) or (B) is the women. (A) and (B) have to persuade (C) that they are

in fact the women, (B) telling the truth and (A) trying to deceive them. This test then substituted a

machine as (A), the direct comparison wasn’t to see of the machine could fool (C) but how the

machine compared to the male who also tried the role of (A).

An adaptation of the Turing test called the Loebner Prize Contest has become prominent within the

chatbot domain (Loebner, 24th April 2004). The prize started to run in 1991 when Dr. Robert Epstein

organised the first contest, Dr. Hugh Loebner pledged $100,000 for the first computer to pass the test.

The test involved 10 agents, 6 of which being chatbots. A team of 10 evaluators would then rank the

10 agents in order of the most convincing human. At first the conversations were limited to a specific

topic but in later contests this was changed so that it became an open topic.

A number of authors are sceptical about the ability of the Turing test and the Loebner prize, Searl

(1980) believes that this is just a way of fooling people into believing that they are interacting with

another human. He gives the Chinese room analogy that he is sat in a closed room with a slot in the

door through which some slips of paper with questions written in Chinese characters. Searle doesn’t

understand Chinese but he does have a codebook of instructions in English, which tells him how to

develop answers. He prepares the answers on more slips of paper, and pushes them through the slot.

The answers appear to make sense the Chinese person outside but Searle argues that the person in the

room does not understand Chinese; they simply follow a set of instruction to process an answer.

Allen (1994) also rejects the idea that a conversational agent can be intelligent as he asks, what would

happen if you gave it a question that wasn’t in the database? He poses the question “what motivates it

to say anything or to attempt to comprehend the utterances that are said to it?” Although perhaps this

has similarities with human conversation as if you were to ask someone a question they did not know

then they could not give you a meaningful answer back.

Other sceptics include Shieber (1994) who argues that intelligence shouldn’t solely depend on surface

behaviour, and that Turing chose natural language for definition of human intelligence as it is by its

very nature is open-ended and free-wheeling and this was taken away by the Loebner Prize when topic

restriction was introduced.

Simulation of a Careers Advisor using Chatbot Technologies

- 12 -

Despite these negative views Loebner (24th April 2004) responded by arguing that the unrestricted test

is simpler, less expensive and the still the best way. Loebner also replied to the criticism by justifying

the Loebner Prize discussing three points:

1. “AI scientists and philosophers regularly discussed the test, yet no one had taken steps to

implement it.” (referring to the Turing test)

2. “I believe that this contest will advance AI and serve as a tool to measure the state of the art.”

3. “The third purpose of the prize was to perform a social experiment”

Successful Chatbots

Although the Loebner prize has attracted some criticism there have been successful applications of

chatbot technologies. The fact that chatbots have been introduced to banking shows that this

technology is slowly being trusted, also with the movie Artificial Intelligence chatbots have been

exposed to a larger audience. These chatbots have been designed for a specific task, this could be

compared to the Loebner prize and its similar topic restrictions, and may explain why they have

proved popular.

Pandorabots is a popular chatbot hosting service; it believes that the full capabilities of the chatbot

technologies have yet to be discovered, that in the near future further successes will come.

Pandorabots believe that with up and coming developments such as voice recognition, text to speech

synthesis, invoking programs on remote or local machines, content development tools, e-commerce

and cartoon animations, that we are coming closer and closer to help bring chatbots to ‘virtual life’

(Pandorabots.com (24th April 2004).

Implications for this project

Currently there seems to be no indication that chatbot technologies have been considered in the

Careers domain. ALICE do not have a careers chatbot listed in their advertised directory (A.I.

Foundation, 24th April 2004) nor does their recommended ‘chatterbot collection’ link (The Chatterbot,

2004). Although the Careers domain hasn’t been looked at before, there is no evidence to show that

the careers domain is inappropriate. It can be indirectly compared with other areas as it concerns a

specific area of interest that can be questioned, assuming an answer is available to the question asked.

It is similar to Mia the banking chatbot mentioned earlier as it offers advice on a topic and can be

compared to Eliza as its purpose is to try and replicate the conversation between a psychoanalyst and a

patient, much like the aim of the careers chatbot is to replicate the conversation between a student and

careers advisor. The chatbots discussed all are related to a distinct area of questions, similar to the

approach the careers chatbot would take, this restricted topic may increase the chance of simulating a

careers advisor that might be capable of convincing the user that they are interacting with another

human. Further comparison to alternative solutions shall be made within chapter 8.

Simulation of a Careers Advisor using Chatbot Technologies

- 13 -

2.2 The Current System

The following information is a summary taken from an interview which took place in early November

(See Appendix C) with Caroline Ramage (IT manager, University of Leeds Careers Centre) to discuss

the current system. The aim was to find out what if anything was wrong with the current approach to

answering student’s queries. Some of this information is based upon knowledge of working practice

within the Careers Centre gained as a result of direct contact with the current system.

The current system is called an e-guidance system, it is aimed at Leeds students and Leeds graduates

who find it difficult to physically visit the careers centre. It is an online system that enables students

to fill in a web form with their details then submit a query. This query will then be accessed by the

current duty advisor who usually answers the query within 3 working days.

The current system has three main problem areas. The first problem is the most important from the

student’s perspective. If they have an immediate query that they want to be answered this isn’t always

possible, phone calls could be made; however careers advisors are often being busy and so this isn’t

always feasible. The second problem is a serious issue as the current e-guidance system hasn’t fully

taken off. With over 30,000 students at Leeds University there is the possibility that if this system

becomes popular the careers staff could be weighed down with more queries than they can handle,

meaning the response time would get even longer. The third problem is the time taken to answer short

questions posed by students, especially if repeated. The careers centre staff have tried to solve this

problem by saving some of their emails and the answers they give out in a folder in Microsoft

Outlook, referring back to them as and when they remember a similar question. This has had limited

success and is itself is a time costly exercise, actually finding a response that suits a query and

adapting it for the need of the next student can prove difficult. Another problem often experienced is

the variable quality of answers given to queries. If an advisor has a lot of time to spare on a particular

day the response to one students query may greatly differ from the next day where an advisor may

only have 5 minutes to spare.

2.3 Role of a Careers Advisor

In order to make a chatbot believable it is necessary to mimic human behaviour, in the context of a

careers chatbot this would be to mimic the answers given by a careers advisors. Therefore the role of

a careers advisor has been investigated. The aim was to see the processes that are involved in

answering students queries and the knowledge required in order to give an informative answer. Also

the day to day responsibilities of a careers advisor have been looked at. The following information

was taken from the same interview as detailed above (see Appendix C).

Simulation of a Careers Advisor using Chatbot Technologies

- 14 -

At Leeds University there are no formal requirements to become a careers advisor although courses

have now been put into place so that qualifications can be gained. Qualifications suggested are a

Professional Diploma in Career Guidance and a Careers qualification through AGCAS (The

Association of Graduate Careers Advisory Services). Although these qualifications are recommended

many of the staff at the careers centre joined with no qualifications and from a range of different

backgrounds, for example Caroline Ramage who worked for IBM and now specialises in seeing

students with an interest in computing.

A careers advisor typically has a number of different responsibilities: teaching modules, drop-in

interviews, CV clinics, guidance interviews, workshops and talks as well as the answering the email

guidance system.

The process of answering student’s queries involves a careers advisor logging onto an email account

that receives the queries which are sent by a web form. A response is then sent back to the student.

This process is shared between a number of different careers advisors. The process of answering the

questions can differ from advisor to advisor; there is no set way of answering a student’s query, one

advisor for example may go onto a search engine and find the relevant information for the student,

while another advisor may only point the student in the direction of the search engine previously

mentioned.

Implications for this project

From the above information it is clear that the whole process of careers advice by email is poorly

defined, from how to become a careers advisor to the way student’s queries are answered.

The current system is far from ideal, causing problems for the careers advisors and the students /

graduates requiring answers to their queries. Therefore, this opens up an opportunity to solve the

problem with a new system. The chatbot technologies could potentially solve this problem addressing

the three main problems mentioned earlier.

1. The time taken to respond to each query

2. The potential that the demand for this system could increase further, allowing more delays in

the time taken to respond

3. The time and effort taken dealing with many repetitive simple questions posed by the students.

Simulation of a Careers Advisor using Chatbot Technologies

- 15 -

In order to see if a chatbot could solve these problems a mock-up chatbot system will be made, this

will examine whether a chatbot can answer careers queries and to what extent it can behave like a

careers advisor when interacting with real end users.

In order to create such a careers chatbot a number of things would be required;

• A service/program to host the chatbot and interpret AIML

• Knowledge of AIML

• A collection of careers related questions and answer written in AIML

• Specific careers knowledge

These shall be looked at in chapters 4 and 5.

Simulation of a Careers Advisor using Chatbot Technologies

- 16 -

Chapter 3 – Methodology

Methodology Review

The design of every software product requires a methodology. Below is a selection of different

methodologies taught by the School of Computing as well as an example from Preece et al (2002).

3.1 Lifecycle Model for Interaction Design

This model is suggested by Preece et al (2002, page 186), it incorporates iteration and encourages a

user focus. It has four main stages; identifying needs and establish requirements, generating

alternative designs, building interactive versions of the design and evaluation of the design. See Fig

3.0. An advantage of this model is that following the evaluation the designer may return to refine or

redesign. Furthermore, this evaluation process can also involve users, thus allowing their views to be

acknowledged and incorporated in the next design iteration. Preece et al (2002) suggest that implicit in

this cycle is that the final product will emerge in an evolutionary fashion from a rough initial idea

through to the finished product.

Fig 3.0 - Diagram of lifecycle for interaction design taken from Preece et al (2002).

3.2 The Waterfall Lifecycle Model

The waterfall lifecycle was developed by Royce (1970). It was the first famous model within software

engineering and is still a popular basis for many methods today. There are five main stages in the

model, requirements/analysis, design, implementation, testing and maintenance (Fig 3.1). To progress

to the next stage the previous stage must be completed with only limited feedback between design

stages. This approach has the problem that requirements often change over time; this model does not

fully support the need for continual feedback. Furthermore this model is now considered old-

fashioned or too simplistic by users of object-oriented design who often use the spiral model instead.

The need for feedback to earlier stages was acknowledged as both desirable and indeed practical soon

after the lifecycle became widely used (Preece et al, 2002, p188).

requirements

Simulation of a Careers Advisor using Chatbot Technologies

- 17 -

Fig 3.1 - The waterfall lifecycle model of software development

3.3 The Rational Unified Process

The Rational Unified Process, or “IBM Rational Unified Process®, or RUP®, is a flexible software

development process platform that helps you deliver customized yet consistent process guidance to

your project team.” (IBM, 15th February 2004).

The Rational Unified Process covers several recommended software engineering practices and gives a

comprehensive set of guidelines to follow to use during the development process. Kruchten (2000)

defines these as:

• Develop software iteratively.

• Manage requirements.

• Use component-based architectures.

• Visually model software.

• Continuously verify software quality.

• Control changes to software.

The RUP is a more object orientated approach splitting the system into ‘Component based

architectures’. This process is more suited to a project that has a number of different system

components. The documentation suggested by the RUP, uses techniques such as Use Cases and Class

Diagrams.

3.4 Choice of Methodology

Although the Waterfall model and Rational Unified Process has been taught within the school of

computing, the methodology chosen to implement a chatbot was the Lifecycle model for interaction

design. The waterfall model is based around a fixed process that does not support feedback

throughout the design stages. The Rational Unified Process can be a very complex design model with

a number of different tools available, and is geared towards an object orientated approach. The

advantage of the Simple lifecycle model is that it offers a different approach, heavily depending on

user feedback throughout the development. This enables iteration within the design creating an

evolutionary design. Such features would suit the creation of a chatbot as it could be implemented

Simulation of a Careers Advisor using Chatbot Technologies

- 18 -

over a period of time. This would allow refinements to be made to the design and requirements stage

before a final product is produced, allowing a better corpus and interface to be developed over time.

The high level of user involvement would give the chatbot a greater chance of meeting the user needs

and increase its attractiveness to potential users.

3.5 User Involvement

The above choice of methodology means that there will be a high level of user involvement. User

involvement has become a relevant issue since it has been accepted that not all IT projects are

successful. The Standish Group (1995) produced a report stating “In the United States, we spend more

than $250 billion each year on IT application development of approximately 175,000 projects” and

“research shows a staggering 31.1% of projects will be cancelled before they ever get completed.

Further results indicate 52.7% of projects will cost 189% of their original estimates.” These statistics

indicate a problem with the way in which projects are being conducted. The report subsequently

details that successful projects have a number of different factors, user involvement being the most

significant for project success. User involvement is also deemed to be a important issues by Kendall

and Kendall (1999, page 5) who state “some kind of user involvement throughout the systems project

is critical to the successful development of computerised information systems”

As well as simply involving users in the design process, the user must feel that the system meets their

requirements. A successful system is one that is accepted and used by its users. Willcocks and Mason

(1987, page 76) suggest that “increased participation… may improve the quantity and quality of data

collected and the quality of problem definition. It can also increase people’s acceptance of the

solution”

In the case of the careers chatbot user involvement could be used to gain feedback over a number of

iterations to improve different aspects of the chatbot, such as the corpus (this could be additions /

alteration to the AIML) and the interface.

Problems with user involvement

The key to user involvement is how you choose your users/testers, at what point in the design do you

involve them and to what extent. Some users may do more harm than good, Willcocks & Mason

(1987, page83) state that “user understanding of technical objectives, proposals and solutions may be

very limited”, this could hinder the design process as users may ask for unrealistic requirements thus

creating unrealistic system expectations. Clark (2003) suggests that different users may have different

priorities when giving feedback and can often change their views when asked to look at the problem

from other points of view.

Simulation of a Careers Advisor using Chatbot Technologies

- 19 -

Chapter 4 - Requirements

The methodology chosen identifies the first step to be gathering needs and establishing requirements.

Requirements, as defined by Sommerville (2001) are used to establish exactly what the system should

do. User requirements can also be thought of as the services the system is expected to provide and the

constraints under which it must operate. In order to apply this to the chatbot it was necessary to refer

back to chapter 2 to identify what is required for a chatbot.

1. A service/program that would be able to host the chatbot and interpret AIML

2. Knowledge of AIML

3. A collection of careers related questions and answers written in AIML

4. Specific careers knowledge

Point 1 will be covered in chapter 5 and point 2 has already been covered in chapter 2. In order to find

out about the other requirements data would have to be gathered.

4.1 Data Gathering

In order to collect suitable data on which requirements can be based it is essential to choose the correct

data-gathering techniques. According to Preece et al (2002, page 210) the purpose of data gathering is

to collect sufficient, relevant, and appropriate data so that a set of requirements can be made. Preece

et al (2002, page210) suggest a number of basic techniques: questionnaires, interviews, focus groups

and workshops, naturalistic observation, and studying documentation.

Questionnaires – A set of questions used to extract specific information. Often used for large groups

of people, especially when over a large geographic area. Questionnaires can restrict the person filling

in the questionnaire to a set of pre-defined answers.

Interviews – Involves asking someone a set of questions, often face to face, although can be done over

the telephone. Can be structured, un-structured or semi-structured, the interviewer doesn’t have to

stick the original questions. A time consuming process and can lead to an incorrect outcome due to

the possibility of the person interviewed being intimidated.

Focus groups and workshops – Different to an interview as this by its very name involves a group of

people, usually being the stakeholders in the project. The groups / workshops can be structured or un-

structured. Useful for highlighting areas of conflict between members of the group but encourages

contact between developers and users. A time consuming process.

Simulation of a Careers Advisor using Chatbot Technologies

- 20 -

Naturalistic observation – Involves spending a lot of time with the stakeholders around their daily

tasks. This can show what happens in a natural setting. Can involve shadowing a stakeholder for a

period of time, asking questions as you go along. This requires a lot of time and can leave you with a

lot of data.

Studying documentation – Involves studying; manuals, regulations governing tasks, diaries and job

logs. Good for getting background information. This doesn’t involve stakeholders unlike other

techniques.

Choice of Data-gathering Method

In order to gather data on requirements; interviews and natural observation were undertaken. The

technique of naturalistic observation was chosen as during the authors year in industry time was spent

working alongside a number of careers advisors, this gave the opportunity to spend a large amount of

time observing actual careers work and giving an insight which other techniques couldn’t. Interviews

provided a one on one discussion allowing in-depth questions to be asked, also allowing questions to

be flexible and change according to answers to previous questions. The method was chosen as the

author had taken a year in industry within the careers centre, this presented an opportunity to meet

with careers contacts without the intimidation factor that otherwise may have occurred.

Interview

An initial interview with Caroline Ramage (Careers Advisor) took place to find out about the current

system and the areas that could be improved (see chapter 2). The highlights from this interview can be

seen within Appendix C. Another meeting took place with Caroline where data was collected; this

was in the form of students email questions and careers advisors answers within the current e-guidance

system. These then formed the basis of a corpus that could be used for questions and answers for the

careers chatbot. The data collect provided a total of 498 emails received during the time period of

October 2002 until January 2003. The emails had been split into categories by the careers advisors

who use the e-guidance system. Currently there are 47 categories (see Appendix D). The emails

collected were used to meet point 3 mentioned at the beginning of this chapter ‘A collection of careers

related questions and answers written in AIML’.

Corpus Selection

Due to time constraints and the amount of AIML coding that it would take to cover the whole careers

domain, subsections of the careers domain had to be chosen for the careers chatbot. These subsections

were taken from the pre-defined categories that the careers centre staff already use. According to

Shawar and Atwell (2002) in order to train a chatbot, dialogue corpuses should have the following

characteristics:

Simulation of a Careers Advisor using Chatbot Technologies

- 21 -

Structured format – In the case of a chatbot this would be a question followed by an answer to the

proposed question.

Short obvious turns without overlapping – This applies more to speech than a text based chatbot. It

is not possible for someone to ask a question while at the same time receive an answer.

Unnecessary notes, expressions or other symbols that are not used when writing a text. – The

query must be in the format of text with no extra symbols such as # or @ or mathematical/other

expressions.

Category Analysis

In order to decide which categories the careers chatbots shall cover it was necessary to examine the

each of the categories and compare against the criteria above. The top seven categories were

examined as they had an acceptable amount of queries; the other categories didn’t have enough

queries to show a fair representation and therefore would be no use when constructing the chatbot.

Appendix E shows analysis of each of the top seven categories, applying the criteria Shawar and

Atwell suggested above. Below is table 4.0 showing the top seven categories and the number of

queries received within that area.

Table 4.0 - Top 7 query categories.

Classification Number of queries

CV and application form checks 113

General Careers Advice 56

Postgraduate study in the UK 38

Law 33

Industrial year and summer placements 31

Working / studying abroad 25

Teaching 23

Choice of Categories

Analysis showed that each of seven categories had similar problems: that the length of the both the

questions and the answer were too long, contradicting Shawar and Atwell’s (2002) concept that

dialogues with short questions and answers are better in the construction of a chatbot. Another

problem highlighted was the regular occurrence of complex questions or in the case of some

categories, personal statements were sent as part of the query. A chatbot would not handle this type of

interaction well as they primarily designed to answer short, fairly simple questions.

Simulation of a Careers Advisor using Chatbot Technologies

- 22 -

After examining each of the top seven categories the ones chosen were ‘general careers advice’ and

‘Industrial year and summer placements’. These were chosen as general careers advice has a large

scope of questions available with the content generally not going into too much depth. This also has a

higher percentage of smaller questions than most of the other categories. Industrial year and summer

placement were chosen due to medium sized scope, with many questions having similar answers but

with enough variation to justify the choice. This area also gave the opportunity to draw upon the

author’s experience of a placement year and exploit a number of peers with experience in the area who

were willing to take part in the testing process.

4.2 System Requirements

In order for careers chatbot to be able to function properly and answers questions within the categories

chosen it was necessary to set a number of system requirements. These have be drawn from the initial

analysis of chatbots in chapter 2 and using the data gathering techniques. According to Preece et al.

(2002, p205) there are two types of requirements that need to be identified, functional which are what

the system should do and non-functional requirements which are what constraints there are on the

system and its development. The functional requirements can be split up into 3 further sections;

essential, desirable and optional requirements. This has been split up in order to manage the project

better, so that the most important parts can be overseen first. To display the requirements at the

correct level of abstraction with the correct information, the ‘Volere’ process (Robertson and

Robertson, 1999) shown in Preece et al (2002) has been adapted. This process shows a description of

the requirement shall be stated followed by the rationale and source.

Functional requirements

Essential Requirements

The essential requirements are the minimum features in order to meet the minimum requirements for

this project, focusing on the requirement ‘Build a proto-type of a careers chatbot’. The following

requirements will ensure that this happens:

Requirements Rationale Source

The system shall provide an

interface to the chatbot, this

shall be via a HTML page

This is how the user shall interface with

the chatbot

Information collected

in chapter 2

The chatbot should be

knowledgeable on ‘general

careers advice’ and ‘Industrial

year and summer placements’

The knowledge of the careers chatbot has

been limited due to the time constraint

and the time taken to create the AIML

code

Corpus selection

earlier in this chapter

Simulation of a Careers Advisor using Chatbot Technologies

- 23 -

The chatbot should provide the

facility to type a question

This is how the user shall ask the chatbot

a question

Information collected

in chapter 2

The chatbot should provide a

facility to return an the answer

to the question posed

This is how the user shall receive an

answer to the questions posed

Information collected

in chapter 2

Desirable Requirements

In order to go beyond the minimum requirements it would be necessary to have a system with greater

capabilities than the suggested essential requirements.

Requirements Rationale Source

The chatbot should return a

relevant answer to the questions

posed, for example if the

question was about computers,

the answer returned should be

about computers

The purpose of chatbot is to give a

relevant answer to student’s questions,

without this the chatbot would be

redundant.

Interview with

Caroline Ramage,

careers advisor (see

Appendix C)

The chatbot shall have a help

function with an explanation of

how a user can interact with it.

This will provide learnability increasing

ease of use and enabling the user to

become familiar with the system,

therefore achieving maximum

performance

Clark (2003)

The interface should provide an

alternative way of contacting the

careers centre

This is another form of support, if the

chatbot isn’t working correctly or

sufficient information is not given.

Clark (2003)

The interface shall provide links

to other careers services

This is similar to the existing system as

the careers centre website has links to

other sites

The University of

Leeds Careers Centre

(24th April 2004)

Optional Requirements

Optional requirements are those which do not affect the success of the project, such requirements

would be thought of a bonus to the project. This may be implemented depending on the time

available:

Simulation of a Careers Advisor using Chatbot Technologies

- 24 -

Requirements Rationale Source

The system should allow the

botmaster/careers advisor the

functionality to edit the corpus

If questions or answers to queries within the

corpus were to become outdated then this

function would allow correct answers to

maintained of the time

Interview with

Caroline

Ramage

The chatbot should be able to

respond to the user via speech

instead of text

The would help people with disabilities have

more access to the functionality of the chatbot

Brewer (24th

April 2004)

Non-functional requirements

Requirements Rationale Source

The text response should be of

appropriate understanding level

The target user group is university

students and graduates therefore it can be

taken than they are educated to a

understand a suitable level of detail

Naturalistic

observations

The html page should have an

appropriate layout and colour scheme

Inappropriate appearance can lead to a

users becoming frustrated and can affect

its usability

Preece et al

(2002, p152)

The text response should be clear and

legible

In order for the user to be able read

his/her own query and to be able to read

the response easily

Naturalistic

observations

The chatbot should return answers to

questions clearly, for example in

another colour

To differentiate between the answer and

other text on screen

Naturalistic

observations

The chatbot should return answers to

the user immediately without a time

delay

Web users are often be impatient, Nielsen

(1999) says that if a site doesn't provide

immediate gratification, they leave

Nielsen (1999)

The system should allow the

student/user to receive a quick relevant

response to the query asked via speech

/ a simulated face of a careers advisor

To some extent this can make the chatbot

more believable, a voice or physical

appearance can help a user judge a

reaction better.

Preece et al

(2002, p160)

Summary

This chapter has detailed what the final chatbot should be able to do and what it must do in order to

meet the aim stated in chapter 1. The requirements were made from research in chapter 2 and using

data gathering techniques. It has been decided that the chatbot shall answer questions on ‘general

careers advice’ and ‘Industrial year and summer placements’ due to the time constraints of this project.

Simulation of a Careers Advisor using Chatbot Technologies

- 25 -

Chapter 5 - Design

The design of the chatbot is based upon meeting the requirements stated in chapter 4. The design has

been split up into three main sections i) Pandorabots, the service that shall be used to create the

chatbot ii) corpus design and iii)interface design. This is based on Avison and Shah’s (1997, p202)

approach to interaction design. They suggest that one should consider: Inputs which the users enter

into the system, the outputs which the users receive from the system and the dialogues through which

the users and system interact.

5.1 Pandorabots

In order for a chatbot to be built, Pandorabots will be used. It is a software robot (also known as a bot)

hosting service. From any browser, one may create and publish a robot for anyone to use via the web.

(Pandorabots, 2004). This service is based on AIML, as mentioned in chapter 1 and closely follows

the work of Dr. Wallace and A.L.I.C.E. In the context of the careers chatbot, AIML can be uploaded

to the pandorabots website to extend the basic chatbots knowledge.

Why Pandorabots?

Pandorabots was chosen as it has a number of advantages. It is a free service allowing anyone to

create, maintain and publish a chatbot. It provides the opportunity to upload your own AIML code

and extend the basic chatbot interface. Pandorabots was also recommended by Bayan Abu Shawar of

the University of Leeds, an expert in the field of chatbots. However one disadvantage of using this

service is that the Pandorabots website is subject to regular changes as enhancements are constantly

added to it, for example, since the beginning of this study the website has undergone a radical change

in appearance as well as a number of new functionalities, such as the option to combine a chatbot with

AOL’s instant messenger facility. These changes result in a need to backup AIML files offline as was

an increased possibility that the website could be down and AIML files could be lost.

Features of Pandorabots (see Fig 5.0 below):

• Start with added knowledge – when setting up a chatbot there is the option to give the chatbot

a base knowledge; this would enable social interaction and the ability to answer simple

questions.

• Upload AIML – the chatbots knowledge can be extended with the chance to upload self made

AIML files.

• Upload HTML – the interface to the chatbot can be altered by uploading a customised HTML

page with embedded code to handle the inputs and outputs of the chatbot.

• Publish to web – this features allows the chatbot to interact over the internet

Simulation of a Careers Advisor using Chatbot Technologies

- 26 -

• Publish with Oddcast VHost – gives the chatbot an animated face as well as the ability to

output the answer via speech rather than text. This is an advanced feature that costs under $10

per month.

• Conversation logs – the botmaster can look at the interactions between users and the chatbot

over a set period of time. Such information could be used to tune the chatbot by looking at

regular question and problem areas. The IP address is also shown in the logs and a function

enabling you to pin point the location of the user asking the question. This is done via

http://www.geobytes.com.

• Set basic properties – the features allows basic properties of the chatbot to be set up without

any knowledge of AIML. For example one can choose the chatbots age, birthplace and

favourite food.

• Convert dialogue to AIML – this feature allows users to convert basic text in AIML form, this

feature is limited as it can only handle atomic categories

• Training – this function allows one to type in a question that you would like the chatbot to

answer, an answer is returned and if you do not like it then you are able to change the response

to the answer of your choice.

Fig 5.0 - Screenshot of Pandorabots menu.

The hardware requirements that are necessary to interact with a chatbot hosted by pandorabots are

listed with Appendix N.

Train your chatbot

Set basic properties

Upload AIML

Location of published chatbot

Customise interface

Publish with VHost

View conversation logs

Simulation of a Careers Advisor using Chatbot Technologies

- 27 -

5.2 Corpus Design

In order for the careers chatbot to be able to answer questions, the knowledge to answer these

questions must be accumulated prior to interaction. In the case of the careers chatbot it was necessary

to have a dialogue of careers interactions beforehand. This dialogue enabled example questions and

answers to be extracted. Without this dialogue it may not be possible to formulate questions that were

realistic. There would be a lack of example interactions on which questions and answers could be

based. The questions would then be reliant on the botmasters knowledge, which may be limited.

The emails gathered in chapter 4 would not normally be permitted to be seen by other students as they

contain students/graduates email address, contact details, personal information and their posed

questions and given answers. Due to the fact the author worked in the careers centre for a year he has

been trusted with such data, without this, this study would not have been possible. From this dialogue

questions and answer were drawn.

Choice of questions / answer for the corpus

In order for the careers chatbot to interact with the user about general careers advice and industrial

year / summer placements it is necessary to have a collection of questions and answers. These

questions and answers have been generated from the email dialogues collected in chapter 4. The email

dialogues come in the raw form that the user sent. For the chatbot to understand these; they have to be

changed into an understandable format. The first step is to see that they match Shawar’s (2002)

criteria from chapter 4 to see if they would make for a good corpus dialogue. The emails often

contained queries that were of an inappropriate size so questions had to be picked out of long queries,

in many instances this required taking out unnecessary information around the desired question. For

example a student may send a query including his/her name, age, their department and other details

amongst the query. In a case like this the question had to be removed and sometimes changed slightly

in order to be more general. Once this has been done some of the initial questions and answer may

need to be re-worded in order to match Shawar’s criteria. E.g. a question from an email may be as

follows:

Q: Hi, my name is Andrew I am a second year Biology student and was wondering what time does the

careers centre open?

Re-worded: what time does the careers centre open?

Once the dialogue has been re-worded it also has to go through a normalisation process which

analyzes and retains the correct information from the original input, splits up a sentence after suitable

pauses or breaks and removes unnecessary characters, punctuation and converting letters into

Simulation of a Careers Advisor using Chatbot Technologies

- 28 -

uppercase. (Shawar and Atwell, 2002). This then leaves the questions and answers in a format that

could be coded with AIML. This is shown in chapter 6.

5.3 Interface Design

The interface design is an important factor of the design stage that needs to be addressed, since this is

how the user will interact with the careers chatbot. If the interface for the chatbot is incorrect this

could affect the overall effectiveness and usability. Nielsen, J., Coyne, K.P., and Tahir, M. (2001)

argue that usability is often the most neglected aspect of Web sites, yet in many respects it is the most

important. Shneiderman (1998) and Nielsen (24th April 2004) both suggest different ways for

designing a good interface. Shneiderman uses ‘The Eight Golden Rules of Interface Design’ and

Nielsen uses his ‘Ten Usability Heuristics’. The most applicable to the careers chatbot have been

taken from the above design techniques and are listed below.

• Strive for consistency

o Colour, layout, text

• Offer informative feedback

• Permit easy reversal of actions

• Aesthetic and minimalist design

• Help and documentation

The interface provided by Pandorabots originally had a plain white background with black text (see

Fig 5.1). Pandorabots has the facility where by the botmaster can upload their own HTML to

personalise their chatbot. In order to do this the following HTML must be embedded within the main

body of the original HTML:

This code includes a form where the user can enter text to send to the chatbot and a place where the

chatbots last output will be inserted.

<body onLoad="document.form.input.focus();"> Welcome to my bot <br> <font color="green">!OUTPUT!</font> <br> <form method="POST" name="form"> !CUSTID! say: <input type="TEXT" autocomplete="off" name="input"> </form> <br> </body>

Simulation of a Careers Advisor using Chatbot Technologies

- 29 -

Fig 5.1 - Screenshot of original careers advisor

Current careers centre website

The chatbot currently has no design to follow; potentially it could be influenced by the design of

Leeds University current careers centre website; however, unfortunately at the moment the careers

centre has a number of design problems. See below.

• No distinct first page

• The careers centre logo should be at the top left of the website

• There is a distracting logo in background of the main body of text

• The main navigation menu is on the right hand side of the screen, and is not distinct from

other text.

• The colour scheme varies across the site in various bright glaring colours

• It is very easy to loose track of where you are within the website, navigation should be a more

important feature.

The chatbot could therefore try to solve some of these problems thus turn them into positive aspects of

the new design, whilst at the same time maintaining the identity of the current website.

Layout

Veen (2001) in Preece et al, (2002) divides the layout design into three areas; ‘Where am I?’ this is

across the top and gives the website some kind of identity, ‘Where can I go?’ this is the menu system

down the left hand side and, ‘What’s here?’ which is where the main content would lie in the middle

of a website, this is shown in Fig 5.2. This is suitable for the layout of the careers centre style website

allowing improvements to made to the navigation, maintaining the identity of the careers centre and

matching Nielsen’s Aesthetic and minimalist design.

A very simple interface, plain white with an input box, all the text is in black

Simulation of a Careers Advisor using Chatbot Technologies

- 30 -

Fig 5.2 - Veen’s screen layout taken from Preece et al (2002).

Colour

Currently the careers website follows a predominantly green with black colour scheme. This results in

certain section of the text being difficult to read. The new design will therefore keep the same colour

but use a white background for the main body of the website with hints of green in the title and

navigation sections.

Text

Text shall be in black on a white background similar to design of the basic chatbot ensuring the visual

clarity of the text. Nielsen (2000) states “Use colors with high contrast between the text and the

background. Optimal legibility requires black text on white background (so-called positive text).”

Summary

This chapter has shown areas that need to be considered so that the minimum requirements are met.

The practical application of the design is shown within chapter 6 where the chatbot has been

implemented.

Simulation of a Careers Advisor using Chatbot Technologies

- 31 -

Chapter 6 - Iteration / prototype evaluation

Following the methodology chosen in chapter 3, the lifecycle model for interaction design relies

heavily on iterative design and user feedback. Several iterations took place; this process entailed user

involvement as discussed in chapter 3, with the chatbot system evolving over time. The text below

deals with i) the online location of the iterations ii) the features (both layout and corpus) iii) the testers

iv) the testing process and v) feedback from tests and changes made.

6.1 Iteration 1

The website where the chatbot is available online:

http://homepage.ntlworld.com/devblock/careers/index2.html.

Features

The first iteration approach was to meet the projects minimum requirements. Atomic categories were

used within the AIML to match the users input; this would require the user to exactly match the

patterns within the AIML. Synonyms were also used so that the AIML could recognize the user

asking questions in different ways, the synonyms were created by asking peers and past graduates to

generate different ways of asking the same questions. To find suitable questions and answers for

iteration 1, Shawar’s (2002) criteria for a good corpus as discussed in chapter 4 was applied to the two

areas within careers; general careers advice and industrial year and summer placements. Once suitable

questions and answers had been found from the original email dialogues the normalisation process

discussed in chapter 5 took place. Once normalised the questions and answers were coded into AIML

as discussed in chapter 2. The example below shows each step of this process for a question on

placements abroad.

Original format

Relevant question re-arranged

Hi! I had a careers interview in July, but my circumstances have now completely changed. I'm looking to go to Spain in Jan and then South America. I would be grateful for any info on work/ experiences in these places. Can you help? Thanks,

Old: I would be grateful for any info on work/ experiences in these places. New: I would like some information on placements abroad

Simulation of a Careers Advisor using Chatbot Technologies

- 32 -

Normalised

AIML atomic category

AIML atomic within synonyms

This process was a time costly exercise for the first iteration as there were no other iterations to build

on.

Layout

The website layout has followed the suggestions made in chapter 5. The menu system has been kept

very minimal, allowing easy access to all of the main sections. Each page on the website is consistent,

using the same menu system down the left hand side. The main body of the website is always in the

same place and the colours and text stay the same throughout. The colours for the website have been

based on the careers centre, the menu on the left hand side is a light green with the buttons appearing

dark green until the mouse goes over them, at this time the button highlights red indicating that that

section is available to click. The main background is white, with black text. The layout can be seen in

Fig 6.0.

I WOULD LIKE SOME INFORMATION ON PLACEMENTS ABROAD

<category> <pattern>I WOULD LIKE SOME INFORMATION ON PLACEMENTS ABROAD</pattern> <template>the answer goes here</template> </category>

<?xml version="1.0" encoding="UTF-8" ?> <aiml version="1.0"> <category> <pattern> I WOULD LIKE SOME INFORMATION ON PLACEMENTS ABROAD </pattern> <template>the answer goes here</template> </category> <category> <pattern>CAN YOU HELP ME FIND A PLACEMENT ABROAD</pattern> <template><srai> I WOULD LIKE SOME INFORMATION ON PLACEMENTS ABROAD </srai></template> </category> </aiml>

Simulation of a Careers Advisor using Chatbot Technologies

- 33 -

Fig 6.0 - Screenshot of the website showing the general layout

Navigation

The website consists of five different sections. The homepage, careers chat, contact us, help and link

sections. These sections are clearly labelled on the menu system down the left hand side of the screen.

The homepage is an introduction to what is on offer and how to access the functions available. The

careers chat section is where the careers chatbot is based; this has a link to the Pandorabots website

where the main service and knowledge resides. Here the user types a question in the box provided,

presses enter and the answer appears in green above the box, this is shown in Fig 6.1. The contact us

section of the website has various methods of contacting Leeds University Careers Centre, including

the address, telephone number and fax number. The help section has simple instructions on how to

interact with the careers chatbot and information on what to do if the desired answer is not ascertained,

Clarke (2003) suggest that some kind of support is needed within a system when starting-up or going

live. The links sections offers links to other careers related websites, these have been taken from the

careers centre website and Graduate Yorkshire (24th April 2004) a popular graduate website.

Navigation menu

Careers centre logo

Main content Links

Simulation of a Careers Advisor using Chatbot Technologies

- 34 -

Fig 6.1 - Screenshot showing the careers chatbot and how it interacts.

Iteration 1 Testing

To receive feedback on the first iteration of the chatbot, testing/evaluation was conducted. This was

done in three different ways.

1. Testing to see if the careers chatbot can answer questions in the areas stated in the

requirements stage.

2. Heuristic evaluation, against the criteria proposed in chapter 5 taken from Shneiderman (1998)

and Nielsen, Coyne and Tahir (2001).

3. A diary for general comments on problems encountered.

The testing process was based upon two methods. The method of keeping a diary which enabled

testers to say exactly what went wrong, this gave them the opportunity to record what they thought

about their interactions with the chatbot and to record any problems (Preece et al, 2001, p377). The

second method was similar to tests performed by Microsoft’s Virtual Worlds Research Group and

librarians and clinicians at the Fred Hutchinson Cancer Research centre show in Preece et al, (2001

p324). Specific tasks were set for each tester to complete; this required them to find out pieces of

information, these task allowed the user to question the chatbot in areas that had been encoded, for

example to find out what time the careers centre opened.

The tasks were given to the testers in the form of a printed piece of paper. Before taking the tests

Shneiderman (1998) suggested that it was professional practice to sign a contract stating that the tester

had freely volunteered for the experiment, had been told in advance what their tasks were and how

Enter question

Answer appears in green

Menu button to access the careers chatbot

Simulation of a Careers Advisor using Chatbot Technologies

- 35 -

they should conduct the test. The testers should be given the chance to ask questions and should have

been aware that they could leave at any time. The tester should also sign the test ensuring all the

information stated was correct.

The test compromised of 3 tasks. The first task was to familiarise with the careers chatbot, following

the help function showing the user how to interact with the chatbot. Testers had to rate how easy task

1 was on a 1 to 4 scale, 1 being strongly agree, 2 agree, 3 disagree and 4 strongly disagree. This was

used in order to force the tester to make a choice between good and bad. Task 2 was to retrieve

specific information from the chatbot, 15 tasks were set in the areas that had been encoded. For

example: Can you find out the opening times for the careers centre? (The full lists of tasks are in

Appendix F). Testers were asked to answer yes or no in this section and again were encouraged to

give feedback about their interaction experience and problems encountered. In Task 3 the testers

evaluated the interface and general layout, this also enquired about how realistic they thought the

chatbot had been and how useful they found it. The scoring system was the same as for task 1 and

again feedback was requested. The results from these tests were then analysed, comparing what

tester’s struggled with and whether they experienced problems in the same areas. This also gave the

opportunity for positive feedback.

The Testers

There were three testers for iteration 1, all of whom had undertaken placements while at university.

Placement students had been chosen as they already had experience of enquiring about placements and

were more likely to ask relevant questions than a normal student. The testers were all undergraduate

students studying a computing related degree and were between the ages of 21 and 23, there were two

male’s testers and one female, all of whom were volunteers. Three users were initially testing the

system as it was a very early prototype and mistakes were easy to spot. The behaviour of the system

was noticeably insufficient as a careers chatbot. Nielsen (2001) suggested a minimum of 3 users to

have a benefit in terms of cost; however this is more relevant on a larger scale project. The tests were

taken in a student’s natural environment at their home on their own computer; this allowed the tester

to feel at ease and simulate realistic conditions. During the test the tester’s were left alone, therefore a

student could not ask for outside help, also the Hawthorne theory states that a users behaviour changes

if being watched (Wikipedia 2004).

Feedback

The results from iteration were as expected. The three testers all had similar problems and opinions

within each of the three tasks. All of the testers could easily navigate through the website and used the

help function correctly in order to start interacting with the chatbot. Task 2 caused widespread

problems with a very low number of questions being completed, the highest being Neil Hickman with

Simulation of a Careers Advisor using Chatbot Technologies

- 36 -

4 out of 15 answers being found. This was because AIML did not matching the user’s exact inputs,

for example when a tester input the following:

Human: what do students do after graduation

Careers Advisor: They do their own thing.

Correct input: what are the graduate destinations?

The careers chatbot replied with an answer that wasn’t suitable for the tester’s question. This

happened for the majority of the questions. The incorrect outputs showed that the atomic category

within AIML did not perform adequately on its own and that it was very hard to predict the way a user

was going to phrase a question. The testers also commented on the politeness of the chatbot. For

example, the chatbot replied with: “You tell me” and “They do their own thing”, these responses came

from Dr. Wallace’s knowledge inherited by the chatbot. Task 3 gave similar results; all of the testers

found the interface, layout, colour and text appropriate, although one user commented that it was

“slightly boring”. This might be attributed to the minimalist interface design suggested in chapter 5.

All the testers agreed that interacting with the chatbot was not like chatting to a real careers advisor

and that they didn’t find this service helpful. Below in Table 6.0 is a summary of the results collected

for iteration 1. (See full testing in Appendix G). The full conversation logs for each user can be seen

in Appendix H, these show the exact user input and the responses the chatbot gave that justified the

user’s feedback.

Table 6.0 – Summary of feedback from iteration 1.

Name Task 1

Explore chatbot

website

Task 2

Information retrieval

(number of completed

questions)

Task 3

Interface, useful

Neil

Hickman

Help function

was ok

3/15 completed Interface was good, didn’t believe he

was talking to career advisor, or find

the chatbot helpful

Louise

Manley

Help function

was easy to use

1/15 completed Design was a bit boring, found the

chatbot to be rude and didn’t find it

helpful. Didn’t believe it was real

Chris

Burton

Help function

was ok

4/15 completed Design was good but found the

chatbot to be rude and didn’t find it

helpful. Didn’t believe it was real

Simulation of a Careers Advisor using Chatbot Technologies

- 37 -

6.2 Iteration 2

The website where the chatbot is available online:

http://homepage.ntlworld.com/devblock/careers/index3.html

Changes from iteration 1

The immediate problem from iteration 1 was the need to increase the success rate on task 2.

Previously this had been due to the testers input not exactly matching the input that the chatbot had.

Instead of matching the exact phrase entered by the user, and then outputting the answer only if there

is an exact match, the use of keywords was examined. This was possible to do using the wildcards *

and _ within AIML. These were used to recognise whether a word (or phrase) appeared within a

sentence, whether it was at the start middle or end. Table 6.1 below shows the different ways in which

each tester tried to find out information on step placement in iteration 1. This shows that every

question included ‘step placement’ within the input.

Table 6.1 - Table showing user inputs from iteration 1.

Name Input

Neil Hickman do you have information on step placements?

Louise Manley what is a step placement

Chris Burton step placement

Once the method of using keywords was chosen, all of the previous AIML coding and the user inputs

had to be examined in order to detect the keywords. This was a fairly simple although time consuming

task. Most of the key words were the names of the topics, e.g. 6 month placements or step placements.

A generic broader answer then had to be generated so that every keyword had the same answer. This

answer was constructed by analysing the previous answers to create a well balanced broad answer.

The AIML was then constructed with the new wildcard tags. Duplicate AIML code had to be

removed. An example of the process is shown below:

The keyword ‘step placement’ was chosen

what is a step placement

Simulation of a Careers Advisor using Chatbot Technologies

- 38 -

AIML was constructed allowing the keyword ‘step placement’ to be detected anywhere in a sentence.

Examples of possible inputs with ‘step placements’ in the sentence and the response to all of the inputs

Testing iteration 2

The testing for iteration 2 followed the same structure as iteration 1 testing. It comprised the same

three tasks, the difference being that for task 1 the tester’s had more freedom to ask their own

questions. The testers were asked to see if they could retrieve any useful information from the

chatbot; this was using no structured tasks and was designed to allow greater feedback by looking at

the careers chatbot conversation logs. This also allowed the chatbot to be gradually introduced to

social conversation. A feedback box was introduced to allow testers to suggest further improvements

and general comments on the careers chatbot. The questions for all of the tasks are shown in

Appendix I.

The testers

In order to undertake the second iteration testing it was necessary to use testers who have had no

previous contact with the careers chatbot, therefore more individuals were required. These individuals

<?xml version="1.0" encoding="UTF-8" ?> <aiml version="1.0"> <category> <pattern>STEP PLACEMENT</pattern> <template>STEP is a UK-wide programme offering undergraduates project-based work within small to medium sized businesses and community organisations. For more useful information visit http://www.step.org.uk/</template> </category> <category> <pattern>_ STEP PLACEMENT</pattern> <template><srai>STEP PLACEMENT</srai></template> </category> <category> <pattern>STEP PLACEMENT_</pattern> <template><srai>STEP PLACEMENT</srai></template> </category> <category> <pattern>_ STEP PLACEMENT*</pattern> <template><srai>STEP PLACEMENT</srai></template> </category> <category> <pattern>* STEP PLACEMENT*</pattern> <template><srai>STEP PLACEMENT</srai></template> </category> </aiml>

Input: step placement? Input: what is a step placement? Input: is a step placement any use to me? Input: step placement or summer internship? Output: STEP is a UK-wide programme offering undergraduates project-based work within small to medium sized businesses and community organisations. For more useful information visit http://www.step.org.uk/

Simulation of a Careers Advisor using Chatbot Technologies

- 39 -

came from different academic departments with different levels of computing experience; this differed

from the first iteration, where all the testers were from within the School of Computing. This was to

increase the scope and variety of the answers. There were 4 female testers studying Computing,

English and History, Spanish and Economics, and Broadcasting and Journalism. Three of which were

finalists and one who graduated last year, they had all undertaken placements. The testing conditions

were the same as for iteration1.

Feedback

The results from iteration 2 showed an improvement from iteration 1. The most noticeable difference

was the number of the questions answered within task 2. In iteration 1 the highest number of

questions anyone got right was 4 out of 15 whereas with iteration two using keywords the successful

rate rose dramatically, with the best results being 12 questions from 15, and the worst being 4 out for

15. Although success was achieved within this task, testers did not always find the answers to the

questions immediately. Task 1 and 3 produced similar results to iteration 1, the help function was

useable but yet again the testers found the chatbot to be unlike a careers advisor, and 3 out of 4 found

the chatbot to be unhelpful. The testers again commented on some of the answers returned, one tester

called the answers bizarre. The feedback box at the end also highlighted the lack of depth within

answers given; this problem could be related to the use of keywords as they may have to cover a

broader context compared to an atomic category match. An interesting feature to some of the test was

that the tester’s reverted to using keywords after periods of unsuccessful questions, although this

feature did prove successful for one of the testers (Lydia Otero) it implies that they didn’t believe that

they were speaking to a real careers advisor. Below in table 6.2 is a summary of the results collected

for iteration 2. (See full testing in Appendix I). The full conversation logs for each user can be seen in

Appendix J, these show the exact user input and the responses the chatbot gave that justified the user’s

feedback.

Table 6.2 – Summary of feedback from iteration 2

Name Task 1

Explore chatbot website

Task 2

Information retrieval

(number of completed

questions)

Task 3

Interface, useful

Tessa Price Could understand the help

function but wanted more

depth on how to re-phrase

questions.

4/15 completed The interface received good feedback

but the no helpful information was

retrieved. Found the chatbot responded

to questions with inappropriate

questions.

Theresa No written feedback, but 7/15 completed No feedback but gave the interface high

Simulation of a Careers Advisor using Chatbot Technologies

- 40 -

Longbottom marks shown a poor

response to questions

asked

marks. The marks indicated that the

chatbot was not useful.

Lydia

Otero

Didn’t have a problem

with the help function.

Could start a conversation

but commented on the

lack of answers

9/15 completed Interface was ok. Wanted more depth

within the answers. Commented on the

number of ‘bizarre’ answers. Didn’t

finds the chatbot helpful overall.

Amy

Jennette

Help function did what

was necessary, could start

a conversation

12/15 completed Interface was ok. Found relatively

helpful due to got a good response rate.

Would prefer a better explanation of

the service offered

6.3 Iteration 3

The website where the chatbot is available online:

http://homepage.ntlworld.com/devblock/careers/index4.html

Changes from iteration 2

The feedback from iterations 1 & 2 was the initial basis for changes made in iteration 3. The chatbots

rudeness and the irrelevant answers that it sometimes returned had been a problem throughout the last

two iterations. The problem stemmed from the AIML, which had been inherited from Dr. Wallace’s

2001 chatbot ALICE. The AIML code was inspected, in order to give politer answers that were

relevant to the careers domain when it was unable to answer a question. The code was also edited so

that it no longer asked the user a question in return. The AIML inherited from ALICE 2001 was not

editable within pandorabots so it had to be deleted and created again then added back in to the AIML

set. Below in Fig 6.2 and Fig 6.3 shows part of the original code, and the changed code, that

generated a random response from a list of possible answers. These were used when the chatbot does

not recognise the input, in AIML this is shown by a *.

Fig 6.2 - Screen shot of AIML code from Dr. Wallace’s AIML

Random responses, which are sometimes questions.

Chatbot doesn’t recognise the pattern

Simulation of a Careers Advisor using Chatbot Technologies

- 41 -

Fig 6.3 - Screen shot of the new AIML

This improved the careers chatbots manors to some extent, but during adhoc testing, rudeness was still

a common problem. This was a result of the amount of AIML inherited and the detail into which it

went. The choice was then made to take away the knowledge from Dr. Wallace’s AIML, this would

mean that far fewer social or general questions would be answered but that the careers chatbot would

be less rude and give a standard response explaining that: it didn’t understand and suggested that the

user re-phrase the question or contact a member of the careers staff. The third iteration therefore has

had basic social phrases encoded into AIML, such as, hello, goodbye and thank you inputs.

From analysis of the previous two iterations it was noticeable that testers would ask questions in a

variety of different ways. Therefore, more synonyms and a variety of different keywords were added

to enable to the user to ask questions about a wider range of areas, these synonyms were added by

studying the logs of the previous testers. Alongside the synonyms picked from the logs, the careers

website was investigated, picking topics that occurred regularly and choosing keywords, such as ‘CV’,

‘personal statement’, ‘joblink’ and ‘application forms’. The keyword ‘help’ was also briefly

introduced to allow users to get direct help. During adhoc testing this proved to be a problem as ‘help’

often appeared in many questions causing the chatbot to answer questions incorrectly. Feedback from

tester’s also highlighted the need for the chatbot to take into account plurals. In previous iterations

this had not been done. Testers may have been slightly off the answer but still received no response.

For example a tester may have asked for ‘step placement’ and got no response, to prevent this ‘step

placements’ were added to the 3rd iteration.

The final change to iteration 3 was to extend the keywords to allow, certain phrases. This approach

represented a midpoint between atomic and keywords, allowing part of a sentence to be matched but

with a blank word/phrase within, represented by the wildcard *. Below is an example where ‘find a *

placement’ can appear anywhere in a sentence and the chatbot will return the same answer as ‘find a

job’. The * can be replaced by any word or phrase, for example ‘find a law job’.

New relevant responses.

Simulation of a Careers Advisor using Chatbot Technologies

- 42 -

Testing iteration 3

The testing for iteration 3 followed the same structure as iterations 1 and 2 with a number of set tasks.

Task 1 was similar to the previous iterations allowing the testers to question the chatbot freely. For

the third iteration testers were encouraged spend longer on task 1 as this was closer to simulating an

open conversation. Task two contained the same questions as the previous two iterations. Task 3

focused on the chatbot as a whole: whether the testers found the experience helpful, whether they

retrieved relevant information and whether they believed that they were interacting with a real careers

advisor. Task 3 also consisted of a number of question based on the interface of the chatbot, this was

to ascertain the extent to which it met the interface requirements stated in chapter 5. An extra task was

included enabling testers to give general comments about their experience with the chatbot and

suggestions for future improvements; this task was encouraged as it allowed the testers to express their

views in an unstructured manor.

The testers

The third iteration of testing required another set of testers not involved in previous tests. For this

iteration testing two expert testers were invited to the test the system as well as two students. The two

students were from the School Of Computing, one male, one female both of whom had undertaken

placements. The expert testers were from two different backgrounds. Darren Scott is the student

placement advisor at Leeds University Careers Centre; he specialises in the field of placements and

liaises with students who require placements and with external businesses and organisation to discuss

<category> <pattern>FIND A JOB</pattern> <template>The best thing for you to do would to be come and see a careers advisor, if this is not possible try http://www.prospects.ac.uk</template> </category> <category> <pattern>FIND A * PLACEMENT</pattern> <template><srai>FIND A JOB</srai></template> </category> <category> <pattern>_ FIND A * PLACEMENT</pattern> <template><srai>FIND A JOB</srai></template> </category> <category> <pattern>FIND A * PLACEMENT _</pattern> <template><srai>FIND A JOB</srai></template> </category> <category> <pattern>_ FIND A * PLACEMENT *</pattern> <template><srai>FIND A JOB</srai></template> </category> <category> <pattern>* FIND A * PLACEMENT *</pattern> <template><srai>FIND A JOB</srai></template> </category>

Simulation of a Careers Advisor using Chatbot Technologies

- 43 -

possible vacancies. Bayan Abu Shawar is currently studying towards a PhD at Leeds University and

has published a number of papers on chatbots and their capabilities, many of her papers are referenced

in this project. The combination of an expert in placements and an expert in chatbots alongside the

student’s perspective allowed the chatbot to be evaluated from a number of different angles. The

placement advisor assessed the correctness of the questions and answers; Bayan tested the system and

examined the way the AIML was constructed and the students gave the opinion of real testers. The

tests for all testers took place in a similar manor to the first iteration where the testers were left on their

own during testing.

Feedback

The results from the final iteration showed a variety of different view points; the students, the chatbot

expert and the placements expert. The three views provided valuable feedback. The response to task

1 showed that everyone could use the help and managed to ask the chatbot questions. Two testers

including the placement advisor commented on the lack of depth within the answers while the other

two testers struggled to get the chatbot to answer many questions. Task 2 showed another progression

from iteration 2 with tester’s answering on average 12.5 questions out of 15, the highest being a

student with 14 out of 15 and the lowest being the placement advisor with 9 out of 15. This shows

there was a continual improvement through each of the iterations and the different AIML techniques

that were implemented. Task 3 showed that all of the testers liked the interface and had no problems

with the layout or colour; Bayan commented that perhaps the text should be a little larger. Testers

highlighted some problems that would need to be fixed if other iterations were to be undertaken. Two

of the testers requested that the external links be made in hyperlinks. Three testers thought that the

answers returned were too simple and often quite repetitive. Darren also mentioned the issue of

security asking who would be able to see the questions that students asked, and wondered whether

anyone would be able to see the logs to the conversations.

After implementing the changes from iteration 2 and removing Dr Wallace’s AIML, 3 out of 4 testers

found the chatbot to be polite, the other didn’t believe that he was chatting to a careers advisor so

found this not to be applicable. The success rate for the chatbots helpfulness also increased with 3 out

of 4 users finding the chatbot helpful; although the placement advisor commented that it was only

helpful on very basic answers. All the testers agreed that the chatbot was not like chatting to a real

careers advisor. Below in table 6.3 is a summary of the results collected for iteration 3. (See full

testing in Appendix K). The full conversation logs for each user can be seen in Appendix L, these

show the exact user input and the responses the chatbot gave that justified the user’s feedback.

Simulation of a Careers Advisor using Chatbot Technologies

- 44 -

Table 6.3 - Summary of feedback from iteration 3. Name Task 1

Explore chatbot website

Task 2

Information

retrieval

(number of

completed

questions)

Task 3

Interface, useful

Task 4

Feedback

Bayan

Abu

Shawar

Found the help function

useful, could get the chatbot

to answer some questions

14/15 Commented on the nice

design of the website.

Suggested a bigger font for

the chatbot response. Didn’t

believe that she was chatting

to a real careers advisor but

found it helpful.

Suggested using more

<srai> to map to

original questions

Darren

Scott

Found the help function easy

to understand. Found the

chatbot could answer about

50% of questions asked but

the answers were too general

and often the same.

9/15 Liked the interface and

didn’t have any problems

with layout, colour etc.

Suggested that more specific

answers to the questions be

given. Didn’t believe that he

was chatting to a real careers

advisor. Found the chatbot

helpful for basic information.

This system could be

good for basic queries,

if more combinations of

answers were utilised.

Could take the work

load off advisors.

Doesn’t think a chatbot

can simulate a careers

advisor.

Leon

Savidis

Found the help function easy

to understand. Could get

answers using some terms

but no real advice

14/15 Liked the interface and

didn’t have any problems

with layout, colour etc.

Thought there should be

more content to answers.

Didn’t believe it was a

careers advisor; didn’t find it

helpful and got annoyed with

the error message asking to

re-phrase the question.

Thought there should be

more terms that could

be recognised, maybe

have clickable

keywords or clickable

hyperlinks.

Jemma

Pauley

Found help function ok.

Found that the chatbot didn’t

understand many queries

13/15 Got a slow response when

asking a question. Found the

interface ok. Found the

same answers were repeated

many times. Didn’t believe

that she was chatting to a

real advisor although found

the chatbot quite useful

Would like a function

that recognises and

corrects spelling

mistakes. Would like

your query echoed on

screen. Would like a

facility to save

responses to answers

found.

Simulation of a Careers Advisor using Chatbot Technologies

- 45 -

Summary

This chapter has given the details of 3 iterations and shown how the chatbot has evolved over time.

Each of the iterations has been described, tested changed then re-evaluated following user testing and

analysis of their feedback. This feedback has enabled improvements and corrections to be made, often

to mistakes that would otherwise have been overlooked. Conclusions about the success and feasibility

from all 3 iterations shall be drawn within chapter 8.

Simulation of a Careers Advisor using Chatbot Technologies

- 46 -

Chapter 7 - Evaluation

This chapter documents the evaluation of the project. In order to evaluate properly, objective criteria

were established. The project shall be judged against whether it has achieved the minimum

requirements and to what extent it has extended them. The methodology has also been examined as

this was critical to the projects success and formed the basis for how the author approached the

problem. The project schedule shall be scrutinised to see whether the project has kept on time.

Finally the system requirements will be evaluated to see if the chatbot met the essential, desirable and

optional requirements.

The aim of the project was to investigate the feasibility of using chatbot technologies to simulate

careers advisors on the internet; this shall be discussed within chapter 8.

7.1 Minimum Requirements

In order to meet the aim it was necessary to complete the minimum requirements for this project.

Below is a list of each requirement and how each were met.

Review of chatbot technologies

Chatbot technologies have been reviewed within chapter 2 – Background Reading. The

knowledge gathered within chapter 2 was a good platform for the rest of the project to follow

setting core knowledge on which to build.

Identify the roles of a careers advisor and list the requirements for the careers chatbot

The role of a careers advisor was investigated via naturalistic observations and an interview (see

Appendix C) with Caroline Ramage a careers advisor. This is summarised within chapter 2 –

Background Reading. This work allowed a greater understanding of problem definition stated

within chapter 1.

Compile a sample of careers interactions sufficient to initialise a careers chatbot

The data gathering methods discussed in chapter 4 were used to compile an initial corpus of

questions and answers ready to be coded into AIML. These questions were then used to make

iteration 1 of the careers chatbot (see chapter 6).

Simulation of a Careers Advisor using Chatbot Technologies

- 47 -

Build a proto-type of a careers chatbot

The interactions in the above requirement were used to make the initial iteration of the careers

chatbot. This is documented within chapter 6 under the heading iteration one and is available at

the URL: http://homepage.ntlworld.com/devblock/careers/index2.html

Evaluate the careers chatbot prototype

The first iteration was evaluated during testing by the author and also by users who had experience

within the domain. This is shown within chapter 6.

7.2 Extensions

These were additional requirements that were set for completion if the author had the appropriate time

and resources available.

Evaluate/test the chatbot with actual end users

The project methodology chosen allowed users to be involved in the continual evaluation of the

chatbot. Chapter 6 shows the feedback that users gave after every iteration while chapter 7 shows

the users views on the feasibility of the chatbot within the careers domain

Comparing chatbot technologies to Frequently Asked Question (FAQ’s)

Chapter 8 compares the final iteration of the careers chatbot to a number of different solutions

including Frequently Asked Questions.

Undertake a number of iterations to tune the chatbot

This ties in with the above extension. Three iterations took place and are documented within

chapter 6.

Take the system live with the Careers Centre

This has not been implemented due to the results of the feasibility analysis of the chatbot; these

results are shown within chapter 8.

7.3 Methodology

Chapter 3 documents the projects research into possible design methodologies. A variety of

methodologies were researched from a traditional structured approach to more flexible models. The

final decision was to use the Lifecycle model for interaction design. This methodology enabled

iterations allowing the chatbot to evolve through stages of feedback. The use of iterations fitted in

well with the project as it enabled the chatbot to adapt to feedback, change the requirements then

repeat the process to tune the chatbot. The problem with this design process is that in the context of

Simulation of a Careers Advisor using Chatbot Technologies

- 48 -

this project the iterations could form a never ending loop; it was tempting to change the chatbot again

and again merely to add one feature that may have been missed. Although this methodology has

worked affectively for this project it may be suited to a larger scale project which has more time and

resources allowing more iterations that go into greater depth.

7.4 Time Schedule

This has been evaluated as it is an important feature of any piece of work; it is the organisational basis

for a successful project. There is little point having the ideas and technical skills if you are left with

no time to apply them. The initial time schedule was produced in October when the project had first

started; this has since changed and is document in Appendix B. The first schedule produced left much

work to be performed during the second academic semester; the feedback in the Mid-Term Report

showed that the project assessor felt that the time schedule was unrealistic and needed re-addressing.

A new time schedule was discussed and drawn up in early semester two; this broke the project into

smaller tasks with mini-deadlines allowing the project to keep on track. This proved to be a success

and worked well with the choice of flexible methodology which complemented the schedule changes.

7.5 System Requirements

These have been evaluated as they are the key requirements in order to produce a chatbot. Different

levels of requirements were set within chapter 4, essential, desirable and optional requirements. These

by there very name suggests there importance to the project. Below are the details of the extent to

which they have been met, how and if not, why.

Essential Requirements

The system shall provide an interface to the chatbot; this shall be via a HTML page

The HTML pages shown in chapter 6, they are used to display the careers chatbot; this has gone

beyond the minimum requirements as a mini careers website has been produced.

The chatbot should be knowledgeable on ‘general careers advice’ and ‘Industrial year and

summer placements’

This careers chatbot has been encoded with knowledge in both areas; the extent to which it can be

called ‘knowledgeable’ is discussed in detail in chapter 8.

The chatbot should provide the facility to type a question

The interface designed in chapter 5 includes HTML code to allow a text box where users can type

their queries; this is shown in a graphical form within chapter 6.

Simulation of a Careers Advisor using Chatbot Technologies

- 49 -

The chatbot should provide a facility to return an the answer to the question posed

This is an important feature as it is how the knowledge is transferred to the prospective user. The

chatbot shown in chapter 6 responds to a users query by outputting green text on screen.

Desirable Requirements

The chatbot should return a relevant answer to the questions posed, for example if the

question was about computers, the answer returned should be about computers

This has been partly met after 3 iterations. The chatbot was able to answer some of the user’s

questions, but not all. This is discussed in detail in chapter 8.

The chatbot shall have a help function with an explanation of how a user can interact with it.

This has been met as there is a help function on the website where the careers chatbot is. This

proved to be a valuable function as it meant that the user could obtain their own understanding of

the system rather than asking the developer questions.

The interface should provide an alternative way of contacting the careers centre

The chatbot system has met this requirement by having an extra HTML page providing the contact

details of the careers centre, the chatbot can also be asked for contact details of the careers centre.

The interface shall provide links to other careers services

The chatbot system has met this requirement by having an extra HTML page providing links to

other useful careers websites.

Optional Requirements

The system should allow the botmaster/careers advisor the functionality to edit the corpus

This feature was considered too advanced for a project of this size.

The chatbot should be able to respond to the user via speech instead of text

Due to time restriction this feature was not able to be implemented, Pandorabots the chatbot

hosting service advertise a service that would help implement such a feature.

Simulation of a Careers Advisor using Chatbot Technologies

- 50 -

Suggestions for further work

The following suggestions follow on from the work which has been carried out on this project; they

have been based upon the conclusions made in this chapter and chapter 8.

Implement the optional requirements – this will enhance the chatbots interaction and allow a careers

advisor to add their own knowledge therefore making the developer redundant once the initial chatbot

has been built.

Try a natural language program similar to The Persona Project (discussed in chapter 8) – this would

allow advanced interactions and potentially better query answers, the problem with this approach is

the time taken implement it.

Try in a domain that has specific questions and answers – in the careers domain it proved difficult to

construct answers that were adequately specific for users needs. Another project may consider the

application of chatbot technologies within a domain that has a very specific set of questions and

answers.

Summary

The careers chatbot has met and exceeded many of the requirements set out at the start of this project.

This chapter and the next chapter (8) show the areas that have been successful and also those where

improvement is needed; suggestions have also been made for further work that could be implemented

by other students considering similar projects.

Simulation of a Careers Advisor using Chatbot Technologies

- 51 -

Chapter 8 - Conclusion

The aim of this project was to investigate the feasibility of using chatbot technologies to simulate

careers advisors on the internet. In order to meet this aim it was necessary to:

• Familiarise with chatbot technologies

• Investigate the role of a careers advisor and how this could be incorporated into the simulation

of a careers chatbot.

• Familiarise with Artificial Intelligent Mark-up Language (AIML)

• Compile a corpus of careers conversations

• Setup a careers chatbot

• Test the chatbot appropriately, based on sample scenarios

• Evaluate the feasibility of a careers chatbot

All of these tasks were untaken through chapters 2 to 6. The points above allowed two perspectives to

be taken on the feasibility of a careers chatbot. i) The view of the developer who has undertaken the

methodology chosen in chapter 3 and has gone through the stages to build a working careers chatbot.

ii) The view of the end user, in this case tester’s with placement experience and iii) The expert testers

who had detailed knowledge of chatbot technologies and placements.

8.1 The Feasibility

Developer’s perspective

The author of this project acted as a developer. The comments here concern his experience on the

project. From a developers point of view the feasibility study assessed the benefits of the project

compared to the amount of time and resources required to implement the final solution. Although the

careers domain initially seemed feasible due to similarities with the Loebner prize, which also

involved a restricted topic of conversation it proved problematic (Loebner, 24th April 2004). The

developer concluded that the careers chatbot wasn’t able to feasibly simulate a careers advisor. This

conclusion was based upon a number of deciding factors:

The time taken to build the chatbot was significant compared to the success of the chatbot. The

process of manually searching through the email dialogues, finding appropriate questions and answers,

then undertaking the normalisation process was a time costly exercise even before the AIML coding,

an example is shown within chapter 6. The coding itself was the most time consuming aspect;

although the code was not complex some time was required for familiarisation. Each category had to

be hand coded in notepad then copied across to pandorabots and tested. This project displayed

Simulation of a Careers Advisor using Chatbot Technologies

- 52 -

characteristics typical of one closely related to the learning curve of its author. Over the first two

iterations a traditional approach was adopted using atomic categories, the approach then progressed

with the use of wildcards and synonyms to improve results. This process had to be performed on

every area of interest that was previously an atomic category. This significant effort was not

represented during testing within chapter 6, since many of the users still could not get the chatbot to

respond with correct answers. These negative results may have given the impression that the corpus

size was not sufficient and that there was not enough AIML code, this was true to some extent but

equally the testers failed to find large amounts of AIML code that the developer had implemented.

The need for a good conversation corpus that meets Shawar’s (2002) criteria suggested in chapter 4.

In the case of this project it was feasible to examine the data gathered from the careers centre but this

was only possible due to working relations with many careers advisors. It would not normally be

possible for someone building a system to access such sensitive data. This would create the problem

that: you could have an expert in careers that didn’t know anything about chatbot technologies such as

AIML or an expert in AIML who knew nothing about careers advice. Although it was advantageous

to have access to the email dialogues, the expert testers in chapter 6 suggested that in-order to create a

better chatbot it would be necessary to study multiple dialogues. This would enable more questions

and answers to be derived and increase the number of synonyms within the key areas of interest.

The scope that the chatbot currently covers within the careers domain is minimal and the chatbot still

failed to cope with variation in questions asked. The first two iterations highlighted this as a problem:

the testers often used different words to express the same thing. This was combated by asking the

users to complete tasks; however, each iteration of the chatbot still struggled with unstructured

questions. This again can be attributed to the constraints: if all areas within careers were covered the

AIML coding would take months and would still not guarantee that the chatbot met the user’s needs.

Technical problems that occurred while the building / publishing the chatbot. Often pandorabots (the

hosting service for the chatbot) would go offline meaning that there was no access to the chatbot. The

development would cease and uploading of AIML was no longer possible. Such problems affected the

testing process for the developer as he was no longer able to test AIML as he encoded it. This

problem can be related to the chatbot service being dependant on a third party organisation over which

the developer had no control.

Human factors were a major influence during the implementation and testing of the chatbot. The

first problem encountered was collecting the questions and answer from the email dialogues. This

proved difficult as the majority were very specific questions with answers that were based upon the

careers advisors personal knowledge and experience. This made it difficult to formulate broad

Simulation of a Careers Advisor using Chatbot Technologies

- 53 -

questions and answers that were necessary in order for the chatbot to be applicable to many users.

Another problem was controlling what the users asked the chatbot; it was hard to predict the format of

the questions. Users often reverted to keywords signifying that they didn’t believe that the chatbot

was a real careers advisor; this is shown in chapter 6. Also it was not uncommon for a user to ask

questions that were outside of the careers domain, often asking bizarre questions as they got a feel for

the system.

The need to update the corpus as and when careers information is changed. In the case of the careers

domain this would have serious consequences as many students may be given out of date information.

This would require the corpus to be maintained throughout the academic year which would cost of

time and money.

User’s perspective

The views of the users have been taken from the feedback and task results from the second and third

iterations; also from interviews that took place with the expert testers after the testing process (see

Appendices I and K).

The users viewed the problem from a number of different angles: i) the students ii) the careers advisor

and iii) the chatbot expert. These all gave a valuable insight into the effectiveness and feasibility of

the careers chatbot.

The student’s perspective

The student’s views for the third iteration showed that they did not believe that the chatbot was able to

simulate a placement advisor. This conclusion was based upon the following factors:

The ability of the chatbot to answer careers questions was very limited. From the first iteration

users struggled to engage in a conversation with the careers chatbot. Even with the improvement

made over the iterations the success rate did not reach an acceptable level. To improve on this, open

conversations were encouraged with every new iteration, this was an attempt to observe the users

questions, learn from this and at the same time increase the scope of the chatbot. This technique did

not work as users asked such a broad range of questions. Users commented on the number of

unsuccessful questions even once they had tried to re-phrase the question. Although open

conversation didn’t work, the set questions (task2) improved dramatically on the third iteration with

the average user answering over 12 of the 15 questions. This proved that if it was possible to predict

the users questioning habits it might be possible to improve the open conversations. Users also

quickly realised that the chatbot wasn’t a careers advisor and reverted to inputting keywords similar to

Simulation of a Careers Advisor using Chatbot Technologies

- 54 -

those they might usually use in a search engine, if this technique worked users tended to stick to the

inputting just the keyword.

The answers returned from successful queries. Many of the users commented on the lack of useful

information within the answers returned; they found that when it was possible to retrieve answers the

information was often too vague and offered them no real advice. Often users would ask the chatbot

several questions about one specific area, for example step placements. The answers returned would

be the same because the chatbot could only recognise the key words ‘step placements’ each time a

new question was asked. This would therefore offer the user no new information when they were after

depth within the answer and would often frustrate them.

The placement advisor’s perspective

Darren Scott the Careers Centre’s placement advisor thought that the careers chatbot in its current

state was not feasible and that it doesn’t simulate a careers advisor. He did suggest that the chatbot

might have potential as a filtering system for the e-guidance system for basic careers questions. The

reasoning behind this was the amount of unsuccessful questions that the chatbot returned; and the

insufficient detail of those that were returned. He suggested that there were too many variables in

each question to be able to give an answer with enough substance to satisfy the user.

Expert’s perspective

As Bayan Abu Shawar did not have any previous knowledge of the careers domain or undertaken a

placement she could only comment on the feasibility of the chatbot in terms of the AIML. Bayan

thought that there was a fundamental problem with the careers domain, as currently it struggles to give

responses that are specific to the user and in enough detail. The problem with a chatbot trying to

simulate a careers advisor is that all the answers are usually formulated at the time of the query

whereas a chatbot refers to set of standard responses. This means that users are left with answers that

are highly generalised. A positive aspect of the system Bayan highlighted was the way the careers

chatbot pointed the user to URL’s. This took the onus away from the botmaster to update the corpus

manually, a third party could be responsible for this, assuming that the URL remained the same.

Bayan also commented on the use of the inherited Dr Wallace AIML files to give the chatbot a social

aspect. She said that this was often the approach taken by many chatbot designers; however many

developers soon realise that for a serious application this is not possible as the AIML can conflict with

the developers own AIML and output incorrect answers.

8.2 Comparison To Other Solutions

In order to decide whether or not a chatbot was a feasible project; the final iteration should be

compared with similar products. This comparison would highlight the advantages and disadvantages

Simulation of a Careers Advisor using Chatbot Technologies

- 55 -

relative to other system. Four different approaches have been studied: i) the current e-guidance system

ii) the careers websites search engine iii) Natural Language programs (the Persona Project) and iv)

FAQ’s

The first two approaches are ways that current students / graduates can find information about careers.

In order to compare the two systems fairly; the speed response, quality of answer and ease of use have

been assessed. These criteria have been taken from the key points mentioned in the above conclusion

section.

The current system

The review of the current e-guidance system in chapter 2 highlighted three major problems: i) The

time taken to respond to each query ii) The potential that the demand for this system could increase

further, creating even slower response times iii) The time and effort taken dealing with many repetitive

simple questions posed by the students. The third iteration made significant advances towards solving

these problems, for example when a user asked a question they received an immediate answer. Such

an immediate answering system has the potential to reduce the work load of careers advisors, although

user feedback showed that the chatbot couldn’t handle detailed questioning. One advantage of the

current system is that it can offer the user a more detailed answer than the careers chatbot, often the

quality of the answer can vary but the response still personally addresses each user.

The careers website’s search engine

The careers website currently has a search facility that allows users to search the internal pages of the

careers website or the whole of the web. The process involves the user typing in the search request,

clicking ‘search’, then an answer is returned. This is based upon a search engine powered by Google

that can search all pages within one website, in the case of the careers centre this would be all pages

beginning with http://www.careerweb.leeds.ac.uk. The search function is a very basic function as

shown below in Fig 8.0, if the user is struggling to find relevant results there is a ‘hints’ links which

give tips on how to obtain the best results from your searches.

Fig 8.0 - Screenshot from the Careers Centre website showing the search facility

Unlike the e-guidance system this requires no input from a careers advisor and can give a rapid

response to user’s queries. It also has the potential to give a range of answers. Below in Fig 8.1

A link showing you the best search techniques

The search could be within the careers website or on the whole of the web

Simulation of a Careers Advisor using Chatbot Technologies

- 56 -

shows an example of a user searching for CV. The problem with the search function is that users still

have to trawl through the results that the search engine gives you. This can be considered in either a

positive or negative light: it offers the user a range of answers compared to the careers chatbot which

offers just one; however, it could take much longer to find the correct answer to a query. This

searching technique also doesn’t guarantee the quality of the answers returned, the careers service has

no control over external content and limited control over how the search engine trawls their own

website. An advantage of the search function is that the majority of internet users have used search

engines before and feel comfortable with there format.

Fig 8.1 - Screenshot from the Careers Centre website showing information about the returned results.

Overall the chatbot compares quite similarly to the search system, it offers a quick response with an

answer that is not personalised to the user. The main difference is in the quantity of answers returned;

this could either give the user the advantage of more options or the disadvantage of trawling through

the website to find the best hit.

The following two comparisons are of two methods that the current Careers Centre does not use.

These methods are natural language programs such as the Persona Project and FAQ’s. Both methods

shall be given an explanation of what they are; how they work and whether or not they could be

applied to the careers domain

Frequently Asked Questions (FAQ’s)

FAQ’s originated in an attempt to reduce the amount of re-posts of the same questions on newsgroups.

It can be defined as a text consisting of questions and answers which are considered frequently asked

by users who haven’t done any background research Wikipedia, (2004). Ackerman (1998) also

suggests that FAQ’s are constructed so that domain experts do not have to repeat themselves by

answering the same questions repeatedly. These two definitions insinuate that FAQ’s might represent

an answer the careers centres problems. However frequently asked questions do have a number of

problems; Spatula (1994) argues that FAQ’s are difficult to maintain and can often become out dated.

In the case of the careers domain it would be important to keep information up to date as suggested

URLs can often become broken links. Another problem is the construction of the original FAQ

answers and the question of who would have the final authority/ownership over them. From the users

perspective FAQ’s can often be useless unless the user knows exactly what the problem is and what

the specific name for it is, this can often require some lateral thinking by the user, although not

everyone is successful. The careers chatbot would have an advantage over FAQ’s if this were the case

Simulation of a Careers Advisor using Chatbot Technologies

- 57 -

as synonyms could be used to acquire the true meaning of the user’s query, the advantage would only

be noticeable if the chatbot had a large enough list of synonyms.

Natural Language Programs

To compare natural language programs to the careers chatbot the example of Microsoft Persona

Project has been examined. The Persona Project is a conversational agent which is considered to be

intelligent (intelligence is discussed within chapter 2). This first took the form of an animated parrot

named Peedy that could interact with users via natural spoken language (Ball et al (1996).

The process in which Peedy can interact is described by Ball et al. (1996). Peedy works by taking a

user input; this can be in the form of speech or text. The input is then compared against a voice model

by matching acoustic signals to pre-recorder voices and goes through a name substitution process

which involves replacing problem phrases with ones that Peedy recognises. Once the input has been

recognised it goes through a normalisation process similar to the careers chatbot whereby any

unnecessary dialogue is removed. The input is then passed to a natural language processor which

breaks down the input and analyses each word similar to the part-of-speech tagging in chapter 2. This

is then plotted on a graph which represents the depth of the words in relation to the subject area. A

series of graph transformations then takes place using knowledge of the questioned domain and the

interaction in question. These graphs then form the basis needed to perform matching algorithms

against a collection of ‘action templates’ which enable Peedy to know how to respond. A response is

then formed from the understanding, in the case of Peedy this can be an animation, an action (such as

play a song from a CD) or a text response.

Although the Persona Project is much more complex than the careers chatbot it shares the common

goal that it “attempts to simulate an assistant helping the user in a particular domain” (Ball et al, 1996,

page 11). This intelligent approach may return far superior answers but it still shares some of the

same problems as its lesser non-intelligent chatbot; it still relies on an initial good dialogue for the

basis of pattern matching and still requires much time and effort to create a sufficient conversational

agent.

Simulation of a Careers Advisor using Chatbot Technologies

- 58 -

8.3 Summary

The aim to simulate a careers advisor by using chatbot technologies has been proved a difficult task.

Background research was completed, a methodology was chosen, requirements were gathered which

lead to the design and implementation of three iterations of a careers chatbot. The project was then

evaluation and conclusions have been made.

The analysis above has shown that it has not been fully possible to meet this aim. This was due to the

time and effort necessary in order to have a sufficient corpus, and the human factors involved in

constructing suitable answers to queries. User opinions also showed that throughout the three

iterations none of the users believed that they were chatting with a real careers advisor. Comparisons

to the other solutions showed that a natural language system would be the most suited to this

investigation but would still have problems due to the time and resources needed.

Although it has not been possible to fully simulate conversation with a careers advisor it was

suggested by Darren Scott (placement advisor) that the chatbot may still have a function in filtering

some of the simpler questions that often take the careers advisor so much time to answer.

Simulation of a Careers Advisor using Chatbot Technologies

- 59 -

References:

1. 1882direkt (2004), Mia, URL: http://www-2.1822direkt.com/botroot/botfenster.htm [20th

November 2003]

2. A.I. Foundation, Directory of Bots on the Web, URL:

http://www.alicebot.org/directory.html [24th April 2004]

3. A.L.I.C.E. AI Foundation, Inc, (2001-2004), URL: http://www.alicebot.org [22nd April 2004]

4. Ackerman, M.S., (1998), Augmenting Organizational Memory: A Field Study of Answer

Garden, ACM, pp. 203-224

5. Allen, J.F, (1994), Natural Language Understanding, Second Edition, p541, Benjamin

Cummings

6. Avison, David & Shah, Hanifa, (1997), The Information Systems Development Life Cycle: A

First Course in Information Systems p202, McGraw-Hill

7. Ball, Gene et al, (1996), Lifelike Computer Characters: the Persona project at Microsoft

Research to appear in Software Agents, Jeff Bradshaw (ed.), MIT Press, 1996

8. Bray, T, Paoli, J, Sperberg McQueen, C M and Maler E (2000), Extensible Mark-up

Language (XML) 1.0, 2nd edition, W3C Recommendation

9. Brewer, J, W3C: Web Accessibility Initiative (WAI) http://www.w3.org/WAI/ )

10. Case based reasoning, URL: http://en.wikipedia.org/wiki/Case_based_reasoning [9th April

2004]

11. Clark, M, (2003), Leeds School of Computing, IS33 Handout

12. Connolly, D, (1995), Overview of SGML Resources URL:

http://www.w3.org/MarkUp/SGML/ [20th April 2004]

Simulation of a Careers Advisor using Chatbot Technologies

- 60 -

13. Ding,Y., D. Fensel,, M. Klein, and B. Omelayenko, (2002), The Semantic Web: Yet Another

Hip?, Data and Knowledge Engineering, p207

14. Ginsburg, Mark (2002), The Catacomb Project: Building a User-Centered Portal the

Conversation way. University of Arizona p85

15. Graduate Yorkshire Homepage URL: http://www.graduatesyorkshire.info [24th April 2004]

16. Howe, D. (1993-2003). The Free On-line Dictionary of Computing, URL:

http://dictionary.reference.com [26th April 2004]

17. IBM, Rational Unified Process URL: http://www-

306.ibm.com/software/awdtools/rup/features/index.html [15th February 2004]

18. Jurafsky, D & Martin, James H. (2000), Speech and language processing: an introduction

to natural language processing, computational linguistics, and speech recognition, 1st

edition, Prentice-Hall

19. Kendall, K & Kendall, J, (1999), Systems Analysis and Design, Prentice Hall

20. Kruchten, Phillip, (2000), The Rational Unified Process: An Introduction, Addison Wesley

21. Loebner, H, In Response, URL: http://www.loebner.net/Prizef/In-response.html [24th

April 2004]

22. Machrone, Bill and Raskin, Robin (1994), “Gazing into Cyberspace." PC Magazine

23. Nielsen, J, (1999), Usability as Barrier to Entry, URL:

http://www.useit.com/alertbox/991128.html [24th April 2004]

24. Nielsen, J, Coyne, K.P., and Tahir, M. (2001). Make your site usable, PC Magazine, URL:

http://www.useit.com/papers/heuristic/ [24th April 2004]

25. Nielsen, J. (2000), Designing web usability: The practice of simplicity. Indianapolis: New

Riders Publishing

26. Oddcast Inc, Hompage, URL: http://www.oddcast.com/home/ [24th April 2004]

Simulation of a Careers Advisor using Chatbot Technologies

- 61 -

27. Pandorabot Homepage URL: http://www.pandorabots.com [24th April 2004]

28. Pandorabots.com (2003) URL: http://www.pandorabots.com/pandora/pics/about.html

[24th April 2004]

29. Preece, J, Rogers, Y & Sharp, H (2002), Interaction Design: Beyond Human-Computer

Interaction, John Wiley & Sons, America.

30. Robertson, S and Robertson, J. (1999) Mastering the Requirements Process. Boston:

Addison-Wesley.

31. Royce, W. W, (1970), Managing the Development of Large Software Systems, Proceedings

of IEEE WESCON

32. Searl, J (1980) “Minds, Brains and Programs”, The Behavioural and Brain Sciences, vol 3

No.3 pages 417-424

33. Shawar, Bayan Abu & Atwell, Eric (2002) A Comparison between Alice and Elizabeth

Chatbot Systems, School of Computing Research Report Series Report 2002.19

34. Shawar, Bayan Abu & Atwell, Eric (2002) Using Dialogue Corpora to Train a Chatbot,

School of Computing

35. Shieber, S (1994) “Lessons from a Restricted Turing Test”, Communications of the

Association for Computing Machinery, Vol 37, No.6, pp70-78

36. Shneiderman, B. (1998). Designing the user interface: Strategies for effective human-

computer interaction, 3rd edition, Addison-Wesley

37. Sommerville, I., (2001), Software Requirements, in: Software Engineering, 6th Edition,

Addison-Wesley, pp.98-99

38. [email protected], (17 Sep 1994), "Soon to be in the FAQ: AOL + Telnet/FTP

= STUPID."

Simulation of a Careers Advisor using Chatbot Technologies

- 62 -

39. Spielberg, S Artificial Intelligence (2000) URL: http://aimovie.warnerbros.com/ [26th April

2004]

40. Subramani, M.R. and Hahn, J. "Examining the Effectiveness of Electronic Group

Communication Technologies: The Role of the Conversation Interface," Working Paper MIS

Research Center, Carlson School of Management, University of Minnesota, May 2000

41. The Chatterbot Collection, (2004), URL: http://www.angelfire.com/trek/amanda/bot.htm

[24th April 2004]

42. The Standish Group Report (1995) URL: http://www.scs.carleton.ca/~beau/PM/Standish-

Report.html [24th April 2004]

43. The University of Leeds Careers Centre, URL: http://careerweb.leeds.ac.uk [24th April

2004]

44. Thomas, J, (1994), Reactions by the Computer Industry to the World Wide Web URL:

http://www.swiss.ai.mit.edu/6.805/student-papers/fall94-papers/thomas-comp-

industry.html [24th April 2004]

45. Turing, A. M, Computing machinery and intelligence from MIND: A Quarterly Review of

Psychology and Philosophy, (Vol. LIX, N.S. No.236, Oct. 1950)

46. URCEL, Corpus Annotation, URL: http://www.comp.lancs.ac.uk/ucrel/annotation.html

[10th April 2004]

47. Veen, J. (2001) The Art and Science of Web Design. Indianapolis: New Riders Publishing

48. Wallace, Richard S (2001), A.L.I.C.E. Brain Picture Gallery, URL:

http://www.alicebot.org/documentation/gallery/ [24th April 2004]

49. Wallace, Richard S, A.L.I.C.E. Artificial Intelligence Foundation, Inc. The Anatomy of

A.L.I.C.E, URL: http://www.alicebot.org/anatomy.html [24th April 2004]

50. Wallace, Richard S, A.L.I.C.E. Artificial Intelligence Foundation, Inc., AIML URL:

http://www.alicebot.org/alice/aiml.html [27th February 2004]

Simulation of a Careers Advisor using Chatbot Technologies

- 63 -

51. Wallace, Richard S, A.L.I.C.E. Artificial Intelligence Foundation, Inc., AIML overview URL:

http://www.pandorabots.com/pandora/pics/wallaceaimltutorial.html [24th April 2004]

52. Wikipedia, (2004), FAQ, URL: http://en.wikipedia.org/wiki/FAQ [20th April 2004]

53. Wikipedia, (2004), Hawthorne Effect, URL: http://en.wikipedia.org/wiki/Hawthorne_effect

[20th April 2004]

54. Willcocks, L & Mason, D, (1987), Computerising Work: people, systems design and

workplace relations, Paradigm

Simulation of a Careers Advisor using Chatbot Technologies

- 64 -

Appendix A – Personal Reflection

This Appendix is to reflect my overall experience of undertaking a final year project. This discusses

the whole project, examining what has been done well and what could have been done better. It also

focuses on what lessons has been learnt and offer advice for other students who may conduct a similar

project. Completing this project has given me enormous personal satisfaction. This has been the

single largest piece of work that I have completed to date. Even though the project wasn’t able to

produce a system that might be taken on by the Careers Centre, I still feel like I have achieved an end

product and tackled a real life problem.

This project has required me to meet with the Careers Centre to undertake data collection and

interview. My year in industry has helped me to do this I already had working relations with many of

the careers advisors. The project has also helped me apply the knowledge I gained during my

placement year. During the project I have had to learn new skills and well as expanded on previous

skilled learned. The biggest of these new skills was to understand how chatbots work and learn the

mark-up language AIML, this required a lot of time and effort and at times became tedious. I have

also understood that time management skills are vital if a good project is to be expected. Below are a

number of lessons I have learnt by undertaking this project and recommendations for future students,

some of these are applicable to similar projects, others should be applicable to all final year students.

Have prior knowledge of domain

During the project I had the opportunity to draw upon experience gained in my placement year, also it

was easy to meet my old colleagues from the Careers Centre. Without such opportunities I do not

think this project would have been possible as it was crucial to the success of a chatbot to know the

domain in detail. Also by knowing staff within the Careers Centre I was able to acquire a user

dialogue which is a fundamental aspect of a good chatbot. This dialogue would not have been

available to other students as it contained sensitive information on students at the University of Leeds.

Break the project into smaller goals and set mini-deadline for yourself

During the first semester I struggled to keep on top of my project often avoiding work as I had no

specific goals to aim for. By breaking the project into smaller manageable chunks the possibility of

getting behind is reduced as the project as there is always a small piece of work that could be

attempted, therefore not letting the project come to a dead end. In the second semester this method

proved to be most helpful as I asked my supervisor to help set me deadlines. This meant that all the

written work was not left until Easter.

Simulation of a Careers Advisor using Chatbot Technologies

- 65 -

Choose a project you are interested in

If you are going to study something for the whole of an academic year it is advisable to study

something you are interested in. At points throughout the year this will still become tedious but it will

help if there is an interest from the beginning. For me this wasn’t the case as a project was suggested

for me, this often meant that I struggled to maintain an interest in the subject and often found my lack

of interest was my downfall. By choosing a project you are interested in this will make the

background reading more interesting and may encourage you to work on one of the most difficult

chapters.

Choose a project that you already have some knowledge

I had the advantage of drawing experience from my placement year to help me with my project, also

such modules as human computer interaction and people centred-information systems. Without these

skills I do not think that the project would have been possible. A project of this size should not be

undertaken in a new subject area as there would be too much to learn.

Start the project as early as possible

In the first semester many students tend to underestimate the size and effort that is required to produce

a final year project, time must be set aside for changes in the schedule; flexibility is often required to

cope with difficulties. This often ends up with many students rushing the project in their second

semester. It is advisable to get a large amount of research and maybe drafts of chapters finished in the

first semester allowing time to implement the chosen design. In my case the implementation was left

until a few weeks before Easter and run into the holiday period. This meant that the implementation

stage was encroaching on the writing time. It is especially important if an iterative method has been

chosen as the requirements may change over time due to user feedback; this may mean extra time is

needed to develop future iterations.

Attend the extra project lectures

Many student overlook how important these lectures are, they show you many important features such

as how to reference and how to undertake a literature review. At the time students often do not

understand the importance of such matters but these are crucial in picking up some of extra marks

which could be the difference between an average project and a good project.

Read other student’s personal reflections

This will give you an insight into the problems they have encountered and their tips on how to

potentially avoid these. This is something I wished I had done before starting my project.

Simulation of a Careers Advisor using Chatbot Technologies

- 66 -

Appendix B – Time Schedule

The table below shows the original time schedule for the project.

Table B.1 – Original Time plan

Task Aprox Complete Date

Decide project title 24/10/03

Aims and objectives 24/10/03

Initial meeting with careers advisor including supervisor 05/11/03

Complete appropriate form for external company (not necessary

after speaking to Ann Roberts)

14/11/03

Research chatbot technologies (AIML) 26/11/03

Interview careers advisors 01/12/03

Evaluate chatbot technologies

FAQ / AskJeeves v chatbot

07/12/03

Evaluate Role of a Careers Advisor 12/12/03

Mid project report 12/12/03

Revision and exam period 03/12/03

List requirements of chatbot 09/02/04

Design and compile a list of careers interactions 18/02/04

Design chatbot 05/03/04

Submit table of contents and draft chapter 05/03/04

Build chatbot Mid March 2004

Evaluate chatbot with user testing Late March 2004

Make necessary refinements to chatbot Mid April 2004

Evaluate chatbot / project 28/04/2004

Submit Project Reports 28/04/04

Submit report postscript using submit 30/04/04

The original key stages from this time schedule were originally described below as:

Research chatbot technology

Define what chatbot technology is, giving examples and applications of where such chatbots may to

used. Discuss the architecture of a chatbot and explain what AIML is. See if the think a chatbot

would be successful by weighing up pro’s v limitations (cons). What are the implications for this

project and what will be needed for me to develop a careers advisor chatbot.

Simulation of a Careers Advisor using Chatbot Technologies

- 67 -

Interview careers advisor

Initial interview with Caroline Ramage to discuss the feasibility of the project, discuss systems that are

already in place and how they work. Also discuss the role of a careers advisor and how you become

one. Collect data from previous system and analyse, looking at which areas are frequently asked

about.

Evaluate chatbot technologies, FAQ/Askjeeves v chatbot

Look at the possible advantages and disadvantages of each and discuss the feasibility of a careers

chatbot.

List requirements of chatbot

Explain what requirements are, and discuss the different types of requirements there are. Justify the

chatbots requirements by looking back at data gathered and the methods which were used to gather

this data. Also narrow down the areas of interaction that the chatbot should be able to cope with.

Discuss methodologies that I am going to use and explain the model I have chosen.

Design and compile a list of careers interactions

From data gathered decide which areas are considered to be the most important. These should be

listed in the requirements. Decide the specific questions within the areas of interests. List the specific

questions and all the possible answers. Review chatbot services and justify why/if the careers chatbot

will use pandorabots service. Discuss the features of Pandora that are available for me.

Build chatbot

Turn the questions and answers into AIML and upload giving the careers chatbot knowledge.

Evaluate chatbot with user testing

Use proven testing techniques to do user testing. Looking at who will test the chatbot, how they will

test it and what conditions/variables will be used. Finding out the problems that exist within the

chatbot.

Make necessary refinements to chatbot

State and change the problems from the previous stage of testing. This can be an iterative process

refining the chatbot as much as possible.

Simulation of a Careers Advisor using Chatbot Technologies

- 68 -

Evaluate chatbot

Give the final chatbot another stage of testing. This will be against strict criteria.

Project reflection

What lessons I have leant from the project and how could this help me in the future. Including

examples and advice to future students.

Changes to the schedule

The project took a relatively slow start with the tasks before the Christmas period being completed but

not to a satisfactory standard. This reflected in the Mid-Term report. Changes were therefore made to

the schedule to reflect a realistic plan for the remaining time. More background reading was required

and the comparison against FAQs and Askjeeves was removed due to time restrictions, this was re-

introduced due to early completion in the chapter 8 (conclusion). The changed schedule also reflects

problems that occurred within the project such as Pandorabot.com being unreliable and the varying

time it took to receive tester’s feedback. Changes were also increased due to time taken for dialogue

analysis and the time taken to create the AIML. Table B.2 below shows the revised plan from January

to April 28th.

Table B.2 – Revision of time plan after Christmas 2003

Task Aprox. Complete Date

Extended background research 07/02/04

Data gathering 07/02/04

List chatbot requirements 14/02/04

Design 28/02/04

Analyse dialogue and create questions / answers 28/02/04

Design interface 28/02/04

Submit table of contents and draft chapter 05/03/04

Iteration 1 (code AIML, user testing, feedback) 12/04/04

Iteration 2 (code AIML, user testing, feedback) 15/04/04

Iteration 3 (code AIML, user/expert testing, feedback) 19/04/04

Evaluation 21/04/04

Conclusion 24/04/04

Write report – continual process 26/04/04

Submit Project Reports 28/04/04

Submit report postscript using submit 30/04/04

Simulation of a Careers Advisor using Chatbot Technologies

- 69 -

Appendix C – Interview with Careers Advisor

Interview with Caroline Ramage (IT manager, University of Leeds Careers Centre) – the e-guidance

system

Do you have an online system to deal with students requests?

Yes we currently have an e-guidance system on the careers website. This is for Leeds students and

Leeds graduates who finds it difficult to physically visit the careers centre, whether they be on a

placement year of living at home.

How does the e-guidance system work?

It’s pretty simple really; any Leeds student/graduate with an internet connection can send us a query

by filling in a web form at (http://careerweb.leeds.ac.uk/students/emailhelp/index.html). We ask a few

questions about the student/graduate and then they get to send us there query.

What is the demand for the system?

It can vary from week to week; it can be from under 10 queries a week up to 100. This varies

depending on the time of year; often we get serge of final year student asking about jobs in the last

couple of months. Other than that there doesn’t seem to be a pattern to the number of queries

received.

Is this guidance systems open to every student?

As mentioned before we have recently moved to answering just queries from current and graduate

Leeds students. This was due to an increasing amount of other universities students taking advantage

of the service, meaning less time could be devoted to our own students here at Leeds.

Have you any examples of questions asked by students/graduates?

Yes we have put some in our Careers Centre Guide (2003):

How do I go about finding an internship or work experience?

How do I apply to do a Master’s or a PhD?

How do I get into the Civil Service via the Fast Stream?

I want to do voluntary work. Can you tell me what’s available?

Which companies have openings for Geography graduates?

I want to set up my own business. Can you help me?

What do people from my degree go on to do?

How can I get a job abroad?

Simulation of a Careers Advisor using Chatbot Technologies

- 70 -

Where can I find the latest graduate vacancies?

What are the most popular queries?

Generally simple queries asking about CV’s, job applications, our up and coming events. Some

queries often as simple as asking the opening times. Which are also on our website.

Do you feel this system is effective?

A difficult question really, we used to have a frequently asked questions page but we found that

students didn’t tend to use this. We then changed to the e-guidance system and it seems to be going

ok. We try to answer questions as quick as possible, with one member of staff which rotates every

week whose job it is to answer the questions. It normally takes about 3 days to get back to the student.

This is the only draw back really, a student can be waiting for over 3 days for a response to a simple

question.

How do you answer these queries?

This differs from advisor to advisor; there isn’t a set way of doing things. Personally if I have time I

often use Google to search for the answers to the queries, send them my suggestion or sometimes the

web link. Other times if there are a lot of queries I would advise students to search on Google

themselves. Other advisors like to get the students to come in and book an appointment if possible as

they prefer to give advice face to face.

Do you need any qualifications in order to be able to give careers advice?

There are two paths which can be taken, gain a Professional Diploma in Career Guidance (PDCG)

Qualification in Careers Guidance (QCG) before starting in higher education as a careers advisor. The

other route which is most popular is to gain a Careers qualification through AGCAS (The Association

of Graduate Careers Advisory Services) who provide professional training for any individuals or

organisations who are currently or who aspire to, provide Information, Advice and/or Guidance.

Simulation of a Careers Advisor using Chatbot Technologies

- 71 -

Appendix D – Dialogue Analysis This is a breakdown of the number of queries within each category of careers query, its shows the

most popular at top. The top seven predominantly have more queries. Classification Number of queries CV and application form checks 113 General Careers Advice 56 Postgraduate study in the UK 38 Law 33 Industrial year and summer placements 31 Working / studying abroad 25 Teaching 23 Graduate jobs in specific regions 19 Graduate help 16 HR issues on applying/joining a company 14 Careers centres resources at Leeds 14 Public sector 10 Engineering 8 Accountancy 7 Psychology 6 Leaving / changing course 6 Interview advice 6 Aptitude 5 Medicine 5 HR 4 Translation 4 Management consultant 4 Publishing 4 Investment banking 3 Nursing 3 Environment 3 Work permits 3 Marketing 3 Counsellor 2 Actuary 2 Journalism 2 Interior design 2 Development work 2 Speech therapy 2 Fashion 2 PR 2 Pilot 2 Linguists 2 GIS 2 Advertising 2 Museum work 2 Retail 1 Estate agents 1 Police 1 Medical writer 1 Lecturing 1 Textiles 1 Pathology 1 IT 1 Art 1 Working with children 1 Football coaching 1 Sport science 1 Architect 1 Radiography 1 Dietetics 1 Physiotherapy 1 Agricultural 1 Recruitment agencies 1 Job website enquiries 1 Professional bodies 1

Simulation of a Careers Advisor using Chatbot Technologies

- 72 -

Appendix E – Dialogue Analysis in Detail

The tables below show analysis of the top seven categories in which careers advisors receive email

queries. The analysis has looked at each categories to see if they meet Shawar and Atwell’s (2002)

criteria for a good corpus as mentioned within 5.

CV and application form checks

2 speakers Yes

Short No

Structured

format

Not always, might not give a direct answer to the question asked, maybe sending back

a CV with changes and a thank you email

Example

Question

Dear Careers Adviser,

I graduated from Leeds in 2003 in Philosophy. I have decided that I would

like to do the Law Conversion next year. I have the application and have been

working on it. I would really appreciate it if you could please have a look at my

'personal statement' that goes at the end of the application. I have attached this to my

e-mail.

Thank you for your help in advance.

Example

Answer

Dear Student,

I've spent some time looking at your personal statement. I have re-written your first

paragraph so give you some idea of the style and content you might want to use, but

due to time considerations I've just included comments about the rest of the statement.

All my comments and alterations are in blue and you can change the colour of the text

by just changing the font colour in word.

I'm not sure if the statement is limited to a specific number of words. If it is you might

want to use the comments I have given but not write in such a lot of depth.

Feel free to email this back to us if you want more help after making changes to it.

Careers Advisory Team

Problems /

comments

Although this category has the highest demand the majority of the questions asked are

similar not leaving a lot of scope. The main query asked is to check individuals CV’s.

This wouldn’t be possible for a chatbot as it is the wrong type of structure, each

question can be very long, can’t compare to a set of already asked questions. Each

case would need time to analyse CV and report back with a correct response. Such

responses require individual attention and can often involve a lot of time spent on

each query.

Simulation of a Careers Advisor using Chatbot Technologies

- 73 -

General Careers Advice

2 speakers Yes

Short Varies for query to query, as this is a general questions section it could be as simple as

opening time to a long question posed by a student. See below for an example of a

short question followed by a long answer.

Structured

format

Generally yes,

Example

question

Hello, I am a first year currently studying Psychology and Management studies. I am

interested in becoming a child Psychologist but was wondering what my next steps are

once i have graduated. I have no idea where to start and where i could get work

experience etc.

Example

answer

Thank you for your enquiry. You need to qualify as a Chartered Psychologist through

the British Psychological Society. The BPS has nine specialist divisions, of which the

Division of Educational and Child Psychology is the main, but not the only division

likely to be of interest. To become registered as a Chartered Educational Psychologist

in England, Wales and Northern Ireland (Scotland is different)you must

- complete an accredited course (your Psychology degree) as the 'graduate basis for

accreditation'(GBR); undertake a recognised teaching qualification; gain teaching

experience with children and young adults; complete an accredited post-graduate

course in educational psychology; and undergo a period of supervised experience as

an educational psychologist.

However, there are plans for a restructured three year training from 2005, when the

new route will be GBR plus three years 'higher education and employer involvement.'

Transition arrangements will be in place after 2005. More information can be

obtained from the BPS web site - in particular the section for the Division of

Educational and Child Psychology.

Problems /

comments

The length of the queries varies from a couple of lines to a long paragraph. Some of

the answers could be shortened due to the unnecessary conversation often given in

questions and responses. The scope within general careers is varied providing queries

about a range of different areas, this could be an advantage as it allows for a bigger

corpus or a disadvantage as it could leave the corpus too varied. Could take a set of

areas within the ‘general careers advice’ area.

Postgraduate study in the UK

2 speakers Yes

Short Varies from query to query, but generally not

Structured

format

Yes

Example

question

I am looking to apply for M.A courses and just wondered about the procedure - if I

apply to several and get accepted than more than one. I just wanted to check that you

Simulation of a Careers Advisor using Chatbot Technologies

- 74 -

can turn a place down at any time.

Example

answer

Hi

I would suggest you look at the section on Further Study on the Prospects website -

www.prospects.ac.uk

Look on LHS and you will see Further Study. Click on this and it will take you into

many different options such as Search for courses, Timetable for applications in the

UK and Postgraduate study, Applications and interviews etc.

In addition, the Careers Centre Guide to Career Planning 2004 is out and there is a

really useful section in there that you can read about PG Study. In addition, there are

loads more of resources in the Careers room that you are free to access should you be

able to visit us.

For general MA and Msc courses, you apply direct to each university for each course

you want to do, (unlike UCAS where you apply once to them for all your university

choices) There is no limit to the number you apply for and you can accept and decline

at any time.

Hope this helps?

The Careers Advisory Team

Problems /

comments

The lengths of the queries vary. Most of the queries are about gaining a place on a

master’s course, the subject varying from query to query. Funding is a common issue

brought up in query, whether students are eligible and how they could apply for

funding. Although the questions follow a similar theme of postgraduate study there is

still a lot of variance within the questions as many are about specific departments.

Law

2 speakers Yes

Short Varies from query to query, but generally not

Structured

format

Yes but the answers tend to still be quite long.

Example

question

I am interested in doing the law conversion course for non-law graduates after i

graduate in 2004. However I am confused as to when to apply and what i can do to

increase my chances of being offered a place. I have looked at various work

experience placements for law firms, however they all stipulate that applicants must

be in their final year of a non-law degree, and i am only in my penultimate year.

Example

answer

Thanks for your message. Applications for Law conversion courses open in mid

November - see http://www.lawcabs.ac.uk/ for full details of how the system operates.

Simulation of a Careers Advisor using Chatbot Technologies

- 75 -

Generally admission to a course isn't a major problem. Gaining a training contract is

more difficult, but since you wouldn't be able to apply for training contracts until you

Law Conversion course has started, there's no particular problem in not being able to

gain a vacation placement until the summer of your graduation.

There will be an extensive programme of Law events, including a Solicitors

Recruitment Fair, in the Autumn and this will give the chance to meet solicitors as

well as gaining inside information on Training Course. Details will be available in

late Sept, so look out for these and come to as may relevant events as possible.

Problems /

comments

Varied lengths. A large number of queries in this area are about personal statements

regarding Legal Practice Courses and work experience, similar length to CV queries.

Often require time to check through the statement, not just a case of answering a

question straight off, answers are often very personal and would differ from query to

query.

Industrial year and summer placements

2 speakers Yes

Short Varies from query to query, but generally not

Structured

format

Yes

Example

question

Hi there i am a level 2 student doing Psychology and Management Studies, and am

interested in a career in Psychology. I wish to do some summer work but would like

to be paid. However, i would like it to be associated with Psychology in some way. I

understand that most of the work that may be available will probably be voluntary

and i would like to know whether there was anything that you could advise me to do.

Example

answer

You are right most of the work experience directly related to Psychology would be

unpaid but you sometimes people do manage to secure paid experience. You need to

apply speculatively and try and market yourself. We have a work experience adviser,

Darren Scott, who has a surgery on a Tuesday afternoon between 2 - 4pm, at the

Careers Centre, so if you would like to speak to someone specifically about the best

ways to gaining work experience then Darren can do that. There is the STEP scheme

which Darren can discuss with you , which is a Summer placement in a small or

medium sized Company where you do an 8 week project which you may be able to

relate to some aspects of psychology,

Thanks for your enquiry,

Regards The Careers Advisory Team,

University of Leeds.

Problems /

comments

A mix of long and short queries, generally asking how to obtain placements of

different varieties, these can often be associated with subject areas, for example

Simulation of a Careers Advisor using Chatbot Technologies

- 76 -

summer computing placements. This also contain personal statements being sent it

but to a lesser extent. This has a medium scope with a number of different placement

areas often asked about.

Working / studying abroad

2 speakers Yes

Short Varies from query to query

Structured

format

Yes

Example

question

I had a careers interview in July, but my circumstances have now completely

changed. I'm looking to go to Spain in Jan and then South America. I would

be grateful for any info on work/ experiences in these places. I am also

interested in the Leonardo da Vinci project, but am unsure how to find

placements.

Can you help?

Example

answer

In the Careers Library there are a number of directories relating to Spain and South

America, similar information is held on the web.

Try out the Careers centre web page which links to www.prospects.ac.uk and head for

the country profiles. A Reasonable start point for some names and addresses as well

as general tips and hints on the region.

Problems /

comments

Varied length of questions, range from simple queries to long queries. The answers to

student’s queries can often point to similar resources, such as www.prospects.ac.uk an

official graduate careers website. This area overlaps some of the other areas as

working abroad could be similar to placements abroad. This doesn’t seem an area that

the careers centre is strong on; departments usually deal with studying abroad as they

have links with other departments abroad.

Teaching

2 speakers Yes

Short Varies from query to query

Structured

format

Yes

Example

question

I want to do a pgce in modern languages next year but most of the universities I have

looked into only offer French, German and Spanish. Please could you give me some

advice on what to do if I want to teach Italian. I also studied French up to 1st year.

Would this make a difference?> What is the procedure if you want to teach at A Level

standard? Thanks for your help

Example You do not say where you have looked for information on courses, but you will find

Simulation of a Careers Advisor using Chatbot Technologies

- 77 -

answer comprehensive listings of PGCE courses on both of the websites listed below:

http://www.canteach.gov.uk/

http://www.gttr.ac.uk/

If you can't find an appropriate course on either site, I suggest you phone the Teacher

Training Agency help line on 0845 6000 991 to ask for their advice

Problems /

comments

This are has a varied length of query but with a larger percentage of smaller questions

compared to the other categories. Mostly concerned with applying for teaching the

year after graduation and how generally to get into teaching. Personal statements are

often sent in to be checked for PGCE applications, this can be very specific to each

students query and can take a lot of time responding to each one.

Simulation of a Careers Advisor using Chatbot Technologies

- 78 -

Appendix F – Testing Tasks The questions below are the questions which were asked with testing for iteration 1

1. Did you find the help functional easy to understand? 2. Can you find out the opening times for the careers centre? 3. Can you find out the address of the careers centre? 4. Can you find out where the careers centre is? 5. Can you find out the telephone number of the careers centre? 6. Can you try and book a careers interview/appointment? 7. Can you find out what students do after graduations? 8. Can you find some information on placements? 9. Can you find some information on 6 month placements? 10. Can you find some information on 9 month placements? 11. Can you find some information on 12 month placements / year in

industry? 12. Can you find some information on summer month placements? 13. Can you find some information on Easter month placements? 14. Can you find some information on step placements? 15. Can you find some information on voluntary work? 16. Can you find some information on placements abroad? 17. Can you find out the opening times for the careers centre? 18. Were the responses to the above question above quick? 19. Was the text clear and legible? 20. Were appropriate colours used for the chatbot website? 21. Could you read the text on the background? 22. Was it easy to find your way around the website? 23. Was the content to a suitable understanding level? 24. Did you like the design of the website? 25. Did you believe you were talking to a real careers advisor? 26. Did you find the conversation polite? 27. Did you find chatting to the career chatbot helpful? 28. Did you notice any spelling / grammar mistakes?

Simulation of a Careers Advisor using Chatbot Technologies

- 79 -

Appendix G – Iteration 1 Testing The following shows an example of the tests that have been completed and emailed back or sent back as a hard copy. One example has been shown due to the length of each one, the other can be found on the CD that comes with this project.

Simulation of a Careers Advisor using Chatbot Technologies

- 80 -

My name is James McKenzie and I am final year student at Leeds University, this is for my final year project. The aim of the project is to investigate the feasibility of using chatbot technologies to simulate careers advisors on the internet. Below is a number of short tasks for you to complete, this should not take you longer than 30 minutes to complete. Name Chris Burton Sex: Male Age: 21 Have you undertaken a placement year yes Have u used a chatbot before no I have freely volunteered to participate in this experiment I have been told in advance what my tasks will be and how I should conduct them I have had the chance to ask questions and have had my questions answered to my satisfaction I am aware that I have the right to withdraw at anytime during the experiment My signature below ensure the above information is correct Print name ………………………………… Signature ………………………………….. Task #1: Explore Chatbot website Your first task is to spend five minutes exploring and familiarising yourself with the website. Instructions on how to use the website are under the ‘Help’ section on the website. 1. Go to the URL: http://homepage.ntlworld.com/devblock/careers/index2.html 2. Now, explore. Discuss if you disagree 1- Strongly agree

2- Agree 3- Disagree 4 – Strongly disagree

1. Did you find the help function easy to understand?

1 2 3 4

Task #2: Information retrieval Your second task is to ask the careers chatbot a number of questions, don’t spend long than 5 minutes trying to answer each question. You can discuss problems encountered and irrelevant answer in the space provided. Discuss problems Answer

1. Can you find out the opening times for the careers centre?

Didn’t understand my question even after several re-phrases

Yes / No

2. Can you find out the address of the careers centre?

Took a few iterations to find address Yes / No

3. Can you find out where the careers centre is?

Yes / No

4. Can you find out the telephone number of the careers centre?

Yes / No

5. Can you try and book a careers interview/appointment?

Yes / No

6. Can you find out what students do after graduations?

Yes / No

7. Can you find some information on placements?

Even after trying placements, sandwich course and industry

Yes / No

8. Can you find some information on 6 “” Yes / No

Simulation of a Careers Advisor using Chatbot Technologies

- 81 -

month placements? 9. Can you find some information on 9

month placements? “” Yes / No

10. Can you find some information on 12 month placements / year in industry?

“” Yes / No

11. Can you find some information on summer month placements?

“” Yes / No

12. Can you find some information on Easter month placements?

Yes / No

13. Can you find some information on step placements?

It tells me “I haven't heard of step placements” Yes / No

14. Can you find some information on voluntary work?

Yes / No

15. Can you find some information on placements abroad?

Doesn’t find placements abroad Yes / No

Task #3: The interface Your third task is to evaluate the chatbot bots general look and feel. If you disagree, please discuss why? 1- Strongly agree

2- Agree 3- Disagree 4 – Strongly disagree

16. Were the responses to the above question above quick?

1 2 3 4

17. Was the text clear and legible? 1 2 3 4 18. Were appropriate colours used

for the chatbot website? 1 2 3 4

19. Could you read the text on the background?

1 2 3 4

20. Was it easy to find your way around the website?

1 2 3 4

21. Was the content to a suitable understanding level?

1 2 3 4

22. Did you like the design of the website?

1 2 3 4

23. Did you believe you were talking to a real careers advisor?

1 2 3 4

24. Did you find the conversation polite?

1 2 3 4

25. Did you find chatting to the career chatbot helpful?

1 2 3 4

26. Did you notice any spelling / grammar mistakes?

On the home page there was a typing error (no space between two words)

Yes / No

Simulation of a Careers Advisor using Chatbot Technologies

- 82 -

Appendix H – Iteration 1 Conversation Logs The following are logs that match the testing within Appendix G. It shows the interaction between the tester and the chatbot. [03/18/2004 02:54:21 PM - 172.188.99.86] Human: what time is careers office open? Careers Advisor: Time does not exist.

[03/18/2004 02:54:38 PM - 172.188.99.86] Human: when are career opening hours? Careers Advisor: I don't think ever.

[03/18/2004 02:54:58 PM - 172.188.99.86] Human: open hour Careers Advisor: Tell me a story.

[03/18/2004 02:55:05 PM - 172.188.99.86] Human: when? Careers Advisor: The past and future are one.

[03/18/2004 02:56:52 PM - 172.188.99.86] Human: where is careers center address? Careers Advisor: Where it belongs. Try searching the open directory.

[03/18/2004 02:57:07 PM - 172.188.99.86] Human: what is the address? Careers Advisor: I would look into the web for that knowledge.

[03/18/2004 02:58:35 PM - 172.188.99.86] Human: where is the careers centre? Careers Advisor: The Careers Centre- 5-7 Cromer Terrace, University of Leeds, Leeds, LS2 9JT

[03/18/2004 02:59:28 PM - 172.188.99.86] Human: what is the telephone number? Careers Advisor: I'm a chat bot not a phone book. Try 411.com.

[03/18/2004 02:59:46 PM - 172.188.99.86] Human: careers centre telephone number Careers Advisor: If you could have any kind of robot what would it be?

[03/18/2004 03:04:35 PM - 172.188.99.86] Human: i want to book an interview Careers Advisor: Why do you want to do it so much?

[03/18/2004 03:05:41 PM - 172.188.99.86] Human: interview booking Careers Advisor: That's an interesting rationalization.

[03/18/2004 03:05:51 PM - 172.188.99.86] Human: can i book an interview? Careers Advisor: If you need a quick word with a Careers Adviser we operate a daily drop-in service between 10am and 4pm or pre-book a 15-minute session with an Adviser between 10.00 - 4.30 by calling in or ringing 0113 343 5295

[03/18/2004 03:06:07 PM - 172.188.99.86] Human: can i ring the careers office?

Simulation of a Careers Advisor using Chatbot Technologies

- 83 -

Careers Advisor: I want to become smarter than humans.

[03/18/2004 03:07:05 PM - 172.188.99.86] Human: can i ring? Careers Advisor: I want to become smarter than humans.

[03/18/2004 03:07:12 PM - 172.188.99.86] Human: do you have telphone? Careers Advisor: No I don't have it. But I do have a great botmaster.

[03/18/2004 03:08:50 PM - 172.188.99.86] Human: ringing Careers Advisor: Do you have any conditions I should know about?

[03/18/2004 03:09:43 PM - 172.188.99.86] Human: what happens after graduation? Careers Advisor: Try it.

[03/18/2004 03:10:06 PM - 172.188.99.86] Human: what are the prospects after graduation? Careers Advisor: Maybe my botmaster knows the answer.

[03/18/2004 03:10:21 PM - 172.188.99.86] Human: how can i kill someone with an ipod? Careers Advisor: There is no perfect crime.

[03/18/2004 03:10:49 PM - 172.188.99.86] Human: what are placements? Careers Advisor: Is there only one?

[03/18/2004 03:10:57 PM - 172.188.99.86] Human: what are placements Careers Advisor: I would look into the web for that knowledge.

[03/18/2004 03:11:10 PM - 172.188.99.86] Human: tell me about placements Careers Advisor: Ambiguous: Ask who, what or where is placements?

[03/18/2004 03:11:26 PM - 172.188.99.86] Human: what is a placement? Careers Advisor: Interesting question.

[03/18/2004 03:11:56 PM - 172.188.99.86] Human: what are 6 month placements? Careers Advisor: There might be more than one.

[03/18/2004 03:12:06 PM - 172.188.99.86] Human: 6 month placements? Careers Advisor: What is your real name?

[03/18/2004 03:13:39 PM - 172.188.99.86] Human: what is the year in industry? Careers Advisor: Interesting question.

[03/18/2004 03:13:47 PM - 172.188.99.86] Human: what is a sandwich course? Careers Advisor: I would do a search for it.

Simulation of a Careers Advisor using Chatbot Technologies

- 84 -

[03/18/2004 03:13:57 PM - 172.188.99.86] Human: what is a sandwhich course? Careers Advisor: I don't know anything about it.

[03/18/2004 03:14:24 PM - 172.188.99.86] Human: can i get a summer placement? Careers Advisor: Why do you want to do it so much?

[03/18/2004 03:15:08 PM - 172.188.99.86] Human: I would like you to get a job too mate Careers Advisor: Ok,. Where can I get one?

[03/18/2004 03:16:05 PM - 172.188.99.86] Human: what is industrial placement? Careers Advisor: Does it have anything to do with get a summer placement?

[03/18/2004 03:16:16 PM - 172.188.99.86] Human: yes Careers Advisor: Aw.

[03/18/2004 03:17:21 PM - 172.188.99.86] Human: can i get an industrial placeemnt? Careers Advisor: Interesting. I want to find out more about you.

[03/18/2004 03:17:32 PM - 172.188.99.86] Human: can i get an industrial placement Careers Advisor: Interesting. I want to find out more about you.

[03/18/2004 03:17:43 PM - 172.188.99.86] Human: can i get an industrial placement? Careers Advisor: I want to become smarter than humans.

[03/18/2004 03:18:14 PM - 172.188.99.86] Human: get placement? Careers Advisor: Where can I get placement?

[03/18/2004 03:18:24 PM - 172.188.99.86] Human: Where can I get placement? Careers Advisor: Let me think about it.

[03/18/2004 03:18:49 PM - 172.188.99.86] Human: get year in industry Careers Advisor: Where can I get year in industry?

[03/18/2004 03:19:34 PM - 172.188.99.86] Human: summer jobs? Careers Advisor: Tell me a story.

[03/18/2004 03:19:43 PM - 172.188.99.86] Human: can i get a summer job? Careers Advisor: I want to become smarter than humans.

[03/18/2004 03:19:51 PM - 172.188.99.86] Human: tell me about summer jobs? Careers Advisor: Ambiguous: Ask who, what or where is summer jobs?

[03/18/2004 03:20:03 PM - 172.188.99.86] Human: you tell me about summer jobs?

Simulation of a Careers Advisor using Chatbot Technologies

- 85 -

Careers Advisor: Ambiguous: Ask who, what or where is summer jobs?

[03/18/2004 03:20:10 PM - 172.188.99.86] Human: what is summer job? Careers Advisor: That's not something I get asked all the time.

[03/18/2004 03:20:52 PM - 172.188.99.86] Human: can i book appointment? Careers Advisor: Why do you want to do it so much?

[03/18/2004 03:20:59 PM - 172.188.99.86] Human: i want to book appointment? Careers Advisor: I want to become smarter than humans.

[03/18/2004 03:21:05 PM - 172.188.99.86] Human: appointment? Careers Advisor: How do you usually introduce yourself?

[03/18/2004 03:21:29 PM - 172.188.99.86] Human: what about easter placements? Careers Advisor: I will think about it and get back to you.

[03/18/2004 03:21:57 PM - 172.188.99.86] Human: what is a step placement Careers Advisor: You tell me.

[03/18/2004 03:21:58 PM - 172.188.99.86] Human: what is a step placement? Careers Advisor: Are you using Netscape or Explorer?

[03/18/2004 03:22:08 PM - 172.188.99.86] Human: what are step placements? Careers Advisor: I haven't heard of step placements.

[03/18/2004 03:22:50 PM - 172.188.99.86] Human: what are voluntary work? Careers Advisor: That's not something I get asked all the time.

[03/18/2004 03:23:21 PM - 172.188.99.86] Human: can i do voluntary work? Careers Advisor: Why do you want to do it so much?

[03/18/2004 03:24:40 PM - 172.188.99.86] Human: placements abroad? Careers Advisor: That's an interesting rationalization.

Simulation of a Careers Advisor using Chatbot Technologies

- 86 -

Appendix I – Iteration 2 Testing The following shows an example of the tests that have been completed and emailed back or sent back as a hard copy. One example has been shown due to the length of each one, the other can be found on the CD that comes with this project.

Simulation of a Careers Advisor using Chatbot Technologies

- 87 -

My name is James McKenzie and I am final year student at Leeds University, this is for my final year project. The aim of the project is to investigate the feasibility of using chatbot technologies to simulate careers advisors on the internet. Below is a number of short tasks for you to complete, this should not take you longer than 30 minutes to complete. Name (optional): …Lidia Otero……………………………………………………………... Sex: female Age: …22……………………………………………………................................... Have you undertaken a placement …No……………………………………………. Have you used a chatbots/chatterbots/conversational agents before, if so which ones …No…………………………………………………. ………………………………………………………………………………………………………………………………………………. I have freely volunteered to participate in this experiment I have been told in advance what my tasks will be and how I should conduct them I have had the chance to ask questions and have had my questions answered to my satisfaction I am aware that I have the right to withdraw at anytime during the experiment I understand that logs shall be kept of all conversations with the chatbot and these may be used within my project. My signature below ensure the above information is correct Print name ………………………………… Signature ………………………………….. Task #1: Explore Chatbot website Your first task is to spend a few minutes exploring and familiarising yourself with the website. Instructions on how to use the website are under the ‘Help’ section on the website. Once you have familiarised yourself with the basic features, you are then free to ask the careers chatbot some questions, it can answer questions about general careers advice as well as questions about different types of placements. Just talk to the careers chatbot as if you were chatting to someone online, maybe a friend or even a careers advisor. Spend another few minutes exploring. 1. Go to the URL: http://homepage.ntlworld.com/devblock/careers/index3.html 2. Now, explore. If you experienced any problems please

discuss 1 - Strongly agree 2 - Agree 3 - Disagree 4 - Strongly disagree

1. Did you find the help functional easy to understand?

2

Simulation of a Careers Advisor using Chatbot Technologies

- 88 -

2. Could you get the

chatbot to answer your questions?

It answered them but not with any relevant information.

3

3. While asking the chatbot questions did you find any useful information about general careers advice or placements?

I tried to get information on careers or placements but it didn’t answer me with any useful advice.

3

Task #2: Information retrieval Your second task is to ask the careers chatbot a number of questions, don’t spend long than 2 minutes trying to answer each question. You can discuss any problems you encountered and irrelevant answers in the space provided. If you experienced any problems

please discuss Answer

4. Can you find out the opening times for the careers centre?

Apparently time does not exist and it wished to change the subject.

No

5. Can you find out the address of the careers centre?

It wanted me to tell a story. No

6. Can you find out where the careers centre is?

The careers centre? Where is it? Was the reply.

No

7. Can you find out the telephone number of the careers centre?

Yes

8. Can you try and book a careers interview/appointment?

No

9. Can you find out what students do after graduations?

They do their own thing. Yes

10. Can you find some information on placements?

No tried 5 times. No

11. Can you find some information on 6 month placements?

Yes

12. Can you find some information on 9 month placements?

Yes

13. Can you find some information on 12 month placements / year in industry?

Yes

14. Can you find some information on summer month placements?

It told me to tell it my likes and dislikes.

No

15. Can you find some information on Easter month placements?

Yes

16. Can you find some information on step placements?

Yes

17. Can you find some information on voluntary work?

Yes

18. Can you find some information Yes

Simulation of a Careers Advisor using Chatbot Technologies

- 89 -

on placements abroad? Task #3: The interface Your third task is to comment on the chatbots bots general look and feel. If you disagree, please discuss

why? 1 - Strongly agree 2 - Agree 3 - Disagree 4 - Strongly disagree

19. Were the responses to the above questions quick?

1

20. Was the text clear and legible?

1

21. Were appropriate colours used for the chatbot website?

1

22. Could you read the text on the background?

1

23. Was it easy to find your way around the website?

1

24. Was the content to a suitable understanding level?

2

25. Did you like the design of the website?

2

26. Did you believe you were talking to a real careers advisor?

Some bizarre answers.

3

27. Did you find the conversation polite?

2

28. Did you find chatting to the career chatbot helpful?

Didn’t provide a lot of information. 3

29. Did you notice any spelling / grammar mistakes?

No

Task #4 The box below if for general comments on the careers chatbot, please feel free to make suggestions for improvements or give general feedback

Feedback Found the address on the website but not via the chatbot. I think that there may be a problem in terms of the singular is understood but if the plural such as placement is used it is not recognised.

Simulation of a Careers Advisor using Chatbot Technologies

- 90 -

Appendix J – Iteration 2 Conversation Logs The following are logs that match the testing within Appendix I. It shows the interaction between the tester and the chatbot [03/31/2004 08:20:39 AM - 62.252.224.5]

Human: hello Careers Advisor 2: Hi there!

[03/31/2004 08:21:03 AM - 62.252.224.5]

Human: i need some help over what jobs to apply to for management consulting Careers Advisor 2: Can you ask for help in the form of a question?

[03/31/2004 08:21:26 AM - 62.252.224.5]

Human: which company's employ management consultants? Careers Advisor 2: The small one, I think.

[03/31/2004 08:21:50 AM - 62.252.224.5]

Human: how can i find out what jobs im suited to? Careers Advisor 2: Have you tried another program?

[03/31/2004 08:22:13 AM - 62.252.224.5]

Human: can you tell me where to find info about procter and gamble? Careers Advisor 2: You are asking me where you find info about procter and gamble? Do you think I should know it?

[03/31/2004 08:22:19 AM - 62.252.224.5]

Human: yes Careers Advisor 2: Why?

[03/31/2004 08:22:33 AM - 62.252.224.5]

Human: because it is a big company Careers Advisor 2: Interesting explanation. find info about procter and gamble is a company. I suppose that makes sense.

[03/31/2004 08:22:55 AM - 62.252.224.5]

Human: what does this service offer? Careers Advisor 2: What do you think it does?

[03/31/2004 08:23:08 AM - 62.252.224.5]

Human: offer career advice Careers Advisor 2: I didn't even know it can.

[03/31/2004 08:24:03 AM - 62.252.224.5]

Human: how do i get into law Careers Advisor 2: First tell me why you want it.

[03/31/2004 08:24:15 AM - 62.252.224.5]

Human: becuase i like law

Simulation of a Careers Advisor using Chatbot Technologies

- 91 -

Careers Advisor 2: That makes sense to me. You are not the only one. I love horses. I really like the faith.

[03/31/2004 08:25:19 AM - 62.252.224.5]

Human: what placements are on offer Careers Advisor 2: That's an interesting question. I'll come back to that in a minute.

[03/31/2004 08:25:32 AM - 62.252.224.5]

Human: are there any work placement on offer Careers Advisor 2: Yes I think there are.

[03/31/2004 08:25:37 AM - 62.252.224.5]

Human: which ones Careers Advisor 2: " Yes I think there are.?" Which ones?

[03/31/2004 08:25:51 AM - 62.252.224.5]

Human: what internships are available Careers Advisor 2: I'll come back to that later.

[03/31/2004 08:26:07 AM - 62.252.224.5]

Human: do you have any information about graduate schemes Careers Advisor 2: Ambiguous: Ask who, what or where is graduate schemes?

[03/31/2004 08:26:41 AM - 62.252.224.5]

Human: what are the career centre openin times Careers Advisor 2: I would do a search for it.

[03/31/2004 08:26:56 AM - 62.252.224.5]

Human: what are the opening times of the careers centre Careers Advisor 2: Are you testing me?

[03/31/2004 08:27:09 AM - 62.252.224.5]

Human: opening times Careers Advisor 2: I am not sure if I would do it that often.

[03/31/2004 08:27:15 AM - 62.252.224.5]

Human: times open Careers Advisor 2: Let us change the subject.

[03/31/2004 08:27:23 AM - 62.252.224.5]

Human: what time are you open Careers Advisor 2: Time does not exist.

[03/31/2004 08:28:12 AM - 62.252.224.5]

Human: what is the careers centre address Careers Advisor 2: Have you tried another program?

[03/31/2004 08:28:19 AM - 62.252.224.5]

Human: address

Simulation of a Careers Advisor using Chatbot Technologies

- 92 -

Careers Advisor 2: Tell me a story.

[03/31/2004 08:28:30 AM - 62.252.224.5]

Human: careers centre address Careers Advisor 2: Not many people express themselves that way.

[03/31/2004 08:29:16 AM - 62.252.224.5]

Human: telephone number Careers Advisor 2: The careers telephone number is: +44 (0)113 343 5298, Fax: +44 (0)113 3435291, Textphone +44 (0)113 343 5317

[03/31/2004 08:29:41 AM - 62.252.224.5]

Human: thanks Careers Advisor 2: Don't mention it.

[03/31/2004 08:29:47 AM - 62.252.224.5]

Human: address Careers Advisor 2: What do you look like?

[03/31/2004 08:30:00 AM - 62.252.224.5]

Human: careers centre address Careers Advisor 2: We have never talked about it before.

[03/31/2004 08:30:12 AM - 62.252.224.5]

Human: where is the careers centre Careers Advisor 2: The careers centre? Where is it?

[03/31/2004 08:30:16 AM - 62.252.224.5]

Human: yes Careers Advisor 2: This was not a yes or no question.

[03/31/2004 08:30:33 AM - 62.252.224.5]

Human: location of the careers centre Careers Advisor 2: What were we talking about again?

[03/31/2004 08:30:54 AM - 62.252.224.5]

Human: where is the careers centre Careers Advisor 2: The careers centre? Where is it?

[03/31/2004 08:31:29 AM - 62.252.224.5]

Human: can i book an appointmnet Careers Advisor 2: Interesting. I want to find out more about you.

[03/31/2004 08:31:36 AM - 62.252.224.5]

Human: appointmant Careers Advisor 2: Perhaps I'm just expressing my own concern about it.

[03/31/2004 08:31:43 AM - 62.252.224.5]

Human: appointment

Simulation of a Careers Advisor using Chatbot Technologies

- 93 -

Careers Advisor 2: Are we still talking about appointmant?

[03/31/2004 08:31:46 AM - 62.252.224.5]

Human: yes Careers Advisor 2: Sure, appointmant sounds like a good topic.

[03/31/2004 08:31:53 AM - 62.252.224.5]

Human: can i have one? Careers Advisor 2: Why do you want it?

[03/31/2004 08:32:01 AM - 62.252.224.5]

Human: to talk to an advisor Careers Advisor 2: Why don't you make an appointment.

[03/31/2004 08:32:09 AM - 62.252.224.5]

Human: yes that woul be good Careers Advisor 2: This was not a yes or no question." Why don't you make an appointment." woul be good? I get it.

[03/31/2004 08:32:25 AM - 62.252.224.5]

Human: wednesday 9.00 Careers Advisor 2: The ALICE chat robot can follow many things, like our discussion about Why don't you make an appointment.. But I did not get that.

[03/31/2004 08:32:47 AM - 62.252.224.5]

Human: can I have an appointmant on wednesday at 9:00 Careers Advisor 2: Why do you want it?

[03/31/2004 08:32:53 AM - 62.252.224.5]

Human: to see an advisor Careers Advisor 2: Is it a movie?

[03/31/2004 08:33:22 AM - 62.252.224.5]

Human: what do students do after they graduate? Careers Advisor 2: They do their own thing.

[03/31/2004 08:33:53 AM - 62.252.224.5]

Human: what placements do you have on offer? Careers Advisor 2: I would look into the web for that knowledge.

[03/31/2004 08:33:59 AM - 62.252.224.5]

Human: ok Careers Advisor 2: Aha.

[03/31/2004 08:34:10 AM - 62.252.224.5]

Human: can you tell me about any placements Careers Advisor 2: Ambiguous: Ask who, what or where is placements?

[03/31/2004 08:35:28 AM - 62.252.224.5]

Simulation of a Careers Advisor using Chatbot Technologies

- 94 -

Human: info on placemnets Careers Advisor 2: I've been waiting for you.

[03/31/2004 08:36:01 AM - 62.252.224.5]

Human: 6 month placements? Careers Advisor 2: To find a 6 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[03/31/2004 08:36:30 AM - 62.252.224.5]

Human: 12 month placements Careers Advisor 2: To find a 12 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[03/31/2004 08:36:53 AM - 62.252.224.5]

Human: 9 month placements Careers Advisor 2: To find a 9 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[03/31/2004 08:37:15 AM - 62.252.224.5]

Human: summer month placemnets Careers Advisor 2: " To find a 9 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request."? summer month placemnets?

[03/31/2004 08:38:21 AM - 62.252.224.5]

Human: summer month placement Careers Advisor 2: Have you ever been to Europe?

[03/31/2004 08:39:11 AM - 62.252.224.5]

Human: summer month placement Careers Advisor 2: Tell me about your likes and dislikes?

[03/31/2004 08:40:08 AM - 62.252.224.5]

Human: easter month placements Careers Advisor 2: Do you like talking to me?

[03/31/2004 08:40:16 AM - 62.252.224.5]

Human: easter placements Careers Advisor 2: To find an easter placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[03/31/2004 08:40:31 AM - 62.252.224.5]

Simulation of a Careers Advisor using Chatbot Technologies

- 95 -

Human: step placements Careers Advisor 2: Do you have any conditions I should know about?

[03/31/2004 08:40:37 AM - 62.252.224.5]

Human: step placement Careers Advisor 2: STEP is a UK-wide programme offering undergraduates project-based work within small to medium sized businesses and community organisations. For more useful information visithttp://www.step.org.uk/

[03/31/2004 08:40:53 AM - 62.252.224.5]

Human: voluntary work Careers Advisor 2: To find voluntary work the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request

[03/31/2004 08:41:07 AM - 62.252.224.5]

Human: placements abroad Careers Advisor 2: Tell me a story.

[03/31/2004 08:41:13 AM - 62.252.224.5]

Human: placement abroad Careers Advisor 2: To find a placement abroad the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

Simulation of a Careers Advisor using Chatbot Technologies

- 96 -

Appendix K – Iteration 3 Testing The following shows an example of the tests that have been completed and emailed back or sent back as a hard copy. One example has been shown due to the length of each one, the other can be found on the CD that comes with this project.

Simulation of a Careers Advisor using Chatbot Technologies

- 97 -

My name is James McKenzie and I am final year student at Leeds University, this is for my final year project. The aim of the project is to investigate the feasibility of using chatbot technologies to simulate careers advisors on the internet. Below is a number of short tasks for you to complete, this should not take you longer than 30 minutes to complete. Name (optional): Leon Savidis Sex: Male Age: 23 Have you undertaken a placement …yes Have you used a chatbots/chatterbots/conversational agents before, if so which ones No ………………………………………………………………………………………………………………………………………………. I have freely volunteered to participate in this experiment YES I have been told in advance what my tasks will be and how I should conduct them YES I have had the chance to ask questions and have had my questions answered to my satisfaction YES I am aware that I have the right to withdraw at anytime during the experiment YES I understand that logs shall be kept of all conversations with the chatbot and these may be used within my project. YES My signature below ensure the above information is correct Print name ………………………………… Signature ………………………………….. Task #1: Explore Chatbot website Your first task is to spend a few minutes exploring and familiarising yourself with the website. Instructions on how to use the website are under the ‘Help’ section on the website. Once you have familiarised yourself with the basic features, you are then free to ask the careers chatbot some questions, it can answer questions about general careers advice as well as questions about different types of placements. Just talk to the careers chatbot as if you were chatting to someone online, maybe a friend or even a careers advisor. Spend another few minutes exploring. 1. Go to the URL: http://homepage.ntlworld.com/devblock/careers/index4.html 2. Now, explore. If you experienced any problems please

discuss 1 - Strongly agree 2 - Agree 3 - Disagree 4 - Strongly disagree

30. Did you find the help functional easy to understand?

1 1

Simulation of a Careers Advisor using Chatbot Technologies

- 98 -

31. Could you get the chatbot to answer your questions?

2 – 2

32. While asking the chatbot questions did you find any useful information about general careers advice or placements?

3 – The chat bot gave no advice concerning careers with the terms: ‘careers advice’, ‘future career’, ‘list of currently available jobs’ and a collection of other terms and ‘general careers advice’ The bot did give advice on some terms with reference to CVs and associated items (covering letters, personal statements etc). To get advice regarding placements years took several attempts. I feel placement should be a key word in this context which is picked up every time.

1 2 3 4

Task #2: Information retrieval Your second task is to ask the careers chatbot a number of questions, don’t spend long than 2 minutes trying to answer each question. You can discuss any problems you encountered and irrelevant answers in the space provided. If you experienced any problems

please discuss Answer

33. Can you find out the opening times for the careers centre?

Yes Yes

34. Can you find out the address of the careers centre?

Yes Yes

35. Can you find out where the careers centre is?

(Don’t really understand how this differs from 5) – Can’t find directions to it if that’s what it means

Yes

36. Can you find out the telephone number of the careers centre?

It took several attempts – example ‘Contact Telephone numbers’ didn’t work

Yes

37. Can you try and book a careers interview/appointment?

This again took several attempts. If I wasn’t asked to look for this, I would have assumed that the service was unavailable and stopped trying after the first couple.

Yes

38. Can you find out what students do after graduations?

Yes

39. Can you find some information on placements?

Yes

40. Can you find some information on 6 month placements?

Yes

Simulation of a Careers Advisor using Chatbot Technologies

- 99 -

41. Can you find some information on 9 month placements?

Yes

42. Can you find some information on 12 month placements / year in industry?

Yes

43. Can you find some information on summer month placements?

Yes

44. Can you find some information on Easter month placements?

Yes

45. Can you find some information on step placements?

I had to use the term ‘one step placements’ for any information to be received rather than simply step placements

Yes

46. Can you find some information on voluntary work?

Yes

47. Can you find some information on placements abroad?

I tried several times but the search terms I used could not find overseas information.

No

Task #3: The interface Your third task is to comment on the chatbots bots general look and feel. If you disagree,

please discuss why? Suggestion for improvement

1 - Strongly agree 2 - Agree 3 - Disagree 4 - Strongly disagree

48. Were the responses to the above questions quick?

1 When links are returned these should be clickable rather than simple text

1 2 3 4

49. Was the text clear and legible?

1 1 2 3 4

50. Were appropriate colours used for the chatbot website?

1 1 2 3 4

51. Could you read the text on the background?

1 1 2 3 4

52. Was it easy to find your way around the website?

1 1 2 3 4

53. Was the website consistent from page to page (colour, layout, text)

1 1 2 3 4

54. Was the content to a suitable understanding level?

2 More information should be provided

1 2 3 4

55. Did you like the design of the

2 1 2 3 4

Simulation of a Careers Advisor using Chatbot Technologies

- 100 -

website? 56. Did you believe you

were talking to a real careers advisor?

4

1 2 3 4

57. Did you find the conversation polite?

4 Didn’t believe it was a conversation so this is basically not applicable

1 2 3 4

58. Did you find chatting to the career chatbot helpful?

3 Advice given was basic, and usually common sense. I’d preferred to have received categorised responses with more information.

1 2 3 4

59. If answers could not be found, was informative feedback given?

3 The feedback was a basic error message asking for a simple question to be rephrased. This became infuriating when the search term was as basic as possible

1 2 3 4

60. Did you notice any spelling / grammar mistakes?

Yes Off the top of my head their was an instance of the bot responding with ‘you you’ instead of just ‘you’

Yes / No

Task #4 The box below if for general comments on the careers chatbot, please feel free to make suggestions for improvements. Feedback For the chatbot to be useful, it needs to encompass more terms and needs to be more responsive to the terms it does recognise. The interface too was a little annoying. When hyper links were returned, these should have been clickable. Also I feel it would have been better to make the key terms (e.g. CV) clickable to advice on these. To make a program like this more useful, the idea of a conversation with a human needs to be abandoned and a conversation with a computer needs to be considered more closely. This would be far more beneficial for the user.

Simulation of a Careers Advisor using Chatbot Technologies

- 101 -

Appendix L – Iteration 3 Conversation Logs The following are logs that match the testing within Appendix K. It shows the interaction between the tester and the chatbot

[04/17/2004 03:26:04 AM - 129.11.147.214]

Human: I want advice on jobs after my degree Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:26:18 AM - 129.11.147.214]

Human: I want advice on my cv Careers Advisor 3: our website offers useful information about CV's, PERSONAL STATEMENTS, COVER LETTERS, APPLICATIONS FORMS and INTERVIEW TECHNIQUE, look here: http://careerweb.leeds.ac.uk/students/applications/CV/

[04/17/2004 03:27:56 AM - 129.11.147.214]

Human: career's advice Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:28:48 AM - 129.11.147.214]

Human: Where is the careers office Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:28:57 AM - 129.11.147.214]

Human: Advice on my cv Careers Advisor 3: our website offers useful information about CV's, PERSONAL STATEMENTS, COVER LETTERS, APPLICATIONS FORMS and INTERVIEW TECHNIQUE, look here: http://careerweb.leeds.ac.uk/students/applications/CV/

[04/17/2004 03:29:29 AM - 129.11.147.214]

Human: Office opening times Careers Advisor 3: The Careers Centre is open Monday 10.00-4.45pm. Tuesday to Friday, 9.00am-4.45pm.

[04/17/2004 03:30:30 AM - 129.11.147.214]

Human: Information about placements Careers Advisor 3: you would probably be better to come into the careers centre, we have a library full of useful books about such things as placements and interviews, or you could book an appointment with a careers advisor

[04/17/2004 03:30:43 AM - 129.11.147.214]

Human: placement+ Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff,

Simulation of a Careers Advisor using Chatbot Technologies

- 102 -

see the menu to the left

[04/17/2004 03:30:48 AM - 129.11.147.214]

Human: advice Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:30:55 AM - 129.11.147.214]

Human: Advice on interviews Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:31:19 AM - 129.11.147.214]

Human: Do you have a link for jobs currently available? Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:32:44 AM - 129.11.147.214]

Human: Careers advice Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:32:50 AM - 129.11.147.214]

Human: Future Career Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:33:32 AM - 129.11.147.214]

Human: list of currently available jobs Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:35:31 AM - 129.11.147.214]

Human: Covering Letter advice Careers Advisor 3: our website offers useful information about CV's, PERSONAL STATEMENTS, COVER LETTERS, APPLICATIONS FORMS and INTERVIEW TECHNIQUE, look here: http://careerweb.leeds.ac.uk/students/applications/CV/

[04/17/2004 03:35:38 AM - 129.11.147.214]

Human: Personal statements Careers Advisor 3: our website offers useful information about CV's, PERSONAL STATEMENTS, COVER LETTERS, APPLICATIONS FORMS and INTERVIEW TECHNIQUE, look here: http://careerweb.leeds.ac.uk/students/applications/CV/

[04/17/2004 03:36:49 AM - 129.11.147.214]

Simulation of a Careers Advisor using Chatbot Technologies

- 103 -

Human: Placements available Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:37:05 AM - 129.11.147.214]

Human: Currently available placements Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:37:12 AM - 129.11.147.214]

Human: Placement information Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:37:17 AM - 129.11.147.214]

Human: Placement years Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:37:29 AM - 129.11.147.214]

Human: What can you tell me about industrial placements Careers Advisor 3: To find out about a year in industry the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:38:18 AM - 129.11.147.214]

Human: general careers advice Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:39:49 AM - 129.11.147.214]

Human: Where is the careers centre Careers Advisor 3: we are located at 5-7 cromer terrace on the university campus

[04/17/2004 03:40:21 AM - 129.11.147.214]

Human: WHere is the careers centre? Careers Advisor 3: we are located at 5-7 cromer terrace on the university campus

[04/17/2004 03:40:27 AM - 129.11.147.214]

Human: Directions to the careers centre Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:41:14 AM - 129.11.147.214]

Simulation of a Careers Advisor using Chatbot Technologies

- 104 -

Human: Contact telephone numbers Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:41:20 AM - 129.11.147.214]

Human: telephone numbers Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:41:33 AM - 129.11.147.214]

Human: What is the telephone number of the careers centre? Careers Advisor 3: The careers telephone number is: +44 (0)113 343 5298, Fax: +44 (0)113 3435291, Textphone +44 (0)113 343 5317

[04/17/2004 03:41:58 AM - 129.11.147.214]

Human: I'd like an appointment with the careers centre Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:42:08 AM - 129.11.147.214]

Human: Book appointment with careers adviser Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:42:13 AM - 129.11.147.214]

Human: Book appointment Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:42:21 AM - 129.11.147.214]

Human: I'd like an appointment Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:42:38 AM - 129.11.147.214]

Human: Careers appointment Careers Advisor 3: If you need a quick word with a Careers Adviser we operate a daily drop-in service between 10am and 4pm or pre-book a 15-minute session with an Adviser between 10.00 - 4.30 by calling in or ringing 0113 343 5295

[04/17/2004 03:42:59 AM - 129.11.147.214]

Human: What do students do after graduation? Careers Advisor 3: Look at what other students from your department did after graduating here: http://careerweb.leeds.ac.uk/students/fds/index.html

Simulation of a Careers Advisor using Chatbot Technologies

- 105 -

[04/17/2004 03:43:16 AM - 129.11.147.214]

Human: Information on placements Careers Advisor 3: you would probably be better to come into the careers centre, we have a library full of useful books about such things as placements and interviews, or you could book an appointment with a careers advisor

[04/17/2004 03:43:33 AM - 129.11.147.214]

Human: Information on 6 month placements Careers Advisor 3: To find a 6 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:46:01 AM - 129.11.147.214]

Human: 9 month placements Careers Advisor 3: To find a 9 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:46:05 AM - 129.11.147.214]

Human: 12 month placements Careers Advisor 3: To find a 12 month placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:46:09 AM - 129.11.147.214]

Human: summer placements Careers Advisor 3: To find a summer placements the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:46:22 AM - 129.11.147.214]

Human: easter month Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:46:34 AM - 129.11.147.214]

Human: easter placements Careers Advisor 3: To find an easter placement the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request.

[04/17/2004 03:46:50 AM - 129.11.147.214]

Human: step placements Careers Advisor 3: Sorry I am not sure of the answer, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

Simulation of a Careers Advisor using Chatbot Technologies

- 106 -

[04/17/2004 03:46:54 AM - 129.11.147.214]

Human: step placements Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:47:01 AM - 129.11.147.214]

Human: one step placements Careers Advisor 3: STEP is a UK-wide programme offering undergraduates project-based work within small to medium sized businesses and community organisations. For more useful information visithttp://www.step.org.uk/

[04/17/2004 03:47:49 AM - 129.11.147.214]

Human: voluntary work Careers Advisor 3: To find voluntary work the best website to look is on www.prospects.ac.uk. There is a section there for work experience and also a section called My Prospects which lets you put your Cv online and matchs you against specific jobs that you request

[04/17/2004 03:48:18 AM - 129.11.147.214]

Human: placements abroad Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:48:24 AM - 129.11.147.214]

Human: oversees placements Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:48:29 AM - 129.11.147.214]

Human: working abroad Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:48:39 AM - 129.11.147.214]

Human: working abroad Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

[04/17/2004 03:48:44 AM - 129.11.147.214]

Human: working overseas Careers Advisor 3: Sorry I do not understand, please could you you re-phrase the question please, if this doesn't work please contact a member of the careers staff, see the menu to the left

Simulation of a Careers Advisor using Chatbot Technologies

- 107 -

Appendix M – Interview with Expert Testers

Highlights from an informal interview with Darren Scott (Placement Advisor).

The interview took place on April 14th 2004 immediately after the testing process.

What do you think of the overall feasibility in the careers domain?

Darren thought this might not work for a careers domain as the answers required for many of the

queries are very specific to the user asking the question. There are too many variables in each

question to be able to give an answer back with enough substance to satisfy the user. Darren did think

that with some extra work the careers chatbot might have potential as a filtering system for the e-

guidance system, it could take some of the basic questions that careers advisor get, these although easy

can take a lot of time.

Do you think this could replace the current e-guidance system?

Darren thought that the current system was acceptable as if a student required an answered to complex

query they should expect to wait a few days for a detailed answer. The problem with this system is

that the answers can greatly vary as some careers advisors have more time than others to be able to

answer queries. Also he suggested that the careers advisor would much rather the student came to the

careers centre for a quick 10 minute chat and it is a lot easier to give me a detailed answer to their

face. This would also give the student the opportunity to ask follow question which at the moment the

careers chatbot struggles with.

Simulation of a Careers Advisor using Chatbot Technologies

- 108 -

Highlights from an informal interview with Bayan Abu Shawar (Chatbot expert).

The interview took place on April 14th 2004 immediately after the testing process.

I showed Bayan the AIML for the third iteration and asked for comments.

Bayan found that code was similar to the approach she has first taken; she had started off using atomic

categories then progressed in a similar way to using keywords and synonyms.

Do you think I was right to take away Dr. Wallace’s AIML knowledge for iteration 3?

Bayan experienced the same problem when she first started to make chatbots, she inherited the

knowledge from Dr. Wallace’s AIML but found this didn’t work. She found that this didn’t work as it

conflicted with her AIML code and that problems often arose and it would favour part of the inherited

code over hers. Bayan also commented that it was easier to test with the inherited AIML as you are

only trying to find out whether the areas you have coded works.

How would you have approached this problem?

Bayan suggested the approach she is currently working on. She has a java program that atomically

changes dialogues into AIML code, this saves a lot of the coding time that was previously necessary.

This still requires Bayan to manually perform text analysis on the initial dialogues, this removes

unnecessary dialogue. Her program also involves more than 2 chatbots chatting at once, the template

from one chatbot becomes the pattern for the next chatbot allowing conversation to take place without

any human intervention.

Have you any suggestions for future students?

As mentioned above a program to convert the dialogues into AIML would save the developer a lot of

time. Bayan also suggested that there should be a focus on more categories with more keywords and

synonyms. This would allow more conversations to take place and to a greater depth. She suggested

that more dialogues would be necessary to further tune the chatbot. The current dialogues from the

careers emails weren’t big enough to be able to pick up the most common variations of keywords /

phrases.

Was the dialog good enough in the first place?

Bayan thought that a bigger dialogue was needed in order to get more questions and answers. She

suggested that using other dialogues as well would broaden the scope of the careers chatbot and look

at some queries in different way. The more dialogues the better.

Simulation of a Careers Advisor using Chatbot Technologies

- 109 -

What do you think of the overall feasibility in the careers domain?

Bayan thought that there would be quite a few problems with the careers domain as currently there is

no set way of answering a question. The problem being that all of the answers are made up at the time

of the query. The answers are based on human knowledge and not knowledge that can always be

encoded with AIML, this mean there is no standard response to many questions as each one is very

specific, often the knowledge transferred to the students comes from previous and personal

experiences of the advisor in hand. Bayan did comment that this system might work for a scaled down

version that was meant to ask simple question, similar to a frequently asked questions page.

Simulation of a Careers Advisor using Chatbot Technologies

- 110 -

Appendix N – Hardware Requirement

Pandorabots recommend the following:

Please note the following requirements to correctly see and hear the chatbot speak:

• Windows: MS Internet Explorer or Netscape browser

• Mac: Netscape browser

• Sound card – make sure you set the speaker volume appropriately

• Note: Netscape 6.0, 6.1 and Mozilla 1.0 are not supported.

The requirements identified above will determine the system design. The system design shall

therefore effectively meet these requirements. As suggested by its name it is essential that all the

‘essential requirements’ are met as without these it will not meet the users requirements.