1.
Introduction ^
2.
Designing Chatbots for Pro Bono Services ^
2.1.
The Right Bot for the Right Job ^
The main driving question when starting to build a chatbot is as such: «What is my bot being hired to do?»4 This question naturally extends to how chatbots are programmed to solve a person’s queries in the best way, and how chatbots should be designed to make people feel comfortable interacting with them. One must first clearly defining the objectives that the chatbot should achieve, and subsequently design the chatbot to achieve these objectives.
Objective | How does a chatbot help? |
To direct web visitors to the right agencies or legislation | Chatbot can provide users hyperlinks to the requested agencies or legislation (e.g. Foreign Ministry, Ministry of Manpower) |
To determine whether an applicant meets the means test for free legal representation | Chatbot asks users for a specific set of information in a Y/N format, and depending on the user’s response, the chatbot can identify whether the means test has been reached |
To guide people who have received a criminal charge | User describes the situation to the chatbot, the chatbot asks for further information such as the offence, and whether the user has sought legal advice, etc. |
To schedule legal clinic timings with the accepted applicants | Bot provides a list of available dates and times, and the user confirms. |
Table 1: Sample objective table for a pro bono service provider
2.2.1.
Reducing Errors and Interruptions ^
The conversation is designed with the objective to meet the user’s needs. This usually means the conversation is succinct and short to reduce errors and interruptions.7 A longer conversation flow provides users with more opportunities to input typographical errors or responses that the chatbot cannot process,8 unless the chatbot has been programmed to process common typographical errors.
There are two elements present in the conversation that enable the chatbot to achieve its objective swiftly. The first element is the nature of the questions. To better illustrate this, sample conversations, i.e. A, B and C, are provided in the next paragraph. In Conversation A, the chatbot’s questions provide the User with room to ask anything. The chatbot can be programmed to process the query if it picks out the word «appointment» and the date «26 october 2017». Unless programmed to do so, the chatbot cannot process the query if «appointment» is spelt wrongly or synonyms are used, e.g. «meeting», «consultation». In Conversations B and C, the question directs the user to only provide a specific date that the chatbot can process immediately, bypassing the need for the user to input «appointment». Conversations B and C use leading questions to direct the users to provide a response that the chatbot can process.
Conversation A | Conversation B | Conversation C | ||
Bot: Hi there! My name is Peter. How can I help you make an appointment with our legal clinic? | Bot: Hi there! My name is Peter. Please let me know which day you would like to visit us. | Bot: Hi there! My name is Peter. Please let me know which day you would like to visit us. User: 26 October 2018 Bot: Please let me know what time suits you:
User responds by clicking Bot: Your appointment has been fixed on 26 October 2018 at 5pm. See you soon! |
Table 2: Conversation flow comparisons for an AppointmentBot
2.2.2.
Mirroring Man ^
Legal clinics can provide a backstory for the chatbot and develop a human quality in the response of chatbots. For instance, the chatbot in Table 2 has a name called «Peter». «Peter» could be a 19 year old high school graduate who is in between high school and starting college. This 19 year old «Peter» responds with an upbeat tone. Or, «Peter» could be a 60 year old grandfather that responds with a reassuring tone. A good example would be the chatbot «Poncho»10. Poncho is essentially a weather bot. However, the developers have built a backstory for Poncho, thereby giving it a personality that allows it to provide weather updates in a fun way. The question therefore lies in what is the appropriate backstory for the chatbot given the objectives that the developer intends to achieve.
2.3.
Designing the Chatbot for a Legal Clinic ^
The chatbot should generally fulfil one objective at any given moment. The clinic should identify what is the objective that they want the chatbot to fulfil. Building a chatbot that does appointment scheduling, filtering applicants for mean tests, and providing legal advice at the same time can result in none of the objectives being achieved by the chatbot due to the multitude of responses that the user can input, leaving the field open for more errors.
3.1.
Providing of Legal Advice ^
3.2.
Gathering and Processing Data ^
3.3.
Complex Tasks Related to Natural Language Processing ^
3.3.1.
Dialects or Variations of the First Language ^
There is a likelihood that those seeking legal aid do not speak the language in its «official» form, but a dialect. A chatbot that is trained to understand an official language only may not be able to process dialect, e.g. German as compared to Bayerische Sprache, English as compared to Singlish.13 Nouns, adjectives, and syntax can differ greatly between an «official language» and a dialect. If a dialect is highly prevalent and used in the region, the developer may consider train the chatbot to recognise words from dialects. However, this means more effort may be needed to train the chatbot to respond and process such inputs. Linguists and native speakers of dialects may have to work with developers to achieve the right outcome.
3.3.2.
Different Languages ^
3.4.
«Can I Speak to an Operator, Please?» ^
Legal advice, when sought, should be provided accurately. If it is certain that the chatbot will unlikely be able to provide proper assistance due to insufficient data, or a huge discretion is permissible (e.g. a specific element of a means test: the rule is that applicants must have annual income of $50’000 or less, but the clinic has accepted exceptions whereby the applicant earns $50’500), the chatbot should prompt the user to contact a human-operated helpline. The simple rationale is that if the chatbot provides wrong advice, the situation can be dire due to the nature of the enquiries. A good fraction of these enquirers unlikely able to afford paid professional opinion, and in some cases, there is an urgency for legal advice to be dispensed, such as when the applicant is facing a criminal charge. In such a case, developers should prompt the user to contact a specific human-manned hotline if the chatbot cannot resolve the query.
4.1.
Scalability of Chatbots ^
4.2.
Developing Greater Interactivity? ^
4.3.
Shifting of Resources ^
4.4.
Cost Benefit Analysis ^
5.
Conclusion ^
With increasing awareness of technology and growth in the presence of chatbots in our daily lives, people become comfortable with interacting and «trusting» chatbots to guide us. It would not be surprising if people become comfortable with chatbots and correspondingly see pro bono service providers eventually deploying chatbots to assist enquirers or applicants. However, any such chatbot deployed should be built to be trustworthy, understanding, and with the ability to create such an atmosphere as well. Volunteers and staff can chip in to create this new «colleague» of theirs. It will be exciting to see how chatbots and volunteers can work side by side in legal clinics in the future.
6.
References ^
Accenture Interactive, Chatbots in Customer Service, Accenture. https://www.accenture.com/t00010101T000000__w__/br-pt/_acnmedia/PDF-45/Accenture-Chatbots-CustomerService.pdf.
Matthew Black, Live from F8 – «Group bots» with Messenger Chat Extensions, Chatbots Magazine. https://chatbotsmagazine.com/live-from-f8-group-bots-with-messenger-chat-extensions-641a3d66b367, 2017.
Chatbotconf 2017, Oratio. https://orat.io/chatbotconf.
Nitin Donde, How data analytics will help us understand chatbots, Venture Beat. https://venturebeat.com/2017/05/29/how-data-analytics-will-help-us-understand-chatbots/, 2017.
Nadine Freischlad, Indonesian speaking chatbots are here! Soon everyone can use them, TechInAsia. https://www.techinasia.com/kata-chatbot-platform-for-indonesian, 2016.
Samuel Gibbs, Chatbot lawyer overturns 160,000 parking tickets in London and New York, The Guardian. https://www.theguardian.com/technology/2016/jun/28/chatbot-ai-lawyer-donotpay-parking-tickets-london-new-york, 2016.
Lauren Golembiewski, Conversational UX: How to build robust, usable bots, ChatbotConf 2017: Guest Speaker.
Lawbot, University of Cambridge Judge Business School. https://www.jbs.cam.ac.uk/faculty-research/centres/socialinnovation/cambridge-social-ventures/our-graduates/lawbot/.
Joanna Goodman, Legal technology: the rise of the chatbots, The Law Society Gazette. https://www.lawgazette.co.uk/features/legal-technology-the-rise-of-the-chatbots/5060310.article, 2017.
Ilker Koksal, 5 small changes that drastically improve chatbot conversations, Venture Beat. https://venturebeat.com/2017/09/16/5-small-changes-that-drastically-improve-chatbot-conversations/, 2017.
Alfie Packham, Cambridge students build a «lawbot» to advise sexual assault victims, The Guardian. https://www.theguardian.com/education/2016/nov/09/cambridge-students-build-a-lawbot-to-advise-sexual-assaultvictims, 2016.
John Palvus, The Next Phase of US: Designing Chatbot Personalities, Co.Design. https://www.fastcodesign.com/3054934/the-next-phase-of-ux-designing-chatbot-personalities, 2016.
Poncho, https://poncho.is/.
Reugee Law Clinic Cologne, http://lawcliniccologne.com/.
Csenger Szabo, 8 Things I Learned at ChatbotConf 2017, Chatbots Magazine. https://chatbotsmagazine.com/8-thingsive-learnt-at-chatbotconf-2017-22cf9318828e, 2017.
Visabot, http://visabot.co.
Helen Zeng, Bots with jobs: Designing conversational UI for the workplace, ChatbotConf 2017: Guest Speaker.
- 1 Gibbs 2017.
- 2 http://visabot.co (all websites last accessed on 28 October 2017).
- 3 https://www.jbs.cam.ac.uk/faculty-research/centres/social-innovation/cambridge-social-ventures/our-graduates/lawbot/.
- 4 Golembiewski 2017.
- 5 Koksal 2017.
- 6 Donde 2017.
- 7 Zeng 2017.
- 8 Ibid.
- 9 Palvus 2016.
- 10 https://poncho.is/.
- 11 This is a sticky issue on the regulation of legal technology. A person requires a license to give legal advice, either for income or for pro bono. However, it is still generally debatable whether legal advice given by an AI will render the software developer liable for breach of illegal legal practice.
- 12 Each country has its own data protection rules. As for the European Union, it has established the General Data Protection Regulation (usually abbreviated as GDPR). The European Union will enforce the General Data Protection Regulation from 25 May 2018. All member states of the European Union are expected to comply with these rules.
- 13 Freischlad 2016.
- 14 For instance, the Refugee Law Clinic in Cologne provides language translations in French and Arabic, despite German being the «official» language for the country.
- 15 Black 2017.