PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2013 | 2 | nr 2 | 146--158
Tytuł artykułu

Chatbots for Customer Service on Hotels' Websites

Treść / Zawartość
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this article we present an analysis of implementations of a chatbot - a program which simulates an intelligent conversation with webpage visitors, dedicated to hotels and guesthouses (hotel chatbot, in short: HC). We obtained unique data from five various webpages exhibiting various configurations, containing a total of 17413 user statements in 4165 conversations. HC informative function was confirmed for more than 56% of the conversations. Moreover, 63% of users prefer to interact with HC if it suggests at least one clickable option to choose as an alternative to typing. The results indicate that the implementation of speech synthesis increases the percentage of users who decide to start a conversation with the HC and it may have a positive impact on the percentage of users that book rooms online. (original abstract)
Rocznik
Tom
2
Numer
Strony
146--158
Opis fizyczny
Twórcy
  • University of Warsaw, Poland
autor
  • Denise Systems sp. z o.o.
Bibliografia
  • [1] Abu Shawar B., Atwell E. (2007) Chatbots: Are they Really Useful?, LDV-Forum Journal for Computational Linguistics and Language Technology, 22 (1), pp. 29-49.
  • [2] Abu Shawar B., Atwell E. (2004) Evaluation of Chatbot Information System, in Proceedings of the Eighth Maghrebian Conference on Software Engineering and Artificial Intelligence.
  • [3] Banach A., Dąbkowski J. (2010), Duży ruch to nie wszystko. Żeby gość nas polecał, Hotelarz, 11(574) November, pp. 36-39.
  • [4] Bartneck C., Rosalia C., Menges R., Deckers I. (2005) Robot Abuse - A Limitation of the Media Equation, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy.
  • [5] Brahnam S. (2006) Gendered bods and bot abuse, in Proceedings of the CHI 2006 workshop on Misuse and abuse of interactive technologies, Montreal, Quebec, Canada.
  • [6] Brahnam S. (2005) Strategies for handling customer abuse of ECAs, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy, pp. 62-67.
  • [7] Bogdanovych A., Simoff S., Sierra C., Berger H. (2005) Implicit training of virtual shopping assistants in 3D electronic institutions, in Proceedings of the IADIS International Conference: e-Commerce 2005, IADIS Press, Portugal, pp. 50-57.
  • [8] Chai J., Budzikowska M., Horvath V., Nicolov N., Kambhatla N., Zadrozny W. (2001) Natural Language Sales Assistant - A Web-Based Dialog System for Online Sales, in Proceedings of the 13th Innovative Applications of Artificial Intelligence Conference, IAAI'01, Seattle, WA, pp. 19-26.
  • [9] De Angeli A. (2006) On Verbal Abuse Towards Chatterbots, in Proceedings of the CHI 2006 workshop on Misuse and Abuse of Interactive Technologies, Montreal, Quebec, Canada.
  • [10] De Angeli A., Brahnam S. (2008) I hate you! Disinhibition with virtual partners, Interacting with Computers, 20(3), pp. 302-310.
  • [11] De Angeli A., Carpenter R. (2005) Stupid computer! Abuse and social identities, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy.
  • [12] De Angeli A., Johnson G. I., Coventry L. (2001) The unfriendly user: exploring social reactions to chatterbots, in Proceedings of the International Conference on Affective Human Factor Design, London, pp. 467-474.
  • [13] Hughes L. (2006) The Eliza Effect: Conversational Agents and Cognition, available at: http://www.laurahughes.com/art/elizaeffect.pdf (accessed 27 September 2011).
  • [14] Jessa Sz. (2004) Czy chatterboty nas rozumieją?, Software 2.0 Extra, 10/2004, pp. 16-20, available at: http://sdjournal.org/magazine/1251-sztuczna-inteligencja.
  • [15] Jessa Sz., Jędruch W. (2010) Przetwarzanie wyrażeń języka naturalnego w wyrażenia logiczne - system Denise, in Przedsięwzięcia i usługi informacyjne. Praca zbiorowa Katedry Architektury Systemów Komputerowych KASKBOOK, (Ontologie w opisie scenariuszy usług), Gdańsk, pp. 75-87.
  • [16] Kuligowska K. (2010) Koszty i korzyści implementacji wirtualnych asystentów w przedsiębiorstwach oraz ich znaczenie dla rozwoju gospodarki elektronicznej, rozprawa doktorska, Wydział Nauk Ekonomicznych Uniwersytetu Warszawskiego, Warszawa.
  • [17] Kuligowska K., Lasek M. (2011) Virtual assistants support customer relations and business processes, The 10th International Conference on Information Management, Gdańsk.
  • [18] Kopp S., Gesellensetter L., Krämer N., Wachsmuth I. (2004) A conversational agent as museum guide - design and evaluation of a real-world application, in Proceedings of Intelligent Virtual Agents (IVA 2005), Berlin, Germany, Volume 3661, pp. 329-343.
  • [19] Loebner Prize, (2011), available at: http://www.loebner.net/Prizef/loebner-prize.html (accessed 27 September 2011).
  • [20] Mewes D., Heloir A. (2009) The Uncanny Valley, available at: http://embots.dfki.de/doc/seminar_ss09/writeup%20uncanny%20valley.pdf (accessed 27 September 2011).
  • [21] Nass C., Steuer J., Tauber E. (1994) Computers are social actors. Human Factors in Computing Systems, in CHI ´94 Conference Proceedings, New York, pp. 72-78.
  • [22] Pfeiffer T., Liguda C., Wachsmuth I., Stein S. (2011) Living with a Virtual Agent: Seven Years with an Embodied Conversational Agent at the Heinz Nixdorf MuseumsForum, in: S. Barbieri , K. Scott, & L. Ciolfi (eds.), Proceedings of the Re-Thinking Technology in Museums 2011 - Emerging Experiences. Limmerick: think creative & the University of Limerick, pp. 121-131.
  • [23] Reeves B. (2004) The Benefits of Interactive Online Characters, available at: http://www.sitepal.com/pdf/casestudy/Stanford_University_avatar_case_study.pdf (accessed 27 September 2011).
  • [24] Robinson S., Traum D., Ittycheriah M., Henderer J. (2008) What would you ask a conversational agent? observations of human-agent dialogues in a museum setting, in Proceedings of the 5th International Conference on Language Resources and Evaluation.
  • [25] Saarine L. (2001) Chatterbots: Crash Test Dummies of Communication. Master's Thesis, UIAH Helsinki, available at: http://mlab.uiah.fi/~lsaarine/bots/ (accessed 27 September 2011).
  • [26] Shieber S. (1993) Lessons from a Restricted Turing Test, available at: http://www.eecs.harvard.edu/shieber/Biblio/Papers/loebner-rev-html/loebner-revhtml.html (accessed 27 September 2011).
  • [27] Wallis P. (2005) Believable conversational agents: introducing the intention map, available at: http://nlp.shef.ac.uk/dqa/wallis05-3.pdf (accessed 27 Sep. 2011).
  • [28] Wallis P. (2005 b) Robust normative systems: what happens when a normative system fails?, in Proceedings of the INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, Rome, Italy.
  • [29] Weizenbaum J. (1976) Computer power and human reason: from judgment to calculation, W. H. Freeman & Co., NY, USA.
  • [30] Weizenbaum J. (1966) ELIZA - A computer program for the study of natural language communication between man and machine, Communications of the ACM, 10(8), pp. 36-45.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.ekon-element-000171285181

Zgłoszenie zostało wysłane

Zgłoszenie zostało wysłane

Musisz być zalogowany aby pisać komentarze.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.