Date of Award
8-2023
Document Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Human Centered Computing
Committee Chair/Advisor
Bart Knijnenburg
Committee Member
Nathan McNeese
Committee Member
Guo Freeman
Committee Member
Marten Risius
Abstract
Adaptive experiences have been an active area of research in the past few decades, accompanied by advances in technology such as machine learning and artificial intelligence. Whether the currently ongoing research on adaptive experiences has focused on personalization algorithms, explainability, user engagement, or privacy and security, there is growing interest and resources in developing and improving these research focuses. Even though the research on adaptive experiences has been dynamic and rapidly evolving, achieving a high level of user engagement in adaptive experiences remains a challenge. %????? This dissertation aims to uncover ways to engage users in adaptive experiences by incorporating interactivity and explanation through four studies.
Study I takes the first step to link the explanation and interactivity in machine learning systems to facilitate users' engagement with the underlying machine learning model with the Tic-Tac-Toe game as a use case. The results show that explainable machine learning (XML) systems (and arguably XAI systems in general) indeed benefit from mechanisms that allow users to interact with the system's internal decision rules.
Study II, III, and IV further focus on adaptive experiences in recommender systems in specific, exploring the role of interactivity and explanation to keep the user “in-the-loop” in recommender systems, trying to mitigate the ``filter bubble'' problem and help users in self-actualizing by supporting them in exploring and understanding their unique tastes.
Study II investigates the effect of recommendation source (a human expert vs. an AI algorithm) and justification method (needs-based vs. interest-based justification) on professional development recommendations in a scenario-based study setting. The results show an interaction effect between these two system aspects: users who are told that the recommendations are based on their interests have a better experience when the recommendations are presented as originating from an AI algorithm, while users who are told that the recommendations are based on their needs have a better experience when the recommendations are presented as originating from a human expert. This work implies that while building the proposed novel movie recommender system covered in study IV, it would provide a better user experience if the movie recommendations are presented as originating from algorithms rather than from a human expert considering that movie preferences (which will be visualized by the movies' emotion feature) are usually based on users' interest.
Study III explores the effects of four novel alternative recommendation lists on participants’ perceptions of recommendations and their satisfaction with the system. The four novel alternative recommendation lists (RSSA features) which have the potential to go beyond the traditional top N recommendations provide transparency from a different level --- how much else does the system learn about users beyond the traditional top N recommendations, which in turn enable users to interact with these alternative lists by rating the initial recommendations so as to correct or confirm the system's estimates of the alternative recommendations.
The subjective evaluation and behavioral analysis demonstrate that the proposed RSSA features had a significant effect on the user experience, surprisingly, two of the four RSSA features (the "controversial" and "hate" features) perform worse than the traditional top-N recommendations on the measured subjective dependent variables while the other two RSSA features (the "hipster" and "no clue" items) perform equally well and even slightly better than the traditional top-N (but this effect is not statistically significant). Moreover, the results indicate that individual differences, such as the need for novelty and domain knowledge, play a significant role in users’ perception of and interaction with the system.
Study IV further combines diversification, visualization, and interactivity, aiming to encourage users to be more engaged with the system. The results show that introducing emotion as an item feature into recommender systems does help in personalization and individual taste exploration; these benefits are greatly optimized through the mechanisms that diversify recommendations by emotional signature, visualize recommendations on the emotional signature, and allow users to directly interact with the system by tweaking their tastes, which further contributes to both user experience and self-actualization.
This work has practical implications for designing adaptive experiences.
Explanation solutions in adaptive experiences might not always lead to a positive user experience, it highly depends on the application domain and the context (as studied in all four studies); it is essential to carefully investigate a specific explanation solution in combination with other design elements in different fields. Introducing control by allowing for direct interactivity (vs. indirect interactivity) in adaptive systems and providing feedback to users' input by integrating their input into the algorithms would create a more engaging and interactive user experience (as studied in Study I and IV). And cumulatively, appropriate direct interaction with the system along with deliberate and thoughtful designs of explanation (including visualization design with the application environment fully considered), which are able to arouse user reflection or resonance, would potentially promote both user experience and user self-actualization.
Recommended Citation
Guo, Lijie, "Understanding the Role of Interactivity and Explanation in Adaptive Experiences" (2023). All Dissertations. 3443.
https://open.clemson.edu/all_dissertations/3443
Author ORCID Identifier
https://orcid.org/0000-0002-2465-3319
Included in
Data Science Commons, Graphics and Human Computer Interfaces Commons, Other Computer Engineering Commons, Other Computer Sciences Commons