|
Download PDFOpen PDF in browserAnomaly Detection in Online User ReviewEasyChair Preprint 1292431 pages•Date: April 6, 2024AbstractThis paper presents a pipeline to detect and explain anomalous reviews
in online platforms. The pipeline is made up of three modules and allows
the detection of reviews that do not generate value for users due to either
worthless or malicious composition. The classifications are accompanied by
a normality score and an explanation that justifies the decision made. The
pipeline’s ability to solve the anomaly detection task was evaluated using
different datasets created from a large Amazon database. Additionally, a
study comparing three explainability techniques involving 241 participants
was conducted to assess the explainability module. The study aimed to measure the impact of explanations on the respondents’ ability to reproduce the
classification model and their perceived usefulness. This work can be useful
to automate tasks in review online platforms, such as those for electronic
commerce, and offers inspiration for addressing similar problems in the field
of anomaly detection in textual data. We also consider it interesting to have
carried out a human evaluation of the capacity of different explainability
techniques in a real and infrequent scenario such as the detection of anomalous reviews, as well as to reflect on whether it is possible to explain tasks as
humanly subjective as this one. Keyphrases: Explainability, Text Reviews, anomaly detection, transformers Download PDFOpen PDF in browser |
|
|