ACIS 2024 Proceedings

Abstract

Unstructured information artefacts such as social media posts, news articles, and company annual reports are heavily used by knowledge workers in their daily tasks. Such artefacts may stem from a variety of sources with questionable or at least unknown quality. This presents unique challenges for knowledge workers in effectively and efficiently evaluating the quality relative to the task at hand. Metadata holds the potential to enhance this process; however, there is limited research investigating how users interact with metadata to assess the quality. In this paper, we investigate how users engage with metadata during tasks that require them to use such information artefacts. We use a lab experiment, with eye-tracking and retrospective think-aloud methods, to explore user behaviour and cognitive processes during data quality evaluation when interacting with metadata associated with large PDF documents. The results of our study contribute to the understanding of the relationship between metadata usage and data quality evaluation, providing insights to guide the design of data quality assessment tools.

Share

COinS