Doerfel, S.; Zoller, D.; Singer, P.; Niebler, T.; Hotho, A. & Strohmaier, M.: Evaluating Assumptions about Social Tagging - A Study of User Behavior in BibSonomy. In: Seidl, T.; Hassani, M. & Beecks, C. (Hrsg.):
Proceedings of the 16th LWA Workshops: KDML, IR and FGWM, Aachen, Germany, September 8-10, 2014.. CEUR-WS.org, 2014, S. 18-19
[Volltext]
[BibTeX]
Doerfel, S.; Zoller, D.; Singer, P.; Niebler, T.; Hotho, A. & Strohmaier, M.:
Of course we share! Testing Assumptions about Social Tagging Systems. , 2014
[Volltext] [Kurzfassung]
[BibTeX]
Social tagging systems have established themselves as an important part in today's web and have attracted the interest from our research community in a variety of investigations. The overall vision of our community is that simply through interactions with the system, i.e., through tagging and sharing of resources, users would contribute to building useful semantic structures as well as resource indexes using uncontrolled vocabulary not only due to the easy-to-use mechanics. Henceforth, a variety of assumptions about social tagging systems have emerged, yet testing them has been difficult due to the absence of suitable data. In this work we thoroughly investigate three available assumptions - e.g., is a tagging system really social? - by examining live log data gathered from the real-world public social tagging system BibSonomy. Our empirical results indicate that while some of these assumptions hold to a certain extent, other assumptions need to be reflected and viewed in a very critical light. Our observations have implications for the design of future search and other algorithms to better reflect the actual user behavior.
Haustein, S. & Siebenlist, T.: Applying social bookmarking data to evaluate journal usage . In:
Journal of Informetrics 5 (2011), Nr. 3, S. 446 - 457
[Volltext]
[Kurzfassung]
[BibTeX]
Web 2.0 technologies are finding their way into academics: specialized social bookmarking services allow researchers to store and share scientific literature online. By bookmarking and tagging articles, academic prosumers generate new information about resources, i.e. usage statistics and content description of scientific journals. Given the lack of global download statistics, the authors propose the application of social bookmarking data to journal evaluation. For a set of 45 physics journals all 13,608 bookmarks from CiteULike, Connotea and BibSonomy to documents published between 2004 and 2008 were analyzed. This article explores bookmarking data in STM and examines in how far it can be used to describe the perception of periodicals by the readership. Four basic indicators are defined, which analyze different aspects of usage: Usage Ratio, Usage Diffusion, Article Usage Intensity and Journal Usage Intensity. Tags are analyzed to describe a reader-specific view on journal content.
Heckner, M.; Heilemann, M. & Wolff, C.: Personal Information Management vs. Resource Sharing: Towards a Model of Information Behaviour in Social Tagging Systems.
Int'l AAAI Conference on Weblogs and Social Media (ICWSM). San Jose, CA, USA: 2009
[BibTeX]
Noy, N. F.; Chugh, A. & Alani, H.: The CKC Challenge: Exploring Tools for Collaborative Knowledge Construction. In:
IEEE Intell Syst 23 (2008), Nr. 1, S. 64-68
[Volltext]
[Kurzfassung]
[BibTeX]
The great success of Web 2.0 is mainly fuelled by an infrastructure that allows web users to create, share, tag, and connect content and knowledge easily. The tools for developing structured knowledge in this manner have started to appear as well. However, there are few, if any, user studies that are aimed at understanding what users expect from such tools, what works and what doesn't. We organized the Collaborative Knowledge Construction (CKC) Challenge to assess the state of the art for the tools that support collaborative processes for creation of various forms of structured knowledge. The goal of the Challenge was to get users to try out different tools and to learn what users expect from such tools-features that users need, features that they like or dislike. The Challenge task was to construct structured knowledge for a portal that would provide information about research. The Challenge design contained several incentives for users to participate. Forty-nine users registered for the Challenge; thirty-three of them participated actively by using the tools. We collected extensive feedback from the users where they discussed their thoughts on all the tools that they tried. In this paper, we present the results of the Challenge, discuss the features that users expect from tools for collaborative knowledge constructions, the features on which Challenge participants disagreed, and the lessons that we learned.