IR H/M Course
The Information Need (IN)
• Searching is motivated by a problematic situation
• Gap in user knowledge between what they know
Copyright By PowCoder代写 加微信 powcoder
and what they want to know is the information need – ASK- Anomalous State of Knowledge ( )
• The IN is not static, and develops during the search session as the user learns from interaction
• The transformation of a user’s information need into a query is known as query formulation process
– One of the most challenging activities in information seeking: amplified if the information need is vague or the user
knowledge about the collection is poor
Information Need is Dynamic
• It evolves as we interact with a system
• We learn from our search interaction
• Our understanding possibly improves over a search session
• Our information need changes
Information Need – IN’
Information Information Need – IN’’ Need – IN’’’
IR H/M Course
Information Need Transformation
real%information need
Perception
perceived% information% need
implicit/in/ the/mind/of/the/user
IN IN query 2
Expression
Formalisation
Problem definition Source selection
Problem articulation
Examination of results Extraction of information
Integration with overall task
ByGheorgheMuresan (2012)
representation in/a/human/language
system/language
The/information/need/itself/can/change
Role of User Interfaces in IR
Effective*interaction*is*crucial*for*successful*experience
IR H/M Course
Role of Search User Interfaces
• Effective search interfaces should allow users to: – Recognise that they really need to complete a task
– Express their needs (queries) easily and accurately
– Understand the structure of a search corpus
– Judge the relevance of retrieved documents easily and accurately
Search User Interfaces should allow the users to operate search and manage a search task as efficiently as possible
Supporting Query Input
• The role of search interfaces is to allow users to express their information need as easily and as completely as possible
– Keywords
– Query by Example – Behavioural Signals
IR H/M Course
Simple Query Box
Simple.(Easy.(Everyone(can(use(it.(ASK6friendly.
But … (Side Effect)
75% of Web queries are less than 3 words. Jansen and Spink (2006)
• Longer Box: People submit longer queries when a search box is longer (Belkin et al. 2003)
• More Boxes: Context box that asks what you know about the search topic (Kelly et al. 2005)
Balance&between&additional&effort&and&performance&gain&is& a&challenge
Query&Auto*completion/& Suggestion.&Google&Inc.&(2008)
Other Ideas
IR H/M Course
Ostensive Interaction
Campbell and van Rijsbergen (1996) Urban et al. (2006)
Showing(or(pointing(to(an(example(of(what(you(want(to(tell,(as(
opposed(to(verbally(describing(it
Other Examples of Natural User
Interfaces
• Voice input: A user speaks to a device to express their information need (e.g. Siri, Google Voice Search)
• Eye input: A user’s eye movement is tracked and used as an indication of interest (like a mouse cursor)
Garkavijsetal. (2012)
• Facial expression: A user’s facial expression is recorded and analysed to gauge their satisfaction (e.g. identify their favourite part in a video clip)
Johoetal. (2009)
IR H/M Course
Eye tracking device
View of subject’s pupil on monitor; used for calibration
• Device to detect and record where and what people look at
• Multipleapplications: reading, usability, visual search, in both physical and virtual contexts
Grankaetal. (2004)
What is Eye-tracking?
Why use Eye-tracking for Information Retrieval?
Understand how searchers browse online search results
Suggest ideas for enhanced
interface design
More accurate interpretation of
implicit feedback (e.g. clickthrough data) More targeted metrics for evaluating retrieval
performance Grankaetal. (2004)
IR H/M Course
Sample Eye-tracking Output
Granka et al. (2004) 15
Eye-tracking Heatmap over Many Users
https://www.branded3.com/blog/seo-and-eye-tracking/
IR H/M Course
Eyetracking of Google Instant
http://www.mediative.com/eye-tracking-google-through-the-years/
Supporting Results Examination
• The search interfaces should allow users to obtain relevant information directly, or to select documents that lead to relevant information
– Explicit & Implicit Relevance Feedback – Results in Context
– Faceted Search
– Diversification and Aggregation
IR H/M Course
Relevance Feedback in Operation
Explicit feedback
Whiteetal. (2004)
Problems with (Explicit) Relevance Feedback (1)
• Relevance Feedback systems are recall-dependent
– A small number of retrieved relevant docs can adversely affect the derivation of a new query
• Exploration-Exploitation trade-off
– The balance between users visiting documents to assess relevance
because they want to and because they have to
• Visiting documents to assess relevance is a tedious,
cumbersome and time-consuming process
Relevance(Feedback(is(a(cognitively(overloaded(activity
IR H/M Course
Problems with Relevance Feedback (2)
• Two problems that are somewhat linked:
– Treats relevance as a binary notion
– Does not handle multi-topic or partially relevant documents
• Relevance is an abstract, intuitive concept that cannot
Query’expansion’formulae’rely’on’collapsing’
complex’relevance’assessment’scales’into’a’ ??? binary’notion
be adequately expressed with ‘yes/no’
Recall’that’user’information’need’is’vague’and’ document’representation’is’uncertain””””””
Implicit Relevance Feedback
• ImplicitRelevanceFeedbackgrewoutofthereluctance of users to mark documents as relevant by clicking checkboxes next to document titles
– Some searches are precision-oriented – Explicit feedback is tedious
• Can we gather feedback without requiring the user to do anything?
• Idea:gatherfeedbackfromobserveduserbehavior – Attempting to determine what is relevant based on user
interaction with a search system and returned documents
IR H/M Course
Observable Behavior
Minimum Scope
View Listen
Bookmark Save Purchase Delete
Copy / paste Quote
Forward Reply Link Cite
Rate Publish
Examine Retain
Implicit Relevance Feedback in Search Engines
• Users no longer have to click checkboxes and browse to each document to assess it
• Typically, Web search engines log users’ interaction – Clickthrough activity; Doc viewing time; Scrollbar activity,
Mouse clicks; Search session activity; etc.
• Combine and use (mine) these to predict the user’s
intention and information need
• ImplicitRelevanceFeedbackislessaccuratethan explicit RF
– Confounding variables like presentation bias
– Like much in Computing Science it is a trade-off!
– But more useful than pseudo-relevance feedback (aka Blind RF), which contains no evidence of user judgments 24
Behavior Category
IR H/M Course
Traditional vs. Session-based Retrieval
Traditional1( 14query ) Query= IR2applications
Document Collection
IR can mean either information retrieval
or infrared D12(infrared)
D22(infrared)
D31(retrieval)
D42(infrared)
D51(retrieval)
Session4based
Query= IR2applications
Previous1query=
retrieval1systems …
Frequency1in1 viewed1docs:
Infrared:20 Retrieval:25
Retrieval System
Retrieval System
D31(retrieval) … D51(retrieval)
Srirametal. (2004)
Uses2more2contextual2information Gives2more2accurate2results
TileBars – Hearst (1995)
Three Term sets
Large rectangle indicates a document
Click on a tile to see the contents of the document.
Term frequency and distribution information is important for determining relevance.
Results’in’context:’Show’user’the’relationship’between’the’words in’the’query’and’the’documents’retrieved.
http://people.ischool.berkeley.edu/~hearst/research/tilebars.html
IR H/M Course
Diversity in Web Search
Aggregated Search Results
Structured( Data
Emphasis(on(mixing(different(
media(types((Lalmas al.(((2010) Santos’et’al.”(2015)
Queries’are’often’underspecified”
IR H/M Course
Aggregated Search General Architecture
Search)Interface/ Portal/ Broker
Query Query Query
Source/ Source/ Source/ Server/ Server/ Server/ Vertical Vertical Vertical
Source/ Server/ Vertical
Source/ Server/ Vertical
Diazetal. (2010)
Present)the)user with)a)summary)of)search)results) from)one)or)more)resources
Faceted Search Interface
http://flamenco.berkeley.edu/demos.html
Browsing)oriented-interface-supporting-both-query- formulation-and-results-examination–Tunkelang (2009)
IR H/M Course
Faceted Search in Action
Faceted’Navigation
Faceted’Navigation
Faceted’Navigation
Saracevic’s Relevance Model
• Algorithmic (System) Relevance: A search engine says this document is relevant
• Topical Relevance: This document is about the topic of the information need (query)
• Cognitive Relevance: This document has information that is new to me or has a suitable level of difficulty to me
• Situational Relevance: This document is suitable for my task (e.g. entertainment, report writing)
• Affective (Motivational) Relevance: I like the design or writing style of this document
Saracevic (1997) 32
IR H/M Course
• Information Retrieval is intrinsically an iterative
• Effective search interfaces allow users to operate search and manage a search task as efficiently as possible
• A user-oriented evaluation of the system is necessary to gauge the overall performance of an IR system in helping users complete their search tasks
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com