程序代写代做代考 AI Bayesian scheme COMP6714: Informa2on Retrieval & Web Search

COMP6714: Informa2on Retrieval & Web Search

Introduc*on to

Informa(on Retrieval

Lecture 9: Probabilis*c Model & Language Model

1

COMP6714: Informa2on Retrieval & Web Search

Recap of the last lecture
§  Improving search results

§  Especially for high recall. E.g., searching for aircra? so it
matches with plane; thermodynamic with heat

§  Op*ons for improving results…
§  Global methods

§  Query expansion
§  Thesauri
§  Automa*c thesaurus genera*on

§  Global indirect relevance feedback
§  Local methods

§  Relevance feedback
§  Pseudo relevance feedback

2

COMP6714: Informa2on Retrieval & Web Search

Probabilis*c relevance feedback
§  Rather than reweigh*ng in a vector space…
§  If user has told us some relevant and some irrelevant
documents, then we can proceed to build a
probabilis*c classifier, such as a Naive Bayes model:
§  P(tk|R) = |Drk| / |Dr|
§  P(tk|NR) = |Dnrk| / |Dnr|

§  tk is a term; Dr is the set of known relevant documents; Drk is the
subset that contain tk; Dnr is the set of known irrelevant
documents; Dnrk is the subset that contain tk.

3

COMP6714: Informa2on Retrieval & Web Search

Why probabili*es in IR?

User
Information Need

Documents
Document

Representation

Query
Representation

How to match?

In traditional IR systems, matching between each document and
query is attempted in a semantically imprecise space of index terms.

Probabilities provide a principled foundation for uncertain reasoning
.
Can we use probabilities to quantify our uncertainties?

Uncertain guess of
whether document
has relevant content

Understanding
of user need is
uncertain

4

COMP6714: Informa2on Retrieval & Web Search

Probabilis*c IR topics

§  Classical probabilis*c retrieval model
§  Probability ranking principle, etc.

§  (Naïve) Bayesian Text Categoriza*on
§  Bayesian networks for text retrieval
§  Language model approach to IR

§  An important emphasis in recent work

§  Probabilis2c methods are one of the oldest but also
one of the currently hoGest topics in IR.
§  Tradi2onally: neat ideas, but they’ve never won on
performance. It may be different now.

5

COMP6714: Informa2on Retrieval & Web Search

The document ranking problem
n We have a collec*on of documents
n User issues a query
n A list of documents needs to be returned
n Ranking method is core of an IR system:

n In what order do we present documents to the user?
n We want the “best” document to be first, second best
second, etc….

n Idea: Rank by probability of relevance of the
document w.r.t. informa(on need
n P(relevant|documenti, query)

6

COMP6714: Informa2on Retrieval & Web Search

Recall a few probability basics
§  For events a and b:
§  Bayes’ Rule

§  Odds:

∑ =
==

=

==∩=

aax
xpxbp

apabp
bp
apabp

bap

apabpbpbap
apabpbpbapbapbap

,
)()|(

)()|(
)(
)()|(

)|(

)()|()()|(
)()|()()|()(),(

)(1
)(

)(
)(

)(
ap
ap

ap
ap

aO

==

Posterior

Prior

7

COMP6714: Informa2on Retrieval & Web Search

The Probability Ranking Principle

“If a reference retrieval system’s response to each request is a
ranking of the documents in the collec*on in order of decreasing
probability of relevance to the user who submiged the request,
where the probabili*es are es*mated as accurately as possible on
the basis of whatever data have been made available to the system
for this purpose, the overall effec2veness of the system to its user
will be the best that is obtainable on the basis of those data.”

§  [1960s/1970s] S. Robertson, W.S. Cooper, M.E. Maron;
van Rijsbergen (1979:113); Manning & Schütze (1999:538)

8

COMP6714: Informa2on Retrieval & Web Search

Probability Ranking Principle

Let x be a document in the collection.
Let R represent relevance of a document w.r.t. given (fixed)
query and let NR represent non-relevance.

)(
)()|(

)|(

)(
)()|(

)|(

xp
NRpNRxp

xNRp

xp
RpRxp

xRp

=

=

p(x|R), p(x|NR) – probability that if a relevant (non-relevant)
document is retrieved, it is x.

Need to find p(R|x) – probability that a document x is relevant.

p(R),p(NR) – prior probability
of retrieving a (non) relevant
document

1)|()|( =+ xNRpxRp

R={0,1} vs. NR/R

9

COMP6714: Informa2on Retrieval & Web Search

Probability Ranking Principle (PRP)
§  Simple case: no selec*on costs or other u*lity
concerns that would differen*ally weight errors

§  Bayes’ Op*mal Decision Rule
§  x is relevant iff p(R|x) > p(NR|x)

§  PRP in ac*on: Rank all documents by p(R|x)

§  Theorem:
§  Using the PRP is op*mal, in that it minimizes the loss
(Bayes risk) under 1/0 loss

§  Provable if all probabili*es correct, etc. [e.g., Ripley 1996]

10

COMP6714: Informa2on Retrieval & Web Search

Probability Ranking Principle

§  More complex case: retrieval costs.
§  Let d be a document
§  C – cost of retrieval of relevant document
§  C’ – cost of retrieval of non-relevant document

§  Probability Ranking Principle: if

for all d’ not yet retrieved, then d is the next
document to be retrieved

§  We won’t further consider loss/u(lity from now
on

C ⋅ p(R | d) + ′ C ⋅ (1− p(R | d)) ≤ C ⋅ p(R | ′ d ) + ′ C ⋅ (1− p(R | ′ d ))

11

COMP6714: Informa2on Retrieval & Web Search

Probability Ranking Principle
§  How do we compute all those probabili*es?

§  Do not know exact probabili*es, have to use es*mates
§  Binary Independence Retrieval (BIR) – which we
discuss later today – is the simplest model

§  Ques*onable assump*ons
§  “Relevance” of each document is independent of
relevance of other documents.
§  Really, it’s bad to keep on returning duplicates

§  Boolean model of relevance
§  That one has a single step informa*on need

§  Seeing a range of results might let user refine query

12

COMP6714: Informa2on Retrieval & Web Search

Probabilis*c Retrieval Strategy

§  Es*mate how terms contribute to relevance
§  How do things like u, df, and length influence your
judgments about document relevance?
§  One answer is the Okapi formulae (S. Robertson)

§  Combine to find document relevance probability

§  Order documents by decreasing probability

13

COMP6714: Informa2on Retrieval & Web Search

Probabilis*c Ranking

Basic concept:

“For a given query, if we know some documents that are
relevant, terms that occur in those documents should be
given greater weighting in searching for other relevant
documents.

By making assumptions about the distribution of terms
and applying Bayes Theorem, it is possible to derive
weights theoretically.”

Van Rijsbergen

14

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model
§  Tradi*onally used in conjunc*on with PRP
§  “Binary” = Boolean: documents are represented as binary

incidence vectors of terms (cf. lecture 1):
§ 
§  iff term i is present in document x.

§  “Independence”: terms occur in documents independently
§  Different documents can be modeled as same vector

§  Bernoulli Naive Bayes model (cf. text categoriza*on!)

),,( 1 nxxx …
!
=
1=ix

15

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model
§  Queries: binary term incidence vectors
§  Given query q,

§  for each document d need to compute p(R|q,d).
§  replace with compu*ng p(R|q,x) where x is binary term
incidence vector represen*ng d Interested only in ranking

§  Will use odds and Bayes’ Rule:

)|(
),|()|(

)|(
),|()|(

),|(
),|(

),|(

qxp
qNRxpqNRp

qxp
qRxpqRp

xqNRp
xqRp

xqRO
!
!

!
!

!
!

!
==

16

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model

•  Using Independence Assumption:


=

=
n

i i

i

qNRxp
qRxp

qNRxp
qRxp

1 ),|(
),|(

),|(
),|(

!
!

),|(
),|(

)|(
)|(

),|(
),|(

),|(
qNRxp
qRxp

qNRp
qRp

xqNRp
xqRp

xqRO !
!

!
!

!
⋅==

Constant for a
given query Needs estimation


=

⋅=
n

i i

i

qNRxp
qRxp

qROdqRO
1 ),|(

),|(
)|(),|(• So :

17

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model


=

⋅=
n

i i

i

qNRxp
qRxp

qROdqRO
1 ),|(

),|(
)|(),|(

•  Since xi is either 0 or 1:

∏∏
== =

=

=

=
⋅=

01 ),|0(
),|0(

),|1(
),|1(

)|(),|(
ii x i

i

x i

i

qNRxp
qRxp

qNRxp
qRxp

qROdqRO

•  Let );,|1( qRxpp ii == );,|1( qNRxpr ii ==

•  Assume, for all terms not occurring in the query (qi=0) ii rp =

Then…
This can be
changed (e.g., in
relevance feedback)

18

COMP6714: Informa2on Retrieval & Web Search

All matching terms Non-matching
query terms

Binary Independence Model

All matching terms
All query terms

∏∏

∏∏

===

=
===



⋅=


⋅⋅=

11

1
01

1
1

)1(
)1(

)|(

1
1

)|(),|(

iii

i
iii

q i

i

qx ii

ii

q
x i

i

qx i

i

r
p

pr
rp

qRO

r
p

r
p

qROxqRO
!

19

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model

Constant for
each query

Only quantity to be estimated
for rankings

∏∏
=== −



⋅=

11 1
1

)1(
)1(

)|(),|(
iii q i

i

qx ii

ii

r
p

pr
rp

qROxqRO
!

•  Retrieval Status Value:

RSV = log
pi(1− ri)
ri(1− pi)xi = qi =1

∏ = log pi(1− ri)
ri(1− pi)xi = qi =1

20

= log(odds(pi))− log(odds(ri))( )
xi = qi =1
∑ = logit(pi) − logit(ri)( )

xi = qi =1

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model

•  All boils down to computing RSV.

∑∏
==== −


=


=

11 )1(
)1(

log
)1(
)1(

log
iiii qx ii

ii

qx ii

ii

pr
rp

pr
rp

RSV


==

=
1
;

ii qx
icRSV

ci = log
pi(1− ri)
ri(1− pi)

= logit(pi) − logit(ri)

So, how do we compute ci’s from our data ?

21

COMP6714: Informa2on Retrieval & Web Search

Binary Independence Model
•  Estimating RSV coefficients.
•  For each term i look at this table of document counts:

S
s

pi ≈ )(
)(
SN
sn

ri


)()(
)(

log),,,(
sSnNsn

sSs
sSnNKci

+−−−


=≈

•  Estimates:

22

Documents Relevant Non-Relevant Total

Xi = 1

Xi = 0

Total N S

n s n-s
S-s N-n-S+s N-n

N-S

However, these
estimates could
be 0.

COMP6714: Informa2on Retrieval & Web Search

Add ½ Smoothing
•  Add ½ to each of the center four cells.

ci ≈ K(N,n,S,s) = log
(s+1 2) (S − s+1 2)

(n − s+1 2) (N − n − S + s+1 2)

23

Documents Relevant Non-Relevant Total

Xi = 1

Xi = 0

Total N+2 S+1

n+1 s+½ n-s+½
S-s+½ N-n-S+s+½ N-n+1

N-S+1

COMP6714: Informa2on Retrieval & Web Search

Example /1

24

§  Query = {x1, x2}
§  O(R=1|D3, q)

Doc Judgment x1 x2 x3
D1 R 1 1 1

D2 R 0 0 1

D3 R 1 0 0

D4 NR 1 0 1

D5 NR 0 1 1

COMP6714: Informa2on Retrieval & Web Search

Example /2

25

§  Es*mate pi and ri

Doc Judgment x1 x2 x3
D1 R 1 1 1

D2 R 0 0 1

D3 R 1 0 0

D4 NR 1 0 1

D5 NR 0 1 1

COMP6714: Informa2on Retrieval & Web Search

Es*ma*on – key challenge
§  If non-relevant documents are approximated by
the whole collec*on, then ri (prob. of occurrence
in non-relevant documents for query) is n/N and
§  log (1– ri)/ri = log (N– n)/n ≈ log N/n = IDF!

§  pi (probability of occurrence in relevant
documents) can be es*mated in various ways:
§  from relevant documents if know some

§  Relevance weigh*ng can be used in feedback loop
§  constant (Cro} and Harper combina*on match – 0.5) –
then just get idf weigh*ng of terms

§  propor*onal to prob. of occurrence in collec*on
§  more accurately, to log of this (Greiff, SIGIR 1998)

26

in fact, u-idf can be deemed as the cross-entropy

COMP6714: Informa2on Retrieval & Web Search

27

Itera*vely es*ma*ng pi
1.  Assume that pi constant over all xi in query

§  pi = 0.5 (even odds) for any given doc
2.  Determine guess of relevant document set:

§  V is fixed size set of highest ranked documents on this
model (note: now a bit like u.idf!)

3.  We need to improve our guesses for pi and ri, so
§  Use distribu*on of xi in docs in V. Let Vi be set of

documents containing xi
§  pi = |Vi| / |V|

§  Assume if not retrieved then not relevant
§  ri = (ni – |Vi|) / (N – |V|)

4.  Go to 2. un*l converges then return ranking

op*onal

COMP6714: Informa2on Retrieval & Web Search

Probabilis*c Relevance Feedback
1.  Guess a preliminary probabilis*c descrip*on of R

and use it to retrieve a first set of documents V, as
above.

2.  Interact with the user to refine the descrip*on:
learn some definite members of R and NR

3.  Rees*mate pi and ri on the basis of these
§  Or can combine new informa*on with original guess (use

Bayesian prior):

4.  Repeat, thus genera*ng a succession of
approxima*ons to R.

κ
κ
+

+
=

||
|| )1()2(
V

pV
p iii

κ is
prior

weight

28

op*onal

COMP6714: Informa2on Retrieval & Web Search

PRP and BIR

§  Ge�ng reasonable approxima*ons of probabili*es
is possible.

§  Requires restric*ve assump*ons:
§  term independence
§  terms not in query don’t affect the outcome
§  boolean representa*on of documents/queries/
relevance

§  document relevance values are independent
§  Some of these assump*ons can be removed
§  Problem: either require par*al relevance informa*on or only can

derive somewhat inferior term weights
29

COMP6714: Informa2on Retrieval & Web Search

Okapi BM25

§  Heuris*cally extend the BIR to include informa*on
of term frequencies, document length, etc.

§  Typically,

30

RSVd = log
N
dft



t∈q
∑ ⋅ (k1 +1)tf t ,d

k1 (1− b) + b ⋅
Ld
Lave



⎟ + tf t ,d


(k3 +1)tf t ,q
k3 + tf t,q

idf Normalized term
freq (doc)

Normalized term
freq (query)

k1,k3 ∈ [1.2,2.0],b = 0.75

caps the contribu*on of R

COMP6714: Informa2on Retrieval & Web Search

Good and Bad News
§  Standard Vector Space Model

§  Empirical for the most part; success measured by results
§  Few proper*es provable

§  Probabilis*c Model Advantages
§  Based on a firm theore*cal founda*on
§  Theore*cally jus*fied op*mal ranking scheme

§  Disadvantages
§  Making the ini*al guess to get V
§  Binary word-in-doc weights (not using term frequencies)
§  Independence of terms (can be alleviated)
§  Amount of computa*on
§  Has never worked convincingly beger in prac*ce

31

COMP6714: Informa2on Retrieval & Web Search

Resources
S. E. Robertson and K. Spärck Jones. 1976. Relevance Weigh*ng of Search

Terms. Journal of the American Society for Informa2on Sciences 27(3):
129–146.

C. J. van Rijsbergen. 1979. Informa2on Retrieval. 2nd ed. London:
Bugerworths, chapter 6. [Most details of math] hgp://
www.dcs.gla.ac.uk/Keith/Preface.html

N. Fuhr. 1992. Probabilis*c Models in Informa*on Retrieval. The Computer
Journal, 35(3),243–255. [Easiest read, with BNs]

F. Crestani, M. Lalmas, C. J. van Rijsbergen, and I. Campbell. 1998. Is This
Document Relevant? … Probably: A Survey of Probabilis*c Models in
Informa*on Retrieval. ACM Compu2ng Surveys 30(4): 528–552.

hgp://www.acm.org/pubs/cita*ons/journals/surveys/1998-30-4/p528-crestani/

[Adds very ligle material that isn’t in van Rijsbergen or Fuhr ]

32

COMP6714: Informa2on Retrieval & Web Search

Resources
H.R. Turtle and W.B. Cro}. 1990. Inference Networks for Document Retrieval.

Proc. ACM SIGIR: 1-24.
E. Charniak. Bayesian nets without tears. AI Magazine 12(4): 50-63 (1991). hGp://

www.aaai.org/Library/Magazine/Vol12/12-04/vol12-04.html
D. Heckerman. 1995. A Tutorial on Learning with Bayesian Networks. Microso}

Technical Report MSR-TR-95-06
hGp://www.research.microso?.com/~heckerman/

N. Fuhr. 2000. Probabilis*c Datalog: Implemen*ng Logical Informa*on Retrieval
for Advanced Applica*ons. Journal of the American Society for Informa2on
Science 51(2): 95–110.

R. K. Belew. 2001. Finding Out About: A Cogni2ve Perspec2ve on Search Engine

Technology and the WWW. Cambridge UP 2001.

MIR 2.5.4, 2.8
33

COMP6714: Informa2on Retrieval & Web Search

LANGUAGE MODEL

34

COMP6714: Informa2on Retrieval & Web Search

Today
§  The Language Model Approach to IR

§  Basic query genera*on model
§  Alterna*ve models

35

COMP6714: Informa2on Retrieval & Web Search

Standard Probabilis*c IR

query

d1

d2

dn

Information
need

document collection

matching

),|( dQRP

36

COMP6714: Informa2on Retrieval & Web Search

IR based on Language Model (LM)

query

d1

d2

dn

Information
need

document collection

generation

)|( dMQP 1d
M

2d
M

nd
M§  A common search heuris*c is to use words that you expect to find in matching documents as your

query – why, I saw Sergey Brin advoca*ng that
strategy on late night TV one night in my hotel
room, so it must be good!

§  The LM approach directly exploits that idea!
§  See later slides for a more formal jus*fica*on 37

COMP6714: Informa2on Retrieval & Web Search

Formal Language (Model)

§  Tradi*onal genera*ve model: generates strings
§  Finite state machines or regular grammars, etc.

§  Example:

I wish

I wish
I wish I wish
I wish I wish I wish
I wish I wish I wish I wish

38

COMP6714: Informa2on Retrieval & Web Search

Stochas*c Language Models

§  Models probability of genera*ng strings in the
language (commonly all strings over alphabet ∑)

0.2 the

0.1 a

0.01 man

0.01 woman

0.03 said

0.02 likes

… …

the man likes the woman

0.2 0.01 0.02 0.2 0.01

multiply

Model M

P(s | M) = 0.00000008

s:

39

COMP6714: Informa2on Retrieval & Web Search

Stochas*c Language Models

§  Model probability of genera*ng any string

0.2 the

0.01 class

0.0001 sayst

0.0001 pleaseth

0.0001 yon

0.0005 maiden

0.01 woman

Model M1 Model M2

maiden class pleaseth yon the

0.0005 0.01 0.0001 0.0001 0.2
0.01 0.0001 0.02 0.1 0.2

P(s|M2) > P(s|M1)

0.2 the

0.0001 class

0.03 sayst

0.02 pleaseth

0.1 yon

0.01 maiden

0.0001 woman

40

COMP6714: Informa2on Retrieval & Web Search

Stochas*c Language Models

§  A sta*s*cal model for genera*ng text
§  Probability distribu*on over strings in a given language

M

P ( | M ) = P ( | M )

P ( | M, )

P ( | M, )

P ( | M, )
41

COMP6714: Informa2on Retrieval & Web Search

Unigram and higher-order models

§  Unigram Language Models

§  Bigram (generally, n-gram) Language Models

§  Other Language Models
§  Grammar-based models (PCFGs), etc.

§  Probably not the first thing to try in IR

= P ( ) P ( | ) P ( | ) P ( | )

P ( ) P ( ) P ( ) P ( )

P ( )

P ( ) P ( | ) P ( | ) P ( | )

Easy.
Effective!

42

COMP6714: Informa2on Retrieval & Web Search

Using Language Models in IR
§  Treat each document as the basis for a model (e.g.,
unigram sufficient sta*s*cs)

§  Rank document d based on P(d | q)
§  P(d | q) = P(q | d) x P(d) / P(q)

§  P(q) is the same for all documents, so ignore
§  P(d) [the prior] is o}en treated as the same for all d

§  But we could use criteria like authority, length, genre
§  P(q | d) is the probability of q given d’s model

§  Very general formal approach

43

COMP6714: Informa2on Retrieval & Web Search

The fundamental problem of LMs
§  Usually we don’t know the model M

§  But have a sample of text representa*ve of that model

§  Es*mate a language model from a sample
§  Then compute the observa*on probability

P ( | M ( ) )

M
doc query

44

COMP6714: Informa2on Retrieval & Web Search

Language Models for IR
§  Language Modeling Approaches

§  Agempt to model query genera(on process
§  Documents are ranked by the probability that a query
would be observed as a random sample from the
respec(ve document model

§  Mul*nomial approach

45

COMP6714: Informa2on Retrieval & Web Search

Retrieval based on probabilis*c LM

§  Treat the genera*on of queries as a random process.
§  Approach

§  Infer a language model for each document.
§  Es*mate the probability of genera*ng the query according
to each of these models.

§  Rank the documents according to these probabili*es.
§  Usually a unigram es*mate of words is used

§  Some work on bigrams, paralleling van Rijsbergen

46

COMP6714: Informa2on Retrieval & Web Search

Retrieval based on probabilis*c LM

§  Intui*on
§  Users …

§  Have a reasonable idea of terms that are likely to occur in
documents of interest.

§  They will choose query terms that dis*nguish these documents
from others in the collec*on.

§  Collec*on sta*s*cs …
§  Are integral parts of the language model.
§  Are not used heuris*cally as in many other approaches.

§  In theory. In prac*ce, there’s usually some wiggle room for
empirically set parameters

47

COMP6714: Informa2on Retrieval & Web Search

Query genera*on probability (1)
§  Ranking formula

§  The probability of producing the query given the language model of
document d using MLE is:

=

=

Qt d

dt

Qt
dmld

dl
tf

MtpMQp

),(

)|(ˆ)|(ˆ

Unigram assumption:
Given a particular language model, the
query terms occur independently

),( dttf

ddl

: language model of document d

: raw tf of term t in document d

: total number of tokens in document d

dM

)|()(
)|()(),(

dMQpdp
dQpdpdQp

=

48

COMP6714: Informa2on Retrieval & Web Search

Insufficient data
§  Zero probability

§  May not wish to assign a probability of zero to a document
that is missing one or more of the query terms [gives
conjunc*on seman*cs]

§  General approach
§  A non-occurring term is possible, but no more likely than
would be expected by chance in the collec*on.

§  If ,

0)|( =dMtp

0),( =dttf

cs

cs
cf

Mtp td =)|(

tcf : raw count of term t in the collection
: raw collection size(total number of tokens in the collection)

49

COMP6714: Informa2on Retrieval & Web Search

Insufficient data
§  Zero probabili*es spell disaster

§  We need to smooth probabili*es
§  Discount nonzero probabili*es
§  Give some probability mass to unseen things

§  There’s a wide space of approaches to smoothing
probability distribu*ons to deal with this problem,
such as adding 1, ½ or ℇ to counts, Dirichlet priors,
discoun*ng, and interpola*on
§  [See FSNLP ch. 6 or CS224N if you want more]

§  A simple idea that works well in prac*ce is to use a
mixture between the document mul*nomial and the
collec*on mul*nomial distribu*on

50

COMP6714: Informa2on Retrieval & Web Search

Mixture model
§  Jelinek-Mercer method

§  P(w|d) = λPmle(w|Md) + (1 – λ)Pmle(w|Mc)
§  Mixes the probability from the document with the
general collec*on frequency of the word.

§  Correctly se�ng λ is very important
§  A high value of lambda makes the search
“conjunc*ve-like” – suitable for short queries

§  A low value is more suitable for long queries
§  Can tune λ to op*mize performance

§  Perhaps make it dependent on document size (cf. Dirichlet
prior or Wigen-Bell smoothing) 51

COMP6714: Informa2on Retrieval & Web Search

Basic mixture model summary
§  General formula*on of the LM for IR

§  The user has a document in mind, and generates the query
from this document.

§  The equa*on represents the probability that the
document that the user had in mind was in fact this one.


+−=
Qt

dMtptpdpdQp ))|()()1(()(),( λλ

collection/background language model

individual-document model

52

COMP6714: Informa2on Retrieval & Web Search

Rela*onship to idf
fqi,D=0 è query word that
does not occur in the doc

Add contribu*ons
from i:fqi,D>0

Becomes a
constant | Q, C

propor*onal to the u, inversely
propor*onal to the cf

Note here (i.e., [CMS09]) λ is
mul*plied to the background
model.

53

COMP6714: Informa2on Retrieval & Web Search

Example
§  Document collec*on (2 documents)

§  d1: Xerox reports a profit but revenue is down
§  d2: Lucent narrows quarter loss but revenue decreases
further

§  Model: MLE unigram from documents; λ = ½
§  Query: revenue down

§  P(Q|d1) = [(1/8 + 2/16)/2] x [(1/8 + 1/16)/2]
= 1/8 x 3/32 = 3/256
§  P(Q|d2) = [(1/8 + 2/16)/2] x [(0 + 1/16)/2]
= 1/8 x 1/32 = 1/256

§  Ranking: d1 > d2 54

COMP6714: Informa2on Retrieval & Web Search

Ponte and Cro} Experiments
§  Data

§  TREC topics 202-250 on TREC disks 2 and 3
§  Natural language queries consis*ng of one sentence each

§  TREC topics 51-100 on TREC disk 3 using the concept fields
§  Lists of good terms

Number: 054

Domain: International Economics

Topic: Satellite Launch Contracts </p> <p><desc>Description:<br /> … </desc> </p> <p><con>Concept(s): </p> <p>1.  Contract, agreement<br /> 2.  Launch vehicle, rocket, payload, satellite<br /> 3.  Launch services, … </con> 55 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Precision/recall results 202-250 </p> <p>56 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Precision/recall results 51-100</p> <p>57 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Language models: pro & con<br /> §  Novel way of looking at the problem of text retrieval<br /> based on probabilis*c language modeling </p> <p>§  Conceptually simple and explanatory<br /> §  Formal mathema*cal model<br /> §  Natural use of collec*on sta*s*cs, not heuris*cs (almost…) </p> <p>§  LMs provide effec*ve retrieval and can be improved to the<br /> extent that the following condi*ons can be met<br /> §  Our language models are accurate representa*ons of the data.<br /> §  Users have some sense of term distribu*on.* </p> <p>§  *Or we get more sophis*cated with transla*on model </p> <p>58 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Comparison With Vector Space<br /> §  There’s some rela*on to tradi*onal u.idf models: </p> <p>§  (unscaled) term frequency is directly in model<br /> §  the probabili*es do length normaliza*on of term<br /> frequencies </p> <p>§  the effect of doing a mixture with overall collec*on<br /> frequencies is a ligle like idf: terms rare in the general<br /> collec*on but common in some documents will have a<br /> greater influence on the ranking </p> <p>59 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Comparison With Vector Space<br /> §  Similar in some ways </p> <p>§  Term weights based on frequency<br /> §  Terms o}en used as if they were independent<br /> §  Inverse document/collec*on frequency used<br /> §  Some form of length normaliza*on useful </p> <p>§  Different in others<br /> §  Based on probability rather than similarity </p> <p>§  Intui*ons are probabilis*c rather than geometric<br /> §  Details of use of document length and term, document,<br /> and collec*on frequency differ </p> <p>60 </p> <p> COMP6714: Informa2on Retrieval & Web Search </p> <p>Resources<br /> J.M. Ponte and W.B. Cro}. 1998. A language modelling approach to informa*on </p> <p>retrieval. In SIGIR 21.<br /> D. Hiemstra. 1998. A linguis*cally mo*vated probabilis*c model of informa*on </p> <p>retrieval. ECDL 2, pp. 569–584.<br /> A. Berger and J. Lafferty. 1999. Informa*on retrieval as sta*s*cal transla*on. SIGIR 22, </p> <p>pp. 222–229.<br /> D.R.H. Miller, T. Leek, and R.M. Schwartz. 1999. A hidden Markov model informa*on </p> <p>retrieval system. SIGIR 22, pp. 214–221.<br /> [Several relevant newer papers at SIGIR 23–25, 2000–2002.]<br /> Workshop on Language Modeling and Informa*on Retrieval, CMU 2001. hgp://</p> <p>la.l*.cs.cmu.edu/callan/Workshops/lmir01/ .<br /> The Lemur Toolkit for Language Modeling and Informa*on Retrieval. hgp://</p> <p>www-2.cs.cmu.edu/~lemur/ . CMU/Umass LM and IR system in C(++), currently<br /> ac*vely developed. </p> <p>61</p> </div><!-- .entry-content .clear --> </div> </article><!-- #post-## --> <nav class="navigation post-navigation" role="navigation" aria-label="Post navigation"> <span class="screen-reader-text">Post navigation</span> <div class="nav-links"><div class="nav-previous"><a title="程序代写代做代考 Confidence Intervals.ppt"href="https://powcoder.com/2021/01/22/%e7%a8%8b%e5%ba%8f%e4%bb%a3%e5%86%99%e4%bb%a3%e5%81%9a%e4%bb%a3%e8%80%83-confidence-intervals-ppt/" rel="prev"><span class="ast-left-arrow">←</span> Previous Post</a></div><div class="nav-next"><a title="程序代写代做代考 deep learning GPU algorithm flex Rethinking the Inception Architecture for Computer Vision"href="https://powcoder.com/2021/01/22/%e7%a8%8b%e5%ba%8f%e4%bb%a3%e5%86%99%e4%bb%a3%e5%81%9a%e4%bb%a3%e8%80%83-deep-learning-gpu-algorithm-flex-rethinking-the-inception-architecture-for-computer-vision/" rel="next">Next Post <span class="ast-right-arrow">→</span></a></div></div> </nav><div class="ast-single-related-posts-container ast-container--fallback"><div class="ast-related-posts-title-section"> <h2 class="ast-related-posts-title"> Related Posts </h2> </div><div class="ast-related-posts-wrapper"> <article class="ast-related-post post-38 post type-post status-publish format-standard hentry category-uncategorized tag-matlab tag-simulation"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/matlab-simulation/" target="_self" rel="bookmark noopener noreferrer">matlab simulation</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/matlab/" rel="tag">matlab代写代考</a>, <a href="https://powcoder.com/tag/simulation/" rel="tag">simulation</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-39 post type-post status-publish format-standard hentry category-uncategorized tag-c"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/ab202-assignment-1-arkapong/" target="_self" rel="bookmark noopener noreferrer">AB202 Assignment 1: Arkapong</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/c/" rel="tag">c++代做</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-40 post type-post status-publish format-standard hentry category-uncategorized tag-c"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/msc-c-programming/" target="_self" rel="bookmark noopener noreferrer">MSc C++ Programming</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/c/" rel="tag">c++代做</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-41 post type-post status-publish format-standard hentry category-uncategorized tag-prolog"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/msc-assessed-prolog-lab-exercise-2/" target="_self" rel="bookmark noopener noreferrer">MSc Assessed Prolog Lab Exercise 2</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/prolog/" rel="tag">Prolog代写代考</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-49 post type-post status-publish format-standard hentry category-uncategorized tag-c tag-uml"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/spring-session2015assignment-1/" target="_self" rel="bookmark noopener noreferrer">Spring Session:2015:Assignment 1</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/c/" rel="tag">c++代做</a>, <a href="https://powcoder.com/tag/uml/" rel="tag">UML</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-51 post type-post status-publish format-standard hentry category-uncategorized tag-uml"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/assignment-2-inception-and-elaboration/" target="_self" rel="bookmark noopener noreferrer">Assignment 2: "Inception and Elaboration"</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/uml/" rel="tag">UML</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-55 post type-post status-publish format-standard hentry category-uncategorized tag-android tag-java"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/android-app/" target="_self" rel="bookmark noopener noreferrer">android app</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/android/" rel="tag">android</a>, <a href="https://powcoder.com/tag/java/" rel="tag">Java代写代考</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> <article class="ast-related-post post-57 post type-post status-publish format-standard hentry category-uncategorized tag-java tag-junit"> <div class="ast-related-posts-inner-section"> <div class="ast-related-post-content"> <div class="ast-related-post-featured-section ast-no-thumb"></div> <header class="entry-header related-entry-header"> <h3 class="ast-related-post-title entry-title"> <a href="https://powcoder.com/2016/06/21/comp220-software-development-tools/" target="_self" rel="bookmark noopener noreferrer">COMP220: Software Development Tools</a> </h3> <div class="entry-meta ast-related-cat-style--none ast-related-tag-style--none"><span class="ast-taxonomy-container cat-links default"><a href="https://powcoder.com/category/uncategorized/" rel="category tag">程序代写 CS代考</a></span> / <span class="ast-taxonomy-container tags-links default"><a href="https://powcoder.com/tag/java/" rel="tag">Java代写代考</a>, <a href="https://powcoder.com/tag/junit/" rel="tag">junit</a></span></div> </header> <div class="entry-content clear"> </div> </div> </div> </article> </div> </div> </main><!-- #main --> </div><!-- #primary --> <div class="widget-area secondary" id="secondary" itemtype="https://schema.org/WPSideBar" itemscope="itemscope"> <div class="sidebar-main" > <aside id="custom_html-2" class="widget_text widget widget_custom_html"><h2 class="widget-title">Contact</h2><div class="textwidget custom-html-widget"><ul> <li><strong>QQ: 1823890830</strong></li> <li><strong>微信号(WeChat): powcoder</strong></li> <li><img data-recalc-dims="1" class="alignnone wp-image-366" src="https://i0.wp.com/powcoder.com/wp-content/uploads/2021/01/powcoder.jpg?resize=133%2C133&ssl=1" alt="myweixin" width="133" height="133"/></li> <li><strong>Email: <a href="mailto:powcoder@163.com">powcoder@163.com</a></strong></li> </ul> <ul> <li><strong>请加微信或QQ发要求</strong></li> <li><strong>Contact me through WeChat</strong></li> </ul> </div></aside><aside id="categories-2" class="widget widget_categories"><h2 class="widget-title">Categories</h2><nav aria-label="Categories"> <ul> <li class="cat-item cat-item-245"><a href="https://powcoder.com/category/machine-learning/">机器学习代写代考 machine learning</a> </li> <li class="cat-item cat-item-242"><a href="https://powcoder.com/category/database-db-sql/">数据库代写代考 DB Database SQL</a> </li> <li class="cat-item cat-item-244"><a href="https://powcoder.com/category/data-structure-algorithm/">数据结构算法代写代考 data structure algorithm</a> </li> <li class="cat-item cat-item-239"><a href="https://powcoder.com/category/%e4%ba%ba%e5%b7%a5%e6%99%ba%e8%83%bd-ai-artificial-intelligence/">人工智能 AI Artificial Intelligence</a> </li> <li class="cat-item cat-item-247"><a href="https://powcoder.com/category/compiler/">编译器原理 Compiler</a> </li> <li class="cat-item cat-item-254"><a href="https://powcoder.com/category/network-socket/">计算机网络 套接字编程 computer network socket programming</a> </li> <li class="cat-item cat-item-240"><a href="https://powcoder.com/category/hadoop-map-reduce-spark-hbase/">大数据 Hadoop Map Reduce Spark HBase</a> </li> <li class="cat-item cat-item-241"><a href="https://powcoder.com/category/%e6%93%8d%e4%bd%9c%e7%b3%bb%e7%bb%9fosoperating-system/">操作系统OS代写代考 (Operating System)</a> </li> <li class="cat-item cat-item-250"><a href="https://powcoder.com/category/computer-architecture/">计算机体系结构代写代考 Computer Architecture</a> </li> <li class="cat-item cat-item-251"><a href="https://powcoder.com/category/computer-graphics-opengl-webgl/">计算机图形学 Computer Graphics opengl webgl</a> </li> <li class="cat-item cat-item-249"><a href="https://powcoder.com/category/nlp/">自然语言处理 NLP natural language processing</a> </li> <li class="cat-item cat-item-383"><a href="https://powcoder.com/category/%e5%b9%b6%e8%a1%8c%e8%ae%a1%e7%ae%97/">并行计算</a> </li> <li class="cat-item cat-item-253"><a href="https://powcoder.com/category/computation-theory/">计算理论 Theory of Computation</a> </li> <li class="cat-item cat-item-252"><a href="https://powcoder.com/category/computer-security/">计算机安全密码学computer security cryptography</a> </li> <li class="cat-item cat-item-246"><a href="https://powcoder.com/category/sys-programming/">系统编程 System programming</a> </li> <li class="cat-item cat-item-367"><a href="https://powcoder.com/category/%e6%95%b0%e5%80%bc%e7%a7%91%e5%ad%a6%e8%ae%a1%e7%ae%97/">数值科学计算</a> </li> <li class="cat-item cat-item-255"><a href="https://powcoder.com/category/%e8%ae%a1%e7%ae%97%e6%9c%ba%e8%a7%86%e8%a7%89compute-vision/">计算机视觉代写代考(Compute Vision)</a> </li> <li class="cat-item cat-item-248"><a href="https://powcoder.com/category/web/">网页应用 Web Application</a> </li> <li class="cat-item cat-item-401"><a href="https://powcoder.com/category/%e5%88%86%e5%b8%83%e5%bc%8f%e7%b3%bb%e7%bb%9f/">分布式系统</a> </li> <li class="cat-item cat-item-640"><a href="https://powcoder.com/category/%e7%ac%94%e8%af%95%e9%9d%a2%e8%af%95/">笔试面试</a> </li> <li class="cat-item cat-item-403"><a href="https://powcoder.com/category/%e5%87%bd%e6%95%b0%e5%bc%8f%e7%bc%96%e7%a8%8b/">函数式编程</a> </li> <li class="cat-item cat-item-243"><a href="https://powcoder.com/category/%e6%95%b0%e6%8d%ae%e6%8c%96%e6%8e%98-data-mining/">数据挖掘 Data Mining</a> </li> <li class="cat-item cat-item-364"><a href="https://powcoder.com/category/%e7%a6%bb%e6%95%a3%e6%95%b0%e5%ad%a6/">离散数学代写代考 (Discrete mathematics)</a> </li> <li class="cat-item cat-item-384"><a href="https://powcoder.com/category/%e8%bd%af%e4%bb%b6%e5%b7%a5%e7%a8%8b/">软件工程</a> </li> <li class="cat-item cat-item-551"><a href="https://powcoder.com/category/%e7%bc%96%e7%a8%8b%e8%af%ad%e8%a8%80-programming-language/">编程语言 Programming Language</a> </li> <li class="cat-item cat-item-594"><a href="https://powcoder.com/category/%e7%bb%9f%e8%ae%a1%e4%bb%a3%e5%86%99%e4%bb%a3%e8%80%83/">统计代写代考</a> </li> <li class="cat-item cat-item-574"><a href="https://powcoder.com/category/%e8%bf%90%e7%ad%b9%e5%ad%a6-operation-research/">运筹学 Operation Research</a> </li> </ul> </nav></aside><aside id="tag_cloud-3" class="widget widget_tag_cloud"><h2 class="widget-title">Tag</h2><nav aria-label="Tag"><div class="tagcloud"><a href="https://powcoder.com/tag/algorithm/" class="tag-cloud-link tag-link-469 tag-link-position-1" style="font-size: 18px;" aria-label="Algorithm算法代写代考 (15,142 items)">Algorithm算法代写代考</a><a href="https://powcoder.com/tag/java/" class="tag-cloud-link tag-link-298 tag-link-position-2" style="font-size: 16.91156462585px;" aria-label="Java代写代考 (7,269 items)">Java代写代考</a><a href="https://powcoder.com/tag/database/" class="tag-cloud-link tag-link-414 tag-link-position-3" style="font-size: 16.503401360544px;" aria-label="database (5,442 items)">database</a><a href="https://powcoder.com/tag/data-structure/" class="tag-cloud-link tag-link-501 tag-link-position-4" style="font-size: 16.401360544218px;" aria-label="data structure (5,184 items)">data structure</a><a href="https://powcoder.com/tag/python/" class="tag-cloud-link tag-link-331 tag-link-position-5" style="font-size: 16.299319727891px;" aria-label="Python代写代考 (4,806 items)">Python代写代考</a><a href="https://powcoder.com/tag/compiler/" class="tag-cloud-link tag-link-472 tag-link-position-6" style="font-size: 16.027210884354px;" aria-label="compiler (3,999 items)">compiler</a><a href="https://powcoder.com/tag/scheme/" class="tag-cloud-link tag-link-338 tag-link-position-7" style="font-size: 15.823129251701px;" aria-label="Scheme代写代考 (3,502 items)">Scheme代写代考</a><a href="https://powcoder.com/tag/c-4/" class="tag-cloud-link tag-link-499 tag-link-position-8" style="font-size: 15.823129251701px;" aria-label="C语言代写 (3,489 items)">C语言代写</a><a href="https://powcoder.com/tag/ai/" class="tag-cloud-link tag-link-369 tag-link-position-9" style="font-size: 15.142857142857px;" aria-label="AI代写 (2,211 items)">AI代写</a><a href="https://powcoder.com/tag/c-3/" class="tag-cloud-link tag-link-491 tag-link-position-10" style="font-size: 14.700680272109px;" aria-label="c++代写 (1,633 items)">c++代写</a><a href="https://powcoder.com/tag/sql/" class="tag-cloud-link tag-link-395 tag-link-position-11" style="font-size: 14.530612244898px;" aria-label="SQL代写代考 (1,457 items)">SQL代写代考</a><a href="https://powcoder.com/tag/haskell/" class="tag-cloud-link tag-link-291 tag-link-position-12" style="font-size: 14.530612244898px;" aria-label="Haskell代写代考 (1,453 items)">Haskell代写代考</a><a href="https://powcoder.com/tag/javascript/" class="tag-cloud-link tag-link-299 tag-link-position-13" style="font-size: 14.462585034014px;" aria-label="javascript (1,395 items)">javascript</a><a href="https://powcoder.com/tag/concurrency/" class="tag-cloud-link tag-link-503 tag-link-position-14" style="font-size: 14.428571428571px;" aria-label="concurrency (1,355 items)">concurrency</a><a href="https://powcoder.com/tag/matlab/" class="tag-cloud-link tag-link-309 tag-link-position-15" style="font-size: 14.360544217687px;" aria-label="matlab代写代考 (1,281 items)">matlab代写代考</a><a href="https://powcoder.com/tag/finance/" class="tag-cloud-link tag-link-282 tag-link-position-16" style="font-size: 14.292517006803px;" aria-label="finance (1,221 items)">finance</a><a href="https://powcoder.com/tag/interpreter/" class="tag-cloud-link tag-link-297 tag-link-position-17" style="font-size: 14.190476190476px;" aria-label="interpreter (1,144 items)">interpreter</a><a href="https://powcoder.com/tag/mips/" class="tag-cloud-link tag-link-313 tag-link-position-18" style="font-size: 14.156462585034px;" aria-label="MIPS汇编代写代考 (1,134 items)">MIPS汇编代写代考</a><a href="https://powcoder.com/tag/data-mining/" class="tag-cloud-link tag-link-271 tag-link-position-19" style="font-size: 13.986394557823px;" aria-label="data mining (990 items)">data mining</a><a href="https://powcoder.com/tag/decision-tree/" class="tag-cloud-link tag-link-273 tag-link-position-20" style="font-size: 13.952380952381px;" aria-label="decision tree (982 items)">decision tree</a><a href="https://powcoder.com/tag/deep-learning/" class="tag-cloud-link tag-link-274 tag-link-position-21" style="font-size: 13.952380952381px;" aria-label="deep learning深度学习代写代考 (980 items)">deep learning深度学习代写代考</a><a href="https://powcoder.com/tag/prolog/" class="tag-cloud-link tag-link-329 tag-link-position-22" style="font-size: 13.918367346939px;" aria-label="Prolog代写代考 (957 items)">Prolog代写代考</a><a href="https://powcoder.com/tag/file-system/" class="tag-cloud-link tag-link-281 tag-link-position-23" style="font-size: 13.850340136054px;" aria-label="file system (902 items)">file system</a><a href="https://powcoder.com/tag/c/" class="tag-cloud-link tag-link-265 tag-link-position-24" style="font-size: 13.578231292517px;" aria-label="c++代做 (764 items)">c++代做</a><a href="https://powcoder.com/tag/computer-architecture/" class="tag-cloud-link tag-link-507 tag-link-position-25" style="font-size: 13.47619047619px;" aria-label="computer architecture (712 items)">computer architecture</a><a href="https://powcoder.com/tag/er/" class="tag-cloud-link tag-link-433 tag-link-position-26" style="font-size: 13.47619047619px;" aria-label="ER (711 items)">ER</a><a href="https://powcoder.com/tag/gui/" class="tag-cloud-link tag-link-290 tag-link-position-27" style="font-size: 13.47619047619px;" aria-label="gui (711 items)">gui</a><a href="https://powcoder.com/tag/gpu/" class="tag-cloud-link tag-link-396 tag-link-position-28" style="font-size: 13.272108843537px;" aria-label="GPU (620 items)">GPU</a><a href="https://powcoder.com/tag/data-science/" class="tag-cloud-link tag-link-272 tag-link-position-29" style="font-size: 13.272108843537px;" aria-label="data science (615 items)">data science</a><a href="https://powcoder.com/tag/x86%e6%b1%87%e7%bc%96/" class="tag-cloud-link tag-link-514 tag-link-position-30" style="font-size: 13.238095238095px;" aria-label="x86汇编代写代考 (606 items)">x86汇编代写代考</a><a href="https://powcoder.com/tag/case-study/" class="tag-cloud-link tag-link-468 tag-link-position-31" style="font-size: 13.204081632653px;" aria-label="case study (586 items)">case study</a><a href="https://powcoder.com/tag/distributed-system/" class="tag-cloud-link tag-link-277 tag-link-position-32" style="font-size: 13.170068027211px;" aria-label="distributed system (576 items)">distributed system</a><a href="https://powcoder.com/tag/android/" class="tag-cloud-link tag-link-256 tag-link-position-33" style="font-size: 13.034013605442px;" aria-label="android (526 items)">android</a><a href="https://powcoder.com/tag/kernel/" class="tag-cloud-link tag-link-470 tag-link-position-34" style="font-size: 13.034013605442px;" aria-label="kernel (520 items)">kernel</a><a href="https://powcoder.com/tag/arm/" class="tag-cloud-link tag-link-483 tag-link-position-35" style="font-size: 13px;" aria-label="ARM汇编代写代考 (514 items)">ARM汇编代写代考</a></div> </nav></aside><aside id="block-4" class="widget widget_block"> <div class="wp-block-group is-layout-flow wp-block-group-is-layout-flow"><div class="wp-block-group__inner-container"><ul class="wp-block-latest-posts__list wp-block-latest-posts"><li><a class="wp-block-latest-posts__post-title" href="https://powcoder.com/2024/09/04/cs%e4%bb%a3%e8%80%83-vt6005cem-security-2/">CS代考 VT6005CEM Security</a></li> <li><a class="wp-block-latest-posts__post-title" href="https://powcoder.com/2024/09/04/cs%e4%bb%a3%e8%80%83-vt6005cem-security/">CS代考 VT6005CEM Security</a></li> <li><a class="wp-block-latest-posts__post-title" href="https://powcoder.com/2024/08/29/cs%e4%bb%a3%e8%80%83-dynamic-programming-homework/">CS代考 Dynamic Programming Homework</a></li> <li><a class="wp-block-latest-posts__post-title" href="https://powcoder.com/2024/08/29/cs%e4%bb%a3%e8%80%83-date-06-07-2024-13-07-2024-19-07-2024-22-07-2024-23-07-2024-24-07-2024/">CS代考 DATE 06/07/2024 13/07/2024 19/07/2024 22/07/2024 23/07/2024 24/07/2024</a></li> <li><a class="wp-block-latest-posts__post-title" href="https://powcoder.com/2024/08/03/cs%e4%bb%a3%e8%80%83-nasdaq-100-became-over-sp-500-perform-the-backtesting-of-analytical-var/">CS代考 NASDAQ-100 became over S&P 500. Perform the backtesting of Analytical VaR (</a></li> </ul></div></div> </aside> </div><!-- .sidebar-main --> </div><!-- #secondary --> </div> <!-- ast-container --> </div><!-- #content --> <footer class="site-footer" id="colophon" itemtype="https://schema.org/WPFooter" itemscope="itemscope" itemid="#colophon"> <div class="site-below-footer-wrap ast-builder-grid-row-container site-footer-focus-item ast-builder-grid-row-full ast-builder-grid-row-tablet-full ast-builder-grid-row-mobile-full ast-footer-row-stack ast-footer-row-tablet-stack ast-footer-row-mobile-stack" data-section="section-below-footer-builder"> <div class="ast-builder-grid-row-container-inner"> <div class="ast-builder-footer-grid-columns site-below-footer-inner-wrap ast-builder-grid-row"> <div class="site-footer-below-section-1 site-footer-section site-footer-section-1"> <div class="ast-builder-layout-element ast-flex site-footer-focus-item ast-footer-copyright" data-section="section-footer-builder"> <div class="ast-footer-copyright"><p>Copyright © 2024 PowCoder代写 | Powered by <a href="https://wpastra.com/" rel="nofollow noopener" target="_blank">Astra WordPress Theme</a></p> </div> </div> </div> </div> </div> </div> </footer><!-- #colophon --> </div><!-- #page --> <link rel="stylesheet" href="https://powcoder.com/wp-content/cache/minify/12163.css" media="all" /> <script id="astra-theme-js-js-extra"> var astra = {"break_point":"921","isRtl":"","is_scroll_to_id":"","is_scroll_to_top":"","is_header_footer_builder_active":"1","responsive_cart_click":"flyout"}; </script> <script src="https://powcoder.com/wp-content/cache/minify/75800.js"></script> <script src="https://stats.wp.com/e-202436.js" id="jetpack-stats-js" data-wp-strategy="defer"></script> <script id="jetpack-stats-js-after"> _stq = window._stq || []; _stq.push([ "view", JSON.parse("{\"v\":\"ext\",\"blog\":\"132118579\",\"post\":\"35570\",\"tz\":\"8\",\"srv\":\"powcoder.com\",\"j\":\"1:13.8\"}") ]); _stq.push([ "clickTrackerInit", "132118579", "35570" ]); </script> <script> /(trident|msie)/i.test(navigator.userAgent)&&document.getElementById&&window.addEventListener&&window.addEventListener("hashchange",function(){var t,e=location.hash.substring(1);/^[A-z0-9_-]+$/.test(e)&&(t=document.getElementById(e))&&(/^(?:a|select|input|button|textarea)$/i.test(t.tagName)||(t.tabIndex=-1),t.focus())},!1); </script> </body> </html> <!-- Performance optimized by W3 Total Cache. Learn more: https://www.boldgrid.com/w3-total-cache/ Object Caching 258/349 objects using Disk Page Caching using Disk: Enhanced Content Delivery Network via N/A Minified using Disk Served from: powcoder.com @ 2024-09-07 03:05:00 by W3 Total Cache -->