Please enable JavaScript.
Coggle requires JavaScript to display documents.
Quantitative marketing - Coggle Diagram
Quantitative marketing
Recommendations
Content based
Content based Nearest Neighbors
Similarity metric
Dice coefficient
Pros
Works when you don't have many users
More info
The more sophisticated the features the better it works
Extensions
minimum similarity threshold
similarity weighted predictions
Cons
You might get not very surprising
Requires a lot of prior knowledge
Not useful for new users
Doesn't capture qualitative information
Not using data from other users
Naive Bayes
Pros
Not very sensitive to independence assumption
Use all useful data not just neighbors
Cons
Assumes independence
Cold start
Requires knowledge of the items
Doesn't capture qualitative features
Doesn't use user behavior
Collaborative filtering
User based collaborative filtering
Similarity metric
Pearson's correlation
Pros
We dont need item data
Channels taste
Cons
We need user ratings
High computation with many users
Extensioms
Do not recommend if rating too low
Item based collaborative filtering
Pros
Good when number of items is much lower than number of users
Similarity measure
Cons
High computation time with many items
extensions
variance weighing
significance weighting
Matrix factorization
Stochastic gradient descend
Pros
Efficient calculation times
Improve reach in scarse matrices
Increased quality because it uses all ratings
Cons
Doesn't pick up local similarity
Hybrid
Use recommendations from both types of algorithms
Weighted average
Switching
pick next algorithm using multi armed bandit
Content based bollaborative filtering
Long tail
Evaluation metric
Error for continuous
Precision/Recall for binary
Top N
Online advertising
Funnel
Impression
Click
Conversion
Reinforcement learning
Bayesian update
Uniform distribution
Bad prior all values equally likely
Beta distribution
Posterior distribution is also a beta
Very flexibble
Natural representation for CTR
Pros
You can use observations as you get them
Cons
The choice of prior matters
good prior
Initial prediction close to actual
Fast converfgence
bad prior
Bias in the initial prediction
Slow convergence
Multi armed bandit
AB Testing
Pros
Simple
Cons
A lot of spending in exploration
You stop learning
If thingschange you dont know
If your conclusion is wrong you will not know
epsilon greedy
Pros
Don't get stuck
We use information as it becomes available
Cons
Can take long before converging
if we are sure it's a waste
Performances depend on epsilon
We only use the mean and not the rest of the distribution
UBB algorithm
Pros
We use info as it becomes available
We don't get stuck
Banlances exploring and exploiting
Cons
Depends on choice of alpha
Thompson
Pros
Use the full distribution
Better balance between explore and exploit
No parameters to tune
Cons
Depends on prior distribution
CTR prediction
Naive Bayes
Pros
Fast to train
Fast to calculate
`not sensitive to irrelevant attributes
Cons
Assumes conditional independence
SDG Logistic regression
Pros
Efficient updating
Cons
Error metric
Log loss
Conversion attribution
Marketing Mix modeling
Regression
Explanatory variables
Multicollinearity
Lagged effects
Ad stock
You loose information of the multiple lags
Lagged variables
Multicollinearity
Diminishing returns
Marginal contribution
Sample mean
Cons
Unreliable if N too small
Logistic regression
Cons
Don't consider the order
Ignores more than one occurrence