Please enable JavaScript.
Coggle requires JavaScript to display documents.
Intro to AI (Neural Networks (Capture, x's are inputs (either 1s or…
Intro to AI
Neural Networks
-
-
-
-
-
-
-
Basically if the end result is wrong, it will go back and change weights so that a correct answer is produced
-
-
Linear inseparability - functions where the outcomes cant be separated using a straight line. Eg. XOR. 2 neurons are needed to solve this. These are called multi-layer networks
-
-
Sometimes, an extra input is added, so that h can be learned as well. This is x0 = -1, where w0 = h.
Perceptron learning algorithm - Randomise the weights and run the TLU, altering the weights.
Probabilistic AI
-
-
-
-
Marginals - In a joint distribution, they are the probability of a single event happening, so Pr(A) = Pr(A^B) + Pr(A^¬B). Used to prove if the events are independent or not.
Joint distribution - The probability of 2 events (or more) happening. Pr(A^B). If events are independent, Pr(A^B) = Pr(A) x Pr(B). Else conditional.
Conditional events - When 2 events effect each other. Eg. Weather, Cavity. Pr(A|B) is used instead of Pr(A). So Pr(A^B) = Pr(A|B) x Pr(B).
Independent events - When two events don't effect each other. Eg. Coin throws, Dice rolls.
Decision Trees
-
-
Entropy - Its better to split on a field which will correctly classify only one outcome (entropy = 0).
-
Entropy based trees - Split on the best field by entropy. Doesn't guarantee the best tree though, as it only looks at the current node.
-
-
Bayes
-
-
-
-
Eg. Pr(b|j,m)=a ∑(E) ∑(A) Pr(b)Pr(E)Pr(A|b,E)Pr(j|A)Pr(m|A)
-
-
Machine Learning
-
-
Unsupervised - Creates a cluster of information and it will use that to figure out if it was correct and how correct it was
-
Regression
-
-
-
Classification - Opposite of regression, it tries to classify the outputs with a line (So 1 type of out is on one side, one is on the other).
-