## Deep Learning

**Deep Learning How I Did It: Merck 1st place interview**

http://blog.kaggle.com/2012/11/01/deep-learning-how-i-did-it-merck-1st-place-interview/

**Geoffrey E. Hinton Homepage**

http://www.cs.toronto.edu/~hinton/

**Brains, Sex, and Machine Learning **

http://www.youtube.com/watch?v=DleXA5ADG78

## Stanford, Coursera and Udacity Classes

**Technology Entrepreneurship Course **

http://eesley.blogspot.com/

remember: intersection is ‘and’ and is multiply in stats — when going down tree is ‘and’

rember: ‘or’ is add in stats

INSERT multiplication and addition rules

**Links**

**AIqus Wiki**

http://www.aiqus.com/wiki/Main

**AI Class Index**

https://github.com/lorenzo-stoakes/stanford-ai/blob/master/index.md

**NLP Class**

**Introduction to Information Retrieval**

http://i.stanford.edu/~ullman/mmds.html

**Calculus**

**derivative**(differential calculus) is a way of measuring instantaneous change, such as finding the speed of a car when you only know its position. The slope of the tangent line to a point on a curve corresponds to the derivative. [subtract starting position from ending position] We can take the derivative of the position function—a process of**subtraction and division**—to find the corresponding velocity function, which we can use to determine our instantaneous speed at any given point.**integral**(integral calculus) which describes the accumulation of an infinite number of tiny pieces that add up to a whole and can be used, for instance, to determine the distance a car has traveled when only its speed is known. The area under a curve corresponds to the integral. [add measurements together of movement (speed?)] Remember that the derivative and integral are opposite processes: Each undoes the work of the other. The integral is a process of**multiplication and addition**.- More important, functions are connected to each other in valuable ways: Velocity [speed] is the derivative of position, and acceleration [rate of change] is the derivative of velocity. We integrate acceleration [rate of change] over time to find the velocity function [integration], and we integrate velocity over time to find our position function [integral]. These connections let us make inferences based on what we do know, to figure out what we don’t know.
**Acceleration**is a vector quantity that is defined as the rate at which an object changes its velocity. An object is accelerating if it is changing its velocity.**Speed**is a scalar quantity that refers to “how fast an object is moving.” Speed can be thought of as the rate at which an object covers distance.

**fundamental theorem of calculus**

- The first part of the theorem, sometimes called the first fundamental theorem of calculus, shows that an indefinite integration[1] can be reversed by a differentiation. The first part is also important because it guarantees the existence of antiderivatives for continuous functions.[2]
- The second part, sometimes called the second fundamental theorem of calculus, allows one to compute the definite integral of a function by using any one of its infinitely many antiderivatives. This part of the theorem has invaluable practical applications, because it markedly simplifies the computation of definite integrals.
- The first published statement and proof of a restricted version of the fundamental theorem was by James Gregory (1638–1675).[3] Isaac Barrow (1630–1677) proved a more generalized version of the theorem,[4] while Barrow’s student Isaac Newton (1643–1727) completed the development of the surrounding mathematical theory. Gottfried Leibniz (1646–1716) systematized the knowledge into a calculus for infinitesimal quantities.

UTC – [Charlotte is -4 hours from UTC] (http://www.time.gov/timezone.cgi?UTC/s/0/java

**For sets A and B**

- union = distinct elements – from A or B or both A and B
- difference or complement = elements in A that are not in B – everything in sample space that is not that event – e.g. if A = (number > 0) then ~A = (numbers = or less than 0)
- intersection = shared elements – elements that are in both sets – both A and B, hence appears 2 times

**Probability**

- independent event — the outcome of one event has no relationship to another event
- probability: statistical definition – probability tells us how often something is like to occur wen an experiment is repeated. Probability is concerned with the outcome of trials. (1) The probability of an event is always between 0 and 1. (2) The probability of the sample space is always 1. (3) The probability of an event and its complement is always 1, follows from (1) and (2)
- sample space – the set of all elementary outcomes of a trial.
- mutually exclusive — there are no elements (points) in common
- permutation – is all possible ways elements in a set can be arranged – note that th order of elements is important in permutation: (a, b, c) is a different permutation than (a, c, b) – calulate the numbr of permutations in any set of distinct elements (no elements repeat) by using factorials (n!). The number of permutations of subsets of size k drawn from a set of size n is calculate as: nPk =n!/(n-k)!
- combinations – similar to permutations with the difference that the order of the elments is not significant in combinations (a,b, c) is the same combination as (b, a, c); for this reason there is only one combination of the set (a, b, c) – nCk = nPk/k!
- In technical terms, the set of outcomes from rolling one or more dice has a
**discrete uniform distribution**because the possible outcomes can be enumerated and each outcome is equally likely.The results of two or more dice thrown at once (or multiple throws of the same die) are assumed to be independent of each other, so the probabilities of each combination of numbers are calculated by multiplying the probability of each separate result.

**Conditional Probability**

- P(E|F) is read as the probability E given F — F is known as the condition.
- Two variables are independent if the following relationship holds P(E|F) = P(E)

- Calculate the probability
**of any**of several events occurring (the union of several events, add the probabilities of the individual events.

- Union of mutually exclusive events equation is: P(E U F) = (P(E) + P(F)
- Union of non-mutually exclusive events equation is: P(E U F) = P(E) + P(F) – P(E intersection F)

- Calculate the probability
**of all**of several events occurring (the intersection of several events, add the probabilities of the individual events.

**Bayes Formula**

- Use this formula when P(B|A) but want to know P(A|B)

**relational algebra operators**

- select (Sigma) pick (select) rows — Sigma-operater, condition(s) on expression
- project (Pi) pick (select) columns — Pi-operation, condition(s) on expression
- cross-product (X) combine two relations — results in relation A times relation B number of rows
- natural join (bow-tie) cross-product that enforces equality on all attributes with same name, drops duplicates
- theta join (box-tie, subscript Theta) is natural join with condiction(s) — what db people call a Join
- union (U) = distinct elements in both sets A and B
- difference (-) = the difference of A and B is the relation that contains all the tuples that are in A but that are not in B, so when A-B, elements in A that are not in B, when B-A, elemens in B that are not in A
- intersection (inverted U or &) = shared elements – elements that are in both sets A and B — adds no expressive power, can be expressed as (A – (A – B)); also intersection can be expressed as (A natural join (bow-tie) B)
- rename (Rho) applies new relation name and new attributes to an existing relation, or just new relation name or just new attribute names — needed because joins on relations must have matching column names

** Stanford AI Class Circle. If you want to get added, leave a comment below. I also have a circle for ML and DB** https://plus.google.com/100129275726588145876/posts/KnPjU8oQM2z

**Overview of AIMA Lisp Code** http://aima.cs.berkeley.edu/lisp/doc/overview.html

**Lisp User Guide** http://aima.cs.berkeley.edu/lisp/doc/user.html

**AI Twits** https://twitter.com/#!/aiclass

**DB Twits** https://twitter.com/#!dbclass

**ML Twits** https://twitter.com/#!/ml_class/

**Terms**

**The formal definition of inverse proportion**:

- Two quantities, A and B, are in inverse proportion if by whatever factor A changes, B changes by the multiplicative inverse, or reciprocal, of that factor. E.g: 2/4 would be 2 *
**3**and 4 ***1/3**– 1/3 is the reciprocal of 3

**Closure:**

- In mathematics, a set is said to be closed under some operation if performance of that operation on members of the set always produces a unique member of the same set. For example, the real numbers are closed under subtraction, but the natural numbers are not: 3 and 8 are both natural numbers, but the result of 3 − 8 is not. Similarly, a set is said to be closed under a collection of operations if it is closed under each of the operations individually. see https://secure.wikimedia.org/wikipedia/en/wiki/Closure_%28mathematics%29

## Artificial Intelligence

**Solve for X conference**

**Markov Models**

**FANN – Fast Artificial Neural Network Library**

http://leenissen.dk/fann/wp/

**dlib C++ library**

http://dlib.net/

**Artificial Intelligence: A Modern Approach**

http://aima.cs.berkeley.edu/

**Vowpal Wabbit (Fast Learning)**

http://worrydream.com/feed.xml

## Predator: A Visual Tracker

**Predator: A Visual Tracker that Learns from its Errors **

http://www.youtube.com/watch?v=1GhNXHCQGsM

## MIT150 Symposia: Computation and the Transformation of Practically Everything

**Computation and the Transformation of Practically Everything **

http://www.cra.org/ccc/mitvids.php

**Akamai Chief Scientist Talks Theory**

http://www.cccblog.org/2011/06/10/akamai-chief-scientist-talks-theory/

## Watson

**Apache UIMA Resources on the Web**

http://uima.apache.org/external-resources.html

**Grady Booch talks Watson**

https://www.ibm.com/developerworks/mydeveloperworks/blogs/video-portal/entry/grady_booch_talks_watson_software_archeology_and_the_important_role_of_developers?lang=en_us

## Machine Learning

**Mahoot**

http://mahout.apache.org/

**Machine Learning and Hadoop by Josh Will**

http://www.youtube.com/watch?v=5p06Xg5REj0&feature=youtube_gdata

** Pycon 2012 ML and Python Talks**

http://aimotion.blogspot.com/2012/03/some-data-and-machine-learning-talks.html

**Some ML Packages in Python**

- mloss.org — http://mloss.org/software/
- shogun toolbox — http://www.shogun-toolbox.org/
- nltk
- mlpy
- pyml
- pybrain
- mdp-toolkit
- scikit-learn — look at this — http://scikit-learn.org/stable/
- Numpy – array processing
- Scipy — algorithms built on top of numpy
- ml-class.org to tutorials
- mkjob (yelp) for hadoop

**Bay Area Vision Meeting: Unsupervised Feature Learning and Deep Learning **

http://www.youtube.com/watch?v=ZmNOAtZIgIk

**The Future of Robotics and Artificial Intelligence (Andrew Ng, Stanford University, STAN 2011) **

http://www.youtube.com/watch?v=AY4ajbu_G3k

**a machine learning definition: **

“A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.” — Mitchell, Tom M. (1997). Machine Learning, McGraw Hill. ISBN 0-07-042807-7, p.2.

**ML Videos**

http://videolectures.net/Top/Computer_Science/Machine_Learning/

**Guide to Getting Started in Machine Learning**

http://abeautifulwww.com/2009/10/11/guide-to-getting-started-in-machine-learning/

**Stanford – Artificial Intelligence | Machine Learning **

http://see.stanford.edu/see/lecturelist.aspx?coll=348ca38a-3a6d-4052-937d-cb017338d7b1

**Welcome to the UC Irvine Machine Learning Repository!**

http://archive.ics.uci.edu/ml/

MLOSS

http://www.mloss.org/software/

**CRAN Task View: Machine Learning & Statistical Learning**

http://cran.r-project.org/web/views/MachineLearning.html

**MIT – 18.06 Linear Algebra**

http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/index.htm