Hands-On Artificial Intelligence for Beginners
上QQ阅读APP看书,第一时间看更新

Element–wise operations

In element-wise operations, position matters. Values that correspond positionally are combined to create a new value. 

To add to and/or subtract matrices or vectors:

 

And in Python:

vector_one = np.array([[1,2],[3,4]])
vector_two = np.array([[5,6],[7,8]])
a + b
## You should see:
array([[ 6, 8],[10, 12]])
array([[ 6, 8],[10, 12]])
a - b
## You should see:
array([[-4, -4], [-4, -4]])

There are two forms of multiplication that we may perform with vectors: the Dot productand the Hadamard product.

The dot product is a special case of multiplication, and is rooted in larger theories of geometry that are used across the physical and computational sciences. It is a special case of a more general mathematical principle known as an inner productWhen utilizing the dot product of two vectors, the output is a scalar:

Dot products are a workhorse in machine learning. Think about a basic operation: let's say we're doing a simple classification problem where we want to know if an image contains a cat or a dog. If we did this with a neural network, it would look as follows:

Here, y is our classification cat or dog. We determine y by utilizing a network represented by f, where the input is x, while w and b represent a weight and bias factor (don't worry, we'll explain this in more detail in the coming chapter!). Our x and w are both matrices, and we need to output a scalar , which represents either cat or dog. We can only do this by taking the dot product of w and .

 Relating back to our example, if this function were presented with an unknown image, taking the dot product will tell us how similar in direction the new vector is to the cat vector (a) or dog vector (b) by the measure of the angle () between them:

If the vector is closer to the direction of the cat vector (a), we'll classify the image as containing a cat. If it's closer to the dog vector (b), we'll classify it as containing a dog. In deep learning, a more complex version of this scenario is performed over and over; it's the core of how ANNs work. 

In Python, we can take the dot product of two vectors by using a built-in function from numpy, np.dot():

## Dot Product
vector_one = np.array([1,2,3])
vector_two = np.array([2,3,4])
np.dot(vector_one,vector_two) ## This should give us 20

The Hadamard product, on the other hand, outputs a vector:

The Hadamard product is element-wise, meaning that the individual numbers in the new matrix are the scalar multiples of the numbers from the previous matrices. Looking back to Python, we can easily perform this operation in Python with a simple * operator:

vector_one = np.array([1,2,3])
vector_two = np.array([2,3,4])
vector_one * vector_two
## You should see:
array([ 2, 6, 12])

Now that we've scratched the surface of basic matrix operations, let's take a look at how probability theory can aid us in the artificial intelligence field.