Deep Learning
from Scratch

Deep Neural Networks

Course starts soon..


Quiz


We will start now with a quiz based on the first week material

You have 6 minutes to answer the quiz.

The quiz link:
Quiz Link
It will be copied in Mattermost and in the Zoom chat.

The biological Neuron


Source: Wikimedia Commons

Article about the difference and similarities

Questions from last week

Linear activation function in the output layer

Estimating probability distributions

Projects

Requirements

  • A public repository
  • A notebook with working code and explanation/comments - use the template folder
  • A small video/screen recording/presentation.

Project Template Folder

Project Template Folder

Please use the project template

Bakery Sales Prediction

Dataset available
PRO: Dataset is well structured, we know the source, kiel-based, good fit for the course, challenge.
CONS: not the most exciting project ever.

Fake News

Dataset available on Kaggle
PRO: Well structured and easy to understand dataset. Actual and controversial topic.
CONS: Very USA-based, not sure about source of the news.

Credit Card Fraud Detection

Dataset on Kaggle
PRO: Real-world clear task
CONS: Not sure you can have access to this kind of data outside the dataset

Climate Change

Dataset on Earth Surface Temperature
Dataset on Carbon Dioxide
PRO: Very important and actual problem. Large datasets and a widely discussed field.
CONS: Complex. Data collection changed during time, many factors are involved, and tasks may not be easy.

Image Classification

Natural Scene or Bird Species classification
PRO: Dataset is structured similar what we have seen in the course and task is straightforward
CONS: it may be less exciting, more similar to the examples.

Image Classification "Classic"

Here the idea is to use the two classic datasets to play with ML

  1. MNIST: Website - Github repo
  2. CIFAR: Website

Project from you

Please explain again your project and your team.
We give the other people time to think about it.

Open Questions (15 mins)


  1. What is the difference between parameters and hyperparameters? Can you make some examples?
  2. Is there a hyperparameter that is more important than the others?
  3. How would you choose the parameters, if you do not know exactly what they do?
  4. Once we start stacking layers in a neural network, how do we know what layer is doing?
  5. Is it really important to know?

DISCUSSION AND ANSWERS


Exercises (15-20 mins)

We go through the programming assignments that were planned for this week.

1. What is np.random.seed(1) ?
2. Why are using for loop, after talking about vectorization? How would you vectorize the implementation?
3. How can a neural network perform worse on training data and better on test data? How important is the accuracy on training data? Do we need it?

For the Project: Tensorflow

Until now we implemented everything..


def initialize_parameters(n_x, n_h, n_y):
    ...
def linear_activation_forward(A_prev, W, b, activation):
    ...
def compute_cost(AL, Y):
    ...
def linear_activation_backward(dA, cache, activation):
    ...
def update_parameters(parameters, grads, learning_rate):
    ...
def L_layer_model(X, Y, layers_dims, learning_rate = 0.009,
                    num_iterations = 3000, print_cost=False):
    ...
def predict(X, Y, parameters)
    ...
          

Tensorflow allows us to use less code!


model = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape=(64, 64, 3)),
    tf.keras.layers.Dense(7, activation='relu'),
    tf.keras.layers.Dense(1, activation='sigmoid')
])

model.fit(train_images, train_labels, epochs=10)

test_loss, test_acc = model.evaluate(test_images,  test_labels)
          

A lot of tutorials online

An example of classification

For the next week

  • Finish the first week of the Improving Deep Neural Network Course
  • Do the Programming Assignments: Initialization, Regularization and Gradient Checking
  • Create or join a group for the project (2-4 people)