Transfer Learning with Tensorflow

Michael Sheinfeild
3 min readNov 22, 2019

--

Today i will use the power of trained net to train on new Dataset of flowers.

we will use two networks from hub one is Mobilenet and other Inception.

Imports

import tensorflow as tf

import numpy as np

import matplotlib.pyplot as plt

import tensorflow_hub as hub

import tensorflow_datasets as tfds

from tensorflow.keras import layers

Download DataSet

splits = tfds.Split.ALL.subsplit(weighted=(70, 30))

(training_set, validation_set), dataset_info = tfds.load(‘tf_flowers’, with_info=True, as_supervised=True, split = splits)

we split the data for train and validation with 70% for train and 30 for validation

Info Data

num_examples=dataset_info.splits[‘train’].num_examples

num_classes = dataset_info.features[‘label’].num_classes

num_training_examples = num_examples * 0.7

num_validation_examples=num_examples * 0.3

 Number of Classes: 5
Number of Training Images: 2569.0
Number of Validation Images: 1101.0

Size fix

we need to resize the image as input of the network

IMAGE_RES = 224

def format_image(image, label):

image = tf.image.resize(image, (IMAGE_RES, IMAGE_RES))/255.0

return image, label

BATCH_SIZE = 32

train_batches = training_set.shuffle(num_examples//4).map(format_image).batch(BATCH_SIZE).prefetch(1)

validation_batches = validation_set.map(format_image).batch(BATCH_SIZE).prefetch(1)

Transfer Learn with Hub

URL = “https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4"

feature_extractor = hub.KerasLayer(URL,

input_shape=(IMAGE_RES, IMAGE_RES,3))

feature_extractor.trainable = False

The network freezed since it will be trained only last layear also need to change number of classes to 5.

model = tf.keras.Sequential([

feature_extractor,

layers.Dense(num_classes, activation=’softmax’)

])

model.summary()

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
keras_layer (KerasLayer) (None, 1280) 2257984
_________________________________________________________________
dense (Dense) (None, 5) 6405
=================================================================
Total params: 2,264,389
Trainable params: 6,405
Non-trainable params: 2,257,984

Train

model.compile(

optimizer=’adam’,

loss=’sparse_categorical_crossentropy’,

metrics=[‘accuracy’])

EPOCHS = 6

history = history = model.fit(train_batches,

epochs=EPOCHS,

validation_data=validation_batches)

Epoch 1/6
81/81 [==============================] - 19s 240ms/step - loss: 0.8028 - accuracy: 0.7189 - val_loss: 0.0000e+00 - val_accuracy: 0.0000e+00
Epoch 2/6
81/81 [==============================] - 10s 123ms/step - loss: 0.3951 - accuracy: 0.8745 - val_loss: 0.3670 - val_accuracy: 0.8750
Epoch 3/6
81/81 [==============================] - 10s 124ms/step - loss: 0.2939 - accuracy: 0.9062 - val_loss: 0.3406 - val_accuracy: 0.8880
Epoch 4/6
81/81 [==============================] - 10s 123ms/step - loss: 0.2430 - accuracy: 0.9255 - val_loss: 0.3364 - val_accuracy: 0.8704
Epoch 5/6
81/81 [==============================] - 10s 123ms/step - loss: 0.2141 - accuracy: 0.9351 - val_loss: 0.3204 - val_accuracy: 0.8963
Epoch 6/6
81/81 [==============================] - 10s 123ms/step - loss: 0.1795 - accuracy: 0.9533 - val_loss: 0.3084 - val_accuracy: 0.8963

Plot result

acc = history.history[‘accuracy’]

val_acc = history.history[‘val_accuracy’]

loss = history.history[‘loss’]

val_loss = history.history[‘val_loss’]

epochs_range = range(EPOCHS)

plt.figure(figsize=(8, 8))

plt.subplot(1, 2, 1)

plt.plot(epochs_range, acc, label=’Training Accuracy’)

plt.plot(epochs_range, val_acc, label=’Validation Accuracy’)

plt.legend(loc=’lower right’)

plt.title(‘Training and Validation Accuracy’)

plt.subplot(1, 2, 2)

plt.plot(epochs_range, loss, label=’Training Loss’)

plt.plot(epochs_range, val_loss, label=’Validation Loss’)

plt.legend(loc=’upper right’)

plt.title(‘Training and Validation Loss’)

plt.show()

in the beginning already validation has high accuracy since the layers trained o other data.

See Predictions

class_names = np.array(dataset_info.features[‘label’].names)

array(['dandelion', 'daisy', 'tulips', 'sunflowers', 'roses'],
dtype='<U10')

Batch

image_batch, label_batch = next(iter(train_batches.take(1)))

image_batch = image_batch.numpy()

label_batch = label_batch.numpy()

#labels_path = tf.keras.utils.get_file(‘ImageNetLabels.txt’,’https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt')

#imagenet_labels = np.array(open(labels_path).read().splitlines())

predicted_batch = model.predict(image_batch)

predicted_batch = tf.squeeze(predicted_batch).numpy()

predicted_ids = class_names[np.argmax(predicted_batch, axis=-1)]

predicted_class_names = predicted_ids

label_batch=class_names[label_batch]

['sunflowers' 'tulips' 'roses' 'daisy' 'daisy' 'dandelion' 'dandelion'
'sunflowers' 'dandelion' 'tulips' 'daisy' 'daisy' 'roses' 'tulips'
'roses' 'sunflowers' 'roses' 'daisy' 'tulips' 'dandelion' 'roses'
'tulips' 'roses' 'sunflowers' 'dandelion' 'tulips' 'roses' 'roses'
'dandelion' 'dandelion' 'roses' 'dandelion']

See Prediction

plt.figure(figsize=(10,9))

for n in range(30):

plt.subplot(6,5,n+1)

plt.subplots_adjust(hspace = 0.3)

plt.imshow(image_batch[n])

color = “blue” if predicted_ids[n] == label_batch[n] else “red”

plt.title(predicted_class_names[n].title(), color=color)

plt.axis(‘off’)

_ = plt.suptitle(“Model predictions (blue: correct, red: incorrect)”)

All was predicted correctly !

Inception

just change two parameters :

IMAGE_RES=299

URL = “https://tfhub.dev/google/tf2-preview/inception_v3/feature_vector/4"

Summery

we have see how it is easy to classify new dataset use transfer learning in keras.

just another flower !

--

--