Functions for use with TensorFlow 2.x
References¶
The functions below are based on the below sources. Please visit attached links for further understanding.:
- Chapter 14 – Deep Computer Vision Using Convolutional Neural Networks, Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow 2nd Edition by Aurélien Geron: Github link to notebook
- In particular, functions
random_crop
andcentral_crop
are used
- In particular, functions
- Tensorflow Documentation by The Tensorflow Authors:
Image Augmentation¶
Dataset reading and processing¶
Example: Read images from folder of Imagenet style¶
Sample:
- data
- train
- class1
- class2
- valid
- class1
- class2
- train
# datapath = Path('path/to/data')
# train_aug = [rcrop, rflip]
# valid_aug = [crop]
# aug = (train_aug, valid_aug)
# train_ds = read_img_dataset(str(datapath/train/'*/*'), shuffle=1024, img_size=224, batch_size=32, n_parallel=4, augments=aug, mode='train')
# valid_ds = read_img_dataset(str(datapath/valid/'*/*'), img_size=224, batch_size=32, n_parallel=4, augments=aug, mode='valid')
Example: Read images from folder (or list) and then shuffle split into train and validation¶
# datapath = Path('path/to/data')
# all_files = get_all_files(datapath, recurse=True) # read from folder, can be list
# train_filepaths, tmp_filepaths = train_test_split(all_files, valid_pct=0.3, seed=42)
# valid_filepaths, test_filepaths = train_test_split(tmp_filepaths, valid_pct=0.5, seed=42)
# train_ds = read_img_dataset([str(x) for x in train_filepaths], shuffle_size=1024, img_size=IMG_SIZE, batch_size=BATCH_SIZE, n_parallel=4, augments=aug, mode='train')
# valid_ds = read_img_dataset([str(x) for x in valid_filepaths], img_size=IMG_SIZE, batch_size=BATCH_SIZE, n_parallel=4, augments=aug, mode='valid')
# test_ds = read_img_dataset([str(x) for x in test_filepaths], img_size=IMG_SIZE, batch_size=BATCH_SIZE, n_parallel=4, augments=aug, mode='test')
Visualizations¶
- Plot history from model.fit using tf.keras
# plot_history(history)