PPM sang jpg python

Xem các phiên từ Hội nghị chuyên đề WiML về các mô hình khuếch tán với KerasCV, ML trên thiết bị, v.v. Xem theo yêu cầu

  • TenorFlow
  • API
  • TenorFlow v2. 11. 0
  • con trăn

tf. máy ảnh. sơ chế. hình ảnh. ImageDataGenerator Sắp xếp ngăn nắp với các bộ sưu tập Lưu và phân loại nội dung dựa trên sở thích của bạn

Xem nguồn trên GitHub

Tạo các lô dữ liệu hình ảnh tensor với tính năng tăng cường dữ liệu theo thời gian thực

Xem bí danh

Bí danh tương thích để di chuyển

Xem Hướng dẫn di chuyển để biết thêm chi tiết

[x_train, y_train], [x_test, y_test] = cifar10.load_data[]
y_train = utils.to_categorical[y_train, num_classes]
y_test = utils.to_categorical[y_test, num_classes]
datagen = ImageDataGenerator[
    featurewise_center=True,
    featurewise_std_normalization=True,
    rotation_range=20,
    width_shift_range=0.2,
    height_shift_range=0.2,
    horizontal_flip=True,
    validation_split=0.2]
# compute quantities required for featurewise normalization
# [std, mean, and principal components if ZCA whitening is applied]
datagen.fit[x_train]
# fits the model on batches with real-time data augmentation:
model.fit[datagen.flow[x_train, y_train, batch_size=32,
         subset='training'],
         validation_data=datagen.flow[x_train, y_train,
         batch_size=8, subset='validation'],
         steps_per_epoch=len[x_train] / 32, epochs=epochs]
# here's a more "manual" example
for e in range[epochs]:
    print['Epoch', e]
    batches = 0
    for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
        model.fit[x_batch, y_batch]
        batches += 1
        if batches >= len[x_train] / 32:
            # we need to break the loop by hand because
            # the generator loops indefinitely
            break
2

tf.keras.preprocessing.image.ImageDataGenerator[
    featurewise_center=False,
    samplewise_center=False,
    featurewise_std_normalization=False,
    samplewise_std_normalization=False,
    zca_whitening=False,
    zca_epsilon=1e-06,
    rotation_range=0,
    width_shift_range=0.0,
    height_shift_range=0.0,
    brightness_range=None,
    shear_range=0.0,
    zoom_range=0.0,
    channel_shift_range=0.0,
    fill_mode='nearest',
    cval=0.0,
    horizontal_flip=False,
    vertical_flip=False,
    rescale=None,
    preprocessing_function=None,
    data_format=None,
    validation_split=0.0,
    interpolation_order=1,
    dtype=None
]

Được sử dụng trong sổ ghi chép

Được sử dụng trong hướng dẫn
  • tf. dữ liệu. Xây dựng đường ống đầu vào TensorFlow
Không dùng nữa.
[x_train, y_train], [x_test, y_test] = cifar10.load_data[]
y_train = utils.to_categorical[y_train, num_classes]
y_test = utils.to_categorical[y_test, num_classes]
datagen = ImageDataGenerator[
    featurewise_center=True,
    featurewise_std_normalization=True,
    rotation_range=20,
    width_shift_range=0.2,
    height_shift_range=0.2,
    horizontal_flip=True,
    validation_split=0.2]
# compute quantities required for featurewise normalization
# [std, mean, and principal components if ZCA whitening is applied]
datagen.fit[x_train]
# fits the model on batches with real-time data augmentation:
model.fit[datagen.flow[x_train, y_train, batch_size=32,
         subset='training'],
         validation_data=datagen.flow[x_train, y_train,
         batch_size=8, subset='validation'],
         steps_per_epoch=len[x_train] / 32, epochs=epochs]
# here's a more "manual" example
for e in range[epochs]:
    print['Epoch', e]
    batches = 0
    for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
        model.fit[x_batch, y_batch]
        batches += 1
        if batches >= len[x_train] / 32:
            # we need to break the loop by hand because
            # the generator loops indefinitely
            break
3 không được khuyến nghị cho mã mới. Thích tải hình ảnh bằng
[x_train, y_train], [x_test, y_test] = cifar10.load_data[]
y_train = utils.to_categorical[y_train, num_classes]
y_test = utils.to_categorical[y_test, num_classes]
datagen = ImageDataGenerator[
    featurewise_center=True,
    featurewise_std_normalization=True,
    rotation_range=20,
    width_shift_range=0.2,
    height_shift_range=0.2,
    horizontal_flip=True,
    validation_split=0.2]
# compute quantities required for featurewise normalization
# [std, mean, and principal components if ZCA whitening is applied]
datagen.fit[x_train]
# fits the model on batches with real-time data augmentation:
model.fit[datagen.flow[x_train, y_train, batch_size=32,
         subset='training'],
         validation_data=datagen.flow[x_train, y_train,
         batch_size=8, subset='validation'],
         steps_per_epoch=len[x_train] / 32, epochs=epochs]
# here's a more "manual" example
for e in range[epochs]:
    print['Epoch', e]
    batches = 0
    for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
        model.fit[x_batch, y_batch]
        batches += 1
        if batches >= len[x_train] / 32:
            # we need to break the loop by hand because
            # the generator loops indefinitely
            break
4 và biến đổi đầu ra
[x_train, y_train], [x_test, y_test] = cifar10.load_data[]
y_train = utils.to_categorical[y_train, num_classes]
y_test = utils.to_categorical[y_test, num_classes]
datagen = ImageDataGenerator[
    featurewise_center=True,
    featurewise_std_normalization=True,
    rotation_range=20,
    width_shift_range=0.2,
    height_shift_range=0.2,
    horizontal_flip=True,
    validation_split=0.2]
# compute quantities required for featurewise normalization
# [std, mean, and principal components if ZCA whitening is applied]
datagen.fit[x_train]
# fits the model on batches with real-time data augmentation:
model.fit[datagen.flow[x_train, y_train, batch_size=32,
         subset='training'],
         validation_data=datagen.flow[x_train, y_train,
         batch_size=8, subset='validation'],
         steps_per_epoch=len[x_train] / 32, epochs=epochs]
# here's a more "manual" example
for e in range[epochs]:
    print['Epoch', e]
    batches = 0
    for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
        model.fit[x_batch, y_batch]
        batches += 1
        if batches >= len[x_train] / 32:
            # we need to break the loop by hand because
            # the generator loops indefinitely
            break
5 bằng các lớp tiền xử lý. Để biết thêm thông tin, hãy xem hướng dẫn tải hình ảnh và tăng cường hình ảnh, cũng như hướng dẫn lớp tiền xử lý.

Dữ liệu sẽ được lặp lại [theo lô]

  • trôi nổi. phần của tổng chiều rộng, nếu < 1 hoặc pixel nếu >= 1
  • giống như mảng 1-D. các phần tử ngẫu nhiên từ mảng
  • int. số nguyên pixel từ khoảng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    6 - Với
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    7 giá trị có thể là số nguyên
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    8, giống như với
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    9, trong khi với
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    0 giá trị có thể là số float trong khoảng [-1. 0, +1. 0]
  • trôi nổi. phần nhỏ của tổng chiều cao, nếu < 1 hoặc pixel nếu >= 1
  • giống như mảng 1-D. các phần tử ngẫu nhiên từ mảng
  • int. số nguyên pixel từ khoảng thời gian
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    1 - Với
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    2 giá trị có thể là số nguyên
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    8, giống như với
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    4, trong khi với
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    5 giá trị có thể là số float trong khoảng [-1. 0, +1. 0]
  • lập luận

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    6Boolean. Đặt giá trị trung bình đầu vào thành 0 trên tập dữ liệu, theo tính năng.
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    7Boolean. Đặt giá trị trung bình của từng mẫu thành 0.
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    8Boolean. Chia đầu vào cho tiêu chuẩn của tập dữ liệu, tính năng khôn ngoan.
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    9Boolean. Chia mỗi đầu vào cho tiêu chuẩn của nó.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    90epsilon để làm trắng ZCA. Mặc định là 1e-6.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    91Boolean. Thoa ZCA làm trắng.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    92Int. Phạm vi độ cho phép quay ngẫu nhiên.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    93Float, dạng mảng 1-D hoặc int
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    94Float, dạng mảng 1-D hoặc int
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    95Tuple hoặc danh sách hai số float. Phạm vi để chọn giá trị thay đổi độ sáng từ.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    96Trôi. Cường độ cắt [Góc cắt theo hướng ngược chiều kim đồng hồ tính bằng độ]
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    97Nổi hoặc [dưới, trên]. Phạm vi thu phóng ngẫu nhiên. Nếu một chiếc phao,
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    98.
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    99Trôi. Phạm vi thay đổi kênh ngẫu nhiên.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    90Một trong {"hằng số", "gần nhất", "phản chiếu" hoặc "quấn"}. Mặc định là 'gần nhất'. Các điểm bên ngoài ranh giới của đầu vào được điền theo chế độ đã cho
    • 'hằng số'. kkkkkkk. A B C D. kkkkkkkkk [cval=k]
    • 'gần nhất'. aaaaaaaa. A B C D. dddddddd
    • 'phản chiếu'. abcdcba. A B C D. dcbaabcd
    • 'bọc'. abcdabcd. A B C D. abcdabcd
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    91Float hoặc Int. Giá trị được sử dụng cho các điểm bên ngoài ranh giới khi
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    93Boolean. Lật ngẫu nhiên đầu vào theo chiều ngang.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    94Boolean. Lật ngẫu nhiên đầu vào theo chiều dọc.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    95yếu tố thay đổi tỷ lệ. Mặc định là Không có. Nếu Không có hoặc 0, thì không áp dụng thay đổi tỷ lệ, nếu không, chúng tôi nhân dữ liệu với giá trị được cung cấp [sau khi áp dụng tất cả các phép biến đổi khác].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    96chức năng sẽ được áp dụng trên mỗi đầu vào. Chức năng sẽ chạy sau khi hình ảnh được thay đổi kích thước và tăng cường. Hàm sẽ nhận một đối số. một hình ảnh [tenxơ Numpy có hạng 3] và sẽ xuất ra một tenxơ Numpy có cùng hình dạng.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    97Định dạng dữ liệu hình ảnh, "channels_first" hoặc "channels_last". Chế độ "channels_last" có nghĩa là hình ảnh phải có hình dạng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    98, chế độ "channels_first" có nghĩa là hình ảnh phải có hình dạng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    99. Nó mặc định là giá trị
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    90 được tìm thấy trong tệp cấu hình Keras của bạn tại
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    91. Nếu bạn chưa bao giờ đặt nó, thì nó sẽ là "channels_last".
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92Trôi. Tỷ lệ hình ảnh được dành riêng để xác thực [hoàn toàn từ 0 đến 1].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    93Dtype để sử dụng cho các mảng được tạo

    tăng

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    94Nếu giá trị của đối số,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    97 khác với
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    96 hoặc
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    97.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    94Nếu giá trị của đối số,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 > 1 hoặc
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 < 0

    ví dụ

    Ví dụ về việc sử dụng

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    01

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    

    Ví dụ về việc sử dụng

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    02

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    

    Ví dụ về chuyển đổi hình ảnh và mặt nạ cùng nhau

    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    9

    phương pháp

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    03

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    9

    Áp dụng phép biến đổi cho hình ảnh theo các tham số đã cho

    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      04. Trôi nổi. Góc quay tính bằng độ
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      05. Trôi nổi. Dịch chuyển theo hướng x
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      06. Trôi nổi. Dịch chuyển theo hướng y
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      07. Trôi nổi. Góc cắt theo độ
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      08. Trôi nổi. Phóng to theo hướng x
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      09. Trôi nổi. Phóng to theo hướng y
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      30. Boolean. lật ngang
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      31. Boolean. lật dọc
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      32. Trôi nổi. Cường độ chuyển kênh
    • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      33. Trôi nổi. cường độ thay đổi độ sáng
    Tenxơ Args____6343D, hình ảnh đơn.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    35Từ điển với chuỗi - cặp tham số mô tả phép biến đổi. Hiện tại, các thông số sau từ từ điển được sử dụng. ReturnsMột phiên bản đã chuyển đổi của đầu vào [cùng hình dạng]

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    36

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    9

    Khớp trình tạo dữ liệu với một số dữ liệu mẫu

    Điều này tính toán các số liệu thống kê dữ liệu nội bộ liên quan đến các phép biến đổi phụ thuộc vào dữ liệu, dựa trên một mảng dữ liệu mẫu

    Chỉ bắt buộc nếu

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    6 hoặc
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    8 hoặc
    tf.keras.preprocessing.image.ImageDataGenerator[
        featurewise_center=False,
        samplewise_center=False,
        featurewise_std_normalization=False,
        samplewise_std_normalization=False,
        zca_whitening=False,
        zca_epsilon=1e-06,
        rotation_range=0,
        width_shift_range=0.0,
        height_shift_range=0.0,
        brightness_range=None,
        shear_range=0.0,
        zoom_range=0.0,
        channel_shift_range=0.0,
        fill_mode='nearest',
        cval=0.0,
        horizontal_flip=False,
        vertical_flip=False,
        rescale=None,
        preprocessing_function=None,
        data_format=None,
        validation_split=0.0,
        interpolation_order=1,
        dtype=None
    ]
    
    91 được đặt thành True

    Khi

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    95 được đặt thành một giá trị, việc thay đổi kích thước được áp dụng cho dữ liệu mẫu trước khi tính toán thống kê dữ liệu nội bộ

    Args____634Dữ liệu mẫu. Nên có hạng 4. Trong trường hợp dữ liệu thang độ xám, trục kênh phải có giá trị 1, trong trường hợp dữ liệu RGB, nó phải có giá trị 3 và trong trường hợp dữ liệu RGBA, nó phải có giá trị 4.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    62Boolean [mặc định. Sai]. Có phù hợp với các mẫu tăng ngẫu nhiên hay không.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    63Int [mặc định. 1]. Nếu sử dụng gia tăng dữ liệu [
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    64], đây là số lượng gia tăng vượt qua dữ liệu để sử dụng.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    65Int [mặc định. Không có]. Hạt giống ngẫu nhiên

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    66

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    0

    Lấy mảng dữ liệu và nhãn, tạo ra các lô dữ liệu tăng cường

    Args____634Dữ liệu đầu vào. Mảng Numpy hạng 4 hoặc một bộ. Nếu là bộ, phần tử đầu tiên phải chứa hình ảnh và phần tử thứ hai là một mảng có nhiều mảng khác hoặc danh sách các mảng có nhiều mảng được chuyển đến đầu ra mà không có bất kỳ sửa đổi nào. Có thể được sử dụng để cung cấp dữ liệu linh tinh cho mô hình cùng với hình ảnh. Trong trường hợp dữ liệu thang độ xám, trục kênh của mảng hình ảnh phải có giá trị 1, trong trường hợp dữ liệu RGB, nó phải có giá trị 3 và trong trường hợp dữ liệu RGBA, nó phải có giá trị 4.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    68Nhãn.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    69Int [mặc định. 32].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    50Boolean [mặc định. Thật].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    51Trọng lượng mẫu.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    65Int [mặc định. Không có].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53Không có hoặc str [mặc định. Không có]. Điều này cho phép bạn tùy chọn chỉ định một thư mục để lưu các hình ảnh tăng cường được tạo [hữu ích để hình dung những gì bạn đang làm].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    54Str [mặc định.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    55]. Tiền tố để sử dụng cho tên tệp của ảnh đã lưu [chỉ liên quan nếu đặt ____
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    57một trong số "png", "jpeg", "bmp", "pdf", "ppm", "gif", "tif", "jpg" [chỉ liên quan nếu đặt
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53]. Mặc định. "png".
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    59Boolean [mặc định. Sai], bỏ qua sự khác biệt về số lượng lớp trong nhãn trong quá trình đào tạo và phân tách xác thực [hữu ích cho các tác vụ không phân loại]
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    00Tập dữ liệu con [
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    01 hoặc
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    02] nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 được đặt trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    04. ReturnsAn
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    05 mang lại các bộ dữ liệu của
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    06 trong đó
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 là một mảng dữ liệu hình ảnh có nhiều mảng [trong trường hợp đầu vào là một hình ảnh] hoặc một danh sách các mảng có nhiều mảng [trong trường hợp có thêm đầu vào] và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    68 là một mảng có nhiều nhãn tương ứng. Nếu 'sample_weight' không phải là Không có, các bộ dữ liệu được tạo ra có dạng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    09. Nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    68 là Không, chỉ có mảng numpy
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 được trả về. Tăng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    94Nếu Giá trị của đối số,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    00 không phải là "đào tạo" hoặc "xác thực"

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    14

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    3

    Lấy khung dữ liệu và đường dẫn đến một thư mục + tạo các đợt

    Các lô được tạo chứa dữ liệu tăng cường/chuẩn hóa

    **Có thể tìm thấy hướng dẫn đơn giản **tại đây

    • nếu
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      15 là
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      16 [giá trị mặc định] thì nó phải bao gồm cột
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      17 với các loại của mỗi hình ảnh. Các giá trị trong cột có thể là chuỗi/danh sách/bộ nếu một lớp hoặc danh sách/bộ nếu nhiều lớp
    • nếu
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      15 là
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      19 hoặc
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      20 thì nó phải bao gồm cột
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      17 đã cho với các giá trị lớp dưới dạng chuỗi
    • nếu
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      15 là
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      23 hoặc
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      24 thì nó phải chứa các cột được chỉ định trong
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      17
    • nếu
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      15 là
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      27 hoặc
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      28 thì không cần thêm cột
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    19. Mảng 1D numpy của nhãn nhị phân,
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    16. Mảng 2D gọn gàng của các nhãn được mã hóa một chiều. Hỗ trợ đầu ra đa nhãn
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    27. hình ảnh giống hệt với hình ảnh đầu vào [chủ yếu được sử dụng để hoạt động với bộ mã hóa tự động],
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    24. danh sách với các giá trị của các cột khác nhau,
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    23. mảng có nhiều giá trị trong [các] cột
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    17,
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    20. Mảng numpy 1D gồm các nhãn số nguyên,
  • [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    28, không có mục tiêu nào được trả về [trình tạo sẽ chỉ tạo ra các lô dữ liệu hình ảnh, rất hữu ích khi sử dụng trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    37]
  • Args
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    38Pandas dataframe chứa filepath liên quan đến
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    39 [hoặc đường dẫn tuyệt đối nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    39 là Không có] của hình ảnh trong cột chuỗi. Nó nên bao gồm các cột khác tùy thuộc vào
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    15.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    39string, đường dẫn đến thư mục để đọc ảnh từ. Nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    28, dữ liệu trong cột
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    44 phải là đường dẫn tuyệt đối. Chuỗi
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    44, cột trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    38 chứa tên tệp [hoặc đường dẫn tuyệt đối nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    39 là
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    28].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    17chuỗi hoặc danh sách, cột/s trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    38 có dữ liệu đích.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    51chuỗi, cột trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    38 chứa trọng lượng mẫu. Mặc định.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    28.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    54bộ số nguyên
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    55, mặc định.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    56. Kích thước mà tất cả các hình ảnh được tìm thấy sẽ được thay đổi kích thước.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    57một trong "thang độ xám", "rgb", "rgba". Mặc định. "rgb". Liệu hình ảnh sẽ được chuyển đổi để có 1 hoặc 3 kênh màu.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    58danh sách các lớp tùy chọn [e. g.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    59]. Mặc định là Không có. Nếu không được cung cấp, danh sách các lớp sẽ được tự động suy ra từ
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    17, danh sách này sẽ ánh xạ tới các chỉ số nhãn, sẽ ở dạng chữ và số]. Có thể lấy từ điển chứa ánh xạ từ tên lớp sang chỉ mục lớp thông qua thuộc tính
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    61.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    15một trong số "nhị phân", "phân loại", "đầu vào", "đa đầu ra", "thô", thưa thớt" hoặc Không có. Mặc định. "phân loại". Chế độ mang lại các mục tiêu.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    69kích thước của lô dữ liệu [mặc định. 32].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    50có xáo trộn dữ liệu hay không [mặc định. Đúng] ______765hạt giống ngẫu nhiên tùy chọn để xáo trộn và biến đổi.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53Không có hoặc str [mặc định. Không có]. Điều này cho phép bạn tùy chọn chỉ định một thư mục để lưu các hình ảnh tăng cường được tạo [hữu ích để hình dung những gì bạn đang làm].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    54str. Tiền tố để sử dụng cho tên tệp của ảnh đã lưu [chỉ liên quan nếu đặt ____
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    57một trong số "png", "jpeg", "bmp", "pdf", "ppm", "gif", "tif", "jpg" [chỉ liên quan nếu đặt
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53]. Mặc định. "png".
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    00Tập dữ liệu con [
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    01 hoặc
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    02] nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 được đặt trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    04.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    76Phương pháp nội suy được sử dụng để lấy mẫu lại hình ảnh nếu kích thước mục tiêu khác với kích thước của hình ảnh được tải. Các phương pháp được hỗ trợ là
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    77,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    78 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    79. Nếu PIL phiên bản 1. 1. 3 hoặc mới hơn được cài đặt,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    80 cũng được hỗ trợ. Nếu PIL phiên bản 3. 4. 0 hoặc mới hơn được cài đặt,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    81 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    82 cũng được hỗ trợ. Theo mặc định,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    77 được sử dụng.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    84Boolean, có xác thực tên tệp hình ảnh trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    44 hay không. Nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    86, hình ảnh không hợp lệ sẽ bị bỏ qua. Vô hiệu hóa tùy chọn này có thể dẫn đến tăng tốc khi thực hiện chức năng này. Mặc định là
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    86.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    88 lập luận kế thừa để đưa ra cảnh báo không dùng nữa. ReturnsA
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    89 mang lại các bộ dữ liệu của
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    06 trong đó
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 là một mảng khó hiểu chứa một loạt hình ảnh có hình dạng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    68 là một mảng khó hiểu của các nhãn tương ứng

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    94

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    6

    Đưa đường dẫn đến một thư mục và tạo ra các lô dữ liệu tăng cường

    • "phân loại" sẽ là nhãn được mã hóa một chiều 2D,
    • "nhị phân" sẽ là nhãn nhị phân 1D, "thưa thớt" sẽ là nhãn số nguyên 1D,
    • "đầu vào" sẽ là hình ảnh giống với hình ảnh đầu vào [chủ yếu được sử dụng để hoạt động với bộ mã hóa tự động]
    • Nếu Không, không có nhãn nào được trả về [trình tạo sẽ chỉ tạo ra các lô dữ liệu hình ảnh, rất hữu ích khi sử dụng với
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      95]. Xin lưu ý rằng trong trường hợp class_mode Không có, dữ liệu vẫn cần nằm trong thư mục con của
      [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
      y_train = utils.to_categorical[y_train, num_classes]
      y_test = utils.to_categorical[y_test, num_classes]
      datagen = ImageDataGenerator[
          featurewise_center=True,
          featurewise_std_normalization=True,
          rotation_range=20,
          width_shift_range=0.2,
          height_shift_range=0.2,
          horizontal_flip=True,
          validation_split=0.2]
      # compute quantities required for featurewise normalization
      # [std, mean, and principal components if ZCA whitening is applied]
      datagen.fit[x_train]
      # fits the model on batches with real-time data augmentation:
      model.fit[datagen.flow[x_train, y_train, batch_size=32,
               subset='training'],
               validation_data=datagen.flow[x_train, y_train,
               batch_size=8, subset='validation'],
               steps_per_epoch=len[x_train] / 32, epochs=epochs]
      # here's a more "manual" example
      for e in range[epochs]:
          print['Epoch', e]
          batches = 0
          for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
              model.fit[x_batch, y_batch]
              batches += 1
              if batches >= len[x_train] / 32:
                  # we need to break the loop by hand because
                  # the generator loops indefinitely
                  break
      
      39 để nó hoạt động chính xác
    Args____039string, đường dẫn đến thư mục đích. Nó nên chứa một thư mục con cho mỗi lớp. Mọi hình ảnh PNG, JPG, BMP, PPM hoặc TIF bên trong mỗi cây thư mục thư mục con sẽ được đưa vào trình tạo. Xem tập lệnh này để biết thêm chi tiết.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    54Tuple số nguyên
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    55, mặc định là
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    56. Kích thước mà tất cả các hình ảnh được tìm thấy sẽ được thay đổi kích thước.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    57Một trong số "thang độ xám", "rgb", "rgba". Mặc định. "rgb". Hình ảnh sẽ được chuyển đổi thành 1, 3 hay 4 kênh.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    58Danh sách tùy chọn của thư mục con lớp [e. g.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    59]. Mặc định. Không có. Nếu không được cung cấp, danh sách các lớp sẽ được tự động suy ra từ tên/cấu trúc thư mục con trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    39, trong đó mỗi thư mục con sẽ được coi là một lớp khác nhau [và thứ tự của các lớp, sẽ ánh xạ tới các chỉ số nhãn, sẽ là chữ và số . Từ điển chứa ánh xạ từ tên lớp đến chỉ mục lớp có thể được lấy thông qua thuộc tính
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    61.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    15Một trong số "phân loại", "nhị phân", "thưa thớt", "đầu vào" hoặc Không có. Mặc định. "phân loại". Xác định loại mảng nhãn được trả về.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    69Kích thước lô dữ liệu [mặc định. 32].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    50Có xáo trộn dữ liệu hay không [mặc định. Đúng] Nếu được đặt thành Sai, hãy sắp xếp dữ liệu theo thứ tự chữ và số.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    65Hạt giống ngẫu nhiên tùy chọn để xáo trộn và biến đổi.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53Không có hoặc str [mặc định. Không có]. Điều này cho phép bạn tùy chọn chỉ định một thư mục để lưu các hình ảnh tăng cường được tạo [hữu ích để hình dung những gì bạn đang làm].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    54Str. Tiền tố để sử dụng cho tên tệp của ảnh đã lưu [chỉ liên quan nếu đặt ____
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    57một trong số "png", "jpeg", "bmp", "pdf", "ppm", "gif", "tif", "jpg" [chỉ liên quan nếu đặt
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    53]. Mặc định. "png".
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    15Liệu có nên theo các liên kết tượng trưng bên trong các thư mục con của lớp hay không [mặc định. Sai].
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    00Tập dữ liệu con [
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    01 hoặc
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    02] nếu
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 được đặt trong
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    04.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    76Phương pháp nội suy được sử dụng để lấy mẫu lại hình ảnh nếu kích thước mục tiêu khác với kích thước của hình ảnh được tải. Các phương pháp được hỗ trợ là
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    77,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    78 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    79. Nếu PIL phiên bản 1. 1. 3 hoặc mới hơn được cài đặt,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    80 cũng được hỗ trợ. Nếu PIL phiên bản 3. 4. 0 hoặc mới hơn được cài đặt,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    81 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    82 cũng được hỗ trợ. Theo mặc định,
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    77 được sử dụng.
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    29Boolean, có nên thay đổi kích thước hình ảnh thành kích thước mục tiêu mà không bị biến dạng tỷ lệ khung hình hay không. Hình ảnh được cắt ở giữa với tỷ lệ khung hình mục tiêu trước khi thay đổi kích thước. ReturnsA
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    30 mang lại các bộ dữ liệu của
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    06 trong đó
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 là một mảng khó hiểu chứa một loạt hình ảnh có hình dạng
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    92 và
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    68 là một mảng khó hiểu của các nhãn tương ứng

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    35

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    5

    Tạo các tham số ngẫu nhiên cho một chuyển đổi

    Args
    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    36Tuple số nguyên. Hình dạng của hình ảnh được biến đổi.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    65Hạt giống ngẫu nhiên. ReturnsMột từ điển chứa các tham số được chọn ngẫu nhiên mô tả quá trình chuyển đổi

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    38

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    0

    Áp dụng phép biến đổi ngẫu nhiên cho hình ảnh

    Tenxơ Args____6343D, hình ảnh đơn.
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    65Hạt giống ngẫu nhiên. ReturnsMột phiên bản được biến đổi ngẫu nhiên của đầu vào [cùng hình dạng]

    train_datagen = ImageDataGenerator[
            rescale=1./255,
            shear_range=0.2,
            zoom_range=0.2,
            horizontal_flip=True]
    test_datagen = ImageDataGenerator[rescale=1./255]
    train_generator = train_datagen.flow_from_directory[
            'data/train',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    validation_generator = test_datagen.flow_from_directory[
            'data/validation',
            target_size=[150, 150],
            batch_size=32,
            class_mode='binary']
    model.fit[
            train_generator,
            steps_per_epoch=2000,
            epochs=50,
            validation_data=validation_generator,
            validation_steps=800]
    
    41

    Xem nguồn

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    1

    Áp dụng cấu hình chuẩn hóa tại chỗ cho một lô đầu vào

    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 được thay đổi tại chỗ vì chức năng này chủ yếu được sử dụng nội bộ để chuẩn hóa hình ảnh và cung cấp chúng cho mạng của bạn. Nếu một bản sao của
    [x_train, y_train], [x_test, y_test] = cifar10.load_data[]
    y_train = utils.to_categorical[y_train, num_classes]
    y_test = utils.to_categorical[y_test, num_classes]
    datagen = ImageDataGenerator[
        featurewise_center=True,
        featurewise_std_normalization=True,
        rotation_range=20,
        width_shift_range=0.2,
        height_shift_range=0.2,
        horizontal_flip=True,
        validation_split=0.2]
    # compute quantities required for featurewise normalization
    # [std, mean, and principal components if ZCA whitening is applied]
    datagen.fit[x_train]
    # fits the model on batches with real-time data augmentation:
    model.fit[datagen.flow[x_train, y_train, batch_size=32,
             subset='training'],
             validation_data=datagen.flow[x_train, y_train,
             batch_size=8, subset='validation'],
             steps_per_epoch=len[x_train] / 32, epochs=epochs]
    # here's a more "manual" example
    for e in range[epochs]:
        print['Epoch', e]
        batches = 0
        for x_batch, y_batch in datagen.flow[x_train, y_train, batch_size=32]:
            model.fit[x_batch, y_batch]
            batches += 1
            if batches >= len[x_train] / 32:
                # we need to break the loop by hand because
                # the generator loops indefinitely
                break
    
    34 sẽ được tạo thay vào đó thì nó sẽ có chi phí hiệu suất đáng kể. Nếu bạn muốn áp dụng phương pháp này mà không thay đổi đầu vào tại chỗ, bạn có thể gọi phương thức tạo bản sao trước

    Làm cách nào để chuyển đổi sang jpg trăn?

    Thuật toán. .
    Nhập mô-đun Hình ảnh từ PIL và nhập mô-đun os
    Nhập hình ảnh sẽ được chuyển đổi bằng Image. .
    Hiển thị kích thước của hình ảnh trước khi chuyển đổi bằng hệ điều hành. .
    Chuyển đổi hình ảnh bằng Image. .
    Xuất hình ảnh bằng Image. .
    Hiển thị kích thước của hình ảnh sau khi chuyển đổi bằng hệ điều hành

    ppm trong Python là gì?

    Định dạng hình ảnh chúng tôi sẽ sử dụng được gọi là PPM, viết tắt của Định dạng Pixmap di động .

    Chủ Đề