Lecture 05 - Filtering, pooling, and convolution

MachineLearningCourse.Lecture05Module
Lecture05

Filtering, pooling, and convolution.

Available Functions

  • filter_image(): Demonstrate image filtering with a given image and kernel
  • stock_data_moving_average(): Moving average of stock data
  • demo(): CNN demo for MNIST handwritten digit recognition

Usage

using ImageView
using MachineLearningCourse
image = Lecture05.filter_image()
imshow(image)
using MachineLearningCourse
Lecture05.stock_data_moving_average()
using MachineLearningCourse
Lecture05.demo()
source
MachineLearningCourse.Lecture05.LeNet5Type
LeNet5(input_dims)

LeNet-5 CNN structure with convolutional layers.

Arguments

  • input_dims::Vector{Int}: Input dimensions [height, width, channels]
  • ϕ: Activation function for hidden layers

Fields

  • input_dims::Vector{Int}: Input dimensions specification
  • model::Flux.Chain: The underlying Flux model
  • layer_dimensions::Vector: Layer dimension information

Uses LeNet-5 architecture.

Example

# Create network for 32x32 RGB images  
network = LeNet5([32, 32, 3])
source
MachineLearningCourse.Lecture05.accuracyMethod
accuracy(network::LeNet5, X, Y_onehot)

Calculate accuracy of LeNet-5 CNN on test data.

Arguments

  • network::LeNet5: Trained LeNet-5 network
  • X::Array{Float32, 4}: Input data (height × width × channels × samples)
  • Y_onehot::Matrix{Float32}: One-hot encoded target labels (classes × samples)

Returns

  • Float32: Accuracy as a fraction between 0 and 1
source
MachineLearningCourse.Lecture05.apply_filterMethod
apply_filter(image, kernel)

Apply image filtering using sliding filters.

Arguments

  • image: Input image (color or grayscale)
  • kernel::AbstractMatrix: Filter kernel to apply

Returns

  • Gray: Filtered grayscale image normalized to [0,1]

Notes

Color images are automatically converted to grayscale before filtering.

Example

using Images
img = load("image.jpg")
filtered = apply_filter(img, kernels[:edge_enhance])
source
MachineLearningCourse.Lecture05.demoMethod
demo(; seed=42, train_size=5000, test_size=1000, η=0.001, epochs=50, batch_size=128, verbose=true)

Train and evaluate LeNet-5 CNN on MNIST dataset with grayscale images.

Arguments

  • seed: Random seed for reproducibility (default: 42)
  • train_size: Number of training samples (default: 5000)
  • test_size: Number of test samples (default: 1000)
  • η: Learning rate for Adam optimizer (default: 0.001)
  • epochs: Number of training epochs (default: 50)
  • batch_size: Mini-batch size for SGD training (default: 128)
  • verbose: Print training progress (default: true)

Returns

  • Tuple{LeNet5, Vector{Float32}}: (model, losses)
    • model: Trained LeNet-5 CNN model for grayscale images
    • losses: Training loss values per epoch

Notes

This demonstration shows LeNet-5 CNN with grayscale handwritten digit images:

  • Input dimensions: 28×28×1 (grayscale)
  • First conv layer processes 1 input channel
  • Dataset: MNIST handwritten digits
  • Uses Float32 tensors for optimal Flux.jl performance

Example

# Basic usage
model, losses = Lecture05.demo()

# Custom parameters
model, losses = Lecture05.demo(epochs=100, batch_size=64)
source
MachineLearningCourse.Lecture05.evaluateMethod
evaluate(network::LeNet5, X_test, Y_test, num_classes)

Comprehensive evaluation of LeNet-5 CNN with confusion matrix and per-class metrics.

Arguments

  • network::LeNet5: Trained LeNet-5 network
  • X_test::Array{Float32, 4}: Test input data (height × width × channels × samples)
  • Y_test::Matrix{Float32}: Test target labels (one-hot encoded, classes × samples)
  • num_classes::Int: Number of classes

Returns

  • NamedTuple: (accuracy=Float32, predictions=Vector, truelabels=Vector, confusionmatrix=Matrix{Int})
source
MachineLearningCourse.Lecture05.fetch_stock_dataFunction
fetch_stock_data(symbol="AAPL", period="1y")

Fetch stock price data from Yahoo Finance API.

Arguments

  • symbol::String: Stock symbol (default: "AAPL")
  • period::String: Time period for data (default: "1y") Valid periods: "1d", "5d", "1mo", "3mo", "6mo", "1y", "2y", "5y", "10y", "ytd", "max"

Returns

  • Tuple{Vector{DateTime}, Vector{Float64}}: (dates, closing_prices) or (nothing, nothing) if error

Example

dates, prices = fetch_stock_data("MSFT", "6mo")
source
MachineLearningCourse.Lecture05.filter_imageFunction
filter_image(image_path=joinpath(@__DIR__, "containers_CC0.jpg"), kernel=kernels[:laplacian])

Demonstrate image filtering with a given image and kernel.

Arguments

  • image_path::String: Path to image file (default: containers_CC0.jpg in current directory)
  • kernel::AbstractMatrix: Filter kernel to apply (default: Laplacian edge detection)

Returns

  • Gray: Filtered grayscale image

Notes

Color images are automatically converted to grayscale before filtering.

Example

# Use default image and Laplacian filter
filtered = filter_image()

# Use custom image and Sobel filter
filtered = filter_image("my_image.png", kernels[:sobel_x])

# Display with ImageView
using ImageView
imshow(filtered)
source
MachineLearningCourse.Lecture05.load_companiesMethod
load_companies()

Load Dow 30 company data from CSV file.

Returns

  • DataFrame: Company data with Symbol and Name columns

Example

companies = load_companies()
println(companies.Symbol[1])  # First company symbol
source
MachineLearningCourse.Lecture05.plot_stock_with_moving_averageMethod
plot_stock_with_moving_average(symbol, window_size, period)

Plot stock price with moving average overlay.

Arguments

  • symbol::String: Stock symbol (e.g., "AAPL", "MSFT")
  • window_size::Int: Moving average window size in days
  • period::String: Time period for data (e.g., "1y", "6mo", "3mo")

Returns

  • Plots.Plot: Plot object that can be displayed or saved with savefig()

Example

# Create plot
p = plot_stock_with_moving_average("AAPL", 20, "1y")

# Save plot
savefig(p, "apple_stock.png")
source
MachineLearningCourse.Lecture05.predictMethod
predict(network::LeNet5, X)

Make predictions using trained LeNet-5 CNN.

Arguments

  • network::LeNet5: Trained LeNet-5 network
  • X::Array{Float32, 4}: Input data (height × width × channels × samples)

Returns

  • Predictions from the network (classes × samples)
source
MachineLearningCourse.Lecture05.sliding_filterMethod
sliding_filter(data, kernel; operation=sum, stride=1)

Generic sliding filter for N-dimensional data.

Arguments

  • data::AbstractArray: Input array (1D, 2D, 3D, etc.)
  • kernel::AbstractArray: Filter kernel (same dimensionality as data)
  • operation::Function: Function to apply (sum, mean, maximum, etc.) (default: sum)
  • stride::Int: Step size for sliding window (default: 1)

Returns

  • AbstractArray: Filtered output array with reduced dimensions

Example

# 1D sliding average
data = [1, 2, 3, 4, 5]
kernel = [0.5, 0.5]
result = sliding_filter(data, kernel; operation=sum)
source
MachineLearningCourse.Lecture05.stock_data_moving_averageMethod
stock_data_moving_average()

Interactive stock data visualization with arrow key menus.

Provides an interactive interface to:

  1. Select a company from Dow 30 companies
  2. Choose time period (1mo, 3mo, 6mo, 1y, 2y)
  3. Set moving average window (5, 10, 20, 30, 50 days)

Returns

  • Plots.Plot: Stock price plot with moving average overlay

Usage

Navigate menus with arrow keys and Enter to select.

Example

# Interactive demo
stock_data_moving_average()
source
MachineLearningCourse.Lecture05.train!Method
train!(network::LeNet5, X_train, Y_train, η, epochs; batch_size=128, verbose=true)

Train a LeNet-5 CNN using mini-batch SGD.

Arguments

  • network::LeNet5: LeNet-5 network to train
  • X_train::Array{Float32, 4}: Training input data (height × width × channels × samples)
  • Y_train::Matrix{Float32}: Training target labels (one-hot encoded, classes × samples)
  • η::Float32: Adam optimizer learning rate
  • epochs::Int: Number of training epochs
  • batch_size::Int: Mini-batch size (default: 128)
  • verbose::Bool: Print training progress (default: true)

Returns

  • Vector{Float32}: Training losses per epoch
source