Image Processing examples used for teaching within the Department of Computer Science at Durham University (UK) by Dr. Amir Atapour-Abarghouei.
The material is presented as part of the "Image Processing" lecture series at Durham University.
All material here has been tested with Opencv 4.5 and Python 3.9.
- You may download each file as needed.
- You can also download the entire repository as follows:
git clone https://github.com/atapour/ip-python-opencv
cd ip-python-opencv
In this repository, you can find:
- .py file - python code for the examples
- You can simply run each Python file by running:
python <example file name>.py
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, converted greyscale, greyscale / 2, and the absolute difference between consecutive frames.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, bitwise NOT of the converted greyscale, bitwise AND of the greyscale and binary circular mask, and the XOR of two consecutive frames.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to greyscale and the logarithmic transform of the image. The parameters of the transform can be set using track bars.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to greyscale and the exponential transform of the image. The parameters of the transform can be set using track bars.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input and the power law transform of the image [gamma correction]. The parameters of the transform can be set using track bars.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, Gaussian noise added to the input image, the mean filter applied to the image and the median filter applied to the image. The neighbourhood size of the filters can be set using the track bar.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, Salt and Pepper noise added to the input image, the mean filter applied to the image and the median filter applied to the image. The neighbourhood size of the filters can be set using the track bar.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, Gaussian smoothing applied to the input image, the Laplacian of the image and the blurred image edge sharpened using the Laplacian.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input, the Mean filter, the Gaussian filter and the Bilateral Filter applied to the image. The neighbourhood size of the mean and the Gaussian filters as well as the standard deviation of the Gaussian and the Bilateral Filters can be set using the track bar.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input with Salt and Pepper noise added to it, the Mean filter, the Gaussian filter and the Non-Local Mean Filter applied to the noisy image so the noise can be removed. The neighbourhood size of the mean and the Gaussian filters as well as the standard deviation of the Gaussian and the strength of the Non-Local Means Filters can be set using the track bar.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale, the histogram of the input, the output with its contrast stretched and the histogram of the contrast stretched output.
Running this script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale, the histogram of the input, the output with its histogram equalised and the histogram of the histogram equalised output.
Running this script will perform contrast limited adaptive histogram equalisation (CLAHE). The script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale, the histogram of the input, the output after it is clahe equalised and the histogram of the clahe equalised output. Parameters of CLAHE equalisation can be set using track bars.
Running this script will apply the Fourier Transform to an image and display the fourier magnitude spectrum. The script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale, along with the Fourier magnitude spectrum of the image.
Running this script will apply the Fourier Transform to an image and perform the band pass filtering. The script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale, along with the mask that is meant to be applied to the Fourier magnitude spectrum of the image, the filter Fourier spectrum and the final filtered image brought back to the spatial domain.
Running this script will apply the Fourier Transform to an image and perform both the high and low pass filtering. The script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original RGB, original input converted to grayscale, along with the high pass filter applied to the Fourier spectrum, the low pass filter applied to the Fourier spectrum, the final high pass filtered image brought back to the spatial domain and the final low pass filtered image brought back to the spatial domain. The radius of the filters can be determined using a track bar.
Running this script will apply the Fourier Transform to an image and perform both the Butterworth high and low pass filtering. The script will receive input images from a camera or a video (pass the path to the video as an argument) and display the original input converted to grayscale and the Fourier spectrum along with the Butterworth high pass filter applied to the Fourier spectrum, the Butterworth low pass filter applied to the Fourier spectrum, the final high pass filtered image brought back to the spatial domain and the final low pass filtered image brought back to the spatial domain. The radius and order of the Butterworth filters can be set using track bars.
Running this script will apply template matching to images received from a camera (or a video). The user is asked draw a box on the image. This box, drawn using the mouse, selects a template. This template (patch to be matched) will be displayed as part of the output. Then correlation template matching will be performed, and a box will be drawn on the closest window to the input template within the image. This essentially means the algorithm will try to track the selected box.
Running this script will visualise the different colour channels in the RGB, HSV and CIELAB colour spaces. The script will separate the channels and display all three channels of all three colour spaces in a grid. Pushing the key 'c' will toggle colour mapping onto some of the colour channels.
Running this script will perform colour object tracking on images received from a camera (or a video). The user is asked draw a box on the image. This box, drawn using the mouse, selects a patch. This patch will be displayed as part of the output. Then object tracking will be performed, and a box will be drawn on the closest window to the input patch within the image. This essentially means the algorithm will track the selected box. This tracking is based on the Mean Shift Algorithm.
Running this script will demonstrate the artefacts introduced by compression techniques. This is done for both JPG and PNG compression. Quality of the compression is controlled via track bars. The script will demonstrate the original input, the compressed results and the absolute difference between the original input and the compressed input is also displayed so the amount of noise is shown. The noise can be amplified using the amplification parameter set through the track bar for better viewing.
Running this script will read a video from a camera and saves that video to disk. Most parameters are hard-coded and need to be changed in the code itself.
All code is provided "as is" to aid learning and understanding of topics within the "Image Processing" course.
Please raise an issue in this repository if you find any bugs. It would even be better if you submitted a pull request with a fix or an improvement.