Electronic Theses and Dissertations

Date

2023

Document Type

Thesis

Degree Name

Master of Science

Department

Electrical & Computer Engineering

Committee Chair

Madhusudhanan Balasubramanian

Committee Member

Deepak Venugopoal

Committee Member

Eddie Jacobs

Abstract

Optical flow estimation is a computer vision problem which aims to estimate apparent 2D motion (flow velocities) of image intensities between two or more consecutive frames in an image sequence. Optical flow information is useful for quantifying dense motion field in numerous applications such as autonomous driving, object tracking in traffic control systems, video frame interpolation, video compression and structural biomarker development for medical diagnosis. Recent state of the art learning methods for optical flow estimation are two-frame based methods where optical flow is estimated sequentially for each image pairs in an image sequence. In this work, we introduce a learning based spatio-temporal transformers for multi-frame optical flow estimation (SSTMs). SSTM is a multi-frame based optical flow estimation algorithm which can learn and estimate non-linear motion dynamics in a scene from multiple sequential images of the scene. When compared to two-frame methods, SSTM can provide improved optical flow estimates in regions with object occlusions and near boundaries where objects may enter or leave the scene (out-of-boundary regions). Our method utilizes 3D Convolutional Gated Recurrent Networks (3D-ConvGRUs) and space-time attention modules to learn the recurrent space-time dynamics of input scenes and provide a generalized optical flow estimation. When trained using the same training datasets, our method outperforms both the existing multi-frame based optical flow estimation algorithms and the recent state of the art two-frame methods on Sintel benchmark dataset (based on a computer-animated movie) and KITTI 2015 driving benchmark datasets.

Comments

Data is provided by the student

Library Comment

Dissertation or thesis originally submitted to ProQuest.

Notes

Open Access

Share

COinS