Defense Notices


All students and faculty are welcome to attend the final defense of EECS graduate students completing their M.S. or Ph.D. degrees. Defense notices for M.S./Ph.D. presentations for this year and several previous years are listed below in reverse chronological order.

Students who are nearing the completion of their M.S./Ph.D. research should schedule their final defenses through the EECS graduate office at least THREE WEEKS PRIOR to their presentation date so that there is time to complete the degree requirements check, and post the presentation announcement online.

Upcoming Defense Notices

Manu Chaudhary

Utilizing Quantum Computing for Solving Multidimensional Partial Differential Equations

When & Where:


Eaton Hall, Room 2001B

Committee Members:

Esam El-Araby, Chair
Perry Alexander
Tamzidul Hoque
Prasad Kulkarni
Tyrone Duncan

Abstract

Quantum computing has the potential to revolutionize computational problem-solving by leveraging the quantum mechanical phenomena of superposition and entanglement, which allows for processing a large amount of information simultaneously. This capability is significant in the numerical solution of complex and/or multidimensional partial differential equations (PDEs), which are fundamental to modeling various physical phenomena. There are currently many quantum techniques available for solving partial differential equations (PDEs), which are mainly based on variational quantum circuits. However, the existing quantum PDE solvers, particularly those based on variational quantum eigensolver (VQE) techniques, suffer from several limitations. These include low accuracy, high execution times, and low scalability on quantum simulators as well as on noisy intermediate-scale quantum (NISQ) devices, especially for multidimensional PDEs.

 In this work, we propose an efficient and scalable algorithm for solving multidimensional PDEs. We present two variants of our algorithm: the first leverages finite-difference method (FDM), classical-to-quantum (C2Q) encoding, and numerical instantiation, while the second employs FDM, C2Q, and column-by-column decomposition (CCD). Both variants are designed to enhance accuracy and scalability while reducing execution times. We have validated and evaluated our proposed concepts using a number of case studies including multidimensional Poisson equation, multidimensional heat equation, Black Scholes equation, and Navier-Stokes equation for computational fluid dynamics (CFD) achieving promising results. Our results demonstrate higher accuracy, higher scalability, and faster execution times compared to VQE-based solvers on noise-free and noisy quantum simulators from IBM. Additionally, we validated our approach on hardware emulators and actual quantum hardware, employing noise mitigation techniques. This work establishes a practical and effective approach for solving PDEs using quantum computing for engineering and scientific applications.


Past Defense Notices

Dates

AQSA PATEL

Interpretation of SIRAL Waveforms using Ultra-wideband Radar Altimeter Data

When & Where:


317 Nichols Hall

Committee Members:

Carl Leuschen, Chair
Swapan Chakrabarti
Prasad Gogineni
John Paden
David Braaten

Abstract

The surface-elevation of ice sheets and sea ice is currently measured using both satellite and airborne radar altimeters. These measurements are used for generating mass balance estimates of ice sheets and thickness estimates of sea ice. However, due to the penetration of the altimeter signal into the snow there is ambiguity between the surface tracking point and the actual surface location which produces errors in the surface elevation measurement. Until now there is no comprehensive study done to address how the penetration of the Ku-band signal affects the shape of the return signal over various snow zones and sea ice. Therefore, it is important to study the effect sub-surface scattering and seasonal variations in the properties of snow pack have on the return waveform to correctly interpret the satellite radar altimeter data. To address this problem, an ultra-wide bandwidth Ku-band radar altimeter was developed at the Center for Remote Sensing of Ice Sheets (CReSIS). The CReSIS Ku-band Altimeter (CKA) operates over the frequency range of 12 to 18 GHz providing very fine resolution to resolve the sub-surface features of the snow. The CKA is design to encompass the frequency band of SIRAL, a satellite radar altimeter on board CryoSat-2, operating from 13.4 to 13.75 GHz. The data from CKA can be used to simulate SIRAL data, and the simulated SIRAL waveforms can help us understand the effect of signal penetration and sub-surface scattering on the low bandwidth satellite altimeter. The extensive CKA data collected as a part of the Operation Ice Bridge (OIB) campaign can be used to interpret SIRAL data over surfaces with varying snow conditions. The goal of this research is to use modeling and data inter-comparisons from CKA and satellite measurements to investigate the effect of signal penetration into snow and geophysical snow conditions on the retrieval of surface elevation from satellite radar altimeters, such as SIRAL. Based on the results of this investigation, the plan is to improve the tracking algorithms used by SIRAL to effectively track the actual surface location accurately.


MEEYOUNG PARK

HealthTrust: Assessing the Trustworthiness of Healthcare Information on the Internet

When & Where:


250 Nichols Hall

Committee Members:

Bo Luo, Chair
Xue-Wen Chen
Arvin Agah
Luke Huan
Michael Wang

Abstract

As well recognized, healthcare information is growing exponentially and is made more available to public. Frequent users such as medical professionals and patients are highly dependent on the web sources to get the appropriate information promptly. However, the trustworthiness of the web information can be hardly discriminated due to the fast and augmentative properties of the Internet. Most search engines provide relevant pages to given keywords, but the results might contain unreliable or biased information. 

In this dissertation, I proposed a new system named HealthTrust, which automatically assesses the trustworthiness of healthcare information over the Internet. First, in the first phase, a new ranking algorithm for structure-based analysis is adopted. The basic hypothesis is that trustworthy pages are more likely to link to trustworthy pages. In this way, the original set of positive and negative seeds will propagate over the Web graph. Next, in the second phase, the content-consistency between general healthcare-related webpages and trusted sites is evaluated using information retrieval techniques to evaluate the context of the webpage. In addition, sentence modeling is employed to generate contents-based ranking for each page. Finally, an iterative algorithm is developed to integrate the two components.


STEVE PENNINGTON

Spectrum Coverage Estimation Using Large Scale Measurements

When & Where:


246 Nichols Hall

Committee Members:

Joe Evans, Chair
Arvin Agah
Victor Frost
Gary Minden
Ronald Aust

Abstract

Existing RF path loss models are useful for prediction but do not necessarily exploit additional knowledge such as differing terrain features. This research will examine the relationship between terrain type (determined by public GIS data sets) and empirical path loss model parameters through the use of a large scale data collection platform. A large scale measurement campaign will be undertaken to sample the UHF DTV bands from a variety of environments using a set of portable software defined radio sensors. Machine learning and geostatistical algorithms will then be used to learn path loss parameters for generalized terrain types.


MARK CALNON

Assistive Robotics for the Elderly: Encouraging Trust and Acceptance Using Affective Body Language

When & Where:


2001B Eaton Hall

Committee Members:

Arvin Agah, Chair
Frank Brown
Jerzy Grzymala-Busse
Bo Luo
Richard Branham

Abstract

"Assistive robotics for the elderly has become a significant area of research, driven primarily by a rapidly aging global population. Between 2011 and 2050, the number of people aged 60 and over is expected to climb from 893 million to 2.4 billion. In addition to a rapidly aging global population, a growing shortage of caretakers has placed additional urgency on the search for alternative solutions. 

Despite the potential benefits of assistive robotics, one significant hurdle that remains is designing robots that the elderly are willing to use. Not only must assistive robots be effective at monitoring and caring for the elderly, but they must also be acceptable to a wide range of elderly individuals. While a variety of factors can influence the acceptability of a robot, past research has focused primarily on the physical embodiment of the robot. 

Social robotics, however, uses human-robot interactions to study the many social factors that can influence the acceptability of a robot, including affective behaviors, or behaviors that simulate personality and emotion. While the majority of research in affective behaviors has focused on facial expressions, these methods require sophisticated anthropomorphic representations, which are not generally preferred by the elderly. 

However, in addition to facial expressions, body language can also be an effective communicator of emotions and personalities. This research will demonstrate the effectiveness of using non-verbal behaviors to simulate a variety of personalities and emotions on the Aldebaran Nao, as well as the impact these behaviors have on the elderly’s trust and acceptance of the assistive robot. By adapting the personalities and emotions of an assistive robot both to the task it is performing, as well as to individual elderly users, this research will ultimately enable assistive robots to perform as more effective caretakers for the elderly."


DAVID HARVIE

Targeted Scrum: Software Development Inspired by Mission Command

When & Where:


2001B Eaton Hall

Committee Members:

Arvin Agah, Chair
Bo Luo
Jim Miller
Hossein Saiedian
Prajna Dhar

Abstract

Software development has been and continues to be a difficult enterprise. As early as the 1960’s, computer experts recognized the need to develop working and reliable software, within budget and on time. One of the major obstacles to successful software development is the inevitability of changing requirements. Agile software development methods, such as Extreme Programming and Scrum, emerged in the 1990’s as responses to deal with constant change. However, agile software development methods have their own set of weaknesses. Two specific weaknesses with Scrum are a lack of initial planning and a lack of an overall architecture. 
Military operations are another field that must deal with constantly changing requirements in complex environments. In response to this inescapable change, the military these days has primarily employed mission command to direct operations. Mission command is the philosophy where a commander gives subordinates his/her intent and desired end state for an operation, and then the subordinates have appropriate flexibility to operate to achieve that intent and end state. This research effort seeks to use inspirations from mission command to improve certain aspects of agile software development, namely Scrum, both in terms of the process and the product. 
This research effort seeks to address the lack of initial planning in Scrum with the addition of a Product Design Meeting at the onset of the process. This effort also addresses the lack of an overall architecture using two artifacts derived from mission command, the product’s end state and lines of effort (LOEs) necessary to achieve that end state. The addition of the Product Design Meeting, product end state, and LOEs will add more formalism to an agile method. However, we hypothesize that the benefits of adding those techniques will offset any perceived loss of agility.


SAHANA RAGHUNANDAN

Analysis of Angle of Arrival Estimation Algorithms for Basal Ice Sheet Tomography

When & Where:


317 Nichols Hall

Committee Members:

John Paden, Chair
Shannon Blunt
Carl Leuschen


Abstract

One of the key requirements for deriving more realistic ice sheet models is to obtain a good set of basal measurements that enable accurate estimation of bed roughness and conditions. For this purpose, 3D tomography of the ice bed has been successfully implemented with the help of angle of arrival estimation (AoA) algorithms such as multiple signal classification (MUSIC) and maximum likelihood estimation (MLE) techniques. These methods have enabled fine resolution in the cross-track dimension using synthetic aperture radar (SAR) images obtained from single pass multichannel data. This project analyzes and compares the results obtained from the spectral MUSIC algorithm, an alternating projection (AP) based MLE technique, and a relatively recent approach called the reiterative superresolution (RISR) algorithm. While the MUSIC algorithm is more attractive computationally compared to MLE, the performance of the latter is known to be superior in a low signal to noise ratio regime. The RISR algorithm takes a completely different approach by using a recursive implementation of the minimum mean square error (MMSE) estimation technique instead of using the sample covariance matrix (SCM) that is central to subspace based algorithms. This renders the algorithm more robust in scenarios where there is a very low sample support. The SAR focused datasets provide a good case study to explore the performance of the three techniques to the application of ice sheet bed elevation estimation.


EHSAN HOSSEINI

Synchronization Techniques for Burst-Mode Continuous Phase Modulation

When & Where:


250 Nichols Hall

Committee Members:

Erik Perrins, Chair
Shannon Blunt
Lingjia Liu
Dave Petr
Tyrone Duncan

Abstract

Synchronization is a critical operation in digital communication systems, which establishes and maintains an operational link between transmitter and the receiver. As the advancement of digital modulation and coding schemes continues, the synchronization task becomes more and more challenging since the new standards require high-throughput functionality at low signal-to-noise ratios (SNRs). In this work, we address feedforward synchronization of continuous phase modulations (CPMs) using data-aided (DA) methods, which are best suited for burst-mode communications. In our transmission model, a known training sequence is appended to the beginning of each burst, which is then affected by additive white Gaussian noise (AWGN), and unknown frequency, phase, and timing offsets. 

Based on our transmission model, we derive the optimum training sequence for DA synchronization of CPM signals using the Cramer-Rao bound (CRB), which is a lower bound on the estimation error variance. It is shown that the proposed sequence minimizes the CRB for all three synchronization parameters, and can be applied to the entire CPM family. We take advantage of the structure of the optimized training sequence in order to derive a maximum likelihood joint timing and carrier recovery algorithm. Moreover, a frame synchronization algorithm is proposed, and hence, a complete synchronization scheme is presented in this work. 

The proposed training sequence and synchronization algorithm are extended to shaped-offset quadrature phase-shift keying (SOQPSK) modulation, which is considered for next generation aeronautical telemetry systems. Here, it is shown that the optimized training sequence outperforms the one that is defined in the draft telemetry standard as long as estimation error variances are considered. The overall bit error rate suggest that the optimized training sequence with a shorter length can be utilized such that the SNR loss is less than 0.5 dB of an ideal synchronization scenario.


MARTIN KUEHNHAUSEN

A Framework for Knowledge Derivation Incorporating Trust and Quality of Data

When & Where:


246 Nichols Hall

Committee Members:

Victor Frost, Chair
Luke Huan
Bo Luo
Gary Minden
Tyrone Duncan

Abstract

Today, across all major industries gaining insight from data is seen as an essential part of business. However, while data gathering is becoming inexpensive and relatively easy, analysis and ultimately deriving knowledge from it is increasingly difficult. In many cases, there is the problem of too much data such that important insights are hard to find. The problem is often not lack of data but whether knowledge derived from it is trustworthy. This means distinguishing “good” from “bad” insights based on factors such as context and reputation. Still, modeling trust and quality of data is complex because of the various conditions and relationships in heterogeneous environments. 

The new TrustKnowOne framework and architecture developed in this dissertation addresses these issues by describing an approach to fully incorporate trust and quality of data with all its aspects into the knowledge derivation process. This is based on Berlin, an abstract graph model we developed that can be used to model various approaches to trustworthiness and relationship assessment as well as decision making processes. In particular, processing, assessment, and evaluation approaches are implemented as graph expressions that are evaluated on graph components modeling the data. 

We have implemented and applied our framework to three complex scenarios using real data from public data repositories. As part of their evaluation we highlighted how our approach exhibits both the formalization and flexibility necessary to model each of the realistic scenarios. The implementation and evaluation of these scenarios confirms the advantages of the TrustKnowOne framework over current approaches.


YUANLIANG MENG

Building an Intelligent Knowledgebase of Brachiopod Paleontology

When & Where:


246 Nichols Hall

Committee Members:

Luke Huan, Chair
Brian Potetz
Bo Luo


Abstract

Science advances not only because of new discoveries, but also due to revolutionary ideas drawn from accumulated data. The quality of studies in paleontology, in particular, depends on accessibility of fossil data. This research builds an intelligent system based on brachiopod fossil images and their descriptions published in Treatise on Invertebrate Paleontology. The project is still developing and some significant achievements will be discussed here. 
This thesis has two major parts. The first part describes the digitization, organization and integration of information extracted from the Treatise. The Treatise is in PDF format and it is non-trivial to convert large volumes into a structured, easily accessible digital library. Three important topics will be discussed: (1) how to extract data entries from the text, and save them in a structured manner; (2) how to crop individual specimen images from figures automatically, and associate each image with text entries; (3) how to build a search engine to perform both keyword search and natural language search. The search engine already has a web interface and many useful tasks can be done with ease. 
Verbal descriptions are second-hand information of fossil images and thus have limitations. The second part of the thesis develops an algorithm to compare fossil images directly, without referring to textual information. After similarities between fossil images are calculated, we can use the results in image search, fossil classification, and so on. The algorithm is based on deformable templates, and utilizes expectation propagation to find the optimal deformation. Specifically, I superimpose a ``warp'' on each image. Each node of the warp encapsulates a vector of local texture features, and comparing two images involves two steps: (1) deform the warp to the optimal configuration, so the energy function is minimized; and (2) based on the optimal configuration, compute the distance of two images. Experiment results confirmed that the method is reasonable and robust.


WILLIAM DINKEL

Instrumentation and Evaluation of Distributed Computations

When & Where:


246 Nichols Hall

Committee Members:

Victor Frost, Chair
Arvin Agah
Prasad Kulkarni


Abstract

Distributed computations are a very important aspect of modern computing, especially given the rise of distributed systems used for applications such as web search, massively multiplayer online games, financial trading, and cloud computing. When running these computations across several physical machines it becomes much more difficult to determine exactly what is occurring on each system at a specific point in time. This is due to each server having an independent clock, thus making event timestamps inherently inaccurate across machine boundaries. Another difficulty with evaluating distributed experiments is the coordination required to launch daemons, executables, and logging across all machines, followed by the necessary gathering of all related output data. The goal of this research is to overcome these obstacles and construct a single, global timeline of events from all servers. 
We employ high-resolution clock synchronization to bring all servers within microseconds as measured by a modified version of the Network Time Protocol implementation. Kernel and user-level events with wall-clock timestamps are then logged during basic network socket experiments. These data are then collected from each server and merged into a single dataset, sorted by timestamp, and plotted on a timeline. The entire experiment, from setup to teardown to data collection, is coordinated from a single server. The timeline visualizations provide a narrative of not only how packets flow between servers, but also how kernel interrupt handlers and other events shape an experiment's execution.