Defense Notices


All students and faculty are welcome to attend the final defense of EECS graduate students completing their M.S. or Ph.D. degrees. Defense notices for M.S./Ph.D. presentations for this year and several previous years are listed below in reverse chronological order.

Students who are nearing the completion of their M.S./Ph.D. research should schedule their final defenses through the EECS graduate office at least THREE WEEKS PRIOR to their presentation date so that there is time to complete the degree requirements check, and post the presentation announcement online.

Upcoming Defense Notices

Jennifer Quirk

Aspects of Doppler-Tolerant Radar Waveforms

When & Where:


Nichols Hall, Room 246 (Executive Conference Room)

Committee Members:

Shannon Blunt, Chair
Patrick McCormick
Charles Mohr
James Stiles
Zsolt Talata

Abstract

The Doppler tolerance of a waveform refers to its behavior when subjected to a fast-time Doppler shift imposed by scattering that involves nonnegligible radial velocity. While previous efforts have established decision-based criteria that lead to a binary judgment of Doppler tolerant or intolerant, it is also useful to establish a measure of the degree of Doppler tolerance. The purpose in doing so is to establish a consistent standard, thereby permitting assessment across different parameterizations, as well as introducing a Doppler “quasi-tolerant” trade-space that can ultimately inform automated/cognitive waveform design in increasingly complex and dynamic radio frequency (RF) environments. 

Separately, the application of slow-time coding (STC) to the Doppler-tolerant linear FM (LFM) waveform has been examined for disambiguation of multiple range ambiguities. However, using STC with non-adaptive Doppler processing often results in high Doppler “cross-ambiguity” side lobes that can hinder range disambiguation despite the degree of separability imparted by STC. To enhance this separability, a gradient-based optimization of STC sequences is developed, and a “multi-range” (MR) modification to the reiterative super-resolution (RISR) approach that accounts for the distinct range interval structures from STC is examined. The efficacy of these approaches is demonstrated using open-air measurements. 

The proposed work to appear in the final dissertation focuses on the connection between Doppler tolerance and STC. The first proposal includes the development of a gradient-based optimization procedure to generate Doppler quasi-tolerant random FM (RFM) waveforms. Other proposals consider limitations of STC, particularly when processed with MR-RISR. The final proposal introduces an “intrapulse” modification of the STC/LFM structure to achieve enhanced sup pression of range-folded scattering in certain delay/Doppler regions while retaining a degree of Doppler tolerance.


Past Defense Notices

Dates

ROHIT YADAV

Automatic Text Summarization of Email Corpus Using Importance of Sentences

When & Where:


2001B Eaton Hall

Committee Members:

Jerzy Grzymala-Busse, Chair
Prasad Kulkarni
Bo Luo


Abstract

With the advent of Internet, the data being added online have been increasing at an enormous rate. Though search engines use information retrieval (IR) techniques to facilitate the search requests from users, the results may not always be effective or the efficiency of results according to a search query may not be high. The user has to go through certain web pages before getting at the web page he/she needs. This problem of information overload can be solved using automatic text summarization. Summarization is a process of obtaining an abridged version of documents so that user can have a quick understanding of the document. A new technique to produce a summary of an original text is investigated in this project. 
Email threads from the World Wide Web consortium’s sites (W3C) corpus are used in this system.Our system is based on identification and extraction of important sentences from the input document. Apart from common IR features like term frequency and inverse document frequency, novel features such as Term Frequency-Inverse Document Frequency,subject words, sentence position and thematic words have also been implemented. The model consists of four stages. The pre-processing stage converts the unstructured (all those things that can't be so readily classified) text into structured (any data that resides in a fixed field within a record or file). In the first stage each sentence is partitioned into the list of tokens and stop words are removed. The second stage is to extract the important key phrases in the text by implementing a new algorithm through ranking the candidate words. The system uses the extracted keywords/key phrases to select the important sentence. Each sentence is ranked depending on many features such as the existence of the keywords/key phrase in it, the relation between the sentence and the title by using a similarity measurement and other many features. The third stage of the proposed system is to extract the sentences with the highest rank. The fourth stage is the filtering stage where sentences from email threads are ranked as per features and summaries are generated. This system can be considered as a framework for unsupervised learning in the field of text summarization. 


ARJUN MUTHALAGU

Flight Search Application

When & Where:


250 Nichols Hall

Committee Members:

Prasad Kulkarni, Chair
Andy Gill
Jerzy Grzymala-Busse


Abstract

“Flight-search” application is an Angular JS application implemented in a client side architecture. The application displays the flight results from different airline companies based on the input parameters. The application also has custom filtering conditions and custom pagination, which a user can interact with to filter the result and also limit the results displayed in the browser. The application uses QPX Express API to pull data for the flight searches.


SATYA KUNDETI

A comparison of Two Decision Tree Generating Algorithms: C4.5 and CART Based on Numerical Data

When & Where:


2001B Eaton Hall

Committee Members:

Jerzy Grzymala-Busse, Chair
Luke Huan
Bo Luo


Abstract

In Data Mining, classification of data is a challenging task. One of the most popular techniques for classifying data is decision tree induction. In this project, two decision tree generating algorithms CART and C4.5, using their original implementations, are compared on different numerical data sets, taken from University of California Irvine (UCI). The comparative analysis of these two implementations is carried out in terms of accuracy and decision tree complexity. Results from experiments show that there is statistically insignificant difference(5% level of significance, two-tailed test)between C4.5 and CART in terms of accuracy. On the other hand, decision trees generated by C4.5 and CART have significant statistical difference in terms of their complexity. 

 


NAGA ANUSHA BOMMIDI

The Comparison of Performance and Complexity of Rule Sets induced from Incomplete Data

When & Where:


317 Nichols Hall

Committee Members:

Jerzy Grzymala-Busse, Chair
Andy Gill
Prasad Kulkarni


Abstract

The main focus of this project is to identify the best interpretation of missing attribute values in terms of performance and complexity of rule sets. This report summarizes the experimental comparison of the performance and the complexity of rule sets induced from incomplete data sets with three interpretations of missing attribute values: lost values, attribute-concept values, and “do not care” conditions. Furthermore, it details the experiments conducted using MLEM2 rule induction system on 176 data sets, using three kinds of probabilistic approximations: lower, middle and upper. The performance was evaluated using the error rate computed by ten-fold cross validation, and the complexity of rule sets was evaluated based the size of the rule sets and the number of conditions in the rule sets. The results showed that lost values were better in terms of the performance in 10 out of 24 combinations. In addition, attribute-concept values were better in 5 out of 24 combinations, and “do not care” conditions were better in 1 combination in terms of the complexity of rule sets. Furthermore, there was not even one combination of dataset and type of approximation for which both performance and complexity of rule sets were better for one interpretation of missing attributes compared to the other two.


BLAKE BRYANT

Hacking SIEMS to Catch Hackers: Decreasing the Mean Time to Respond to Security Incidents with a Novel Threat Ontology in SIEM Software

When & Where:


2012 BEST

Committee Members:

Hossein Saiedian, Chair
Bo Luo
Gary Minden


Abstract

Information security is plagued with increasingly sophisticated and persistent threats to communication networks. The development of new threat tools or vulnerability exploits often outpaces advancements in network security detection systems. As a result, detection systems often compensate by over reporting partial detections of routine network activity to security analysts for further review. Such alarms seldom contain adequate forensic data for analysts to accurately validate alerts to other stakeholders without lengthy investigations. As a result, security analysts often ignore the vast majority of network security alarms provided by sensors, resulting in security breaches that may have otherwise been prevented. 

Security Information and Event Management (SIEM) software has been introduced recently in an effort to enable data correlation across multiple sensors, with the intent of producing a lower number of security alerts with little forensic value and a higher number of security alerts that accurately reflect malicious actions. However, the normalization frameworks found in current SIEM systems do not accurately depict modern threat activities. As a result, recent network security research has introduced the concept of a "kill chain" model designed to represent threat activities based upon patterns of action, known indicators, and methodical intrusion phases. Such a model was hypothesized by many researchers to result in the realization of the desired goals of SIEM software. 

The focus of this thesis is the implementation of a "kill chain" framework within SIEM software. A novel "Kill chain" model was developed and implemented within a commercial SIEM system through modifications to the existing SIEM database. These modifications resulted in a new log ontology capable of normalizing security sensor data in accordance with modern threat research. New SIEM correlation rules were developed using the novel log ontology compared to existing vendor recommended correlation rules using the default model. The novel log ontology produced promising results indicating improved detection rates, more descriptive security alarms, and a lower number of false positive alarms. These improvements were assessed to provide improved visibility and more efficient investigation processes to security analysts ultimately reducing the mean time required to detect and escalate security incidents. 


SHAUN CHUA

Implementation of a Multichannel Radar Waveform Generator System Controller

When & Where:


317 Nichols Hall

Committee Members:

Carl Leuschen, Chair
Chris Allen
Fernando Rodriguez-Morales


Abstract

Waveform generation is crucial in a radar system operation. There is a recent need for an 8 channel transmitter with high bandwidth chirp signals (100 MHz – 600 MHz). As such, a waveform generator (WFG) hardware module is required for this purpose. The WFG houses 4 Direct Digital Synthesizers (DDS), and an ALTERA Cyclone V FPGA that acts as its controller. The DDS of choice is the AD9915, because its Digital to Analog Converter can be clocked at a maximum rate of 2.5 GHz, allowing plenty of room to produce the high bandwidth and high frequency chirp signals desired, and also because it supports synchronization between multiple AD9915s. 

The brains behind the DDS operations are the FPGA and the radar software developed in NI LabVIEW. These two aspects of the digital systems grants the WFG highly configurable waveform capabilities. The configurable inputs that can be controlled by the user include: number of waveforms in a playlist, start and stop frequency (bandwidth of chirp signal), zero-pi mode, and waveform amplitude and phase control. 

The FPGA acts as a DDS controller that directly configures and control the DDS operations, while also managing and synchronizing the operations of all DDS channels. This project details largely the development of such a controller, named Multichannel Waveform Generator (MWFG) Controller, and the necessary modifications and development in the NI LabVIEW software, so that they complement each other.


DEEPIKA KOTA

Automatic Color Detection of Colored Wires In Electric Cables

When & Where:


2001B Eaton Hall

Committee Members:

Jim Stiles, Chair
Ron Hui
James Rowland


Abstract

An automatic Color detection system checks for the sequence of colored wires in electric cables which are ready to get crimped together. The system inspects for flat connectors with differs in type and number of wires.This is managed in an automatic way with a self learning system without any requirement of manual input from the user to load new data to the machine. The system is coupled to a connector crimping machine and once the system learns the actual sample of cable order , it automatically inspects each cable assembled by the machine. There are three methodologies based on which this automatic detection takes place 1) A self learning system 2) An algorithm for wire segmentation to extract colors from the captured images 3) An algorithm for color recognition to cope up with wires with different illuminations and insulation .The main advantage of this system is when the cables are produced in large batches ,it provides high level of accuracy and prevents false negatives in order to guarantee defect free production.


MOHAMMED ZIAUDDIN

Open Source Python Widget Application to Synchronize Bibliographical References Between Two BibTeX Repositories

When & Where:


246 Nichols Hall

Committee Members:

Andy Gill, Chair
Perry Alexander
Prasad Kulkarni


Abstract

Bibtex is a tool to edit and manage bibliographical references in a document.Researchers face a common problem that they have one copy of their bibliographical reference databases for a specific project and a master bibliographical database file that holds all their bibliographical references. Syncing these two files is an arduous task as one has to search and modify each reference record individually. Most of the bibtex tools available either provide help in maintaining bibtex bibliographies in different file formats or searching for references in web databases but none of them provide a way to synchronize the fields of the same reference record in the two different bibtex database files. 
The intention of this project is to create an application that helps academicians to keep their bibliographical references in two different databases in sync. We have created a python widget application that employs the Tkinter library for GUI and unQLite database for data storage. This application is integrated with Github allowing users to modify bibtex files present on Github. 


HARISH ROHINI

Using Intel Pintools to Analyze Memory Access Patterns

When & Where:


246 Nichols Hall

Committee Members:

Prasad Kulkarni, Chair
Andy Gill
Heechul Yun


Abstract

Analysis of large benchmark programs can be very difficult because of their changes in memory state for every run and with billions of instructions the simulation of a whole program in general can be extremely slow. The solution for this is to simulate only some selected regions which are the most representative parts of a program, So that we can focus our analysis and optimizations on those particular regions which represent more part of the execution of a program. In order to accomplish that, we use intel’s pintool, a binary instrumentation framework which performs program analysis at run time, simpoint to get the most representative regions of a program and pinplay for the reproducible analysis of the program. This project uses these frameworks to simulate and analyze programs to provide various statistics about the memory allocations, memory reference traces, allocated memory usage across the most representative regions of the program and also the cache simulations of the representative regions.


GOVIND VEDALA

Iterative SSBI Compensation in Optical OFDM Systems and the Impact of SOA Nonlinearities MS Project Defense (EE)

When & Where:


246 Nichols Hall

Committee Members:

Ron Hui, Chair
Chris Allen
Erik Perrins


Abstract

Multicarrier modulation using Orthogonal Frequency Division Multiplexing (OFDM) is a best fit candidate for the next generation long-haul optical transmission systems, offering high degree of spectral efficiency and easing the compensation of linear impairments such as chromatic dispersion and polarization mode dispersion, at the receiver. Optical OFDM comes in two flavors – coherent optical OFDM (CO-OFDM) and direct detection optical OFDM (DD-OFDM), each having its own share of pros and cons. CO-OFDM is highly robust to fiber impairments and imposes a relaxation on the electronic component bandwidth requirements, but requires narrow linewidth lasers, optical hybrids and local oscillators. On the other hand DD-OFDM has relaxed laser linewidth requirement and low complexity receiver making it an attractive multicarrier system. However, DD-OFDM system suffers from signal-signal beat interference (SSBI), caused by mixing among the sub-carriers in the photo detector, which deteriorates the system performance. Previously, to mitigate the effect of SSBI, a guard band was used between optical carrier and data sideband. In this project, we experimentally demonstrate a linearly field modulated virtual single sideband OFDM (VSSB-OFDM) transmission with direct detection and digitally compensate for the SSBI using an iterative SSBI compensation algorithm. 
Semiconductor optical amplifiers (SOA), with their small footprint, ultra-high gain bandwidth, and ease of integration, are attracting the attention of optical telecommunication engineers for their use in high speed transmission systems as inline amplifiers. However, the SOA gain saturation induced nonlinearities cause pulse distortion and induce nonlinear cross talk effects such as cross gain modulation especially in Wavelength Division Multiplexed systems. In this project, we also evaluate the performance of iterative SSBI compensation in an optical OFDM system, in the presence of these SOA induced nonlinearities.