Defense Notices


All students and faculty are welcome to attend the final defense of EECS graduate students completing their M.S. or Ph.D. degrees. Defense notices for M.S./Ph.D. presentations for this year and several previous years are listed below in reverse chronological order.

Students who are nearing the completion of their M.S./Ph.D. research should schedule their final defenses through the EECS graduate office at least THREE WEEKS PRIOR to their presentation date so that there is time to complete the degree requirements check, and post the presentation announcement online.

Upcoming Defense Notices

Md Mashfiq Rizvee

Hierarchical Probabilistic Architectures for Scalable Biometric and Electronic Authentication in Secure Surveillance Ecosystems

When & Where:


Eaton Hall, Room 2001B

Committee Members:

Sumaiya Shomaji, Chair
Tamzidul Hoque
David Johnson
Hongyang Sun
Alexandra Kondyli

Abstract

Secure and scalable authentication has become a primary requirement in modern digital ecosystems, where both human biometrics and electronic identities must be verified under noise, large population growth and resource constraints. Existing approaches often struggle to simultaneously provide storage efficiency, dynamic updates and strong authentication reliability. The proposed work advances a unified probabilistic framework based on Hierarchical Bloom Filter (HBF) architectures to address these limitations across biometric and hardware domains. The first contribution establishes the Dynamic Hierarchical Bloom Filter (DHBF) as a noise-tolerant and dynamically updatable authentication structure for large-scale biometrics. Unlike static Bloom-based systems that require reconstruction upon updates, DHBF supports enrollment, querying, insertion and deletion without structural rebuild. Experimental evaluation on 30,000 facial biometric templates demonstrates 100% enrollment and query accuracy, including robust acceptance of noisy biometric inputs while maintaining correct rejection of non-enrolled identities. These results validate that hierarchical probabilistic encoding can preserve both scalability and authentication reliability in practical deployments. Building on this foundation, Bio-BloomChain integrates DHBF into a blockchain-based smart contract framework to provide tamper-evident, privacy-preserving biometric lifecycle management. The system stores only hashed and non-invertible commitments on-chain while maintaining probabilistic verification logic within the contract layer. Large-scale evaluation again reports 100% enrollment, insertion, query and deletion accuracy across 30,000 templates, therefore, solving the existing problem of blockchains being able to authenticate noisy data. Moreover, the deployment analysis shows that execution on Polygon zkEVM reduces operational costs by several orders of magnitude compared to Ethereum, therefore, bringing enrollment and deletion costs below $0.001 per operation which demonstrate the feasibility of scalable blockchain biometric authentication in practice. Finally, the hierarchical probabilistic paradigm is extended to electronic hardware authentication through the Persistent Hierarchical Bloom Filter (PHBF). Applied to electronic fingerprints derived from physical unclonable functions (PUFs), PHBF demonstrates robust authentication under environmental variations such as temperature-induced noise. Experimental results show zero-error operation at the selected decision threshold and substantial system-level improvements as well as over 10^5 faster query processing and significantly reduced storage requirements compared to large scale tracking.


Fatima Al-Shaikhli

Optical Measurements Leveraging Coherent Fiber Optics Transceivers

When & Where:


Nichols Hall, Room 246 (Executive Conference Room)

Committee Members:

Rongqing Hui, Chair
Shannon Blunt
Shima Fardad
Alessandro Salandrino
Judy Wu

Abstract

Recent advancements in optical technology are invaluable in a variety of fields, extending far beyond high-speed communications. These innovations enable optical sensing, which plays a critical role across diverse applications, from medical diagnostics to infrastructure monitoring and automotive systems. This research focuses on leveraging commercially available coherent optical transceivers to develop novel measurement techniques to extract detailed information about optical fiber characteristics, as well as target information. Through this approach, we aim to enable accurate and fast assessments of fiber performance and integrity, while exploring the potential for utilizing existing optical communication networks to enhance fiber characterization capabilities. This goal is investigated through three distinct projects: (1) fiber type characterization based on intensity-modulated electrostriction response, (2) coherent Light Detection and Ranging (LiDAR) system for target range and velocity detection through different waveform design, including experimental validation of frequency modulation continuous wave (FMCW) implementations and theoretical analysis of orthogonal frequency division multiplexing (OFDM) based approaches and (3) birefringence measurements using a coherent Polarization-sensitive Optical Frequency Domain Reflectometer (P-OFDR) system.

Electrostriction in an optical fiber is introduced by interaction between the forward propagated optical signal and the acoustic standing waves in the radial direction resonating between the center of the core and the cladding circumference of the fiber. The response of electrostriction is dependent on fiber parameters, especially the mode field radius. We demonstrated a novel technique of identifying fiber types through the measurement of intensity modulation induced electrostriction response. As the spectral envelope of electrostriction induced propagation loss is anti-symmetrical, the signal to noise ratio can be significantly increased by subtracting the measured spectrum from its complex conjugate. We show that if the field distribution of the fiber propagation mode is Gaussian, the envelope of the electrostriction-induced loss spectrum closely follows a Maxwellian distribution whose shape can be specified by a single parameter determined by the mode field radius.        

We also present a self-homodyne FMCW LiDAR system based on a coherent receiver. By using the same linearly chirped waveform for both the LiDAR signal and the local oscillator, the self-homodyne coherent receiver performs frequency de-chirping directly in the photodiodes, significantly simplifying signal processing. As a result, the required receiver bandwidth is much lower than the chirping bandwidth of the signal. Simultaneous multi-target of range and velocity detection is demonstrated experimentally. Furthermore, we explore the use of commercially available coherent transceivers for joint communication and sensing using OFDM waveforms.

In addition, we demonstrate a P-OFDR system utilizing a digital coherent optical transceiver to generate a linear frequency chirp via carrier-suppressed single-sideband modulation. This method ensures linearity in chirping and phase continuity of the optical carrier. The coherent homodyne receiver, incorporating both polarization and phase diversity, recovers the state of polarization (SOP) of the backscattered optical signal along the fiber, mixing with an identically chirped local oscillator. With a spatial resolution of approximately 5 mm, a 26 GHz chirping bandwidth, and a 200 us measurement time, this system enables precise birefringence measurements. By employing three mutually orthogonal SOPs of the launched optical signal, we measure relative birefringence vectors along the fiber.


Past Defense Notices

Dates

Brandon Ravenscroft

Spectral Cohabitation and Interference Mitigation via Physical Radar Emissions

When & Where:


Nichols Hall, Room 129, Ron Evans Apollo Auditorium

Committee Members:

Shannon Blunt, Chair
Chris Allen
Erik Perrins
James Stiles
Chris Depcik

Abstract

Auctioning of frequency bands to support growing demand for high bandwidth 5G communications is driving research into spectral cohabitation strategies for next generation radar systems. The loss of radio frequency (RF) spectrum once designated for radar operation is forcing radar systems to either learn how to coexist in these frequency spectrum bands, without causing mutual interference, or move to other bands of the spectrum, the latter being the more undesirable choice. Two methods of spectral cohabitation are presented in this work, each taking advantage of recent developments in non-repeating, random FM (RFM) waveforms. RFM waveforms are designed via one of many different optimization procedures to have favorable radar waveform properties while also readily incorporating agile spectrum notches. The first method of spectral cohabitation uses these spectral notches to avoid narrow-band RF interference (RFI) in the form of other spectrum users residing in the same band as the radar system, allowing both to operate while minimizing mutual interference. The second method of spectral cohabitation uses spectral notches, along with an optimization procedure, to embed a communications signal into a dual-function radar/communications (DFRC) emission, allowing one waveform to serve both functions simultaneously. Results of simulation and open-air experimentation with physically realized, spectrally notched and DFRC emissions are shown which demonstrate the efficacy of these two methods of spectral cohabitation.


Divya Harshitha Challa

Crop Prediction Based on Soil Classification using Machine Learning with Classifier Ensembling

When & Where:


Zoom Meeting, please email jgrisafe@ku.edu for defense link.

Committee Members:

Prasad Kulkarni, Chair
David Johnson
Hongyang Sun


Abstract

Globally, agriculture is the most significant source, which is the backbone of any country, and is an emerging field of research these days. There are many different types of soil, and each type has different characteristics for crops. Different methods and models are used daily in this region to increase yields. The macronutrient and micronutrient content of the soil, which is also a parametric representation of various climatic conditions like rain, humidity, temperature, and the soil's pH, is largely responsible for the crop's growth. Consequently, farmers are unable to select the appropriate crops depending on environmental and soil factors. The method of manually predicting the selection of the appropriate crops on land has frequently failed. We use machine learning techniques in this system to recommend crops based on soil classification or soil series. A comparative analysis of several popular classification algorithms, including K-Nearest Neighbors (KNN), Random Forest (RF), Decision Tree (DT), Support Vector Machines (SVM), Gaussian Naive Bayes (GNB), Gradient Boosting (GB), Extreme Gradient Boosting (XGBoost), and Voting Ensemble classifiers, is carried out in this work to assist in recommending the cultivable crop(s) that are most suitable for a particular piece of land depending on the characteristics of the soil and environment. To achieve our goal, we collected and preprocessed a large dataset of crop yield and environmental data from multiple sources. Our results show that the voting ensemble classifier outperforms the other classifiers in terms of prediction accuracy, achieving an accuracy of 94.67%. Feature importance analysis reveals that weather conditions such as temperature and rainfall, and fertilizer usage are the most critical factors in predicting crop yield. 


Oluwanisola Ibikunle

DEEP LEARNING ALGORITHMS FOR RADAR ECHOGRAM LAYER TRACKING

When & Where:


Nichols Hall, Room 317 (Richard K. Moore Conference Room)

Committee Members:

Shannon Blunt, Chair
John Paden (Co-Chair)
Carl Leuschen
Jilu Li
James Stiles

Abstract

The accelerated melting of ice sheets in the polar regions of the world, specifically in Greenland and Antarctica, due to contemporary climate warming is contributing to global sea level rise. To understand and quantify this phenomenon, airborne radars have been deployed to create echogram images that map snow accumulation patterns in these regions. Using advanced radar systems developed by the Center for Remote Sensing and Integrated Systems (CReSIS), a significant amount (1.5 petabytes) of climate data has been collected. However, the process of extracting ice phenomenology information, such as accumulation rate, from the data is limited. This is because the radar echograms require tracking of the internal layers, a task that is still largely manual and time-consuming. Therefore, there is a need for automated tracking.

Machine learning and deep learning algorithms are well-suited for this problem given their near-human performance on optical images. Moreover, the significant overlap between classical radar signal processing and machine learning techniques suggests that fusion of concepts from both fields can lead to optimized solutions for the problem. However, supervised deep learning algorithms suffer the circular problem of first requiring large amounts of labeled data to train the models which do not exist currently.

In this work, we propose custom algorithms, including supervised, semi-supervised, and self-supervised approaches, to deal with the limited annotated data problem to achieve accurate tracking of radiostratigraphic layers in echograms. Firstly, we propose an iterative multi-class classification algorithm, called “Row Block,” which sequentially tracks internal layers from the top to the bottom of an echogram given the surface location. We aim to use the trained iterative model in an active learning paradigm to progressively increase the labeled dataset. We also investigate various deep learning semantic segmentation algorithms by casting the echogram layer tracking problem as a binary and multiclass classification problem. These require post-processing to create the desired vector-layer annotations, hence, we propose a custom connected-component algorithm as a post-processing routine. Additionally, we propose end-to-end algorithms that avoid the post-processing to directly create annotations as vectors. Furthermore, we propose semi-supervised algorithms using weakly-labeled annotations and unsupervised algorithms that can learn the latent distribution of echogram snow layers while reconstructing echogram images from a sparse embedding representation.

A concurrent objective of this work is to provide the deep learning and science community with a large fully-annotated dataset. To achieve this, we propose synchronizing radar data with outputs from a regional climate model to provide a dataset with overlapping measurements that can enhance the performance of the trained models.


Jonathan Rogers

Faster than Thought Error Detection Using Machine Learning to Detect Errors in Brain Computer Interfaces

When & Where:


Eaton Hall, Room 2001B

Committee Members:

Suzanne Shontz, Chair
Adam Rouse
Cuncong Zhong


Abstract

This research thesis seeks to use machine learning on data from invasive brain-computer interfaces (BCIs) in rhesus macaques to predict their state of movement during center-out tasks. Our research team breaks down movements into discrete states and analyzes the data using Linear Discriminant Analysis (LDA). We find that a simplified model that ignores the biological systems unpinning it can still detect the discrete state changes with a high degree of accuracy. Furthermore, when we account for underlying systems, our model achieved high levels of accuracy at speeds that ought to be imperceptible to the primate brain.


Abigail Davidow

Exploring the Gap Between Privacy and Utility in Automated Decision-Making

When & Where:


Eaton Hall, Room 2001B

Committee Members:

Drew Davidson, Chair
Fengjun Li
Alexandra Kondyli


Abstract

The rapid rise of automated decision-making systems has left a gap in researchers’ understanding of how developers and consumers balance concerns about the privacy and accuracy of such systems against their utility.  With our goal to cover a broad spectrum of concerns from various angles, we initiated two experiments on the perceived benefit and detriment of interacting with automated decision-making systems. We refer to these two experiments as the Patch Wave study and Automated Driving study. This work approaches the study of automated decision making at different perspectives to help address the gap in empirical data on consumer and developer concerns. In our Patch Wave study, we focus on developers’ interactions with automated pull requests that patch widespread vulnerabilities on GitHub. The Automated Driving study explores older adults’ perceptions of data privacy in highly automated vehicles. We find quantitative and qualitative differences in the way that our target populations view automated decision-making systems compared to human decision-making. In this work, we detail our methodology for these studies, experimental results, and recommendations for addressing consumer and developer concerns.


Bhuneshwari Sharma Joshi

Applying ML Models for the Analysis of Bankruptcy Prediction

When & Where:


Zoom Meeting, please email jgrisafe@ku.edu for defense link.

Committee Members:

Prasad Kulkarni, Chair
Drew Davidson
David Johnson


Abstract

Bankruptcy prediction helps to evaluate the financial condition of a company and it helps not only the policymakers but the investors and all concerned people so they can take all required steps to avoid or to reduce the after-effects of bankruptcy. Bankruptcy prediction will not only help in making the best decision but also provides insight to reduce losses. The major reasons for the business organization’s failure are due to economic conditions such as proper allocation of resources, Input to policymakers, appropriate steps for business managers, identification of sector-wide problems, too much debt, insufficient capital, signal to Investors, etc. These factors can lead to making business unsustainable. The failure rate of businesses has tended to fluctuate with the state of the economy. The area of corporate bankruptcy prediction attains high economic importance, as it affects many stakeholders. The prediction of corporate bankruptcy has been extensively studied in economics, accounting, banking, and decision sciences over the past two decades. Many traditional approaches were suggested based on hypothesis testing and statistical analysis. Therefore, our focus and research are to come up with an approach that can estimate the probability of corporate bankruptcy and by evaluating its occurrence of failure using different machine learning models such as random forest, Random forest, XGboost, logistic method and choosing the one which gives highest accuracy. The dataset used was not well prepared and contained missing values, various data mining and data pre-processing techniques were utilized for data preparation. We use models such asRandom forest, Logistic method, random forest, XGBoost to predict corporate bankruptcy earlier to the occurrence. The accuracy results for accurate predictions of whether an organization will go bankrupt within the next 30, 90, or 180 days, using financial ratios as input features. The XGBoost-based model performs exceptionally well, with 98-99% accuracy.


Laurynas Lialys

Engineering laser beams for particle trapping, lattice formation and microscopy

When & Where:


Nichols Hall, Room 246 (Executive Conference Room)

Committee Members:

Shima Fardad, Chair
Morteza Hashemi
Rongqing Hui
Alessandro Salandrino
Xinmai Yang

Abstract

Having control over nano- and micro-sized objects' position inside a suspension is crucial in many applications such as: sorting and delivery of particles, studying cells and microorganisms, spectroscopy imaging techniques, and building microscopic size lattices and artificial structures. This control can be achieved by judiciously engineering optical forces and light-matter interactions inside colloidal suspensions that result in optical trapping. However, in the current techniques, to confine and transport particles in 3D, the use of high-NA (Numerical Aperture) optics is a must. This in turn leads to several disadvantages such as alignment complications, lower trap stability, and undesirable thermal effects. Hence, here we study novel optical trapping methods such as asymmetric counter-propagating beams where we have engineered the optical forces to overcome the aforementioned limitations. This system is significantly easier to align as it uses much lower NA optics which creates a very flexible manipulating system. This new approach allows the trapping and transportation of different shape objects, sizing from hundreds of nanometers to hundreds of micrometers by exploiting asymmetrical optical fields with higher stability. In addition, this technique also allows for significantly longer particle trapping lengths of up to a few millimeters. As a result, we can apply this method to trapping much larger particles and microorganisms that have never been trapped optically before. Another application that the larger trapping lengths of the proposed system allow for is the creation of 3D lattices of microscopic size particles and other artificial structures, which is one important application of optical trapping.  

This system can be used to create a fully reconfigurable medium by optically controlling the position of selected nano- and micro-sized dielectric and metallic particles to mimic a certain medium. This “table-top” emulation can significantly simplify our studies of wave-propagation phenomena on transmitted signals in the real world. 

Furthermore, an important application of an optical tweezer system is that it can be combined with a variety of spectroscopy and microscopy techniques to extract valuable, time-sensitive information from trapped entities. In this research, I plan to integrate several spectroscopy techniques into the proposed trapping method in order to achieve higher-resolution images, especially for biomaterials such as microorganisms.  


Michael Cooley

Machine Learning for Navel Discharge Review

When & Where:


Eaton Hall, Room 1

Committee Members:

Prasad Kulkarni, Chair
David Johnson (Co-Chair)
Jerzy Grzymala-Busse


Abstract

This research project aims to predict the outcome of the Naval Discharge Review Board decision for an applicant based on factors in the application, using Machine Learning techniques. The study explores three popular machine learning algorithms: MLP, Adaboost, and KNN, with KNN providing the best results. The training is verified through hyperparameter optimization and cross fold validation.

Additionally, the study investigates the ability of ChatGPT's API to classify the data that couldn't be classified manually. A total of over 8000 samples were classified by ChatGPT's API, and an MLP model was trained using the same hyperparameters that were found to be optimal for the 3000 size manual sample.The model was then tested on the manual sample. The results show that the model trained on data labeled by ChatGPT performed equivalently, suggesting that ChatGPT's API is a promising tool for labeling in this domain.


Vasudha Yenuganti

RNA Structure Annotation Based on Base Pairs Using ML Based Classifiers

When & Where:


Eaton Hall, Room 2001B

Committee Members:

Cuncong Zhong, Chair
David Johnson
Prasad Kulkarni


Abstract

RNA molecules play a crucial role in the regulation of gene expression and other cellular processes. Understanding the three-dimensional structure of RNA is essential for predicting its function and interactions with other molecules. One key feature of RNA structure is the presence of base pairs, where nucleotides i.e., adenine(A), guanine(G), cytosine(C), and uracil(U), form hydrogen bonds with each other. The limited availability of high-quality RNA structural data combined with associated atomic coordinate errors in low resolution structures, presents significant challenges for extracting important geometrical characteristics from RNA's complex three-dimensional structure, particularly in terms of base interactions.

In this study, we propose an approach for annotating base-pairing interactions in low-resolution RNA structures using machine learning (ML) based classifiers and leveraging the more precise structural information available in high-resolution homologs to annotate base-pairing interactions in low-resolution structures. We first use DSSR tool to extract annotations of high-resolution RNA structures and extract distances of atoms of interacting base pairs. The distances serve as features, and 12 standard annotations are used as labels for our ML model. We then apply different ML classifiers, including support vector machines, neural networks, and random forests, to predict RNA annotations. We evaluate the performance of these classifiers using a benchmark dataset and report their precision, recall, and F1-score. Low-resolution RNA structures are then annotated based on the sequence-similarity with high-resolution structures and the corresponding predicted annotations.

For future aspects, the presented approach can also help to explore the plausible base pair interactions to identify conserved motifs in low-resolution structures. The detected interactions along with annotations can aid in the study of RNA tertiary structures, which can lead to a better understanding of their functions in the cell.


Venkata Nadha Reddy Karasani

Implementing Web Presence For The History Of Black Writing

When & Where:


LEEP2, Room 1415

Committee Members:

Drew Davidson, Chair
Perry Alexander
Hossein Saiedian


Abstract

The Black Literature Network Project is a comprehensive initiative to disseminate literature knowledge to students, academics, and the general public. It encompasses four distinct portals, each featuring content created and curated by scholars in the field. These portals include the Novel Generator Machine, Literary Data Gallery, Multithreaded Literary Briefs, and Remarkable Receptions Podcast Series. My significant contribution to this project was creating a standalone website for the Current Archives and Collections Index that offers an easily searchable index of black-themed collections. Additionally, I was exclusively responsible for the complete development of the novel generator tool. This application provides customized book recommendations based on user preferences. As a part of the History of Black Writing (HBW) Program, I had the opportunity to customize an open-source annotation tool called Hypothesis. This customization allowed for its use on all websites related to the Black Literature Network Project by the end users. The Black Book Interactive Project (BBIP) collaborates with institutions and groups nationwide to promote access to Black-authored texts and digital publishing. Through BBIP, we plan to increase black literature’s visibility in digital humanities research.