In 2023, the Agrusa Competition awarded a total of $10,000 in prize money.
This year's competitors included 11 teams made up of 24 students who were advised by 10 faculty members.
Place | Title |
1st | Cyclops: A Nanomaterial-based, Battery-Free Intraocular Pressure (IOP) Monitoring System inside Contact Lens (Liyao Li) |
2nd | Sherloc: Secure and Holistic Control-Flow Violation Detection on Embedded Systems |
3rd | Validating IoT and Wearable Devices with Rate-Based Session Types (Grant Iraci and Cheng-En Chuang |
Honorable Mention | UltraFeel: A Portable and High Definition Haptic Reality Platform Using a Low-cost Ultrasonic Array (Alexander Gherardl) |
Honorable Mention | PRIMAL (Naresh Kumar Devulapally) |
This is the complete list of student projects that competed for the prize money.
Given the widespread adoption and significant expansion of AI-generated content (AIGC), distinguishing between authentic and synthesized content has become a growing challenge for viewers. To combat the potential malicious use of AIGC for misinformation and disinformation, we are in the process of developing a learning platform named DeepfakePedia. This platform aims to help teenage viewers gain a basic knowledge of generative AI, be aware of its potential threats to information integrity, and teach them how to identify and distinguish between real and fake media. The DeepfakePedia learning platform will contain:
The SHERLOC project has developed a proof-of-concept system for secure and comprehensive control-flow violation detection (CFVD) in embedded systems. SHERLOC is designed to monitor control-flow transfers within and between both privileged and unprivileged components, securing microcontroller-based systems. SHERLOC achieves the prototype by leveraging hardware features available on microcontrollers to record control-flow transfers. The system develops an interrupt- and scheduling-aware algorithm and uses a combination of backward-edge and forward-edge policies to detect control-flow violations, including function calls, returns, indirect jumps, and exceptions/interrupts. To evaluate SHERLOC's effectiveness, a series of experiments were conducted with various programs and systems, such as BEEBS programs, Blinky bare-metal systems, and FreeRTOS. The results show that SHERLOC is both effective and efficient in detecting control-flow violations in embedded systems.
Our project marks a significant achievement by demonstrating the effectiveness of generative AI in generating lifelike fingerprints. With ongoing research into leveraging GANs and conditional-diffusion models to achieve even finer-grained control over synthetic fingerprint characteristics, these models, especially the diffusion model, though state-of-the-art for image generation, have not been used in biometric data generation, motivating our project. Our proof-of-concept system uses style GAN-generated synthetic fingerprints for training. Within this system, we've implemented the SimCLR framework to train a neural network. This network plays a central role by converting fingerprints into high-dimensional embeddings, serving as the foundation for our fingerprint authentication process. This authentication method relies on cosine distance metrics for robust identity verification. Our innovative approach not only addresses biometrics challenges, including cost reduction and privacy concerns associated with real fingerprint data collection and recognition, but also explores advancements in unsupervised learning and generative AI.
We introduce Cyclops, a low-cost, battery-free contact lens-based eye pressure sensing system that supports long-range data reading of up to 1 m. This design eliminates the need for a bulky data reader in close proximity to the user. We achieve long-range communication by integrating backscatter communication with the pressure sensor, all embedded within a compact contact lens. Compared with existing works, our system significantly improves the communication distance. Our design features a three-layer, sandwich-like antenna structure that includes two metallic antenna layers with a central sensing layer in between. The sensing layer shows significant variations in its intrinsic properties in response to pressure changes. As the sensing layer is integrated into the antenna assembly, it influences the antenna's overall impedance. Through careful optimization of the structure of the two metallic layers, we align the antenna's impedance with that of the backscatter chip, enabling long-range communication.
In any organization, scattered information across multiple documents and websites can be daunting for users to navigate. Our goal is to create an interactive platform that can answer users' questions from such diverse sources of information. In this project, we have built Ubuddy, an interactive social university robot using the Pepper platform. This innovative system activates on a wake word and delivers precise answers to UB-specific queries by employing a language retriever model that integrates ChatGPT and LLama2. It provides real-time information on university events and programs, sourced directly from official university channels, ensuring accuracy and relevance. The robot's display tablet not only shows directions on a map but also answers questions about university leadership. It also entertains users with dances and gestures. Additionally, it has the capability to detect emotions and identify individuals through facial and audio cues, which enhances personalized engagement. We have successfully demonstrated at multiple forums our system that exploits web scraping and robotics to effectively serve the university community.
UltraFeel is a portable, high-definition, and low-cost solution for integrating haptic feedback into mixed reality, or more concisely, haptic reality for the hands. The hardware platform consists of a 16-by-16 phased array of 40kHz ultrasonic emitters, an I/O-intensive FPGA controller, an ESP32-based networking solution, and an AR/VR device (i.e., Quest 2). The 256 ultrasonic emitters work collaboratively to generate high-resolution convergence points. The amplitude modulation rate of 200Hz allows the ultrasonic airflow to mimic high-definition haptic feedback to users. The prototype, an 11.3 in² board, includes 544 components in total: 256 ultrasonic emitters, 160 capacitors/resistors, 128 MOSFET drivers, and 32 shift registers. The project efforts entailed:
Our project addresses privacy concerns within biometrics by implementing an end-to-end vascular feature recognition system designed with a strong emphasis on security and confidentiality. Our solution combines the power of the ResNet50 architecture with biohashing to securely capture and manage individuals' biometric data. It relies on near-infrared light scanning to acquire images, which are then processed to generate a unique digital representation of the user's vein pattern, known as a biohash.
Furthermore, our system incorporates an advanced matching algorithm that compares the user's biohash with the securely stored biohash to establish identity verification. The core model achieves an accuracy rate of 95.58%, and by utilizing biohashing, we attain an Equal Error Rate (EER) of 0, signifying a remarkable 100% enhancement in the efficiency of biohashing in protecting biometric data and privacy.
We'd also like to mention that the paper related to this project has been submitted and accepted, and is set to be published on IEEE Xplore. Furthermore, it was recently presented at the 2023 IEEE Western New York Image and Signal Processing Workshop:
Chris Humphry, Sunil Rufus, and Nalini Ratha. “Secure Vascular Biometric Recognition”. 2023 IEEE Western New York Image and Signal Processing Workshop (WNYISPW). IEEE, 2023.
Privacy issues in current AI systems raise serious ethical and legal concerns, compromising individual rights and societal well-being as identified in the US President’s recent executive order on AI. Protecting privacy in critical applications like biometrics and medical data is crucial for responsible AI development, trust, and security.
Fully Homomorphic Encryption (FHE) is the pinnacle of privacy and security, enabling computations on encrypted data. First, we developed an end-to-end privacy-preserving Convolutional Neural Network (CNN) for detecting sleep apnea from encrypted ECG signals using FHE, achieving an accuracy of 97.6%. This work establishes a fresh benchmark for secure and reliable sleep apnea detection.
Next, we enhanced the privacy of face embeddings for prominent applications like face recognition through FHE and multivariate polynomial transformations, establishing an innovative dual-layer security protocol. This work pioneers advanced biometric protection using FHE, enabling encrypted face recognition and setting new standards in biometrics, privacy, and security.
Our project has accomplished a novel framework called HATEGUARD, which addresses the challenge of detecting and mitigating new waves of online hate. HATEGUARD incorporates a chain-of-thought (CoT) reasoning approach, empowering large language models (LLMs) with reasoning capabilities to determine whether new content exhibits hateful characteristics. It also includes a prompt-based zero-shot classification, which streamlines the updating process by focusing solely on the detection policy. Our framework takes a first step towards practically moderating new waves of online hate by harnessing the potency of LLMs.
Based on this project, we have a paper titled "Moderating New Waves of Online Hate with Chain-of-Thought Reasoning in Large Language Models," accepted by the 45th IEEE Symposium on Security and Privacy (S&P), a top conference in computer security known as a "Big 4" Security Conference, with an acceptance rate of 14.9%.
This project has developed an AI-powered emotionally intelligent interactive agent, EI2-AICare, for informal caregivers, who are an essential component of the healthcare system. Caregivers often face challenges in performing care giving tasks and are at risk of developing burnout and mental health issues. EI2-AICare leverages a Large Language Model (LLM)-powered human-robot interaction mechanism to provide interventions for caregivers.
These interventions include:
The interactive agent supports caregivers in executing their care giving tasks while also caring for their personal well-being.
UB is actively monitoring developments related to COVID-19. Please check with event organizers for the most up-to-date information about scheduled events.
SHARE THE NEWS
Have some news or an event to share with us? Please send it to cse-dept@buffalo.edu.