Aoxuan Silvia Zhang 🐰

Aoxuan Silvia Zhang Ao-Sh-oo-en S-ih-l-v-ee-uh J-ah-ng

(she/her)

Computer Science and Engineering & Mathematics Student

Korea University

KAIST MLAI Lab (incoming)

Professional Summary

About Me

About Me I am passionate about pushing the boundaries of how language models, generative techniques, and weight-space representations can be combined to advance agentic AI systems. My undergraduate double major in Computer Science & Engineering and Mathematics has provided a strong technical foundation, and I will join KAIST’s MLAI Lab as a Master’s student starting Spring 2026.

Outside of academic research, I co-founded Cybereality.ai, an independent research and experimentation group where we explore AI-powered applications that connect digital experiences with real-world everyday life. The project serves as a creative sandbox for prototyping ideas around agentic systems, human–AI interaction, and next-generation Life OS concepts.

Education

M.S. (incoming)

KAIST MLAI Lab

B.S. Computer Science and Engineering & Mathematics

Korea University

Exchange Program, Department of Mathematics

The Hong Kong University of Science and Technology (HKUST)

Interests

Large Language Models Generative Modeling Weight-Space Learning Agentic AI
📚 My Research
Welcome! I am Silvia Zhang, a computer science and engineering & mathematics double‐major at Korea University, and incoming M.S. student at KAIST’s MLAI Lab. My research interests: Large Language Models, generative modeling, weight-space learning, and agentic AI.
Featured Publications

Cost-Sensitive Freeze-thaw Bayesian Optimization for Efficient Hyperparameter Tuning

Dong Bok Lee, Aoxuan Silvia Zhang, Byungjoo Kim, Junhyeon Park, Juho Lee, Sung Ju Hwang, Hae Beom Lee
Recent Publications
Projects
Nexus featured image

Nexus

Nexus is a clarity engine for life in a foreign world — a single interface that turns the chaos of unfamiliar systems, languages, and daily decisions into something intuitive and …

Experience

  1. AI Engineer

    DeepAuto.ai

    Agentic AI Systems

    • Contributed to the development of a three-stage agentic AI workflow: compile, implement, and execute, supporting structured workflow automation.
    • Assisted in building and testing modules that transform high-level workflow plans into atomic functions and execute them with LLM-powered agents.
  2. Research Intern

    KAIST MLAI Lab

    Weight Generation for Large Language Models

    • Conducted a comprehensive literature survey on generative models for weight generation and alternative approaches to weight-space learning in neural networks.
    • Ran and analyzed experimental results from existing codebases, assisted in debugging and reproducing key experiments to validate methodologies.
    • Co-authored a paper (under review at ICLR 2026): Merging Language Models in Latent Space, proposing a VAE-based framework for merging heterogeneous large language models in a shared latent space.
    • Investigated weight distribution properties (kurtosis, compressibility) and contributed to the design and implementation of latent-space fusion experiments.
  3. AI Researcher

    DeepAuto.ai

    LLM Agent on Hyperparameter Optimization

    • Conducted a literature review on state-of-the-art hyperparameter optimization techniques for large language models (LLMs).
    • Analyzed existing codebases and replicated experiments to understand optimization workflows.
    • Implemented and tested existing optimization methods to assess their impact on model performance.
  4. Research Intern

    KAIST MLAI Lab

    Hyperparameter Optimization

    • Developed and implemented baseline optimization algorithms, including Bayesian Optimization with Hyperband (BOHB), Differential Evolution with Hyperband (DEHB), and Functional Surrogate-Based Optimization (FSBO) to efficiently optimize complex black-box functions.
    • Evaluated the performance of these algorithms on benchmark problems and real-world applications, demonstrating their effectiveness in sample-efficient hyperparameter tuning and optimization.
    • Co-authored a paper (NeurIPS 2025): Cost-Sensitive Freeze-Thaw Bayesian Optimization for Efficient Hyperparameter Tuning, introducing a novel cost-aware strategy to improve resource allocation in freeze-thaw Bayesian optimization.
  5. AWS AI & ML Scholarship Recipient

    Udacity

    AI Programming with Python

    • Participated in the AWS DeepRacer Student League and received the AWS AI & ML Scholarship.
    • Completed a collaborative virtual course that teaches programming tools and techniques fundamental to machine learning, with support from Udacity teachers in weekly group sessions.
    • Project 1: Use a Pre-trained Image Classifier to Identify Dog Breeds.
    • Project 2: Create an Image Classifier.
  6. Technical Consulting Virtual Intern

    SAP (via Forage)

    Virtual experience program participant in SAP, through Forage.

    • Completed practical task modules in assembling the data, data analysis, and presenting the results.

Education

  1. M.S. (incoming)

    KAIST MLAI Lab
    Incoming M.S. student at KAIST’s Machine Learning and Artificial Intelligence (MLAI) Lab, advised by Prof. Sung Ju Hwang. My graduate research will focus on large language models, generative modeling, weight space learning, and agentic workflows for scalable foundation models.
  2. B.S. Computer Science and Engineering & Mathematics

    Korea University
    B.S. in Computer Science and Engineering with a double major in Mathematics at Korea University.
  3. Exchange Program, Department of Mathematics

    The Hong Kong University of Science and Technology (HKUST)
    Exchange student at HKUST School of Science (Mathematics), where I broadened my academic perspective in applied mathematics, optimization, and machine learning. Completed advanced coursework in statistics, stochastic processes, regression analysis, time series, and machine learning — providing a rigorous theoretical foundation for research on LLMs and generative modeling.
Blog

🌟 Something About Me

A glimpse into my interests, hobbies, and the things that bring me joy outside of research