Show simple item record

FieldValueLanguage
dc.contributor.authorHuang, Tao
dc.date.accessioned2025-02-06T23:39:45Z
dc.date.available2025-02-06T23:39:45Z
dc.date.issued2025en_AU
dc.identifier.urihttps://hdl.handle.net/2123/33598
dc.description.abstractDeep learning has revolutionized numerous fields, but its success is often hindered by computational inefficiency, reliance on vast labeled datasets, and challenges in designing optimal architectures. This thesis addresses these issues through contributions in four key areas: handcrafted efficient architecture design, automatic neural architecture evolution, effective knowledge distillation, and data-efficient training. First, we propose LightViT, a lightweight vision transformer, and LocalMamba, a visual state-space model, to advance handcrafted architecture design by balancing accuracy and efficiency. Second, we introduce GreedyNASv2, a method to optimize neural architecture search (NAS), and DyRep, a dynamic re-parameterization framework for evolving architectures during training. Third, our work on knowledge distillation includes DIST for improving logits-based distillation, MasKD for feature-level distillation via adaptive masks, and DiffKD, which unifies logit and feature distillation using diffusion models. Lastly, we tackle the challenge of data efficiency with ActGen, an active generation framework for synthesizing hard examples, and MI-MAE, a self-supervised method leveraging mutual information for masked image modeling. Together, these advancements form a cohesive framework for efficient deep learning, addressing computational, data, and architectural challenges to push the boundaries of scalable and practical machine learning systems.en_AU
dc.language.isoenen_AU
dc.subjectDeep Learningen_AU
dc.subjectComputer Visionen_AU
dc.subjectEfficient Machine Learningen_AU
dc.subjectNeural Architecture Searchen_AU
dc.subjectKnowledge Distillationen_AU
dc.titleEfficient Deep Neural Architecture Design and Trainingen_AU
dc.typeThesis
dc.type.thesisDoctor of Philosophyen_AU
dc.rights.otherThe author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.en_AU
usyd.facultySeS faculties schools::Faculty of Engineering::School of Computer Scienceen_AU
usyd.degreeDoctor of Philosophy Ph.D.en_AU
usyd.awardinginstThe University of Sydneyen_AU
usyd.advisorXu, Chang
usyd.include.pubNoen_AU


Show simple item record

Associated file/s

Associated collections

Show simple item record

There are no previous versions of the item available.