Show simple item record

FieldValueLanguage
dc.contributor.authorZhou, Bingxin
dc.date.accessioned2022-05-23T01:15:32Z
dc.date.available2022-05-23T01:15:32Z
dc.date.issued2022en_AU
dc.identifier.urihttps://hdl.handle.net/2123/28617
dc.description.abstractOne of the most predominant techniques that have achieved phenomenal success in many modern applications is deep learning. The obsession with massive data analysis in image recognition, speech processing, and text understanding spawns remarkable advances in deep learning of diverse research areas. The alliance of deep learning technologies yields mighty graph neural networks (GNNs), an emerging type of deep neural networks that encodes internal structural relationships of inputs. The mainstream of GNNs finds an adequate numerical representation of graphs, which is vital to the prediction performance of machine learning models. Graph representation learning has many real-world applications, such as drug repurposing, protein classification, epidemic spread controlling, and social networks analysis. The rapid development of GNNs in the last five years has witnessed a couple of design flaws, such as over-smoothing, vulnerability to perturbation, lack of expressivity, and missing explainability. Meanwhile, the persistent enthusiasm in this research area allows for cumulative experience in solving complicated problems, such as size-variant graph compression and time-variant graph dynamic capturing. The ambition of this thesis is to shed some light of mathematics on a few outlined issues. The permutation-invariant design of graph compression is supported by manifold learning, the robust graph smoothing relies heavily on the principles of convex optimization, and the efficient dynamic graph embedding leverages global spectral transforms and power method singular value decomposition. The author believes that the effectiveness of deep learning designs should not be oriented solely by performance over particular datasets. Modifications on a black-box model should operate beyond fine-tuning tricks. The reliability of deep learning looks forward to designing models with rigorous mathematics so that the `computer science' becomes actual science one day.en_AU
dc.language.isoenen_AU
dc.subjectgraph neural networksen_AU
dc.subjectdeep learningen_AU
dc.subjectspectral transformen_AU
dc.subjectgraph signal processingen_AU
dc.titleGeometric Signal Processing with Graph Neural Networksen_AU
dc.typeThesis
dc.type.thesisDoctor of Philosophyen_AU
dc.rights.otherThe author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.en_AU
usyd.facultySeS faculties schools::The University of Sydney Business School::Discipline of Business Analyticsen_AU
usyd.degreeDoctor of Philosophy Ph.D.en_AU
usyd.awardinginstThe University of Sydneyen_AU
usyd.advisorGao, Junbin
usyd.include.pubNoen_AU


Show simple item record

Associated file/s

Associated collections

Show simple item record

There are no previous versions of the item available.