For non-Euclidean data, it is of vital importance to learn representations according to their underlying geometry. Studies have revealed that the hyperbolic space can effectively embed hierarchical or tree-like data. In particular, the few past years have witnessed a rapid development of hyperbolic neural networks. In this Signature Work, we first propose novel architectures and layers in order to improve stability during training of hyperbolic neural networks. The proposed hyperbolic generative network contains an auto-encoder and a generative adversarial network. Our network is designed to foster expressive and numerically stable representations in the hyperbolic space, and it has been proven to be effective in generating both tree-like graphs and complex molecular data with state-of-the-art structure-related performance. Additionally, we propose the HKConv, a novel trainable hyperbolic convolution layer that expresses local features according to the hyperbolic geometry, and is equivariant to permutation of hyperbolic points and invariant to parallel transport of a local neighborhood. Our approach advances state-of-the-art results in various tasks.