One-shot NAS method has attracted much interest from the research community due to its remarkable training eﬃciency and capacity to discover high performance models. However, the search spaces of previous one-shot based works usually relied on hand-craft design and were short for ﬂexibility on the network topology. In this work, we try to enhance the one-shot NAS by exploring high-performing network architectures in our large-scale Topology Augmented Search Space (i.e, over 3.4 × 1010 diﬀerent topological structures). Speciﬁcally, the diﬃculties for architecture searching in such a complex space has been eliminated by the proposed stabilized share-parameter proxy, which employs Stochastic Gradient Langevin Dynamics to enable fast shared parameter sampling, so as to achieve stabilized measurement of architecture performance even in search space with complex topological structures. The proposed method, namely Stablized Topological Neural Architecture Search (ST-NAS), achieves state-of-the-art performance under Multiply-Adds (MAdds) constraint on ImageNet. Our lite model ST-NAS-A achieves 76.4% top-1 accuracy with only 326M MAdds. Our moderate model STNAS-B achieves 77.9% top-1 accuracy just required 503M MAdds. Both of our models oﬀer superior performances in comparison to other concurrent works on one-shot NAS.