Classical Simulation of Non-Clifford Noise Channels

Date:

We introduce the basics of quantum computing and simulation of quantum systems on classical computers. We then discuss noise in quantum systems and how it is classically modelled, along with the difficulties of simulating quantum noise on classical computers. Our primary question is how to efficiently simulate quantum noise by leveraging existing techniques based upon the Gottesman-Knill theorem, which provides efficient simulation of circuits containing only Clifford gates. To this end we develop theory and practical implementations of the K-Gadget, a method to compress and simualte thousands of instances of dampening noise within reasonable memory constraints.