Location:  Room 149 of the Keck Center

Time and Date:  Wednesday, Nov. 3, at 4pm.

https://chapman.zoom.us/j/99926295635?pwd=Vlgwd0wvM1VwV0xQcHNmMHM1UWdvZz09

Meeting ID: 999 2629 5635

 

Abstract:  Physical advantages to building a quantum computer out of optical frequency photons include: they suffer negligible environment decoherence even at room temperature, there is no cross talk, they network easily into arbitrary geometries, the relevant physics is not heuristic and is often both efficiently simulatable and verifiable with classical light, and measurements – the critical element for entropy reduction to achieve fault tolerance – are sharp and extremely fast. However these pale in comparison to the engineering advantages: all parts of the machine can be built in a tier-1 foundry, and packaged in the same back-end-of-line processes used to build laptops and cellphones. Thus with photons we can realistically stare down the sorts of numbers (~1 million qubits) which capture the size of machine required to do useful quantum computation.

So what is the catch? The primary obstacle is that the specific type of entanglement we need between different photons can only be created probabilistically, and is difficult to create in the presence of loss. In this talk I will overview an architecture Fusion Based Quantum Computing (FBQC) that sits somewhere between the extremes of matter (circuit) based and one-way (cluster state) quantum computing. It requires the production of only fixed size entangled states regardless of the size of computation being performed, and these states can have high probability of loss (or failure to be produced at all). This allows us to attempt creation of the desired entangled states multiple times in parallel, and then to select out successful events.