>>14025144>why does there seem to be so little research into optical computinglol
There's lots, dumbass. Boson-sampling is a form of quantum computing that does what you suggest, although it only calculates specific functions (one being the permanant of the scattering matrix that describes your photons interacting). The problem with it is that you more or less need all of your photons to be identical to calculate the permanant, and it's only better than a classical computer when you've got ~>40 individual photons. They've managed to get some high photon counts, but only with very large setups. Integrated optics are also being applied to this (so just small-scale stuff where the channels are carved into a chip), but I don't believe they've reached high numbers of photons. Figuring out algorithms that go beyond computing matrix functions is also important.
And if we're going to just talk about new classical computing materials, think about the scale at which we've got silicon transistors: a single transistor is on the order of a few nanometres in size, of which there are billions on a single chip (a few square inches). Shrinking optical devices to that scale is going to be challenging.