What will the standard sensor interface be for the high data rate sensors in the coming years?
Wuyts: I think the industry is craving for a standard. So I don’t think all our customers will like it if every sensor vendor makes their own interface. MIPI becomes faster and faster actually so I think that might be quite a good competitor. I do expect more and more efforts to standardize interfaces. But again, it also depends on the exact total bandwidth that is needed.
Hector: For the small resolution, MIPI is really getting more and more popular. But going to the very big arrays, our customers love using LVDS. It’s much more flexible to choose the surrounding hardware.
Narayanaswamy: MIPI’s progression from where it started to where it’s moving has been phenomenal. With the advent of the CPHY shows its proliferation as a standard and it has definitely taken hold in the industry.
Increased bandwidth and resolution are impressing but what about power consumption?
Wäny: We see more and more that the total system power is approaching thermal limits. But on the processing side we really take advantage of the main electronics technology progress with voltage going down. We have much more power efficient vision processing modules that will do much more operations than five years ago. The interfaces are definitely more efficient in terms of power per bit. But all in all, the power is still increasing if you increase resolution and speeds.
Hector: I think the paradigm is changing a little bit: Before, we used to have this one sensor which was very often big and fast to fulfill all the needs, and then we were adapting at the system level. Today, having sensors which are dedicated for a specific application opens a real space for innovations.
Wuyts: Tying into that, what is also an interesting trend is wafer stacking, which becomes more and more available for lower volume applications. So then you can actually do some processing on the image sensor itself and as a consequence, you don’t need to transfer all the data to other systems. So I think there are already some examples in the industry, for example 3D profile applications, where they do local processing on images. The question is of course, what do you do on an image sensor or what do you do more efficiently on a FPGA? If you want to keep it generic, it’s always very application-specific.