inVISION: Is Embedded Vision a rather non-industrial topic?
Schmitt: As we know from history, Embedded Vision is open for any type of application, like code reading or quality control, and newer ones like facial recognition, cashier free stores or ITS. The big difference is that 20 years ago Embedded Vision was only seen by specialised people and now everybody hears about it because it is driven by the consumer market.
Carlson: Through applying analytics to the images, the customer is figuring out a way to capture more value. They can do something more efficient and more intelligent. For these companies the payback is in months or less than one year.
Scheubel: The origins of Embedded Vision are certainly in the consumer sector. However, due to the increasing performance and reliability of components, Enbedded Vision can also be used for industrial applications.
Gross: Because processing power has increased so much even x86 based Windows Embedded Vision systems can be applied nowadays, e.g. in automatic passport control systems at the airport. There you have a camera inspecting passports and another camera checking your face and comparing it to what is actually on your passport and checking the biometrics.
Bach: For us Embedded Vision means basically delivering products and services to the industry. We definitely appreciate the presence of products like the Raspberry, but it is really hard to make a Raspberry part of a product that will be delivered to the industry.
Abel: Everybody wants a transparent supply chain at the logistic sector. Customers like to see where their goods are and in what quality they are. Embedded Vision is the enabler for this.
inVISION: What are you expecting from an Embedded Vision system?
Bach: For us the requirement for an Embedded system is definitely reliability. We can hardly afford to have systems out in the field that permanently crash or are not available for the tasks they were designed for.
Gross: Embedded Vision systems are usually purpose-built and therefore quite customized. A lot of work is involved in getting them up and running. Therefore we try to make our cameras as easy as possible to integrate. We standardize where possible so for example all cameras are controlled with the same programming interface and offer the same functionality.
Abel: We like to have a low price because nobody wants to pay for logistic. But we need standards in the interfaces to give the information to an overall system, e.g. fleet or warehouse management system and so on. We have standards for image transportation but we don’t see standards for Embedded Vision systems at the moment.
Scheubel: We definitely need standardisation. To realise an Embedded Vision application, many components need to seamlessly interact with each other. It begins with the embedded processor and its operating system (OS). On this processor, or a connected hardware accelerator, the neural networks that we develop must be implemented. To generate the images needed for the Embedded Vision task, a compatible imaging module must be integrated. All these components must be seamlessly connected, therefore standardisation could facilitate development a lot.
Carlson: We believe in Workload Consolidation. Once you have got that CPU power you can just take all this other stuff and put it in there. And so in the world of multicore CPUs you can have a six core CPU where one core is used as a gateway and four cores are doing analytics.