Panel Discussion: Embedded Vision Everywhere?

inVISION: Is energy consumption a topic?

Schmitt: We have a lot of customers who have stand-alone solutions that are battery or solar powered and then of course power consumption is an issue.

Scheubel: Even if there is enough power, heat can still be a problem in the embedded processor.

Gross: Power consumption is always coupled to heat generation. If you have a small mobile device that you hold in your hand, it should not get warmer than 40°C. A low power consumption helps to keep the systems cool, which provides better image quality because the noise you have in the images is lower.

Abel: Power consumption is not the problem for us.

inVISION: Embedded systems usually use Linux whereas machine vision systems are using Windows. Is there a compatibility problem?

Carlson: If your vision systems have a six-core chip you can run four different OS on it, one of them is Windows, one is a real-time OS or whatever flavour you want. This is an absolutely do-able workload consolidation. You can have a Microsoft GUI, which is not typically known for being real time performance friendly, with a real time OS. Collaborative robots are a great example where you have a vision system but you also provide real time control of the robot.

Gross: What makes Linux interesting on embedded platforms is that it can be customized much more than Windows systems.

Bach: Even if you have different platforms, specifications or tunings of the Linux kernel it is always the same interface. You can actually copy&paste a lot of the actual workflow and service infrastructure to the target platform. It´s all given by the same OS which only varies a little in configuration. Our development team works on Windows machine and our platforms are primarily Linux.

Abel: If you are looking at an assistant system, it does not care if it is a Windows or Linux system. We are looking for a hardware interface that is digital I/O or CANopen or Ethernet with a standardized protocol. But if we are looking at the robotic side, this is an issue. If these systems are not really compatible you will get a huge jitter in the communication. This maybe leads your robot to move somewhere you don’t want it to move to.

inVISION: Machine vision is mainly based on x86 platforms whereas Embedded systems are using different platforms. Is this a problem?

Scheubel: We see both in the market, x86 and ARM processors. The choice of the processing platform mainly depends on the history of the customer and which is easiest to integrate into his system environment.

Carlson: The software the customer wants to run for the application is usually dictating whether it is an ARM or a x86 system. If they have a x86 history and they have got backwards legacy they are very likely to stay running Windows. If it is an entirely new application and there is no need for Windows and they are at the lower performance end maybe they are going to be more leaning towards ARM.

Schmitt: It is not only the question between ARM and x86 but also a combination with GPU, ARM and FPGA. It´s a bit more complex but at the end it is depending on the application and the taste of the developers.

Abel: We are working on x86 but we are seeing all the edge GPUs. For a developer it is usually not different to program both systems. But if we are looking at machine learning, this is a completely different development process.

Seiten: 1 2 3 4Auf einer Seite lesen

Allied Vision Technologies GmbH

Das könnte Sie auch Interessieren