eXtended Reality Applications for Industry4.0

Просмотров: 260   |   Загружено: 2 год.
icon
Pradipta Biswas
icon
5
icon
Скачать
iconПодробнее о видео
This video demonstrated two XR applications - a welding assistant and a manual assembly application.
The increasing adoption of connected and immersive technologies has led to a new reality in manufacturing. Arc welding is a commonly used industrial joining process involving significant hazards and complexity. The intent of this immersive welding assistant is to reduce health risks for human operators by facilitating remote welding scenarios, promote welding skill development, simulate process, and enhance productivity. The implementation is explored using Mixed Reality (MR) and Virtual Reality (VR). Comprehensive user studies were performed with a typical weld path definition task to compare VR and MR interactions with the measurement of quantitative parameters such as task completion time and accuracy measures, qualitative parameters such as NASA TLX and SUS scores, physiological parameters such as brain activity with EEG and ocular parameters with eye gaze data. Sensor dashboards display live values of welding process parameters reducing situational impairment of human welders and providing feedback for improving weld quality. The user interacts with virtual welding scene in a multi-modal manner and defines the desired weld path on virtual workpiece models remotely. This path is transmitted to the welding robot in real-time. An accurate mapping between the real robot and the virtual welding scene is obtained through linear regression. Thus, this concept utilizes the intelligence and experience of the human welder and the stability and accuracy of the robot arm.
In Industry 4.0, manufacturing entails a rapid change in customer demands which leads to mass customization. The variation in customer requirements leads to a small batch size and several process variations. Hence a factory floor worker needs a guidance system to assist them through these variations for successful completion of assembly task. Existing augmented reality-based systems suggest using a marker for each assembly component for detection which is time consuming and laborious. Existing mixed reality based headsets reportedly used spatial mapping to obtain the location of each individual component. This technique is computationally expensive and reported several seconds of latency to update the spatial map as per the dynamics of the scene. Overcoming these existing limitations of current guidance systems, the proposed method uses mixed reality technology to guide the user by giving instructions as virtual objects over the workspace. The proposed method utilizes state-of-the-art deep learning-based object detection technique to detect objects and employs a regression based mapping technique to obtain the 3D locations of assembly components. Further we proposed a multimodal interface involving both eye gaze and hand tracking modalities. We proposed eye cursor to guide the user through the task and utilized fingertip distances along with object sizes to detect any error committed during the task. We analyzed the proposed mapping method and found that the mean mapping error is 1.842cm. We analyzed the effectiveness of proposed multimodal user interface by conducting two user studies. First study indicated that the current interface design with eye cursor enable participants to perform the task significantly faster compared to the interface without eye cursor. The shop floor workers during the second user study reported that the proposed guidance system is comprehendible and easy to use to complete the assembly task. We demonstrate that the proposed guidance system enabled our participants to finish the assembly of one pneumatic cylinder within 55 seconds while an industry-grade single Yaskawa robot takes 3 min 55 secs for the same task

Похожие видео

Добавлено: 55 год.
Добавил:
  © 2019-2021
  eXtended Reality Applications for Industry4.0 - RusLar.Me