EZYPILOT was initially developed for AGV manufacturers, who were suffering implementing their automated projects in outdoor or mixed routes relaying on their existing technology based on single channel LIDAR… The challenge was adressed with automotive approach of sensor fusion, automotive grade sensors and SW modules. EZYPILOT enabled the extension of the logistic automation to outdoor and ensured a whole new level of perception, which significantly improves the efficiency and safety of outdoor Automated Vehicle operations. Based on this experience we created a generic product, which can be used in any kind of autonomous vehicle…
Autonomous driving is not about the best individual sensor, it‘s about the best sensor team!Different sensors perceive the environment differently, everyone has its sweet spots, each of them has its limitations too.Sensor fusion is like building a winning team based on deep understanding of each individual for a specific game, i.e. operational conditions and environment, where they are supposed to operate.
We offer engineering services and technical consulting to the manufacturers of Autonomous Mobile Robots and Automated Mobile machinery. We guarantee to shorten the development cycle and optimize the development budget by proposing established software and hardware components from our technology partners. Our customers often decide for PoC on the basis of our EzyPilot, which means they can have a functional automated vehicle running in couple of weeks.
Visual SLAM fused with IMU and odometry in indoor environments or GNSS-RTK fused with IMU and odometry in outdoor environments are typical examples of localization solutions.
Mobile Robotic applications often require switching between absolute and relative positioning - typical examples: pallet picking or lane following… These applications are possible based on visual data and dedicated SW modules, which we can integrate in your mobile application
Our main goal is to help our customers improve the safety and performance of their AMRs. We do that by processing the visual data by AI based SW modules, which provide reliable information about drivable area and detected objects in the environment. Among all classified objects we put special attention on humans, which could be dynamically followed with continuous assignation of the speed vector and risk assessment factor. Such inputs are of the highest value for the decision making in vehicle control algorithms for improving safety and performance of the Autonomous Mobile Robots.