Showing posts with label LIDAR. Show all posts
Showing posts with label LIDAR. Show all posts

Tuesday, August 4, 2020

LiDAR SLAM Navigatio Resources

https://github.com/teddyluo/LiDAR-SLAM-Nav-RES


LiDAR-SLAM-RES

A page of LiDAR SLAM Navigatio Resources (LiDAR-SLAM-Nav-RES) to follow up current LiDAR SLAM based Navigation trends, including key papers, books, engineering projects, as well as valuable blogs.

(Current) Project III — Motion Planning

ROS Research Papers

  • ROS Layered Costmaps
David V. Lu, D. Hershberger and W. D. Smart, "Layered costmaps for context-sensitive navigation," 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, 2014, pp. 709-715. pdf
  • Layered Social Cost Map
  • Kollmitz, Marina, et al. "Time Dependent Planning on a Layered Social Cost Map for Human-Aware Robot Navigation." 2015 European Conference on Mobile Robots (ECMR). IEEE, 2015. pdf
  • (2019/07/19) David V. Lu, Daniel B. Allan, and William D. Smart. "Tuning Cost Functions for Social Navigation." International Conference on Social Robotics. Springer, Cham, 2013. pdf
  • ROS Navigation Tuning Guide
Kaiyu Zheng, ROS Navigation Tuning Guide. arXiv preprint arXiv:1706.09068v2, Sep. 2016. pdf
  • ROS Navigation: Concepts and Tutorial
Guimarães R L, de Oliveira A S, Fabro J A, et al. ROS navigation: Concepts and tutorial[M]//Robot Operating System (ROS). Springer, Cham, 2016: 121-160. pdf
  • Robotics Engineering 2: ROS-Turtlebot Motion Control and Navigation
AK Assad, Mashruf Chowdhury, and Yanik Porto, Robotics Engineering 2: ROS-Turtlebot Motion Control and Navigation. May 11, 2015. pdf

Books

  • Robotics (Release 1.4)
Jeff McGough, Book title: Robotics. Date: Dec./02/2018. pdf
  • 《机器人操作系统(ROS)史话36 篇》
张新宇, pdf
  • 《人类找北史:从罗盘到GPS,导航定位的过去与未来》
Bray, Hiawatha. You are here: From the compass to GPS, the history and future of how we find ourselves. Basic Books (AZ), 2014. pdf(中文翻译)

Courses

  • 《智能机器人系统》
国防科技大学智能科学学院, 卢惠民,郑志强,韦庆,肖军浩,杨绍武,曾志文, link
  • 《机器人操作系统入门》(2018)
中科院软件所&中科重德机器人公司, 柴长坤, link

Online Resources

Tutorials

  • 机器人操作系统(ROS)暑期学校, type: video&pdf, link
  • 专栏文章:ROS激光SLAM导航(`move_base`参数配置注释), type: blog, link
  • 小强ROS机器人教程, type: pdf, link
  • 机器人操作系统(ROS)浅析, type:pdf, link
  • ROS小课堂, type:blog, link
  • Exbot 易科实验室, link

Projects

  1. PythonRobotics
  2. ROS Navigation Stack

机器人硬件

1) 硬件平台

2) AGV 国家标准

  • 《GB/T 30029 自动导引车(AGV) 设计通则》, pdfpdf(candidate )
  • 《GB/T 30030 自动导引车(AGV) 术语》, pdfpdf(candidate )
  • 《GB/T 20721 自动导引车 通用技术条件》, pdf

Project I — Hardware Configuration: Laser and IMU Sensors

  1. Laser: Osight LSXXXTM laser sensor configuration & test:
    configuration
  2. IMU:

Project II — Laser-based SLAM (Part 1): Google Cartographer

  1. Google Cartographer
    Hess W, Kohler D, Rapp H, et al. Real-time loop closure in 2D LIDAR SLAM [C]//2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2016: 1271-1278.
  1. Sparse Pose Adjustment (SPA)
    Konolige K, Grisetti G, Kümmerle R, et al. Efficient sparse pose adjustment for 2D mapping[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2010: 22-29.
  2. Correlative Scan Matching
    Olson E B. Real-time correlative scan matching[C]//2009 IEEE International Conference on Robotics and Automation. IEEE, 2009: 4387-4393.
  3. Ceres Scan Matching
    Kohlbrecher S, Von Stryk O, Meyer J, et al. A flexible and scalable slam system with full 3d motion estimation[C]//2011 IEEE International Symposium on Safety, Security, and Rescue Robotics. IEEE, 2011: 155-160.
  4. Branch and Bound Algorithm
    Clausen J. Branch and bound algorithms-principles and examples[J]. Department of Computer Science, University of Copenhagen, 1999: 1-30.

Project II — Laser-based SLAM (Part 2): LiDAR SLAM Survey

  1. Castellanos, J.A., Neira, J., & Tardós, J.D. (2005). Map Building and SLAM Algorithms.
  2. Santos, J. M., Portugal, D., & Rocha, R. P. (2013, October). An evaluation of 2D SLAM techniques available in robot operating system. In 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) (pp. 1-6). IEEE.
  3. Mendes, E., Koch, P., & Lacroix, S. (2016, October). ICP-based pose-graph SLAM. In 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) (pp. 195-200). IEEE.
  4. Yagfarov, Rauf & Ivanou, Mikhail & Afanasyev, Ilya. (2018). Map Comparison of Lidar-based 2D SLAM Algorithms Using Precise Ground Truth. 10.1109/ICARCV.2018.8581131.
  5. Felipe Jiménez, Miguel Clavijo and Javier Juana. (VEHICULAR 2018). LiDAR-based SLAM algorithm for indoor scenarios.
  6. Yagfarov, R., Ivanou, M., & Afanasyev, I. (2018, November). Map Comparison of Lidar-based 2D SLAM Algorithms Using Precise Ground Truth. In 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV) (pp. 1979-1983). IEEE.
  7. Kümmerle, R., Steder, B., Dornhege, C., Ruhnke, M., Grisetti, G., Stachniss, C., & Kleiner, A. (2009). On measuring the accuracy of SLAM algorithms. Autonomous Robots, 27(4), 387.
  8. Chen, Y., Tang, J., Jiang, C., Zhu, L., Lehtomäki, M., Kaartinen, H., …​ & Zhou, H. (2018). The accuracy comparison of three simultaneous localization and mapping (SLAM)-Based indoor mapping technologies. Sensors, 18(10), 3228.

Areskey Miiboo: ROS Smart Car Platform




A Systematic Platform to Learning Robot Programming with ROS | ROS Smart Car System | SLAM Builds a map | Voice Navigation | Speech Recognition | Speech Synthesis (Package Content: 2)
$99.00
https://www.ienggbdc.com/index.php?main_page=product_info&products_id=486073

https://www.amazon.com/-/es/Platform-construye-Navegaci%C3%B3n-Reconocimiento-S%C3%ADntesis/dp/B07X2HQ23D?th=1

https://www.amazon.com/Platform-Navigation-Recognition-Synthesis-Tutorial/dp/B07X1NMQKT

Brand: Areskey   |   Manufacturer: Miiboo

US$ 528.68
ROS Smart Car Platform | SLAM construye un mapa | Navegación por voz | Reconocimiento de voz | Síntesis de voz | Tutorial de inicio de ROS

This one uses the https://www.ydlidar.com/

https://github.com/miiboo

http://miiboo.cn/







XiaoR GEEK ROS SLAM Robot Car with Laser Radar for Raspberry PI

https://www.xiaorgeek.net/collections/raspberry-pi/products/xiaor-geek-ros-slam-robot-car-with-laser-radar-for-raspberry-pi-4b

XiaoR GEEK ROS SLAM Robot Car with Laser Radar for Raspberry Pi 4B


XIAOR GEEK

$425.99 USD


RPLIDAR A1 Lidar.


https://www.xiaorgeek.net/blogs/news

http://xiao-r.com/Product/page/id/10  Manual

http://xiao-r.com/


Tuesday, July 14, 2020

Cliff detection with ESPRO TOF 8x8 Lidar


https://www.espros.com/wp-content/uploads/2020/07/ESPROS_TOFframe611ForCliffdetection.pdf

Looks like an 8x8 array good for about 2 meters for just under $100 

I am working on get some image of the sensors output in action.


Mobile robots such as vacuum cleaners are safe from falling down stairs - with ESPROS' TOF sensors for cliff detection...

Get in touch with us or order here on Digi-Key.

Also get in contact with our sales partners around the world!

Kind regards,
Your ESPROS Photonics Team


Monday, December 16, 2019

Saturday, September 22, 2018

NaviPack LiDAR Navigation Module


https://www.indiegogo.com/projects/navipack-lidar-navigation-module-reinvented#/

https://www.youtube.com/watch?v=SBhIdXVnoZU&feature=share

https://robot.imscv.com/en/product/3D%20LIDAR


NaviPack makes any device smarter and easier to control. It uses the latest LiDAR technology and powerful APIs to create easy solutions for automated devices.

With the built-in SLAM algorithm chip, Navipack is the first plug-and-play type of LIDAR navigation module. NaviPack is also the most affordable LiDAR solution for drones, robots and other devices and instantly enables them with powerful 360-degree sensing capabilities

NaviPack integrates the SLAM algorithm with the LiDAR sensor module, making it super easy to use and significantly reducing development time.

NaviPack performs 360 degree scanning of its surroundings and all objects up to 15 meters away with a frequency of 4000 points per second. It is super easy to use! With the built-in SLAM module, it will start working immediately after plugging into your devices - scanning the environment, building a detailed map, and enabling auto-moving capability.


 navipack ks explosion.jpg








Friday, September 7, 2018

Friday, April 13, 2018

A Comprehensive List of 3D Sensors Commonly Leveraged in ROS Development

https://rosindustrial.org/3d-camera-survey


The recent availability of affordable ROS-compatible 3D sensors has been one of the fortunate coincidences that has accelerated the spread of ROS. This page is intended to replace the Blog Post from early 2016, to be a one-stop more "easy to find" spot for updates and a more complete list of 3D sensors. We are happy to see there has been such traffic on this topic and look forward to making this a more consistent resource to the development community.

Microsoft® Kinect™ 2.0

Type: Time of flight
Depth Range: 0.5 to 4.5 m
3D Resolution: 512 x 424
RGB Resolution: 1920 x 1080
Frame Rate: 30 fps
Latency: 20 ms minimum
FOV: 70° H, 60° V
Physical dims: ~250x70x45 mm (head)
Interface: USB 3.0
Link to ROS Driver
Notes: Latency with ROS is multiple frames.
Active cooling.
Intel r200.png

Intel RealSense R200

Type: Stereo with pattern projector
Depth Range: 0.6 – 3.5 m
3D Resolution: 640 x 480
RGB Resolution: 1920 x 1080
Frame Rate: 60 fps (3D), 30 fps (RGB)
Latency: 1 frame
FOV: 59° H, 46° V
Physical dims: 102x9.5x7 mm
Interface: USB 3.0
Link to ROS Driver
Notes: Outdoors capable.
Asus.jpeg

ASUS® XtionPro™ Live

Type: Structured light
Depth Range: 0.8 to 3.5 m
3D Resolution: 640 x 480
RGB Resolution: 1280 x 1024
Frame Rate: 30 fps
Latency: ~1.5 frames
FOV: 58° H, 45° V
Physical dims: ~180x40x25 mm (head)
Interface: USB 2.0
Link to ROS Driver
Notes: Similar internals to the Xbox Kinect 1.0. Intermittent availability for purchase.
IFM.png

IFM® Efector™ O3D303

Type: Time of flight
3D Resolution: 176 x 132
RGB Resolution: N/A
Depth Range: 0.3 to 8 m
Frame Rate: 25 fps
Latency: 1 frame
FOV: 60° V, 45° H
Physical Dims: 120x95x76 mm
Interface: Ethernet
Link to ROS Driver
Notes: Accuracy +/-4 mm. IP65/67 industrial enclosure.
Zed.jpeg

Stereolabs® ZED™

Type: Embedded stereo
3D Resolution: 2208 x 1242 max
RGB: 2208 x 1242 max
Depth Range: 1.5 to 20 m
Frame Rate: 15 fps at max res., 120 fps at VGA res.
Latency: 1 frame
FOV: 96° H, 54° V
Physical Dims: 175x30x33 mm
Interface: USB 3.0
Link to ROS Driver
Notes: Latency not confirmed.
MultiSense.jpeg

Carnegie Robotics® MultiSense™ S7

Type: Embedded stereo
3D Resolution: 2048 x 1088
RGB Resolution: 2048 x 1088 max (7.5 fps)
Depth Range: 0.4 m to infinity
Frame Rate: 15 fps at 2048 x 544
Latency: 1 frame
FOV: 80° H, 45° V
Physical Dims: 130x130x65 mm
Interface: Ethernet
Link to ROS Driver
Notes: IP68 enclosure.
N35.png

Ensenso® N35-606-16-BL

Type: Structured light
3D Resolution: 1280 x 1024
RGB: 1280 x 1024
Frame Rate: 10 fps
Latency: 1 frame
FOV: 58° H, 52° V
Physical Dims: 175x50x52 mm
Interface: Ethernet
Link to PCL/ROS Driver
Notes: Many other resolutions and FOVs available. IP65/67 enclosure available.
Sick Visionary.png

SICK® Visionary-T™

Type: Time of flight
3D Resolution: 144 x 176
RGB: N/A
Frame Rate: 30 fps
Latency: 66 msec
FOV: 69° H, 56° V
Physical Dims: 162x93x78 mm
Interface: Ethernet
Link to ROS Driver
Notes: IP67 enclosure
ECON Tara.jpg

e-Con Systems Tara Stereo Camera

Type: Embedded Stereo Camera
3D Resolution: 752 x 480
RGB: N/A
Frame Rate: 60 fps
Latency: 1 Frame
FOV: 60° H
Physical Dims: 100x30x35 mm
Interface: USB 3.0
Link to ROS Driver
Notes: Inbuilt IMU
Narian.jpg

Narian SPI

Type: FPGA Stereo Camera
3D Resolution: 640 x 480
RGB: N/A
Frame Rate: 30 fps
Latency: 1 Frame
FOV: Variable
Physical Dims: 105x76x36 mm
Interface: USB 2.0 to cameras, Gigabit Ethernet to Host
Link to ROS Driver
Notes: Resolution up to 1440 x 1072
intelrealsense1.png

Intel® RealSense D415™

Type: Active IR Stereo
3D Resolution: 1280 x 720 max
RGB: 1920 x 1080 max
Depth Range: 0.3 to 10 m
Frame Rate: 90 fps at max depth res., 30 fps at max rgb res.
Latency: not noted
FOV: 69.4° x 42.5° x 77° (+/- 3°)
Physical Dims: 99 mm x 20 mm x 23 mm
Interface: USB 3.0 Type - C
Link to ROS Driver
Notes: Latency not confirmed.
mini-no-background-1-317x750.png

Orbbec® Astra Mini™

Type: Structured Light
3D Resolution: 640 x 480 max
RGB: 640 x 480 max
Depth Range: 0.6 m to 5.0 m
Frame Rate: 30 fps
Latency: 1 frame
FOV: 73 D x 60 H x 49.5 V
Physical Dims: 80 x 20 x 20 mm
Interface: <2.4w data-preserve-html-node="true" USB
Link to ROS Driver
Notes: Optional metal enclosure available. Handle with care without an enclosure, deflection can cause issues with performance. Excessive heat can cause issues. Performance comparable to Asus Xtion.
photoneo.JPG

Photoneo® PhoXi® 3D Scanner L

Type: Structured Light
Depth Map/Point Cloud Resolution: 0.8-3.2M Points
Depth Range: 870-2156 mm
Frame Rate: 2.5-5 fps
Latency: 1 frame
FOV: 1300 x 975 x 1200 mm
Physical Dims: 77 x 68 x 616 mm
Interface: Link to API with ROS Support
Notes: Data Acquisition Time: 2.5-5s. Near metrology grade resolution. Comes in a variety of models. Operates in High Resolution or High Acquisition Modes.

roboception® rc_visard™

Type: Stereo Camera
3D Resolution: 640 x 480 max
RGB: 1280 x 960 max
Depth Range: 0.2 m to 1.0 m for 65 Model, 0.5 m to 3.0 m for 160 Model
Frame Rate: 3-25 Hz
Latency: 1 frame
FOV: 61 H x 48 V
Physical Dims: 135 x 75 x 96 mm for 65 Model, 230 x 75x 84 mm for 160 Model
Interface: Ethernet
Link to ROS Driver
Notes: Weight 680g for 65 Model, 850g for 160 Model. Optional SLAM module. ROS Driver being developed further thanks to the support of the ROSIN EU project.
duomc-03.png

duo3d® DUO MC™

Type: Stereo Camera
3D Resolution: 752 x 480 max
RGB: 752 x 480 max
Depth Range: 0.23 m to 2.5 m for for M series
Frame Rate: 0.1-3000 fps Max
Latency: 1 frame
FOV: 170 W with 30 mm Baseline
Physical Dims: 57 x 30.5 x 14.7mm
Interface: 480 Mbps USB 2.0 Micro-B
Link to API
Link to ROS Driver
Notes: Pixel size 6 x 6 micrometers. Shutter Speed 0.3 microseconds to 1- seconds. Control Functions: Exposure, Shutter, Brightness. Enclosure 6021 Aircraft Grade Aluminium.