Method for Estimation of Position of an Object in a Room Using Cross-Line Laser

AGVs (Automated Guided Vehicles) are widely used in factories and warehouses. Functions such as position estimation are indispensable for unmanned transport robots. We have developed a new method to estimate the position of an object using a cross laser and a camera. In this study, we verified the measurement accuracy by constructing the estimation principle and implementing the system. The results of position estimation at arbitrary measurement points showed that the measurement error was small, averaging 26 mm and maximum 49 mm, confirming that our system can estimate the position accurately.


Introduction
In recent years, the number of autonomous mobile robots in Japan has been on the rise, and it is expected to continue to increase in the future ( Fig. 1.1).
Autonomous mobile robots are used in everyday life, such as drones and cleaning robots. Autonomous mobile robots are also being used in factories and warehouses in the form of unmanned transport robots, and there are high expectations for their use in unmanned operations.
Most of the AGV robots currently in use have a predetermined travel route ( Fig. 1.2), and the layout of a factory or warehouse cannot be easily changed. In addition, position estimation of the robot is indispensable to realize free path change, and it is required to be implemented at a lower cost and with higher accuracy. In the position estimation system, since errors accumulate if only the robot's internal information is used, accurate position correction is achieved by combining external information such as the positional relationship with surrounding objects. SLAM and landmarks (LED markers) are examples of external information added to conventional systems, but these methods have problems such as increased computing tasks and blind spots in the detection area. Therefore, we are developing a method using a crossed laser and a camera with a monocular lens as a "position estimation system with low processing cost and unaffected by obstacles". In this study, we implemented the system and verified the errors based on the measurement results with a constant camera angle.

Principle
In this chapter, we describe the principles used in our research system.

Position estimation principle
As shown in Figure 2.1, a green cross laser is irradiated from the top of the robot to the top of the ceiling, and the intersection point of the laser when it hits the ceiling is captured by a camera from a different viewpoint to estimate the position coordinates.
When estimating the position of a robot by using a camera to directly capture the robot, there are several patterns in which the position cannot be estimated because of blind spots when there are obstacles between the robot and the camera. However, when this system is applied, as shown in Fig. 2.2, there are basically no blind spots because the system captures the ceiling surface. Therefore, it is possible to achieve a wide range of position estimation with a single camera. In addition, because there are parts of the ceiling that cancel out the laser light, such as the light source, some patterns are generated that hide the intersection of the lasers. Since our system uses a laser composed of two straight lines, it is possible to interpolate information by extending the remaining line even if a part of the line is hidden, as shown in Figure 2.3. Based on the above two points, our system can always estimate the robot's position.

2D coordinates to calculate the principle
In this section, we describe the calculation principle for obtaining the positional coordinates of the measurement target from the image after cross laser extraction.  The shooting range of the camera at the distance z from the lens can be calculated from the size of the image sensor ( , ) and the focal length f from the lens, as shown in In this case, the -intercept obtained by applying = 0 in equation (2.10) is the -coordinate of the target. Therefore, the coordinates of point can be expressed as in Eqs. (2.11) and (2.12).

(b)
Calculation of x-axis coordinates When calculating the -axis direction, the length of the line segment was used as the reference distance. However, since we have already obtained the -coordinate of the cross point , we can consider the shooting surface where the point Q actually exists as the new reference screen. The new reference distance is obtained by adding the cosine component of the vector to the 0 used in the calculation of the -axis direction, as shown in Fig. 2.10. The vector and the new reference distance 1 can be expressed as Eqs. (2.13) and (2.14).
The calculation of the real coordinates in the x-axis direction can be expressed as in Eq. (2.15) based on the shooting range of the camera and the -coordinate in the image.
is a signed fraction.
By the above process, it is possible to calculate the 2D coordinates ( , ) of the object.

System Overview
The system configuration used in this study is shown in Fig. 3.1. The model proposed in this study consists of a cross laser linearized by a polarizing lens, a camera, and a computer for image processing and camera angle control. The camera is placed at the reference origin, and the cross laser is placed at an arbitrary position. The cross laser is aimed straight up at the ceiling surface, and the camera captures images at an arbitrary angle with the angle parallel to the ceiling surface being 0 degrees.
The laser is a Huepar BOX-1G Cross Line Laser Marker, the camera is a Raspberry Pi HQ Camera, and the computer is a Raspberry Pi 4. In addition, we used a selfmade control head to specify an arbitrary angle in this experiment. Fig. 3.2 shows the specifications of the angle control head used in this research. A stepping motor is used in the controlled head, and it is assumed that it has sufficient performance for angle resolution.
The head can be controlled in parallel with the position estimation process by specifying an arbitrary angle via I2C communication with the Raspberry Pi as the master and the Arduino nano as the slave. The center of the optical axis does not change according to the angle. Fig. 3.3 shows the parameter settings for the position constants used in the current position estimation system. The camera is positioned at the origin on the x-y plane at a height of 800[mm] from the floor. Since the height from the floor to the ceiling was 2480 [mm], the value used in the calculation is 1680 [mm].
The image sensor in the Raspberry Pi HQ Camera used as the camera is a Sony IMX477. Therefore, the size of the image sensor, which is a constant inside the camera, is =5.476[mm] and =7.564 [mm]. Since the Raspberry Pi HQ Camera consists of a variable focus wide-angle lens, the focal length is not determined as a specification. In addition, the equation for the shooting range of the camera shown in the principle cannot be applied directly because of the wideangle distortion in the acquired image. Therefore, the distorted images are corrected by fixing the focus to a constant and calibrating the camera. For calculations using the focal length, the internal parameters of the camera obtained during calibration are used. In this study, f=5.99[mm] was used as the focal length.
The size of the image to be captured from the camera is =1280[pixel], =960[pixel], and the frame rate is 5fps.

Measurement accuracy verification
In order to utilize this system more effectively, it is necessary to investigate the reliable measurement range in the image. Therefore, the angle of the camera was fixed and the accuracy was verified by comparing the measurement error at each point.
(a) Experimental Methods The image of this experiment is shown in Fig. 4.1. In this experiment, the angle of the camera is fixed at 40 degrees for the measurement. The measurement points are arranged so that the cross laser is irradiated at intervals of 200 [mm] in the x-and y-directions, respectively, and the errors are compared with the theoretical values. The measurement points are placed at intervals of 100 [mm] in the y direction only when x=0 [mm].

(b)
Experimental results

(c) Discussion
The theoretical values (blue dots) and the measured data (red dots) were plotted as 2D graphs in the x-and y-axis directions, respectively, as shown in Fig. 4.2, and the maximum error of 49 mm was confirmed. The average error is 26 mm, which is small enough to be considered as a practical position estimation. This time the angle of the camera is 40 degrees. Therefore, the center of the camera's image is located at around 0 mm in the x-axis direction and 2000 mm in the y-axis direction. When looking at the measurement data as a whole, there was a tendency for errors in the counterclockwise direction of rotation relative to the image center, which may have been caused by the camera not being properly parallel to the ceiling surface. Fig. 4.3 shows the position estimation error as a 3D contour graph, and it can be confirmed that the highly accurate part with an error of less than 20 mm is located near the center of the upper side of the acquired image. The pixel spacing around the upper center of the image is denser than that of other measurement ranges, which may be due to the higher resolution. The areas with relatively large errors were mainly concentrated around the four corners of the image. For the area around the four corners of the image, the effect of using a wide-angle lens is extremely large, and it is believed that wide-angle distortion has not been completely corrected. The ceiling surface used for this measurement had LED lighting around =-200mm and y =2000mm, but there was no noticeable difference in trend when compared to the other measurement points. Therefore, it was confirmed that our position estimation system is not affected by the light source on the ceiling and other factors, and can estimate the laser intersection point even if it is hidden.

Conclusions
In this study, we implemented a system that combines a crossed laser and a camera to propose a position estimation system that has low processing cost and is not affected by obstacles, and verified the errors based on the measurement results when the camera angle is fixed. As for the measurement accuracy of our system, it was confirmed that the maximum error was 49 mm and the average error was 26 mm, as shown in the experimental results. In addition, we were able to estimate the position of the crossed lasers even when the lighting was placed on the ceiling and the intersection of the laser was hidden, indicating that the method can be used as a practical indoor location estimation method. In our experiment, the parallelism between the ceiling and the camera was manually adjusted. This has a direct impact on the error trend. In the future, we plan to further reduce the error and simplify the implementation by establishing a method that can automatically adjust the parallelism between the ceiling surface and the camera.