IMPORTANT: NOW all the ASSAOP names changes by ASPSI to use this previous instructions.
The user need to take into account that this method is adapted to our robot's characteristics (tibi robot of the Institut of robotics and Informatics Industrial). Then, the accompaniment distances and also the Extended Social Force Model need to be adapted to size and mass of the robot to be used on. In any other case, it may happen that the method does not behave as well as it should since it has not been adapted. Also, the user needs to understand that this method expect that the other person wants that the robot accompanies he/she and expects a little person adaptation like when any person accompanies other person that adapts to its maximum velocity 1.2 m/s and also waits a little to be accompanied properly. Then, the person do not run from the robot. Also, the robot has a little initial perion to adapt to the person velocity, afther that period the robot can accompany well the person until 1 m/s of person's velocity.
The ASPSI implementation follows the theory in the contributions presented in [1] and [2], and as a Phd theses of Ely Repiso that will be included in the future in her google scholar.
The people tracker uses the Extended Social Force Model (ESFM) adapted to the Tibi's size and behavior due to its mass. Our ESFM model uses the ideal case of 0 mass, but our robot's acceleration and speed management is adapted to our robot to minimize oscillations and speed jumps during accompaniment. Also, this method is able to avoid static and dynamic obstacles whiles accompanies the person. The ASPSI method uses the ESFM inside the Anticipative kinodymaic planner (AKP) in [3] adapted to accompany one person. Then, our planner to accompany people uses an RRT* planner that uses the ESFM to propagate the paths inside a window of 5 seconds into the future. These possible paths are extracted using the ESFM to accompany and go to the final destination of the person that the robot accompanies. This destination is inferred using the Bayesian Human Motion Intentionality Prediction (BHMIP) [4]. Also, we use random goals over this local window that use a gausian distribution with mean on the final goal direction and with estandard deviation that increases with the number of obstacles inside the local window. Also, to select the best path we use a cost function that evaluates the distance until the goal, the diference of orientation between the robo's orientation to the orientation to arrive to the final goal, the robot's difficulty to follow the path, the work of the robot to avoid static and dynamic obstacles, and the accompaniment cost. All these things are explained inside the pepers that we include in this explanation.
If you use the software, please cite the [2] publication.
[1] Repiso, E., Ferrer, G., & Sanfeliu, A. (2017, September). On-line adaptive side-by-side human robot companion in dynamic urban environments. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 872-877). IEEE.
[2] Repiso, E., Garrell, A., & Sanfeliu, A. (2019). Adaptive side-by-side social robot navigation to approach and interact with people. International Journal of Social Robotics, 1-22.
link:
[3] Ferrer, G., & Sanfeliu, A. (2019). Anticipative kinodynamic planning: multi-objective robot navigation in urban and dynamic environments. Autonomous Robots, 43(6), 1473-1488.
[4] Ferrer, G., & Sanfeliu, A. (2014). Bayesian human motion intentionality prediction in urban environments. Pattern Recognition Letters, 44, 134-140.
To be able to use the tracker (iri_people_tracking_mht) and detector (iri_laser_people_detection) in a robot, you need c++ and ROS. But, you also can use it like tracker of points and you only have to install the tracking library (section: (enter to the tracking library folder and do the points 3. to only install the tracking library)
1.Software and requirements
1.1. Linux Ubuntu
Ubuntu OS (Xenial 14.04 or 16.04 version), a Debian-based Linux operating system.
2.2. C++
The leg detector and the tracker were programed using ros libraries. If you want to use it or change some things it is recommended to have installed a c++ editor like eclipse.
2.3. ROS
We will use a software called ROS. To know more about ROS visit wiki.ros.org, where you can find installation instructions, basic concepts, and the ROS tutorials, highly recommended when you want to start learning ROS. We will use the ROS Kinetic version, for Ubuntu 16.04. You can install the recommended ros-kinetic-desktop-full version.
***** The next packages are not needed if you only use the leg detector and tracker *******
Alongside the full-desktop installation of ROS, if you want to use the tracker in a movile robot that use a laser to navigate you have to install also some extra ROS packages. You’ll need to do:
Create the catkin workspace if you do not have one previously.
It is recommended to call it "iri_ws" and place the ROS nodes inside src folder. But, you can name your workspace as you want, as long as you keep it in mind.
2. You need to include in yout .bashrc the next two lines:
export ROBOT=tibi
export ROS_MODE=sim
3. You need to install our laser people detector (iri_laser_people_detection) and our tracker (iri_people_tracking_mht). Installation instructions there. Also, you can use other detection and tracking, but you will need to adapt them to the ASPSI.
4. Create a ROS workspace if you do not have one.
4 Copy the iri_robot_aspsi folder in your ROS workspace.
2.5 Copy the ASPSI library and enter there from a linux terminal (folder: robot_ASPSI_lib).
> cd robot_ASPSI_lib
> cd build // or create the build forlder.
2.6 In the build folder of the library, compile it, by doing:
4. You can use this launch to launch all the nodes and libraries that the ASPSI needs. You need to launch it in your work-space. The launch is set up with the correct parameters to use the ASPSI with our system:
To select the nearest people to accompany the robot with the dynamic-reconfigure:
The dynamic-reconfigure is pop-out directly with the launch of the gazebo simulator. You need to find in the left-up screen of this pop-out window, this route: tibi ->move_base ->AkpLocalPlanner. Then, you have to look for the variables that are indicated below.
- One person accompaniment: Change to true the value of the variable (select_near_pers_as_companion_one_person) in the dynamic_reconfigure, and change to false this variable. You see that the variable of id_person_companion changes to the track identifier of the nearest person. This person will be the accompanied person for the robot.
To start the ASPSI node, you need to send a goal from the rviz. It is indiferent if you send first the goal and after that select the person or if you do the reverse first select the person and after that send the goal.
The starting of the algorithm needs these two steps: Send a goal and select the person to accompany. After that you can move the accompanied person and the robot accomany it and do that other people or obstacles interfers in the group movement by moving the person and the robot will accomapny the person while avoid these static obstacles or other people.
To move manually each person (with the teleop of ROS):
Important: You need the ROS teleop. Web-site of ROS: http://wiki.ros.org/teleop_twist_keyboard
- To move the accompanied person, you need to launch in a terminal this command:
The instructions to move this person apear in the terminal. To test the argorithm this person is the first that you need to control. You can send an automatic commando to move forvard using a constante velocity or you can move better ( stoping this person, doing turns, exploring the environment, etc).
- To move other people that may interfer the group path (accompanied person and robot), you need to launch in a terminal these other commands:
Move a little these people to know in what position of the environment are located these ones. I recommend to move these people sending at the begining a velocity comand. They move automatically in this direction with a constant velocity. It is more interesting to move better the accompanied person, than these ones that only interfer like the real people going to other places and only crosing with the group. Take into account that this simulator has collisions, if any of these person collide with an obstacle, the gazebo simulator chrashes, then you need to stop these people before arrive to any obstacle.
You can change the initial positions of these people in the environment by changing it in the launch (roslaunch iri_robot_aspsi gazebo_ASPSI_BRL.launch)
-Furthermore, you can change the map of the environment doing this in the launch (roslaunch iri_robot_aspsi gazebo_ASPSI_BRL.launch) file:
<arg name="world" default="master_big"/> <!--Change the name: master_big with other gazebo's map name to change the map of the environemnt. -->
-Also, you may want to include more destinations for the robot's ASPSI. You need to include these ones in the file (/maps/master_big_destinations_Gazebo_sim.txt of the tibi_dabo_akp_local_ASPSI.launch):
2 // number of destination. Increase it with the number of destinations that you include.
0, 45.0, 20.5, 0.5, 1, 1 // destinations inside the environment that use map coordinates. dest-id:0, dest.x: 45.0, dest.y: 20.5, dest-probability-to-go:0.5, number-of-dest-neighbourds:1, id-of-the-neighbourds-that-people-can-go-after-arrive-to-the0-destination: 1
1, 10.0, 20.5, 0.5, 1, 0
Other examples:
4
0, 45.0, 20.5, 0.25, 3, 1, 2, 3
1, 10.0, 20.5, 0.25, 3, 0, 2, 3
2, 45.0, 20.5, 0.25, 3, 0, 1, 3 // change the x and y position of the destinations 2 and 3 to other possible destinations of the map environment, to have 4 real destinations.
3, 10.0, 20.5, 0.25, 3, 0, 1, 2
- Also, to see more markers (of the people prediction, all the possible paths, etc) change the dynamic-reconfigure variable vis_mode to 4 or less, you can change it and see which markers are habilitated.
################## Set-up variables with the dynamic-reconfigure (For Tibi it is already set-up. This can be needed only if you use the system with other robot, other environments, etc) ##################
gen.add("robot_resultant_force_marker" , bool_t, 0, "enable visualization of robot resultan force marker", True)
gen.add("robot_companion_force_marker" , bool_t, 0, "enable visualization of robot resultan force marker", True)
Next ones Parameters to adapt to accompaniment to the robot-platform (set-up for Tibi):
#### The next ones are Parameters to adapt to the robot the accompaniment, distance between people and angle of accompaniment ####
- platform_radii: It is the robot's platform radi, in our case 0.5 m.
- proximity_distance_between_robot_and_person: the accompaniment distance in this case it is set up to 1.5 m, you can reduce until 1 m for Tibi if you want it. It is not recommended to reduce more, because with 1m you have the minimum free space between the accompanied person and the robot.
- real_companion_angle_SideBySide: It is the accompaniment angle set for the case without obstacles in our case side-by-side. It is 90 degrees.
- max_real_speed_out: The limit of the maximum velocity for the robot, in our case 1 m/s or as much 1.2m/s.
- speed_k: It was onlu needed for the real robot. In simulation needs to be set-up to 1.
# to stop slowly the robot. (parece que estos valores van bien)
# step_decreace_velocity_stop_robot_ (with real robot)=0.05 limit_velocity_stop_robot_ (with real robot)=0.1 => is the limit to let the robot decrease the velocity as much as want, without the slow reduction.
- step_decreace_velocity_stop_robot_conf: The maximum decreasement of the robot's velocity allowed to stop slowly. You can increase it, but the robot will stop abruptly.
- limit_velocity_stop_robot_conf: The minimum velocity to stop quickly to 0 m/s velocity. In our case very near to the 0m/s of velocity, it is set up to 0.1m/s.
#### Parameters to reduce the computational load of the planner of config of the planner limits. (may change in other systems. ####
# horitzon time of the local planner.
- horizon_time: The horizon time for the local-planner. To reduce the computational load you can reduce-it, but not much than 3 seconds. Now is set-up to 4 seconds. But the normal lenght is 5 seconds.
- number_vertex: Number of total vertexes for planning purposes. To reduce the computational load you can reduce-it, but not much than 200 vertex. The normal size is of 500 vertex.
- set_planner_dt: the planner dt, it is the iterative loop time, in our case 0.2 seconds.
#### Parameters to adap the environment + reduce the computational load in terms of obstacles####
- in_change_dynamically_final_dest: if true we change the final dest tacking into account the person orientation.
#
- distance_to_dynamic_goal_Vform: limit of distance from the static goal to change the final goal dynamically using the group orientation of movement.It deppends of the distance between the static final destination until the obstacles that are perpendicular to the normal direction of movement of the people in this environment to go until the final destinations. For the BRL= 6 m, for the FME= 4m.
Parameters to reduce the computational load due to obstacles:
- detection_laser_obstacle_distances: distance arround the robot's position to detect the laser obstacles for the local planner (reduce the computaional load).
- radi_to_detect_other_people_no_companions: circle arround the robot to detect dynamic obstacles (other people, not companions). reduce the computaional load.
Parameters to adjust the robot's accelerations and velocities:
# set max velocities and accelerations for propagate the robot position
- v_max: robot maximum velocity. For Tibi= 1.2
- w_max: robot maximum angular velocity. For Tibi= 1.0.
- av_max: robot maximum acceleration. For Tibi= 0.4.
- av_max_VrobotZero: robot maximum acceleration. For Tibi= 0.6.
- lim_VrobotZero: robot maximum acceleration. For Tibi= 0.1.
- av_max_negativa: robot maximum acceleration negative. For Tibi= 0.2.
- av_break: robot minimum acceleration (breaking). For Tibi= 0.4.
- aw_max: robot maximum angular acceleration. For Tibi= 0.9.
#### Other needed variables that normally we do not have to change it:
###### Are from the AKP-planner of gonzalo.
gen.add("move_base", bool_t, 0, "disabled filters cmd_vel and sends zeros", True) # es para parar al robot.
gen.add("distance_mode", int_t, 0, "#0-Euclidean, #1-cost-to-go-erf, #2-c2g-norm, #3-c2g-raw. It is recomnded not to use *erf/*norm methods and not using also *erf/*norm in the global mode, except for c2g-raw", 1, 0, 3)
gen.add("global_mode", int_t, 0, "Designed to be paired with the distance_mode, although it may use a different global mode:#0-Scaliarization, #1-Weighted-sum-erf, #2-Weighted-sum-norm, #3-MO_erf, #4-MO_norm", 1, 0, 4)
# end of SFM params
gen.add("min_v_to_predict", double_t, 0, "Minimum estimaated velocity that the BHMIP requires in order to make aprediction. If velocity is lower than this number, then no prediction is done and the targets remains in place", 0.2, 0.0, 5.0)
gen.add("ppl_collision_mode", int_t, 0, "mode to calculate ppl collisions #0 deterministic, #1 mahalanobis to std, #2 mahalanobis to std 0.5 and Euclidean distance near, #3 Mahalanobis to std 0.3+Eucl", 0, 0, 3)
gen.add("goal_providing_mode", int_t, 0, "mode to provide goals to the local planner: #0 cropping or intersecting the plan with the boundary of local window; #1 Slicing the plan into a set of subgoals at salient points", 0, 0, 1)
gen.add("slicing_diff_orientation", double_t, 0, "Slicing path changes in orientation to select a new subgoal. Only makes sense if the goal_providing_mode is set to #1", 20.0, 10.0, 90.0)
### weights for the costo of the paths. It is missed here the companion cost. (Dificiles de quitar) ###
# change time vindow to calculate the people velocity (propagations)
gen.add("change_time_window_filter_vel_people", double_t, 0, "change the value of the time window for filter the medium velocity of the prediction of people", 4.0, 0.1, 40.0)
gen.add("conf_normal_vel_dampening_parameter", double_t, 0, "to change the normal velocity dampening parameter of the robot, aboid S behaviour", 1.6, -15.0, 15.0)
gen.add("conf_limit_linear_vel_for_dampening_parameter", double_t, 0, "to change the limit_threshold for the linear velocity to change the dampening parameter of the robot, aboid S behaviour(not used at this moment)", 0.15, -15.0, 15.0)
gen.add("conf_limit_angular_vel_for_dampening_parameter", double_t, 0, "to change the limit_threshold for the angular velocity to change the dampening parameter of the robot, aboid S behaviour(not used at this moment)", 0.8, -15.0, 15.0)
gen.add("conf_final_max_v", double_t, 0, "Is minimun velocity to change the goal of the robot to do not it first position", 1.5, 0.0, 3.0)