App - Server Process Flow

The Android based application enables the user to ask for a specific destination using only voice commands and it allows the user to add a new location based on the position of the tag at the moment the user issues the command. The CGI server receives either the name of the desired destination or the name of a new location. In the first case, the server searches in a file called "places.txt" for the coordinates of the location and using the actual location of the user provided by the UWB tag and BNO055, it calculates the shortest path, and in consequence, the immediate instruction the user should follow in order to start navigating towards its destination. In case the user wants to add a new location, the server obtains the coordinates of the user at that instant and stores them in the file "places.txt" with the associated name. The messages sent by the client have a JSON format and they contain only a key-value, in which the key can be either "add" or "place" and the value should be the location name. The server sends the instruction the user should follow next and the app synthesizes speech in order to tell this command to the user. Every time this commands finishes in being said, another request is sent by the app to obtain more instructions.

The shortest path calculation

The shortest path algorithm we used applies the A star algorithm to a visibility graph. Our predefined four nodes are established near the corners of the conference table, which is the main obstacle in the room were we tested the project. The start and goal position are the user's position at the start of navigation and the destination he specifies. The edges along this visibility graph are assigned in a way that they can't go through obstacles (the conference table). The four predefined nodes have their edges already specified to prevent this, but the starting point and goal point don't. To obtain the edges associated with these two nodes, the algorithm basically considers if the line that goes from one node to other crosses the boundary of the conference table, in which case it discards the edge. In case the destination point is inside the area considered to be of the conference table, it just obtains the perimeter point that's closer to that point and generates an edge with it. The A start algorithm uses as heuristic the distance between each considered node and the goal point. The server only considers the first part of the shortest path constructed when analyzing which command to send next to the user. When in this process, it always prioritizes the commands related with orientation, which basically consist of calculating the difference in degrees (clockwise or anticlockwise) between the user's orientation and the destination point. It sends the text string with the specified quantity of meters or degrees to the app. It's important to note that as the wearable with the UWB tag is usually placed in the right arm of the user, a special body constant is added or subtracted, according to the user's orientation, to the location received in order to center the point within the user.

The Application interface

As shown in the picture, the interface is mainly voice command based. At first, when initiating the app, it asks the user for a command or destination. Possible commands are "add" and "pause": the first one consists of the user specifying the name of the location when the app asks for it, and the second one just stops the app from asking more things to the user, as the app is designed to continously ask the user for some command when there's no response. To reactivate the app's responses, the user needs to tap two times the screen. If the user specifies a valid destination, the app starts requesting information from the server and it only voices the commands received. At this point the user can stop the navigation process by tapping the screen two times, which activates a sort of voice command line where the user can specify the "add" and "pause" commands described earlier or it can invoke the "return" command, which resumes the navigation, or the "cancel" command, which stops the current navigation and asks for a new destination. When the user arrives to its destination, the app tells the user such thing and proceeds with the Bluetooth Beacon sensing functionality. It is worth mentioning that there is a margin of error for the nodes locations of 40 cm and for the orientation of the user of 30 degrees.

The Bluetooth Beacon

As measured during the testing process, UWB positioning system has about 20-30 cm error rate. In order to make up for that, while in a 20-40 cm proximity of the destination, we use a Bluetooth Beacon (placed on the target appliance/destination). According to the strength of the received signal (RSSI), the smartphone (ideally placed on the user's wrist) vibrates proportionally. The closer to the destination it gets, the more vibration the phone gives off. This feature let the user search and find the destination when he gets within 20-40 cm range. More than striving for utmost precision at this point, we wanted to give the user a notion of the desired object's probable position. To return to the destination specification, the user only needs to tap once the screen.