An indoor augmented-reality evacuation system for the Smartphone using personalized Pedometry
© Ahn and Han; licensee Springer. 2012
Received: 5 August 2012
Accepted: 30 October 2012
Published: 24 November 2012
There currently exist widely used mobile phone emergency applications for the smartphone and limited mobile emergency applications for indoor environments. However, the outdoor applications only focus primarily on providing accident information to users, and the indoor applications are limited by the unavailability of GPS user-positioning and by WiFi-based access problems. To compensate for these limitations, we propose the RescueMe system, which uses an indoor mobile Augmented Reality application, personalized pedometry, and an optimal exit path algorithm. Together these components comprise a system that can quickly and easily recommend an efficient exit path to mobile phone users in emergency situations. We have developed the mobile-based RescueMe system for use in large-scale buildings that contain complex paths. We show how RescueMe leverages the sensors on a smartphone and utilizes Augmented Reality, cloud information, daily-based user walking patterns, and an adaptive GPS connection method, to deliver critical evacuation information to mobile phone users in indoor emergency situations.
Significant loss of human lives and injuries attributed to building fires occur annually in the United States. In 2009, there were 90 civilian lives lost and 1,500 serious injuries that occurred in 89,200 buildings, according to the U.S. FEMA (Federal Emergency Management Agency) . The loss of human life or injury during a building fire has many causes, such as: asphyxiation or lung damage caused by smoke inhalation, building structure collapse on people unable to evacuate a building in time, and bodily injury caused by trampling of those caught in crowds of people trying to evacuate. A primary factor contributing to these scenarios is the time it takes for people to evacuate a building. If they can escape from the fire within the building in a sufficient amount of time, they can survive such an emergency.
The primary goal of our research is to develop an evacuation system for mobile phone users that can help them survive and escape quickly from inside of buildings when encountering an emergency situation, such as a fire. The RescueMe  system was designed to be run on the smart phone, using the phone’s existing built-in sensors and shared information from cloud servers. In addition our system utilizes Augmented Reality on the smartphone, as well as personalized pedometry and an adaptive GPS connection method. RescueMe does not require the use of any additionally installed building infrastructure. Via cloud servers, it is able to provide real-time data to users about crowded areas or exit doors to avoid, when they are seeking a quick exit path from a building in an emergency. Although there are already some existing localization systems, such as RFID or Wi-Fi, that can also help to locate and track users, these systems are quite expensive. Such systems require installing a costly infrastructure of hundreds or thousands of RFID or Wi-Fi devices. They also do not provide Augmented Reality (AR) assisted guidance to users, as ours does. RescueMe utilizes already existing smartphones, in conjunction with cloud servers, to localize users inside buildings to provide them with critical information and an optimal evacuation path to follow during emergency situations.
This paper offers the following contributions: a mobile phone-based indoor emergency evacuation system for medium-large buildings that is not infrastructure-enhancement dependent (e.g., requiring additional RFID or Wi-Fi sensors); an indoor image-based localization method, using the smartphone and cloud services; a mobile guidance system using AR on smartphones to direct users to safety; a personalized pedometry algorithm that estimates an individual’s stride length to more accurately determine a user’s distance and walking speed, and which utilizes an adaptive GPS connection method; and a recommendation algorithm for the shortest delay path that reroutes users away from crowded exits to uncrowded exits during emergency situations.
RFID tags are explored  as a means for non-GPS localization. RFID tags, statically placed in a building beforehand with precise location knowledge can be used as navigational waypoints. The requirement to install such an infrastructure within the environment is a major limitation of this approach. For the mobile phone localization, received signal strength (RSS) of Wi-Fi signals is the current preferred method [5–7]. Methods used are multilateration and fingerprinting. Multi-lateration requires at least 3 access points and precise knowledge of their locations to triangulate a user’s location through RSS measurements. Fingerprinting requires the user to map the Wi-Fi signal propagation characteristics of the environment beforehand, creating a probabilistic heat map that may be consulted to compare RSS readings obtained by the user. All of the proposed techniques require a prior knowledge about the environment and so their application is less generalized.
Pedometer dead-reckoning (PDR) techniques provide a more general solution by not requiring any prior modification or knowledge of the navigational environment. Given a known starting location (such as that provided by GPS at the entry point to a building), users can probabilistically determine their navigational pathway through footstep detection and heading estimation. Such techniques are limited to the sensor placement and the sensor quality. The most accurate results require external sensor placement in the users shoes [7–9], e.g. yielding 0.5 m - 0.75 m accuracies in 8725 m2 of 3-story building space . Using only the mobile smart phone device offers a more generalized solution, but suffers from limited sensor accessing, poor MEMs (Micro Electro-Mechanical Systems) sensor quality, and looser coupling between the user’s movement and the sensing capability. CompAcc  can also localize a mobile user by using a map-matching technique along with GPS, an accelerometer, and compass–but only in outdoor areas. This author’s method cannot be used to localize a user within a building initially, because it requires a GPS connection to initially localize the user. Even if the user was initially localized within the building, this author’s method could not be used, because it requires the periodic use of GPS to correct for localizing error. Stride estimation is done statically beforehand, contributing to the errors, depending on the variability of a user’s gait. Dynamic stride estimation has been explored, but sensors are normally mounted on the foot in order to capture leg swing. Work to compensate for these errors employ the Kalman filter [4, 7, 11], Weinberg expression , or zero-velocity updates (ZUPT) [7, 8]. These three techniques minimize inertial drift and can predict actions based on prior event knowledge. In particular, the Kalman filter has shown wide usage, and the ZUPT has shown excellent correction accuracy.
System design and architecture
In this work, our goal is to design and develop an indoor augmented reality system for evacuation, by leveraging the sensing capabilities of smartphones and user behavior. In this section, we first highlight the system design requirements and challenges, then describe the overall system architecture and key components.
System design requirements and assumptions
Supporting indoor augmented reality for evacuation calls for advances in a number of research areas, including accurate and efficient indoor localization, efficient indoor AR rendering and user-friendly interfacing, and effective evacuation functionalities.
Indoor localization is a key design requirement for indoor AR systems. Due to the diversity and dynamics of indoor environments and user activities, we need to identify a user’s indoor location precisely (with fine granularity and robustness), with low latency, and without incurring too much overhead on the mobile device.
Efficient exit path recommendation is also important for evacuating a dangerous place in the building. The system needs to effectively recommend the best exit path, avoiding crowded places.
Our solution should not make unrealistic assumptions about the existence of extensive infrastructure to assist with any of the above tasks. Our system does not assume the existence of elaborate indoor localization systems. Even WiFi connectivity and WiFi localization are not necessary, so long as there exists an external wireless connection, such as 3G/4G. We assume only the capabilities and sensors common to most standard smartphones, e.g. today’s iPhones and Android phones, that commonly have a camera capable of capturing continuous video, and accelerometers capable of measuring motion. We assume that the digital compass works indoors, which we’ve verified to be normally true in typical building settings. We do not assume the existence of gyroscopes on the phone, since not all smartphones support them. We also assume that RescueMe already has the map image of their building, the geographical orientation and actual size of their building, room doors’ location with the room number and the walking paths, marked on the map. RescueMe supports evacuation services in all buildings without any additional cost, if the information is provided.
Our task is then to show how, under these assumed conditions, we can construct a system that successfully supports mobile augmented reality in typical indoor settings.
Mobile component: This component runs on the user’s smartphone. This component implements four important functions. First, it provides inputs for image-based positioning by transmitting appropriate snapshots of nearby room number. Second, it implements a pedometry-based localization algorithm to accurately determine the current position and orientation of the user as the user walks up and down the hallway. Third, it implements 3-D rendering of the building from the current perspective (location, orientation, etc.) using AR tags. Finally, it records user’s walking patterns such as walking impact and a stride length in an outdoor area to predict user’s walking distance in the building.
External Image labeling Service: RescueMe uses an external image labeling service to identify a room number from a room number snapshot taken by the mobile component. This service may use any of the well known techniques, such as computer vision and crowd sourcing, for label identification. In our current prototype, we use a commercial image labeling Web service called IQ Engines .
RescueMe Server: This component implements our recommendation algorithm for the best exit path for each user. RescueMe selects the path with the shortest time to evacuate. This is not always the shortest distance path. For example, if the shortest distance path is too crowded, RescueMe observes the delays of other users in the system, and finds another path that is faster, rerouting and notifying each affected user in real time.
In this section, we describe the design and implementation of each component of the RescueMe system. The RescueMe Server coordinates all interactions among the system components, and is responsible for maintaining a spatial database that contains the building’s map and metadata that will be rendered by the AR component on the mobile client. The RescueMe Server is implemented as a Web server that exposes its functionality externally.
A pedometry-based dead-reckoning (PDR) system for indoor localization has been implemented in the Java language for an Android Nexus One smart phone, running on the Android 2.3 (Froyo) operating system. The NexusOne smart phone employs two tri-axis motion sensors, which we leverage for PDR localization: an accelerometer and a digital compass. The accelerometer we use for both step detection and stride estimation. The digital compass is used to determine the user’s heading-direction from which the direction of motion is estimated.
To detect a step, we follow a multistep signal processing method. First, the x-axis, y-axis, and z-axis accelerometer values are normalized by removing the effect of gravity through a mean removal operation. Secondly, we calculate a moving average of the normalized accelerometer signal. A moving average serves to both minimize errors induced by varying the user orientation of the phone in 3-axes, as well as remove unwanted high frequencies from the data. Thirdly, we examine both positive and negative peaks in the processed accelerometer signal trace. A genuine footstep will generate both a bound and a rebound phase corresponding to the foot striking and pushing against the ground. A footstep is therefore characterized by a positive peak closely followed by a negative peak in the accelerometer data. If the amplitude difference between a positive and a negative peak is greater than a set threshold, a step is recorded. Because footstep frequency is roughly 2-3 Hz, we require the temporal distance between a positive and successive negative peak to be ≤ 300 ms. Peak amplitude difference is required to exceed a threshold of 1.0 g. Both values were experimentally determined and verified as well-performing choices. The addition of a dynamic threshold adaptation scheme was tested and found to perform worse than the static scheme, which we possibly attest to the low maximum android accelerometer sample rate, and so the static threshold method is reported.
Learned gait pattern between outdoor places
RescueMe measures a daily-based stride length for the user, when he is walking in outdoor locations in normal situations, in order to build a personalized stride length which is used in the RescueMe application. For this measurement, the application needs to recognize when the user is walking down a street, hovering around one area, or remaining in the same place. We found a study which determined that people usually visit the same locations at similar times. The study which investigated the areas visited by 100,000 subjects found that they exhibit habitual space-time movements, with reasonably small variation . We used this finding about human behavior, based on similar space-time patterns to predict the location of users. Based on the study, we stored the regularly visited places on the phone. When the user stays at the same place or hovers around the place for a certain length of time, we do not measure the user’s stride length. However, when the user walks between outdoor places located a certain distance from each other, RescueMe measures the gait pattern of the user by using the accelerometer and GPS. The walking patterns are adjusted according to a user’s age, sex, height, weight, health, etc [14–16]. These characteristics of a user affect the amplitude of the accelerometer. RescueMe estimates a stride length of users by using the adjusted amplitude measurement of the accelerometer. This outdoor measurement increases the accuracy of our estimation of the user’s walking distance within the hallway of a large-scale building, without requiring the use of GPS indoors.
Daily-based personalized stride estimation
Our system provides a distance estimation between each detected footstep. In order to estimate stride lengths, we have developed a stride estimation function that is based upon the historical measurements of the impact of a user’s footstep, which corresponds to the detected amplitude measured between the positive and negative peak in an accelerometer signal trace.
As part of this personalized pedometic algorithm, we developed and used an adaptive GPS connection algorithm, which adjusts the GPS activation/connection time intervals to match the point at which a user changes his/her walking path in outdoor areas. The usual periodic GPS connection method, which activates the GPS sensor at specified regular time intervals often provides inaccurate measurements of the total walking length when users change the direction of their walking path. Our GPS adaptive algorithm resolved this limitation and we explain the design and development of this algorithm in the Experimental results section.
Map-matching-based orientation and user’s direction estimation
Upon successful detection of a user’s footstep, the simultaneously polled digital compass sensor data is examined in order to determine the direction a user is heading based upon the footstep detected. For each footstep we read the angle of the compass on the building map to locate the user. The true angle direction the user is heading is not measured correctly by the mobile phone as he walks, because the orientation of the phone varies as it is shaken by the user, holding it in his hand. For example, if the angle direction the user is walking is 90 degrees, the phone normally detects the angle’s measurement within a range between 60 and 110. If we use the raw angle data read from the phone, the walking path’s measurement data will vary from the user’s true path. The map-matching technique reduces this problem. This technique is also used to detect the hallway turning points on the building map. When a user turns left or right, RescueMe changes the angle of the compass to the angle of the map. To recognize the turning points, we created a buffer with limited window size to save previous measured angles stored on the phone as the user walks. If the measurements of more than half of the angles are different from the user’s previously determined direction, we store the user’s current direction as this new angle measurement. Part of the map-matching algorithm rounds phone angle measurements that are close to the actual map angle measurements to the map numbers observed in the buffer. Also, if a new angle measurement is not a possible angle measurement of the map, or if it is not within a plausible range of acceptable map measurements, RescueMe does not change the current angle’s measurement to the new one. We discuss the actual experiment for developing this estimation algorithm in the Experimental results section.
Solution for incremental measurement problem
RescueMe provides two methods for solving an incremental problem generated when using pedometry. Pedometric measurements are generated by the accelerometer of the phone. The accelerometer scans for noise, caused by the shaking or jiggling of the phone as the user walks. Such noise affects the measurements used to calculate the walking distance of the user. The longer the walking time, the more the error may increase. We use two different methods to correct for such errors: map-matching combined with user’s direction-path change, and manual re-selection of a room number or re-taking a picture of a room number. With the first method, the RescueMe client recognizes the user’s location when the user changes the angle path he is walking. RescueMe compares this change in location with the building map, and if it matches, resets the user’s previously recorded location to this new correct one–the actual hallway turning point on the map. In the second method, the user decides to reset his own location manually by selecting a room number close to where he is standing. Using this method, he can reset his current location by either re-taking a picture of the room number now closest to him, or by selecting from a list of room numbers already stored on the server. In the second instance, the user selects from a list of room numbers on his current floor that were sent to him originally by RescueMe when he first accessed the server.
Evacuation path recommendation
Learned gait pattern between outdoor places
RescueMe measures a daily-based stride length for the user, when he is walking in outdoor locations in normal situations, in order to build a personalized stride length which is used in the RescueMe application. For this measurement, the application needs to recognize when the user is walking down a street, hovering around one area, or remaining in the same place. We found a study which determined that people usually visit the same locations at similar times. The study which investigated the areas visited by 100,000 subjects found that they exhibit habitual space-time movements, with reasonably small variation . We used this finding about human behavior, based on similar space-time patterns to predict the location of users. Based on the study, we stored the regularly visited places on the phone. When the user stays at the same place or hovers around the place for a certain length of time, we do not measure the user’s stride length. However, when the user walks between outdoor places located a certain distance from each other, RescueMe measures the gait pattern of the user by using the accelerometer and GPS. The walking patterns are adjusted according to a user’s age, sex, height, weight, health, etc [14–16]. These characteristics of a user affect the amplitude of the accelerometer. RescueMe estimates a stride length of users by using the adjusted amplitude measurement of the accelerometer. This outdoor measurement increases the accuracy of our estimation of the user’s walking distance within the hallway of a large-scale building, without requiring the use of GPS.
We have implemented AR on the smart phone to support the overlaying of path-based tags on the hallway. The tags not only give a 3D depth perspective, but are also 3D themselves and can be rotated to face the user regardless of the viewing angle. We accomplish this by exploiting the OpenGL library as explained below. The application differently renders the 3D information depending on the behavior of users. We describe how to render a view of the textured 3D model using the OpenGL library.
Our AR emergency application provides the user with information about the path to take to evacuate a building, using a 3D presentation of evacuation tags for viable exit paths as in Figure 1(a). The tags are displayed in succession on a recommended exit path within a hallway, thus indicating the direction and distance the user needs to follow. We use a depth-based presentation of arrow tags to help users easily recognize the distances they must walk to follow the designated exit paths. When looking down the hallway, a user will know the walking distance he needs to go, because the application displays multiple arrow tags in succession up to the point in the hallway in which the user needs to turn or exit. Meanwhile, as the user walks down the hallway, the application displays a different tag in a corner of the phone that includes directional symbols, conveying information on which way to turn, along with text showing the remaining walking distance. Thus, the user can identify whether he is following the correct direction while he is walking. The directional arrow tags and the additional symbol/distance tag will enable the user to find a viable exit path and proceed in the right direction on that path until he gets to the exit door.
The RescueMe application also provides a 2D map, which users can access when they press a button, so that they can see their current location within the building and all of the recommended exit paths on this map as in Figure 1(b). Thus users can know where they are headed and where they are located in the building, while walking. This 2D map, which was implemented with this application, is based on OpenGL. The users can use this map anytime and even evacuate the building by using it. Users can also change the size or the position of the map by manipulating the touch screen on the smart phone. When walking in a building that has intricate or complicated paths, users can see their current location and a portion of the map at any time.
RescueMe requires accurate determination of the user’s location in an indoor environment. The Global Positioning System (GPS) cannot be used in indoor environments, since line-of-sight communication between GPS receivers and satellites is not possible in an indoor environment. Radio frequency (RF) positioning systems that use WiFi and Bluetooth radios on smartphones provide limited accuracy (1 - 3 m) due to the complexities associated with indoor environments, including a variety of obstacles (people, furniture, equipment, etc.) and sources of interference and noise from other devices . Therefore, we investigated the use of other positioning technology in RescueMe. RescueMe uses a commercial image labeling Web service, called IQ Engines , to determine the user’s initial starting position whenever the user takes a picture of a room number close to him. IQ Engines uses a combination of computer vision and crowd-sourcing to tag a photo with a label describing the content of the image. When an image is submitted to IQ Engines, the image is first processed by a computer vision system in an effort to provide an accurate label. If the computer vision system cannot identify the image, then IQ Engines passes the image to its crowd-sourcing network for analysis and tagging. According to IQ Engines, the time to return a label for an image varies from a few seconds for the computer vision system, to a few minutes for the crowd-sourcing system. In the RescueMe application, a user’s location is determined in the following way. First the user takes a picture of a room number above a door to provide the RescueMe server with his location. The picture is sent to the IQEngines server, which then identifies all text within the picture. The IQEngines server sends back the text of the room number to the RescueMe client, the user’s phone. If the server finds the exact same number in its database, it sends the room number, location, and other associated metadata for that room back to the user. In the RescueMe application, buildings’ door locations are expressed using the following dimensions: floor level, and x-axis and y-axis positions of the doors in the building. If the text that is returned to the phone client contains errors, such as the omission of a letter, the addition of an incorrect letter, or the substitution of a correct letter with an incorrect one, the RescueMe client will use an edit distance algorithm  to determine the correct room number. The client then queries the user to see if the room number is correct. When the user gives an affirmative answer, this information is sent to the RescueMe server, and the correct room number, and thus the location of the user, is determined.
Basic experiment for counting footsteps
Counting footsteps depending on the orientation
Training personalized Pedometry
A user’s personalized stride length can be continually measured using the smartphone only when a user is in an outdoor area, because the GPS sensor only works outdoors. To determine the stride length of users, we collected GPS sensor measurements in combination with the accelerometer sensor’s measurements from users, as they walked from one outdoor location to another. Our goal was to build a practical, personalized pedometric algorithm for the mobile phone that could be used both in outdoor and indoor areas.
To gather and analyze data for this algorithm, we used a survey approach with 10 different users, in which we could collect data from multiple sensors on the mobile phone, that could potentially improve the accuracy of the personalized pedometry data for the RescueMe application. The mobile phone survey data was collected from 10 users who carried the phone with them for approximately one week. The sensor data for each survey participant was collected automatically on the users’ mobile phones during this survey period. This research was approved by Institutional Review Board (IRB) .
We collected daily movement behavior data (e.g., walking, sitting, running) from sensors on mobile phones carried by 10 survey subjects for a period of one to two weeks. We used four mobile phones for this survey: two HTC Nexus One and two HTC Inspires, with our survey application installed on them. We paid each of the survey subjects $10 for participation in this study and required them to carry the phone with them at all times during the day in their pocket or purse, keeping it in the same place as their mobile phone, during the data collection period. The data was collected and stored on the phone during the time they were actively participating in the survey, and later at the end of the survey week, was transferred from the subject’s phone to a laptop.
Walking-pattern experiment in an outdoor environment compared to indoor
The RescueMe application was used to do an experiment, comparing each user’s personalized pedometric readings when measured in outdoor environments, to their pedometric readings in indoor environments. Because the GPS on the mobile phone is only continually usable in outdoor areas, we first calculated each mobile user’s personalized pedometry in these areas using the accelerometer and GPS measurements, and then took similar measurements for each user in indoor areas, using the accelerometer and orientation sensors. The aim of this experiment was to see how closely the outdoor personalized pedometry for each user matched their indoor personalized pedometry, so that we could comfortably use outdoor personalized pedometry measurements for mobile users in place of indoor pedometry measurements in our RescueMe application.
Average difference between user’s indoor and outdoor walking-impact measurements
Walking path variation issue in outdoor areas
The RescueMe application trains the mobile user’s pedometry in outdoor areas, using a personalized pedometry algorithm when the mobile has the additional use of the GPS sensor. The algorithm measures the personalized stride length of a mobile user using the GPS and accelerometer sensors when the user is in an outdoor area. It calculates the distance between a user’s position, at the time of one GPS connection and the user’s position, at a later GPS connection time, using a periodic GPS connection method (e.g., every 3 or 5 minutes). In the time span between the 2 GPS connections, the user’s footsteps and walking impact strength are measured by the accelerometer sensor, and then the algorithm estimates the user’s walking stride, depending on the walking impact.
Adaptive GPS connection method with phone orientation sensor
As noted above, we found that the orientation sensor can successfully be used in conjunction with the mobile phone’s other sensors, to detect variations in a user’s walking path when in outdoor areas. In addition, we also checked the orientation angles of the subjects’ phones that were recorded when they were walking down these same paths and same directions. We found that the mobile phone’s orientation varied greatly, depending on the angle-position of the phone while it was carried in either a user’s pocket or purse. Although the phone’s compass angle readings changed considerably depending on the phone’s placement orientation, we found that we can use the compass sensor’s reading regardless the mobile phone’s placement or orientation. We did an experiment to check the readings of the phone’s orientation sensor when the phone is carried in different positions, and to see whether the orientation sensor could still accurately detect a user’s walking path turning points and variation.
Thus we created an adaptive algorithm to efficiently adjust the GPS connection time to help build a personalized pedometry for users. The periodic GPS connection method was successfully shown to solve the problem of users’ walking path variations (from a straight line) in outdoor areas. We were able to use the orientation sensor to identify the variation and, in conjunction with this sensor data, build an adative GPS connection algorithm. The algorithm obtains the mobile user’s position using the GPS sensor when a mobile user changes the direction of his/her walking path in an outdoor area. Thus, adaptively accessing the GPS sensor improves the accuracy and efficiency of our personalized pedometric algorithm.
Personalized Pedometry-based localization
Accuracy using personalized Pedometry
The pedestrian localization via pedometer and heading estimation systems were implemented and tested in Java on a Nexus One smart phone running the Android 2.3 (Froyo) operating system. User tests to evaluate pedometry step detection, stride estimation, and the combination of step detection and personalized stride length measurements into an overall distance walked estimation were carried out. Additionally, differing types of users were simulated, varying from an “engaged” user who wishes to learn how to use the system to obtain the best performance, to the “casual” user who is not interested in performance and so uses the system in a careless manner.
Step detection accuracy
In Table 3, short strides have a tendency for under detection, while long strides are prone for over detection. This is due to the static threshold used for detection, which is tuned for the normal stride length scenario. An adaptive step threshold detection scheme was implemented and tested, but suffered a poorer performance than the static method. We theorize this counter intuitive result to be due to the accelerometer’s 10 Hz maximum sampling rate on the Nexus One smart phone not providing a smooth enough data curve for the adaptive algorithm to leverage effectively.
Figures 10(b) compares the static and personalized stride length estimation techniques. The resulting stride lengths represent the average stride length of each of the 9 user trials completed, calculated by the overall distance measured divided by number of steps detected, but not actually taken. This removes any additional step detection errors that might be present and allows a pure comparison of stride length estimation. The personalized stride estimation generated 2.33 percent error, while the static stride estimation suffers 17.06 percent error. Interestingly, because the static method was tuned for the medium stride length, its average error actually outperforms that of the personalized method on the same data set. A point of note is the extreme accuracy of the long stride under the personalized estimation scheme. The error bars are almost too small to be seen, averaging to 99.6 percent stride length accuracy for this stride type. This excellent accuracy is most likely due to the flatness of the alpha correction function for large positive peak amplitudes.
Figure 10(a) addresses the combination of error from step detection as well as stride estimation techniques. An overall walk distance is measured by our system and is compared against the ground truth walked distances. Figure 10(b) shows that in some cases, e.g. personalized trial 1 for a short stride, an error in step works to reduce the error stride. However, in most cases, if both kinds of error are present they combine with one another, which is evident by the increase in overall error from stride (2.33 percent) and step (3.33 percent) to distance walked (3.43 percent).
Map-matching experiment for RescueMe
We also conducted an experiment for predicting the walking path angle, when a user changes his direction. In this experiment, we created a constantly replenished five window buffer to determine the heading-direction angle, by storing the angle measurement in each window for each step the user takes. Each time three of the five measurements are similar to each other within the buffer, RescueMe stored the current direction as the new angle measurement of the heading direction of the user as shown in Figure 11(b). The combined use of these two algorithms successfully utilized a match-mapping technique with pedometry to localize the user on the map.
RescueMe simulation for the exit path recommendation
We conducted two types of simulations to evaluate the length of time it takes for people to evacuate an emergency situation in a building. We contrasted three scenarios–one with randomization (no algorithm), one using the shortest-exit path algorithm, and the third, using the RescueMe algorithm. The first simulation involved 179 people who were deployed randomly within the building at the start of the simulation. For the second simulation, 162 people were deployed in one specific area of the building. The simulated people were programmed to move one step every tick at the same speed, as they moved towards the various exit door choices. If more than one person arrives at the same place, the exit speed of each person is delayed by the others adjacent to them.
Simulation: Randomly distributed people
Simulation 1: time (tick)
Simulation 2: time (tick)
The second simulation was conducted to show how best to improve the exit time when people are unevenly distributed throughout a building, such as when they are gathered as a crowd for a presentation in one location in a building. Simulation 2 in Table 4 shows that RescueMe provides the best result for evacuating a crowd of people from within a building. All of the people could evacuate the building within 368 ticks using this method, whereas the shortest path method took longer (563 ticks), and the random method even longer. People using the shortest path method often ended up at the same exit door, since it was the shortest path from the shared crowded area. Each individual person’s exit time was delayed by the people between them and the exit door. However, for the people in the RescueMe scenario, most of them were able to exit a door uncrowded by other people, as shown in Figure 12(b). In this case, as shown in the diagram, RescueMe recommended one of three exit doors: A, B, and C; thus the crowd of people were dispersed equally into three groups to allow them to evacuate quicker through a less crowded exit door.
In this paper, RescueMe was shown to be a novel emergency evacuation system that was successfully implemented on the mobile phone, for use in large indoor building environments. We explained how we developed and tested the various components of this system, which include: personalized pedometry, indoor mobile Augmented Reality, cloud information, and an optimal exit path algorithm. The combined components of this system were shown to work efficiently together to recommend the quickest and shortest evacuation path to users in emergency indoor situations. In addition, the practical personalized pedometry algorithm, which incorporates user stride estimation and utilizes an adaptive GPS connection method, was proven to provide high-positioning accuracy for mobile phone users of this system.
RescueMe was potentially shown to solve the current limitations of available emergency applications on smartphones by leveraging the sensors on the phone, in conjunction with a user’s personalized daily walking stride length estimation and user-localization cloud information to support a user’s timely evacuation from medium-large scale building environments in emergency situations. However, at this point the RescueMe application has not been tested in a real live emergency evacuation situation. It was impossible for the testing phase of this system to generate a real live emergency situation (e.g., a fire or bomb scare in a building) in which to fully evaluate its performance in real life. However, the hope is that this system might be further tested on a larger scale with simulated emergency evacuations so that at some future date it could be provided to the public as a fully-tested mobile application evacuation system. Furthermore, another future enhancement of the RescueMe application would be to integrate it with the building’s existing alarm system or cloud (e.g., twitter, police database) to obtain critical emergency information for the user who is trying to evacuate the building, using our system in real-time.
Written informed consent was obtained from the experimental participant for publication of this report and any accompanying images. Our user-based experiment was approved by Institutional Review Board (IRB) .
This work was supported by a grant from the National Science Foundation, CCF 1048298. The authors wish to thank Youjin Seo, Sua, Jamie Williamson, and Cathy Kerry for helping with some experiments or sharing ideas.
- Federal Emergency Management Agency (FEMA) [http://www.fema.gov/] 
- Ahn J, Han R: RescueMe: An Indoor Mobile Augmented-Reality Evacuation System by Personalized Pedometry. In Proceedings of IEEE APSCC 2011, 70–77. 10.1109/APSCC.2011.26Google Scholar
- Iq engines: Image recognition and visual search [http://www.iqengines.com] 
- Miller LE, Wilson PF, Bryner NP, Francis MH, Guerrieri JR, Stroup DW, Klein-berndt L: Rfid-assisted indoor localization and communication for first responders. 2006, 1–6. 10.1109/EUCAP.2006.4584714Google Scholar
- Miluzzo E, Lane ND, Fodor K, Peterson R, Lu H, Musolesi M, Eisenman SB, Zheng X, Campbell AT: Sensing meets mobile social networks: the design, implementation and evaluation of the cenceme application. In Proceedings of the 6th ACM conference on Embedded network sensor systems, SenSys ’08. ACM, New York; 2008. pp 337–350 pp 337–350Google Scholar
- Miluzzo E, Cornelius C, Ramaswamy A, Choudhury T, Liu Z, Campbell AT: Darwin phones: the evolution of sensing and inference on mobile phones. MobiSys 2010, 5–20. 10.1145/1814433.1814437Google Scholar
- Fuchs C, Aschenbruck N, Martini P, Wieneke M: A survey on indoor tracking for mission critical scenarios. Pervasive and Mobile Computing 2010. In Press, Corrected Proof In Press, Corrected ProofGoogle Scholar
- Beauregard S: Omnidirectional Pedestrian Navigation for First Responders. In 4th IEEE Workshop on Positioning, Navigation, and Communication 2007 (WPNC’07). Hannover, Germany; 2007. pp 33–36 pp 33–36Google Scholar
- Woodman O, Harle R: Pedestrian localisation for indoor environments. In In Proceedings of the 10th international conference on Ubiquitous computing (UbiComp ’08). ACM, New York; 2008. pp 114–123 pp 114–123Google Scholar
- Constandache I, Choudhury RR, Rhee I: Towards mobile phone localization without war-driving. In In Proceedings of the 29th conference on Information communications (INFOCOM’10). IEEE Press, Piscataway; 2010. pp 2321–2329 pp 2321–2329Google Scholar
- Seco F, Jimenez A, Prieto C, Roa J, Koutsou K: A survey of mathematical methods for indoor localization. In: Intelligent Signal, Processing, 2009. WISP 2009. IEEE International Symposium on 2009. pp 9–14 pp 9–14Google Scholar
- Ladstaetter S, Luley P, Almer A, Paletta L: Multisensor data fusion for high accuracy positioning on mobile phones. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, MobileHCI ’10. ACM, New York; 2010. pp 395–396 pp 395–396Google Scholar
- González MC, Hidalgo CA, Barabási A-L: Understanding individual human mobility patterns. Nature 2008, 453: 779–782. 10.1038/nature06958View ArticleGoogle Scholar
- Schmitz A, Silder A, Heiderscheit B, Mahoney J, Thelen DG: Differences in lower-extremity muscular activation during walking between healthy older and young adults. J Electromyogr Kinesiol 2009, 19: 1085.91.View ArticleGoogle Scholar
- Callisaya ML, Blizzard L, Schmidt MD, McGinley JL, Srikanth VK: Sex modifies the relationship between age and gait: a population-based study of older adults. J Gerontol A Biol Sci Med Sci 2008,2008(63):165–170.View ArticleGoogle Scholar
- Brown LA, Gage WH, Polych MA, Sleik RJ, Winder TR: Central set influences on gait: age-dependent effects of postural threat. Exp Brain Res 2002, 145: 286–296. 10.1007/s00221-002-1082-0View ArticleGoogle Scholar
- Dijkstra EW: A note on two problems in connexion with graphs. Numerische Mathematik 1959, 1: 269–271. 10.1007/BF01386390MathSciNetView ArticleGoogle Scholar
- Gu Y, Lo A, Niemegeers I: A survey of indoor positioning systems for wireless personal networks. IEEE Commun Surv Tutorials 2009,11(1):13–32.View ArticleGoogle Scholar
- Levenshtein V: Binary codes capable of correcting deletions, insertions, and reversals. Sov Physice-Doklady 1966, 10: 707–710.MathSciNetGoogle Scholar
- Ahn J, Han R: Detecting human behavior and feelings to determine safety level status using the mobile phone in a user’s daily life, Protocol:12–0260. 2012. The University of Colorado at Boulder, pp 798–802Google Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License(http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.