Sensing spatial and temporal coordination in teams using the smartphone
© Feese et al.; licensee Springer. 2014
Received: 17 March 2014
Accepted: 15 August 2014
Published: 28 September 2014
Teams are at the heart of today’s organizations and their performance is crucial for organizational success. It is therefore important to understand and monitor team processes. Traditional approaches employ questionnaires, which have low temporal resolution or manual behavior observation, which is labor intensive and thus costly. In this work, we propose to apply mobile behavior sensing to capture team coordination processes in an automatic manner, thereby enabling cost-effective and real-time monitoring of teams. In particular, we use the built-in sensors of smartphones to sense interpersonal body movement alignment and to detect moving sub-groups. We aggregate the data on team level in form of networks that capture a) how long team members are together in a sub-group and b) how synchronized team members move. Density and centralization metrics extract team coordination indicators from the team networks. We demonstrate the validity of our approach in firefighting teams performing a realistic training scenario and investigate the link between the coordination indicators and team performance as well as experienced team coordination. Our method enables researchers and practitioners alike to capture temporal and spatial team coordination automatically and objectively in real-time.
Teams and team work are essential in today’s organizations . To perform well as a team, members need to share information, coordinate their actions and support each other. These activities are commonly referred to as team processes, which convert inputs such as individual members’ abilities into outcomes such as team performance .
In order to improve team performance, it is mandatory to monitor how team members work and interact with another. However, current approaches to monitor these team processes in situ and over time are limited. While questionnaires are ill-suited to capture the temporal sequence of interactions, manual behavioral observation that would be more suitable for that purpose is notably absent from group research . One reason is that behavioral observation is time-consuming as the manual encoding of behavior usually takes many times longer than the actual interaction. As a result, most studies are limited to small samples and short observational periods. Consequently, researchers have called for new measurement systems capable of capturing the complexity of team processes .
In our view, ubiquitous computing can help to continuously monitor team processes in realistic environments and provide a new observational tool that can support team researchers and trainers with objective data on how team members interact and work with each other.
We present an approach to use the smartphone as a sensing platform to capture individual and team behaviors. We record body movement of each team member and estimate proximity between team members.
From the sensor data, we extract sub-group and movement alignment networks that summarize a) how long team members are together in a sub-group and b) how synchronized team members move. Further, we propose to summarize the structures of the extracted team networks using density and centralization metrics as used in social network analysis.
We validate our approach in a study with professional firefighting teams performing high fidelity training missions in a firehouse and show how the proposed coordination indicators are correlated with objective and subjective coordination measures.
Team coordination in safety-critical environments has been assessed using different methodologies. A common approach includes behavioral observation. By observing recorded videos of the team interaction and encoding predefined behaviors, researchers can investigate temporal aspects of team processes such as patterns of interaction and changes over time -. Behavioral observation, however, is very resource-intensive and impractical for applied settings.
Another approach to team processes focuses on the structural characteristics of teamwork. Crawford et al. recently introduced a theoretical framework that considers structure . By drawing on social network analysis (SNA), their theory proposes different types of networks to provide a more comprehensive explanation of the relationship between team processes and performance. SNA expresses the social environment as patterns or regularities in relationships among interaction units . These relationships (i.e., ties) can be of different types. For example, a communication network could capture which members of a department communicate with each other on a regular basis. Such a network can reveal those members that are central to the dissemination of information within this department. The relationship data that makes up a social network can be represented and analyzed in different ways . For a quantitative analysis, different relationship metrics can be derived from the data. These metrics can describe properties of individual members or of the whole team. The most common team-level metrics include density and centralization . Network density is defined as the ratio between the actual number and the total number of ties in a network and is often used as an indicator of cohesion . Centralization refers to the variance in ties per team member; low values indicate a structure in which each member has the same numbers of ties. Centralization reflects aspects of work organization and hierarchy.
Researchers have applied SNA to teams in organizations. For example, it has been suggested that centralization has a negative impact on team performance in complex tasks . Zohar et al. showed that the density of a military teams communication network mediated the effects of transformational leadership on climate strength . Likewise, a series of case studies with police and firefighting teams suggests that both teams have different network architectures (distributed vs. split) .
Despite the potential of SNA to uncover the underlying structure of team processes, the number of studies using SNA in team research is small ,. We believe that part of this problem lies in the method itself as SNA, like behavioral observation, is very resource-intensive and often impractical for applied settings. One way to address this issue includes taking advantage of new developments from the field of mobile behavior sensing.
Mobile behavior sensing
Mobile Behavior Sensing aims at measuring and analysing human behavior from sensor data recorded with mobile devices ,,. Research in wearable and ubiquitous computing has shown how user context and behavior can be inferred from the smartphone’s sensor data using signal processing and machine learning techniques. The integrated sensors capture device interaction, body movement, location and speech of the user as well as characteristics of the user’s environment such as ambient light and sound. Characteristics features are then extracted from the sensor signals to make inferences about the context, state and behavior of a user.
Farrahi et al. used coarse location information from cell towers and clustered individual location traces to discover daily routines such as “going to work at 10am” or “leaving work at night” .
In order to give semantic meaning to recorded location information, features from ambient sound and video were fused to categorize location into place categories such as “college/education”, “food/restaurant” or “home” ,.
The audio modality has been analyzed in the mobile setting to detect conversations, recognize speakers and estimate speaking duration -, perceived basic emotions , perceived stress during a street promotion tasks  and to quantify sociability as one aspect of well-being .
On a macro level, Eagle et al. have first shown how mobile phones can be used to infer proximity networks of communities . Relying on repeated Bluetooth scans, mobile phones were used to detect other nearby devices to estimate proximity between individuals. On the same dataset, topic models were later used to discover human interactions from the proximity data .
Before the smartphone was available, Choudhury et al. introduced the sociometer, a wearable device, to automatically sense body motion, communication and proximity networks . Extending this line of research, Olguin et al. used a new version of the sociometer to collect behavioral data of nurses in a hospital. The results showed a positive relationship of group motion energy and speaking time with group productivity .
In previous work, we adopted the idea to use motion and speech activity to monitor teams. Our feasibility study showed that speech and motion activity are promising performance indicators in firefighting teams  which motivated us to design, build and distribute our mobile sensing app CoenoFire in a real fire brigade . In this paper, we build on our approach to sense team proximity dynamics with the smartphone .
Sensing spatial and temporal coordination
From the review of related work, we conclude: Firstly, the smartphone can be used to capture the behavior of it’s user and secondly, teams can be characterized by network metrics as commonly applied in social network analysis. Based on these findings, we propose to automatically sense spatial and temporal aspects of team coordination using the smartphone.
The spatial component of coordination is concerned with how team activities are distributed in space. For this reason, we detect moving sub-groups of team members. Team members within the same sub-group are in close proximity, whereas those of another sub-group are not. The temporal component of coordination is related to how team activities are aligned in time. Instead of detecting concrete activities (e.g. running), we measure the movement activity level that captures how long a team member was physically active during consecutive time intervals. By comparing the motion activity level signals of two team members, we measure how well they aligned their body movement in time. This is especially important for team members of first responder teams such as firefighters that move and work at least in pairs of two.
Sensor data is recorded on the smartphones carried by each team member. The phone’s sensors capture body motion by sensing acceleration, proximity to others by exchanging radio messages between nearby devices and height information by sensing atmospheric pressure.
Data of each team member is processed to derive the sub-group network which captures who was for how long in a sub-group with another team member, as well as the movement alignment network, which captures dependencies in activity levels between team members.
Network metrics as used in SNA are extracted from the team networks to capture the overall structure of the networks. Network density describes how well the nodes (team members) within the network are connected, whereas centralization measures how heterogeneously the nodes are connected to each other. Depending on the type of network (sub-group network or movement alignment network), connected therefore refers to how long team members were in the same sub-group or to how well they aligned their movement activity levels. We refer to density and centralization of the two team networks as team coordination indicators.
In the example presented in Figure 1, person A is standing still while being in proximity with the running persons B and C. Person D on the other hand is walking behind a wall and is therefore not in any sub-group with another person. This leads to the presented sub-group network. As person A is in-sight with persons B and C, the sub-group graph shows them to be in one group, whereas person D is indicated to be alone. The movement alignment network shows person B and C to be best aligned because they are both running, whereas person A is worst aligned as she is the only person not moving. From the sub-group and movement alignment networks SNA metrics are derived to characterize overall network structures in order to capture team coordination indicators.
Smartphone sensing platform
For data collection, we used the Sony Xperia Active smartphone which is dust and water-resistant, has a 3-inch capacitive touchscreen and a built-in ANT-radio (http://www.thisisant.com). ANT is a low power wireless protocol that was developed to connect fitness devices such as heart-rate-belts and pedometers with sport watches; however, in this work we use it for proximity estimation. We developed an Android app to continuously record data of the phone’s built-in sensors. Therefore, we extended the funf-open-sensing-framework  to also detect nearby devices by transceiving ANT-radio messages and to save the raw sensor data locally to the memory card.
Data was recorded from the following built-in sensors: acceleration and orientation sensors were used to measure body movement, the barometer measured atmospheric pressure and was used to infer whether individuals were on the same floor level and ANT-radio messages were sent and received to find out which team member was in proximity to another one.
As our goal was to sense team behaviors, all devices carried by the team members needed to be synchronized to allow comparison of the sensor signals across team members. Therefore, we measured the offset between system time and a common reference time each 5 min using the network time protocol. With this approach, we were able to achieve a time synchronisation across devices with a maximum time difference of 500 ms. To enable remote monitoring, we configured the framework to upload every five minutes a subset of calculated features, such as the battery level to a central server. Because we used the smartphone as a sensing platform, we installed our app as the default homescreen and blocked all soft buttons. In this way, our app was always visible and the use of the smartphone was restricted to our data collection.
At the back-end, we ran one web server to receive and store the data from the smartphones in a central database. A second web server provided a web-based user interface that offered real-time monitoring of the system. A screen shot of the web interface showing the battery status of the devices is presented in the right of Figure 2. The interface also allows visualization of real time data of the firefighters’ movement and speech activity.
In order to validate our approach, we tested it in a sample of firefighting teams completing a training scenario. The study was conducted in cooperation with the Zurich fire department and approved by the Ethics committee of the University of Zurich. Written consent from all participants was obtained prior to data collection.
Setting and procedure
The scenario took place in a burn building, a multi-story training facility that allows for a highly realistic simulation of fire incidents. During training sessions, firefighters were confronted with actual fires, extreme heat, high humidity and thick smoke restricting visibility. Trainings were performed using standard equipment including vehicles, protection suits, and self-contained breathing apparatus (SCBA). We informed the participants about the study two weeks prior to data collection during their morning reports. Upon arrival at the trainings site, participants were again informed about the study, and completed the consent form and a personal background questionnaire. They also received a briefing about the scenario from a training instructor. Then, the first trial was conducted. Each firefighter carried one smartphone in the left jacket pocket of his protection suit (see Figure 3). We videotaped all trials using two regular cameras to record outside and a thermographic camera to record inside the building. After the scenario, they completed the coordination questionnaire (see below) and received a technical debriefing about their performance. Team members switched roles and started the next scenario after a short break.
Measurement of perceived coordination
Perceived explicit and implicit coordination were measured via self-report after each trial. To assess explicit coordination we used three items of the German translation of the subscale coordination of the transactive memory scale . A sample item is “Our team worked together in a well-coordinated fashion”. The scale had a high reliability (α=.80). In absence of a validated scale, we developed five items to assess implicit coordination based on its definition. Sample items included “We automatically adjusted our working styles to each other” and “We understood each other blindly”. The scale had a high reliability (α=.87). All items were answered on a 5-point scale ranging from 1 = “strongly disagree” to 5 = “strongly agree”.
In total 51 professional firefighters from the Zurich Fire Brigade participated in our study. All participants were male, aged 35±10 years. They completed a simulated fire incident in teams of 7-9 members. The data collection was conducted on four consecutive days. We recorded 18 training runs of the described scenario. Most firefighters took part in more than one trial because of the limited overall sample size. However, we made sure that participants switched their roles after each trial to ensure variation. In five trails one smartphone partially failed recording; in additional three runs one firefighter did not participate in the study. This left us with 10 complete runs totaling to over 2 h of training data.
Extraction of team coordination indicators
In the following, we present how we extract team coordination indicators from automatically sensed team networks. First, we describe how moving subgroups are detected to derive the sub-group network. Second, we describe how temporal activity alignment is quantified to extract the activity alignment network. Third, we detail how team coordination indicators are extracted from the team networks.
Detection and visualization of moving sub-groups
Having detected moving sub-groups, we are able to calculate a sub-group network that summarizes which team member was for how long in a sub-group with another team member. Thus, the sub-group network captures the overall spatial structure of the team during a mission. In Figure 4a the corresponding sub-group network is presented on the right of the narrative chart. The network graphs highlights three sub-groups that belong to the first and second troop that enter the building via the turntable ladder as well as the ground support team that includes the incident commander, turntable ladder operator and the engineer. In the graph darker links between nodes correspond to team members that were longer than 60% of the mission together in a sub-group.
In the following we briefly describe our method to detect moving sub-groups using radio-based proximity data. Please refer to  for more details. We follow a two stage approach to detect moving sub-groups: We first calculate the proximity matrix D t for consecutive time intervals t of length L=5 s. Each binary element indicates whether device i received any ANT message of device j during time interval t. Considering proximity to be undirected, we further symmetrize the proximity matrix to obtain .
In the second stage, moving sub-groups are clustered from the proximity data. Clusters are first identified independently from the symmetrized proximity matrices of each time interval and secondly, the clustering output is smoothed by applying a temporal filter, so that clusters last for at least 10 s. We cluster each symmetrized proximity matrix using the single-link criterion. As a result, if group member A is connected with B and B with C, but not with A, all three devices are still clustered into one group.
Using only radio based proximity information might lead to individuals on different height levels to be clustered into one group. To address this problem, we added height information derived from the atmospheric pressure sensor. If the absolute atmospheric pressure difference between two devices is greater than a predefined threshold, the two devices are considered to be on different height levels and are thus not clustered to the same sub-group. To obtain the sub-group network, we average the clustering results over all time intervals.
As the ANT-radio protocol operates in the 2.4 GHz band, radio signals are particularly influenced by the surrounding environment. In our experiments, we observed that depending on the relative orientation and environment of the individuals, the maximal transmit distance varied in the range of 1 m to 20 m. In  we evaluated our algorithm to detect moving sub-groups of firefighters during the described training scenario by comparing the results to a manually annotated ground truth. On average, team members were assigned to the correct sub-group with 95% accuracy.
Temporal alignment of activity level
In order to capture the temporal aspect of coordination in teams, we measure and compare activity levels of individual team members. Thus, we assume that well coordinated team members change their activity level at similar points in time.
We define the activity level to be the fraction of time that an individual is active within a moving window of length L. The activity level increases when individuals become active and decreases as soon as team members stop moving. The window length L determines the slope of the activity level and the minimum time that an individual needs to be active to reach the maximum activity level. The value of L also affects the temporal resolution, a small value requires individuals to change their activity closer in time, whereas a larger value allows for a delay between activity changes, as the activity level is calculated over a longer period.
The dependency between X and Y is expressed by the joint distribution p(x,y) and compared to the joint distribution when independence is assumed, in which case p(x,y)=p(x)p(y). Thus, I(X,Y) is zero if and only if X and Y are independent.
Two examples of activity level alignment that occurred during the firefighting training scenario are presented in Figure 4b. The two activity levels presented in the top graph belong two team members from the first troop (T1a, T1b), whereas the activity levels shown in the bottom graph belong to team member T1b and the incident commander. While the activity levels of the troop members change often together in time and are well aligned, the activity levels of the troop member T1b and the incident commander are not well aligned in time. Consequently, the observed mutual information is higher between the activity levels of the troop members as opposed to those of troop member T1b and the incident commander.
In order to summarize the temporal alignment for the whole team, the mutual information between all pairs of activity levels are calculated. This results in the activity alignment network. An example is presented on the right side of Figure 4b. As can be seen, troop member T1b had highest activity alignment with troop member T1a and lowest with the incident commander.
Team coordination indicators
Density of the sub-group network measures how long team members were on average in sub-groups. As the sub-group network captures the spatial distribution of team members, a high density indicates that many team members were together for a long time, whereas a low density indicates that team members were mostly on their own.
Degree centralization of the sub-group network measures how differently team members were part of a sub-group. A high degree centralization indicates that there was at least one well connected large group and one other small group.
Density of the activity alignment network measures how well team members aligned their activity level on average. It can thus be seen as an overall measure of how coordinated a team moved.
Degree centralization of the activity alignment network measures how differently the team members aligned their activity levels with that of others. It can thus be seen as an overall measure of how differently team members’ motions were coordinated.
with d* being the maximum degree observed in the network.
Evaluation of team coordination indicators
In order to evaluate our approach, we correlated the proposed coordination indicators derived from the sensor data with three validation criteria. We used perceived implicit and explicit coordination to validate the indicators. Explicit coordination includes those actions intentionally used for team coordination and is achieved by means of verbal communication. By contrast, implicit coordination refers to the anticipations of others members’ actions and the dynamic adjustment of one’s own actions accordingly, without the need for overt communication .
We additionally used time to complete the training mission as objective validation criteria. As time is critical in firefighting our reasoning was that well coordinated teams would be faster.
Degree centralization of the sub-group network is highly negatively correlated with completion time and positively with implicit team coordination. That is, the centralization of the sub-group network decreased with completion time (compare Figure 6a) and increased with perceived implicit coordination (compare Figure 6c). In other words, faster teams showed a higher degree of centralization in the sub-group network, meaning that team members were more heterogeneously connected, e.g. some firefighters were in well connected sub-groups for a long time, whereas others were longer on their own or part of a small sub-group.
Density of activity alignment networks is highly negatively correlated with completion time and positively with implicit coordination. That is, the density of the movement alignment network decreased with completion time (compare Figure 6f) and increased with perceived implicit coordination (compare Figure 6h). Thus, faster teams showed more alignment of their activity levels and perceived their implicit coordination as better than slower teams.
Linear correlation analysis: relationship between team coordination indicators and outcome measures team performance and experienced coordination (implicit and explicit) ( N = 10, L = 5 s)
The finding that faster teams had a sub-group network with higher degree of centralization makes sense, because the chosen scenario demanded teams to split into at least three sub-groups of different size: the troop that went inside the building, the firefighter on top of the ladder that helped with the fire hose and the remaining team members on the ground outside the building. Thus, faster teams organized their spatial structure more efficiently than slower teams.
In terms of activity alignment, the results showed that faster teams exhibited higher temporal movement coordination. This finding seems reasonable as it indicates that firefighters in faster teams worked well together and aligned their movements accordingly. Thus, faster teams moved on average more synchronously than slower teams.
The correlation analysis did not indicate any significant relationships between the proposed team coordination indicators and perceived explicit coordination. This is likely due to the fact that the proposed coordination indicators measure spatial and temporal aspects and did not include direct verbal communication which is essential for explicit coordination.
Visual analysis of team networks
The aim of the current paper was to introduce smartphone based behavior sensing and data processing as a means to automatically observe team coordination processes in realistic environments. To this end, we described the method, its theoretical background and reported the findings of a validation study. Our method consists of three steps: First, individual activity level and proximity between team members is measured with the integrated sensors of smartphones. Second, individual data streams are processed and compared with each other to derive motion alignment and sub-group networks which capture team coordination processes. Third, to derive team coordination indicators we used social network analysis to quantify temporal and spatial coordination on the team level. In a firefighting training scenario, we have validated the team coordination indicators by investigating their link to team performance and experienced team coordination.
First, we provide a method that is capable of continuously monitoring team coordination processes in a variety of settings. Thereby, we provide a new measurement tool for team research.
Second, the smartphone allows to capture different types of networks that are beyond classical self-report based social networks that focus on content such as advice, information or friendship. We introduced proximity based sub-groups and motion activity alignment as contents. Moreover, the smartphone data has a high temporal resolution (in the order of seconds) and thus captures changes in network structure over time.
Third, the smartphone-based behavior sensing approach enables easier data collection as no user input is required. In addition, the smartphone-based approach offers a higher degree of anonymity than videos, which potentially increases the willingness to participate in a study.
Finally, our approach bears the potential to open up new settings for team research. The smartphone can be used in settings where traditional behavioral observation is not feasible. For example, firefighters can be monitored during real incidents. In earlier work , we have shown that a smartphone-based data collection approach is feasible in such settings.
Our proposed team sensing approach also has potential implications for practitioners. Instructors of first responder teams can use the smartphone during training to collect objective data on team processes. These data could then serve as additional input for debriefing and thus enables data supported training feedback. Even more so, as the data can be illustrated using different graphs. For example, the narrative chart (compare Figure 4) can be used to get a quick overview of when sub-groups formed and disbanded over the course of a mission.
As the smartphone allows for continuous, real-time assessment of teams, it could also be used during actual missions to monitor performance and coordination. This has a large potential for error prevention. For example, using radio-based proximity data, mission commanders can detect when a team member moves out of sight of the rest of the team without relying on GPS or any installed infrastructure. Being alone may pose a threat to this person because it will be more difficult for his teammates to recognize potential dangers and to provide timely backup. As smartphones are widely used, it would not be difficult to implement such a monitoring system.
In order to detect moving sub-groups over time, we make use of radio based proximity estimation. The accuracy of nearby device detection dependents on architectural constraints and consequently varies across locations. Experiments showed that the maximum detection distance varies between 1 m to 20 m. This accuracy proved to be good enough for the detection of moving sub-groups in the firefighting scenario. We therefore believe our method to be also applicable to other first responder teams. Having room-level accuracy, the approach is also useful to track white-collar workers in office buildings in order to capture their co-location networks, which could be used to identify important persons in a social network. However, in a social event in which individuals stand close together the spatial resolution is likely not to be sufficient to reliably detect social interaction. In such scenarios, the used technique can only give rough proximity cues.
Further, we measured temporal coordination as simultaneous change in activity level. As the activity level captures the amount of body movement, it is only a rough estimate of action coordination. For firefighting teams and most likely also for other first responder teams, the simultaneous change in body movement is clearly related to team coordination because in such teams it is important to move together to solve the task. For white-color workers in an office building this simultaneous change in movement however has no clear meaning. In typical office work, body movement itself is not a driving process behind team performance.
We proposed a set of team coordination indicators that can be measured with the built-in sensors of smartphones. We demonstrated the validity of our approach in firefighting teams performing a realistic training scenario and investigated the link between the coordination indicators and team performance as well as experienced team coordination. Our method enables researchers to capture temporal and spatial team coordination automatically and objectively. However, to prove the generality of the approach, future studies have to be carried out in different architectural configurations and with other types of teams.
The authors would like to thank all members of the Zurich fire brigade for their participation and support throughout the experiments. We are grateful to Bert Arnrich and to Bertolt Meyer for their help in designing this study. We thank Anna-Lena Köng,, Laura Fischer and Nadja Ott for their help with the data collection and for behavioral coding. This work was funded by the SNSF interdisciplinary project “Micro-level behavior and team performance” (grant agreement no.: CR12I1_137741).
- Salas E, Cooke NJ, Rosen MA: On teams, teamwork, and team performance: Discoveries and developments. Hum Factors 2008, 50(3):540–547. 10.1518/001872008X288457View ArticleGoogle Scholar
- Cohen SG, Bailey DE: What makes team work: group effectiveness from the shop floor to the executive suite. J Manag 1997, 23(3):239–290.Google Scholar
- Moreland RL, Fetterman JD, Flagg JJ, Swanenburg K: Behavioral assessment practices among social psychologists who study small groups. In Then A Miracle Occurs: Focusing on Behavior in Social Psychological Theory and Research. Oxford University Press, New York; 2010:28–53.Google Scholar
- Rosen MA, Bedwell WL, Wildman JL, Fritzsche BA, Salas E, Burke CS: Managing adaptive performance in teams: guiding principles and behavioral markers for measurement. Hum Res Manag Rev 2011, 21(2):107–122.Google Scholar
- Brannick MT, Prince C: An overview of team performance measurement. In Team Performance Assessment and Measurement: Theory, Methods, and Applications. Lawrence Erlbaum Associates, London; 1997:3–16.Google Scholar
- Kozlowski SWJ, Bell BS: Work groups and teams in organizations. In Handbook of Psychology: Industrial and Organizational Psychology vol. 12. Wiley, London; 2003:333–375.Google Scholar
- Guastello SJ, Guastello DD: Origins of coordination and team effectiveness: a perspective from game theory and nonlinear dynamics. J Appl Psychol 1998, 83(3):423–437. 10.1037/0021-9010.83.3.423View ArticleGoogle Scholar
- Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT: A survey of mobile phone sensing. IEEE Commun Mag 2010, 48(9):140–150. 10.1109/MCOM.2010.5560598View ArticleGoogle Scholar
- Burtscher MJ, Manser T, Kolbe M, Grote G, Grande B, Spahn DR, Wacker J: Adaptation in anaesthesia team coordination in response to a simulated critical event and its relationship to clinical performance. Br J Anaesth 2011, 106(6):801–806. 10.1093/bja/aer039View ArticleGoogle Scholar
- Burtscher MJ, Wacker J, Grote G, Manser T: Managing non-routine events in anesthesia: the role of adaptive coordination. Hum Factors 2010, 52(2):282–294. 10.1177/0018720809359178View ArticleGoogle Scholar
- Stachowski AA, Kaplan SA, Waller MJ: The benefits of flexible team interaction during crises. J Appl Psychol 2009, 94(6):1536–1543. 10.1037/a0016903View ArticleGoogle Scholar
- Crawford ER, LePine JA: A configural theory of team processes: accounting for the structure of taskwork and teamwork. Acad Manage Rev 2013, 38(1):32–48. 10.5465/amr.2011.0206View ArticleGoogle Scholar
- Wasserman S, Faust K: Social network analysis: methods and applications. Cambridge University Press, New York; 1994.View ArticleGoogle Scholar
- Knoke D, Yang S: Social network analysis. Quantitative applications in the social sciences, vol. 154. Sage, Thousand Oaks, CA; 2008.Google Scholar
- Balkundi P, Harrison DA: Ties, leaders, and time in teams: Strong inference about network structure effects on team viability and performance. Acad Manag 2006, 49(1):49–68. 10.5465/AMJ.2006.20785500View ArticleGoogle Scholar
- Sparrowe RT, Liden RC, Wayne SJ, Kraimer ML: Social networks and the performance of individuals and groups. Acad Manag 2001, 44(2):316–325. 10.2307/3069458View ArticleGoogle Scholar
- Zohar D, Tenne-Gazit O: Transformational leadership and group interaction as climate antecedents: a social network analysis. J Appl Psychol 2008, 93(4):744–757. 10.1037/0021-9010.93.4.744View ArticleGoogle Scholar
- Houghton RJ, Baber C, McMaster R, Stanton NA, Salmon P, Stewart R, Walker G: Command and control in emergency services operations: a social network analysis. Ergonomics 2006, 49(12–13):1204–1225. 10.1080/00140130600619528View ArticleGoogle Scholar
- Pentland A: Honest Signals: How They Shape Our World. The MIT Press, Cambridge; 2008.Google Scholar
- Eagle N, Pentland A: Reality mining: sensing complex social systems. Pers Ubiquitous Comput 2005, 10(4):255–268. 10.1007/s00779-005-0046-3View ArticleGoogle Scholar
- Farrahi K, Gatica-Perez D (2008) What did you do today?: Discovering daily routines from large-scale mobile data. In: Proc. Int. Conf. ACM Multimedia.View ArticleGoogle Scholar
- Chon Y, Lane ND, Li F, Cha H, Zhao F (2012) Automatically characterizing places with opportunistic crowdsensing using smartphones. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Rossi M, Amft O, Tröster G (2012) Recognizing daily life context using web-collected audio data. In: Proc. Int. Symp. Wearable Computers (ISWC).View ArticleGoogle Scholar
- Choudhury T, Pentland AS (2003) Sensing and modeling human networks using the sociometer. In: Proc. Int. Conf. Symposium on Wearable Computers (ISWC).View ArticleGoogle Scholar
- Wyatt D, Bilmes J, Choudhury T (2008) Towards the automated social analysis of situated speech data. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Rossi M, Amft O, Tröster G: Collaborative personal speaker identification: a generalized approach pervasive and mobile computing. Pervasive Mobile Comput 2012, 8: 180–189. 10.1016/j.pmcj.2011.02.005View ArticleGoogle Scholar
- Rachuri KK, Mascolo C, Rentfrow PJ, Longworth C (2010) EmotionSense: A mobile phones based adaptive platform for experimental social psychology research. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Lu H, Frauendorfer D, Rabbi M, Mast MS, Chittaranjan GT, Campbell AT, Gatica-Perez D, Choudhury T (2012) StressSense: Detecting stress in unconstrained acoustic environments using smartphones. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Rabbi M, Ali S, Choudhury T, Berke E (2011) Passive and in-situ assessment of mental and physical well-being using mobile sensors. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Do TMT, Gatica-Perez D: Human interaction discovery in smartphone proximity networks. Pers Ubiquitous Comput 2011, 17(3):413–431. 10.1007/s00779-011-0489-7View ArticleGoogle Scholar
- Olguin D, Gloor PA, Pentland AS (2009) Capturing individual and group behavior with wearable sensors. In: AAAI Symp. Human Behavior Modeling.Google Scholar
- Feese S, Arnrich B, Rossi M, Burtscher M, Meyer B, Jonas K, Tröster G (2013) Towards monitoring firefighting teams with the smartphone. In: Proc. Int. Conf. Pervasive Computing and Communications (PerCom): WorkInProgress.View ArticleGoogle Scholar
- Feese S, Arnrich B, Burtscher M, Meyer B, Jonas K, Tröster G (2013) CoenoFire: Monitoring performance indicators of firefighters in real-world missions using smartphones. In: Proc. Int. Conf. Ubiquitous Computing (UbiComp).View ArticleGoogle Scholar
- Feese S, Arnrich B, Burtscher M, Meyer B, Jonas K, Tröster G (2013) Sensing group proximity danamics of firefighting teams using smartphones. In: Proc. Int. Symp. Wearable Computers (ISWC).Google Scholar
- Aharony N, Pan W, Ip C, Khayal I, Pentland A: Social fMRI: Investigating and shaping social mechanisms in the realworld. Pervasive Mobile Comput 2011, 7: 643–659. 10.1016/j.pmcj.2011.09.004View ArticleGoogle Scholar
- Lewis K: Measuring transactive memory systems in the field: scale development and validation. J Appl Psychol 2003, 88(4):587–604. 10.1037/0021-9010.88.4.587View ArticleGoogle Scholar
- Rico R, Sanchez-Manzanares M, Gil F, Gibson C: Team implicit coordination processes: a team knowledge-based approach. Acad Manage Rev 2008, 33(1):163–184. 10.5465/AMR.2008.27751276View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.