Journal:What is this sensor and does this app need access to it?

From LIMSWiki
Revision as of 17:27, 26 November 2019 by Shawndouglas (talk | contribs) (Saving and adding more.)
Jump to navigationJump to search
Full article title What is this sensor and does this app need access to it?
Journal Informatics
Author(s) Mehrnezhad, Maaryam; Toreini, Ehsan
Author affiliation(s) Newcastle University
Primary contact Email: maryam dot mehrnezhad at ncl dot ac dot uk
Year published 2019
Volume and issue 6(1)
Page(s) 7
DOI 10.3390/informatics6010007
ISSN 2227-9709
Distribution license Creative Commons Attribution 4.0 International
Website https://www.mdpi.com/2227-9709/6/1/7/htm
Download https://www.mdpi.com/2227-9709/6/1/7/pdf (PDF)

Abstract

Mobile sensors have already proven to be helpful in different aspects of people’s everyday lives such as fitness, gaming, navigation, etc. However, illegitimate access to these sensors results in a malicious program running with an exploit path. While users are benefiting from richer and more personalized apps, the growing number of sensors introduces new security and privacy risks to end-users and makes the task of sensor management more complex. In this paper, we first discuss the issues around the security and privacy of mobile sensors. We investigate the available sensors on mainstream mobile devices and study the permission policies that Android, iOS and mobile web browsers offer for them. Second, we reflect on the results of two workshops that we organized on mobile sensor security. In these workshops, the participants were introduced to mobile sensors by working with sensor-enabled apps. We evaluated the risk levels perceived by the participants for these sensors after they understood the functionalities of these sensors. The results showed that knowing sensors by working with sensor-enabled apps would not immediately improve the users’ security inference of the actual risks of these sensors. However, other factors such as the prior general knowledge about these sensors and their risks had a strong impact on the users’ perception. We also taught the participants about the ways that they could audit their apps and their permissions. Our findings showed that when mobile users were provided with reasonable choices and intuitive teaching, they could easily self-direct themselves to improve their security and privacy. Finally, we provide recommendations for educators, app developers, and mobile users to contribute toward awareness and education on this topic.

Keywords: mobile sensors, IoT sensors, sensor security, security education, app permission, mobile security awareness, user privacy, user security, sensor attacks

Introduction

According to The Economist[1], smartphones have become the fastest-selling gadgets in history, outselling personal computers (PCs) four to one. Today, about half the adult population owns a smartphone; by 2020, 80% will. Mobile and smart device vendors are increasingly augmenting their products with various types of sensors such as the Hall effect sensor, accelerometer, NFC (near-field communication) sensor, heart rate sensor, and iris scanner, which are connected to each other through the internet of things (IoT). We have observed that approximately 10 new sensors have been augmented or became popular in mainstream mobile devices in less than two years, bringing the number of mobile sensors to more than 30 sensors. Examples include FaceID, Active edge, depth cameras (using infrared), thermal cameras, air sensors, laser sensors, haptic sensors, iris scanners, heart rate sensors, and body sensors.

Sensors are added to mobile and other devices to make them smart: to sense the surrounding environment and infer aspects of the context of use, and thus to facilitate more meaningful interactions with the user. Many of these sensors are used in popular mobile apps such as fitness trackers and games. Mobile sensors have also been proposed for security purposes, e.g., authentication[2][3], authorization[4], device pairing[5], and secure contactless payment.[6] However, malicious access to sensor streams results in an installed app running in the background with an exploit path. Researchers have shown that user PINs and passwords can be disclosed through sensors such as the camera and microphone[7], the ambient light sensor[8], and the gyroscope.[9] Sensors such as NFC can also be misused to attack financial payments.[10]

In our previous research[11][12][13][14], we have shown that the sensor management problem is spreading from apps to browsers. We proposed and implemented the first JavaScript-based side channel attack revealing a wide range of sensitive information about users such as phone calls’ timing, physical activities (sitting, walking, running, etc.), touch actions (click, hold, scroll, and zoom) and PINs on mobile phones. In this attack, the JavaScript code embedded in the attack web page listens to the motion and orientation sensor streams without needing any permission from the user. By analyzing these streams via machine learning algorithms, this attack infers the user’s touch actions and PINs with an accuracy of over 70% on the first try. The above research attracted considerable international media coverage, including by the Guardian[15] and the BBC[16], which reassures the importance of the topic. We disclosed the identified vulnerability described in the above to the industry. While working with World Wide Web Consortium (W3C) and browser vendors (Google Chromium, Mozilla Firefox, Apple, etc.) to fix the problem, we came to appreciate the complexity of the sensor management problem in practice and the challenge of balancing security, usability, and functionality.

Through a series of user studies over the years[13][14], we concluded that mobile users are not generally familiar with most sensors. In addition, we observed that there is a significant disparity between the actual and perceived risk levels of sensors. In another work[17], the same conclusion was made by Crager et. al. for motion sensors. We discussed how this observation, along with other factors, renders many academic and industry solutions ineffective at managing mobile sensors.[14] Given that sensors are going beyond mobile devices, e.g., in a variety of IoT devices in smart homes and cities, the sensor security problem has already attracted more attention not only from researchers, but also from hackers. In view of all this, we believe that there is much room for more focus on people’s awareness and education about the privacy and security issues of sensor technology.

Previous research[14][17] has focused on individual user studies to study human aspects of sensor security. In this paper, we present the results of a more advanced teaching method—working with sensor-enabled apps—on the risk level that users associate with the PIN discovery scenario for all sensors. We reflect the results of two interactive workshops that we organized on mobile sensor security. These workshops covered the following: an introduction of mobile sensors and their applications, working with sensor-enabled mobile apps, an introduction of the security and privacy issues of mobile sensors, and an overview of how to manage the app permissions on different mobile platforms.

In these workshops, the participants were sitting in groups and introduced to mobile sensors by working with sensor-enabled apps. Throughout the workshops, we asked the participants to fill in a few forms in order to evaluate the general knowledge they had about mobile sensors, as well as their perceived risk levels for these sensors after they understood their functionalities. After analyzing these self-declared forms, we also measured the correlation between the knowledge and perceived risk level for mobile sensors. The results showed that knowing sensors by working with sensor-enabled apps would not immediately improve the users’ security inference of the actual risks of these sensors. However, other factors such as the prior general knowledge about these sensors and their risks have a strong impact on the users’ perception. We also taught the participants about the ways that they could audit their apps and their permissions, including per app vs. per permission. Our participants found both models useful in different ways. Our findings show that when mobile users are provided with reasonable choices and intuitive teaching, they can easily self-direct themselves to improve their security and privacy.

In the next section, we list the available sensors on mobile devices and categorize them, and then we present the current permission policies for these sensors on Android, iOS, and mobile web browsers. In the subsequent section, we present the structure of these workshops in full detail. Afterwards, we include our analysis on the general knowledge and perceived risk levels that our participants had for sensors and their correlation, followed by our observations of the apps’ and permissions’ review activities in the workshops. We then go on to present a list of our recommendations to different stakeholders. Finally, in the final two sections, we include limitations, future work, and the conclusion.

Mobile sensors

As stated, there are more than 30 sensors on mobile devices. Both iOS and Android, as well as mobile web browsers, allow native apps and JavaScript code in web pages to access most of these sensors. Developers can have access to mobile sensors either by (1) writing native code using mobile OS APIs[18][19], (2) recompiling HTML5 code into a native app[20], or (3) using standard APIs provided by the W3C[21], which are accessible through JavaScript code within a mobile browser.

As shown by Taylor and Martinovic[22], the average number of permissions used by Android apps increases over time, in particular for popular apps and free apps. These permissions are requested for having access to the operating system (OS) resources such as contacts and files, as well as sensors such as the GPS and microphone. This has the potential to make apps over-privileged and unnecessarily increase the attack surface.

Mobile sensors' categorization

We first created a list of available sensors on various mobile devices. We prepared this list by inspecting the official websites of mainstream mobile devices such as the iPhone X, Samsung Galaxy S9, Google Pixel 2, as well as the specifications that W3C[23], Android[18], and Apple[19] provide for developers. We proposed categorizing these sensors into four main groups: identity-related (biometric) sensors, communicational sensors, motion sensors, and ambient (environmental) sensors, as presented in Table 1. Note that this list can be even longer if all mobile brands are included. For example, the Cat S61 smart phone has sensors such as a thermal camera, an air sensor (measures the quality of the environmental air), and a laser sensor (to measure distance).

Table 1. Categorization of current mobile sensors
Category Sensors
Ambient Temperature (ambient, device), Humidity, Pressure (barometer), Light, Proximity
Ambient: Environmental Gravity, Magnetic field, Hall effect sensor
Communication WiFi, Bluetooth, NFC
Identity-related GPS, Camera, Microphone, Fingerprint (TouchID), FaceID, Iris scan, Heart rate (HR)
Identity-related: Biometric Touch screen, Active Edge, Haptic sensor, Body sensors
Motion Gyroscope, Accelerometer, Rotation, Orientation, Motion, Sensor hub

In Appendix A (see the supplemental material at the end), we present a brief description of each sensor. With the growing number of sensors on mobile devices, categorizing them into a few groups is much more difficult than before. Some of these sensors can belong to multiple groups. For example, one might argue that GPS belongs to the environmental category; however, since it is associated with people’s identities, we propose to keep it in the identity-related category. Similarly, the sensor hub monitors the device’s movements, which is associated with the user’s activities. Hence, it is difficult to decide to which category (motion or biometric) it belongs.

Sensor management challenges

In Table 2, we present how the Android, iOS, and W3C specs (followed by mobile browsers) treat different sensors in terms of access. We used the Android and Apple developer websites, the W3C specifications, and caniuse.com to build this table.[18][19][23][24] As can be seen, permission policies for having access to different sensors vary across sensors and platforms. We argue that sensing is still unmanaged on existing smartphone platforms. The in-app access to certain sensors including GPS, camera, and microphone requires user permission when installing and running the app. However, as Simon and Anderson have discussed[7], an attacker can easily trick a user into granting permission through social engineering (e.g., presenting it as a free game app). Once the app is installed and the permission approved, usage of the sensor data is not restricted. On the other hand, access to many other sensors such as accelerometers, gyroscopes, and light sensors is unrestricted; any app can have free access to the sensor data without needing any user permission, as these sensors are left unmanaged on mobile operating systems.

Table 2. Current permission policies of sensors on different platforms. ✓: permission required, ✗: permission not required, NA: not supported, and Locked: not open to developers. * NFC should be turned on manually for any program to be able to use it.
Sensor Android iOS W3C/Web Browsers
Proximity
Ambient light
Ambient pressure/barometer NA
Ambient humidity NA NA
Ambient temperature NA NA
Device temperature NA NA
Gravity
Magnetic field
Hall effect sensor NA NA
Wifi
Bluetooth
NFC ✗* ✗*
GPS
Camera
Microphone
Fingerprint/TouchID NA
Touch screen
FaceID NA
Iris scan NA
Heart rate NA
Body sensors NA
Active Edge Locked NA NA
Haptic sensor
Accelerometer
Rotation
Gyroscope
Motion
Orientation
Sensor hub Locked Locked NA

Although the information leakage caused by sensors has been known for years[7][8][9], the problem has remained unsolved in practice. One main reason is the complexity of the problem; keeping the balance between security and usability. Another reason, from the practical perspective, is that all the reported attacks depend on one condition: the user must initiate the downloading and installing of the app. Therefore, users are relied upon to be vigilant and not to install untrusted apps. Furthermore, it is expected that app stores such as the Apple App Store and Google Play will screen the apps and impose severe penalties if the app is found to contain malicious content. However, in the browser-based attack[11][12][13][14], we have demonstrated that these measures are ineffective. Apart from academic efforts, there are industrial solutions (e.g., Navenio) that use some of these sensors such as the accelerometer to track users precisely indoors and outdoors. These products can easily be integrated with illegitimate apps and websites and break user’s privacy and security.

With the growing number of sensors, and more sensitive sensor hardware provisioned with new mobile devices and other IoT devices, the problem of information leakage caused by sensors is becoming more severe. Previous research[14][17] suggests that users are not aware of (i) the data generated by the sensors, (ii) how that data might be used to undermine their security and privacy, and (iii) what precautionary measure they could and should take. Given that, we believe that raising public knowledge about the sensor technology through education is a very timely matter.

Workshop

We ran two rounds of a 90-minute workshop entitled What Your Sensors Say About You, which was hosted by the Thinking Digital conference in November 2016[25] and May 2018 at Newcastle University, U.K. The attendees could find the following description of the workshop on the event page:

“Mobile sensors are everywhere. They’re in our smartphones, our tablets and our wearables. They help our devices to detect movement, sense changes in pressure, and notice when other devices are nearby. The data they provide help us to enjoy richer and more personalized apps. But what are the risks to our phones, and the information that lies within them? Discover how these sensors may introduce new security risks to phone users, and make it more complicated to manage them.”

Pedagogical approach

For the purposes of those workshops, our teaching approach, incorporating taught and research dissemination activities, embodied the principles of constructive alignment and constructivist learning theory. In particular, we deliberately introduced a number of periods of reflection throughout the workshop. Attendees were supported in considering various preventative measures in relation to permission-granting in sensor-related apps and extrapolating their future impacts.

A widely adopted theory in the public understanding of scientific research is that of the “deficit model.”[26] The deficit model acknowledges that a lack of available information leads to a lack of popular understanding, which in turn fosters scepticism and hostility. Through our public engagement exercise, and by making available our resources, we seek to equip the public with accessible information, which may inform reasonable precautionary behavior.

We adopt a challenging role, both as researchers active in mobile sensor security and mediators seeking to popularize research findings. This leads to tension between providing layman and specialist explanations, a perennial issue in science communication.[27] As such, we acknowledge the role popularization of science plays in informing future iterations of research.[28][29] Indeed, our observations of participants’ interactions serve to inform future technological interventions to support mobile sensor security.

Participants

In both rounds of the workshop, participation was voluntary, with conference attendees selecting among multiple parallel workshops. We presented the workshop to the audience in both rounds. In the first run in 2016, 27 female and three male participants, aged between 22 and 51, attended the workshop. In the second run, two female and 18 male participants aged between 21 and 58 attended the workshop. This brought the total number of our participants to 50 (29 female). In both rounds, the workshop attendees were sitting at tables of five or six and could interact with each other and the educators during the workshop. The attendees have owned iOS and Android phones for as little as one year, all the way up to 15 years. Full details of the participants’ demography is presented in Appendix B (see the supplementary material at the end).

Workshop content

We ran the workshops by presenting a PowerPoint file, which is publicly available via the first author’s homepage. These slides contain all the general and technical content delivered to the attendees and the individual/group exercises they were asked to complete. We explicitly explained to the participants whether they need to complete an activity individually or in a group. We also observed them during the workshop to make sure everyone was following the instructions. We explained to the attendees that their feedback during the workshop, through completing a few forms, would be used for a research project. The attendees could leave the workshop at any stage without giving any explanation. In both rounds of the workshop, all participants completed the session to the end.

These workshops were organised into three parts, as shown in Figure 1. In Part 1, we went through the current mobile sensors by (a) providing the participants with a description of sensors and (b) working with sensor-enabled apps. In Part 2, we explained the sensor-based attacks that have been performed on sensitive user information such as PINs. Finally, in Part 3, we discussed mobile app permission settings.


Fig1 Mehrnezhad Informatics2019 6-1.png

Figure 1. The workshop structure

Sensor knowledge

After a brief introduction about the workshop, we first asked the participants to fill in a five-point scale self-rated familiarity questionnaire on a list of different sensors listed in Table 1 (see Appendix C—borrowed from Mehrnezhad et al.[14]—in the supplementary material at the end). In the first round of the workshop in 2016, this form had 25 sensors which we had been consistently using in our previous research.[14] However, in the second round of the workshop in 2018, we added six new sensors (FaceID, iris scan, heart rate, body sensors, Active Edge and haptic sensors). This was due to the augmentation of popular mobile devices with these new sensors.

In this form, we asked the users to express the level of the general knowledge they had of each sensor by choosing one of the following: “I’ve never heard of this”; “I’ve heard of this, but I don’t know what this is”; “I know what this is, but I don’t know how this works”; “I know generally how this works”; and “I know very well how this works.” This was an individual exercise, and the list of sensors was randomly ordered for each user to minimize bias.

Description of sensors (Activity 2): After completing the knowledge form, we asked the participants to go through the description of each sensor (see Appendix A) on a printed paper given to everyone. This was a group activity, and the participants could help each other for a better understanding. In case of any difficulty, the attendees were encouraged to interact with the educators. After everyone went through the description page, we gave them examples of the usage of each sensor, e.g., motion sensors for gaming, NFC for contactless payment and haptic sensors for virtual reality applications.

Afterwards, for the second activity, we asked the participants to go through the description of each sensor (see Appendix A, supplementary material) on a printed paper given to everyone. This was a group activity, and the participants could help each other for a better understanding. In case of any difficulty, the attendees were encouraged to interact with the educators. After everyone went through the description page, we gave them examples of the usage of each sensor, e.g., motion sensors for gaming, NFC for contactless payment, and haptic sensors for virtual reality applications.

For the third activity, we then asked the participants to visit the app stores on their devices and download and install a particular sensor-enabled app (sensor app). Sensor apps are those that visually allow the users to choose different sensors on the screen and see their functionality. For Android users, we recommended the participants install Sensor Box for Android[30], as shown in Figure 2, left. This app detects most of the available sensors on the device and visually shows the user how they work. This app supports the accelerometer, gyroscope, orientation, gravity, light, temperature, proximity, pressure, and sound sensors. For iPhone users, we recommended the Sensor Kinetics app[31], as shown in Figure 2, right. This app mainly supports motion sensors (gyroscope, magnetometer, linear accelerometer, gravity, attitude).


Fig2 Mehrnezhad Informatics2019 6-1.png

Figure 2. Android (left) and iOS (right) sensor apps used in the workshop

Both apps were chosen based on the popularity, number of installs, rating, and the features they offered. We also had a few extra Android phones with the sensor app installed on them. These phones were offered to participants who were unable to install the app and use their own phones. Since the features offered by the Android sensor app were richer, we made sure that each table had at least one Android phone. This was a group activity, and the attendees could help each other find the app on the store and install it. We observed that all users were able to install the app, except two cases in Round 1 and one case in Round 2, who had connection and storage problems. There was another case in Round 2 where the participant did not wish to install the app on his phone due to security and privacy concerns. We lent the Android phones to these users.

We then began two more activities. At this point, we invited the participants to work with the installed apps on their devices. We asked everyone to go through each sensor and find out about its functionality by using the app. Meanwhile, the participants were advised to keep the sensor description handout to refer to if necessary. This was a group activity, and the participants could exchange ideas about the app and sensors, as well as help each other to understand the sensors better. During this activity, we worked with individuals either separately or in small groups of two or three and reviewed at least two sensors in the app, including one motion sensor, using the Android app. Through this pair-working activity, we made sure all participants had the chance to observe a few different sensors on the Android device since it offered more features in comparison to the iOS app. At the end of that activity, we then asked the participants to review the sensor description page again, ensuring nobody expressed difficulties in understanding the general functionalities of mobile sensors.

Finally, we wanted to assess the effect of teaching about sensors to mobile users—via working with mobile sensor apps—on the perceived risk level for each sensor. Similar to our previous research[14], we described a specific scenario:

“Now that you have more knowledge about the sensors, let us describe a scenario here. Imagine that you own a smartphone which is equipped with all these sensors. You have opened a game app which can have access to all mobile sensors. You leave the game app open in the background, and open your banking app which requires you to enter your PIN. Do you think any of these sensors can help the game app to discover your entered PIN? To what extent are you concerned about each sensor’s risk to your PIN? Please rate them in the table. In this part, please make sure that you know the functionality of all the sensors. If you are unsure, please have another look at the descriptions, or ask us about them.”

Then, we asked each participant to fill in a questionnaire (see Appendix C, supplementary material), which included five different levels of concerns: “Not concerned,” “A little concerned,” “Moderately concerned,” “Concerned,’’ and “Extremely concerned.” At the end of this individual activity, we asked the participants to complete a demography form. This form included age, gender, profession, first language, mobile device brand, and the duration of owning a smartphone (see Appendix C, supplementary material). We explained to the participants that these forms would be used anonymously for research purposes, and they could refuse to fill it out (partially or completely).

Sensor attacks

After a short break, we presented a few sensor attacks. In particular, we explained the attacks that we have performed on user sensitive information by using motion and orientation sensors via either installed apps or JavaScript code.[11][12][13][14] These attacks could reveal phone call timing, physical activities (e.g., sitting, walking, running, etc.), touch actions (e.g., click, hold, scroll, zoom) and PINs. (For the exact content presented in this part, please see the PowerPoint file.)

App permissions

After another short break, we explained the problem of over-privileged apps to the participants. We showed examples of such apps, e.g., Calorie Counter-MyFitnessPal, Zara, and Sensor Box for Android (the one that we used in this workshop). These apps ask for extra permissions; Sensor Box, for example, does not need to have access to wifi and phone information to function.

Then we engaged in our seventh activity. This group activity invited the participants to go to the system settings of their mobile phones (or the borrowed ones) and check the permissions of the sensor app that they installed during the workshop. We also explained to them that in both Android and iOS devices, it is possible to disable and enable permissions via the system settings (the option of limiting access while using the app was discussed with iPhone users.)

At this stage, we began our eighth activity, asking the participants to go through the pre-installed apps on their own devices and choose three apps to review their permissions. We asked them to individually complete a form by naming the app, explaining the purpose of the app, listing the (extra) permissions, and expressing whether they would keep the app or uninstall it and why. This form is provided in Appendix D (see supplementary information at the end).

Note that when we ran the workshop in 2016, most Android users were not updated with Android 6 (Lollipop) and had only one way of accessing permissions, which was through each app’s settings (Figure 3, left). From Android Lollipop onward, another permission review model was offered; the user could go to the settings app and see which apps can access certain permission (Figure 3, middle and right). We noticed that in our second workshop in 2018, the participants used both models (explained further in the results section).


Fig3 Mehrnezhad Informatics2019 6-1.png

Figure 3. Android permission models; left: per app, middle and right: per permission

At the end of this workshop, we invited the attendees to discuss their opinions on mobile sensor security with their peers and the educators, giving the attendees an opportunity to gain a few tips to improve their mobile security (explained further in the discussion section).

Results

In this section, we present the results of our analysis of different stages of the two rounds of the workshop, including the general knowledge level about sensors and their perceived risk level, as well as the correlation between them.

General knowledge

Recall that our participants completed the general knowledge form at the beginning of the workshop, before being presented with any information. We present this knowledge level in a stacked bar chart on the left side of Figure 4 for the two rounds. The top bars represent the participants of the first round of the workshop in 2016, and the bottom bars are for the second round in 2018. We categorized these sensors into four groups, as seen in Table 1. In each category, sensors were ordered based on the aggregate percentage of participants in the first round of the workshop declaring they knew generally or very well how each sensor works. This aggregate percentage is shown on the right side; the first number for Round 1, the second number for Round 2. In the case of an equal percentage, the sensor with a bigger share of being known very well by the participants is shown earlier. Note that the bars for some of these sensors (FaceID, heart rate, iris scan, body sensors, haptic sensors, and Active Edge) are solo since they were studied only in our second workshop. We conclude the following observations from Figure 4, left.


Fig4 Mehrnezhad Informatics2019 6-1.png

Figure 4. (Left) Self-declared knowledge of sensors; (right) self-declared perceived risk of sensors. Top bars and left percentages are for Workshop 1 (2016); bottom bars and right percentages are for Workshop 2 (2018).


References

  1. "Planet of the Phones". The Economist. The Economist Newspaper Limited. 26 February 2015. https://www.economist.com/leaders/2015/02/26/planet-of-the-phones. Retrieved 30 November 2018. 
  2. De Luca, A.; Hang, A.; Brudy, F. et al. (2012). "Touch me once and i know it's you!: Implicit authentication based on touch screen patterns". Proceedings of the 2012 SIGCHI Conference on Human Factors in Computing Systems: 987–96. doi:10.1145/2207676.2208544. 
  3. Bo, C.; Zhang, L.; Li, X.-Y. et al. (2013). "SilentSense: Silent user identification via touch and movement behavioral biometrics". Proceedings of the 19th Annual International Conference on Mobile Computing & Networking: 187–90. doi:10.1145/2500423.2504572. 
  4. Li, H.; Ma, D.; Saxena, N. et al. (2013). "Tap-Wave-Rub: Lightweight malware prevention for smartphones using intuitive human gestures". Proceedings of the Sixth ACM Conference on Security and Privacy in Wireless and Mobile Networks: 25–30. doi:10.1145/2462096.2462101. 
  5. Mayrhofer, R.; Gellersen, H. (2007). "Shake Well Before Use: Authentication Based on Accelerometer Data". In LaMarca, A.; Langheinrich, M.; Truong, K.N.. Pervasive Computing - Pervasive 2007. pp. 144–61. doi:10.1007/978-3-540-72037-9_9. ISBN 9783540720379. 
  6. Mehrnezhad, M.; Hao, F.; Shahandashti, S.F. (2015). "Tap-Tap and Pay (TTP): Preventing the Mafia Attack in NFC Payment". In Chen, L.; Matsuo, S.. Security Standardisation Research - SSR 2015. pp. 21–39. doi:10.1007/978-3-319-27152-1_2. ISBN 9783319271521. 
  7. 7.0 7.1 7.2 Simon, L.; Anderson, R. (2013). "PIN skimmer: Inferring PINs through the camera and microphone". Proceedings of the Third ACM workshop on Security and Privacy in Smartphones & Mobile Devices: 67–78. doi:10.1145/2516760.2516770. 
  8. 8.0 8.1 Spreitzer, R. (2014). "PIN Skimming: Exploiting the Ambient-Light Sensor in Mobile Devices". Proceedings of the 4th ACM Workshop on Security and Privacy in Smartphones & Mobile Devices: 51–62. doi:10.1145/2666620.2666622. 
  9. 9.0 9.1 Xu, Z.; Bai, K.; Zhu, S. (2012). "TapLogger: Inferring user inputs on smartphone touchscreens using on-board motion sensors". Proceedings of the Fifth ACM Conference on Security and Privacy in Wireless and Mobile Networks: 113–24. doi:10.1145/2185448.2185465. 
  10. Mehrnezhad, M.; Ali, M.A.; Hao, F. et al. (2016). "NFC Payment Spy: A Privacy Attack on Contactless Payments". In Chen, L.; Matsuo, S.. Security Standardisation Research - SSR 2016. pp. Article 4. doi:10.1007/978-3-319-49100-4_4. ISBN 9783319491004. 
  11. 11.0 11.1 11.2 Mehrnezhad, M.; Toreini, E.; Shahandashti, S.F. et al. (2016). "TouchSignatures: Identification of user touch actions and PINs based on mobile sensor data via JavaScript". Journal of Information Security and Applications 26: 23–38. doi:10.1016/j.jisa.2015.11.007. 
  12. 12.0 12.1 12.2 Mehrnezhad, M.; Toreini, E.; Shahandashti, S.F. et al. (2015). "TouchSignatures: Identification of user touch actions based on mobile sensors data via JavaScript". Proceedings of the 10th ACM Symposium on Information, Computer and Communications Security: 673. doi:10.1145/2714576.2714650. 
  13. 13.0 13.1 13.2 13.3 Mehrnezhad, M.; Toreini, E.; Shahandashti, S.F. et al. (2016). "Stealing PINs via Mobile Sensors: Actual Risk versus User Perception". Proceedings of the 1st European Workshop on Usable Security, EuroUSEC 2016: 1–14. doi:10.14722/eurousec.2016.23008. https://www.ndss-symposium.org/ndss2016/eurousec-2016-workshop/#session1. 
  14. 14.0 14.1 14.2 14.3 14.4 14.5 14.6 14.7 14.8 14.9 Mehrnezhad, M.; Toreini, E.; Shahandashti, S.F. et al. (2018). "Stealing PINs via mobile sensors: actual risk versus user perception". International Journal of Information Security 17 (3): 291–313. doi:10.1007/s10207-017-0369-x. 
  15. Hern, A. (11 April 2017). "Tilted device could pinpoint pin number for hackers, study claims". The Guardian. https://www.theguardian.com/technology/2017/apr/11/tilted-device-could-pinpoint-pin-number-for-hackers-study-claims. Retrieved 30 November 2018. 
  16. "The way people tilt their smartphone 'can give away passwords and pins'". BBC Newsbeat. BBC. 11 April 2017. http://www.bbc.co.uk/newsbeat/article/39565372/the-way-people-tilt-their-smartphone-can-give-away-passwords-and-pins. Retrieved 30 November 2018. 
  17. 17.0 17.1 17.2 Crager, K.; Maiti, A.; Jadliwala, M. et al. (2017). "Information Leakage through Mobile Motion Sensors: User Awareness and Concerns". Proceedings of the 2nd European Workshop on Usable Security, EuroUSEC 2017: 1–15. doi:10.14722/eurousec.2017.23013. https://www.ndss-symposium.org/eurousec-2017-workshop/. 
  18. 18.0 18.1 18.2 "Sensors". Android Developer Guides. Google. 2018. https://developer.android.com/guide/topics/sensors/index.html. Retrieved 30 November 2018. 
  19. 19.0 19.1 19.2 "Core Motion". Apple Developer Documentation. Apple. 2018. https://developer.apple.com/documentation/coremotion. Retrieved 30 November 2018. 
  20. Jin, X.; Hu, X.; Ying, K. et al. (2014). "Code Injection Attacks on HTML5-based Mobile Apps: Characterization, Detection and Mitigation". Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security: 66–77. doi:10.1145/2660267.2660275. 
  21. "All Standards and Drafts Tagged as Web API". W3C. https://www.w3.org/TR/?tag=webapi. 
  22. Taylor, V.F.; Martinovic, I. (6 June 2016). "A Longitudinal Study of App Permission Usage Across the Google Play Store". arXiv. https://arxiv.org/abs/1606.01708v1. Retrieved 30 November 2018. 
  23. 23.0 23.1 "Device and Sensors Working Group". W3C. Archived from the original on 01 January 2018. https://web.archive.org/web/20180101221340/https://www.w3.org/2009/dap/. 
  24. "Sensors Overview". Android Developer Guides. Google. 2018. https://developer.android.com/guide/topics/sensors/sensors_overview.html. Retrieved 30 November 2018. 
  25. "Thinking Digital Women 2016". Mediaworks. Mediaworks UK Limited. 2016. https://www.mediaworks.co.uk/insights/news/mediaworks-blog-post-thinking-digital-women-2016/. 
  26. Wynne, B. (1992). "Misunderstood misunderstanding: Social identities and public uptake of science". Public Understanding of Science 1 (3): 281-304. doi:10.1088/0963-6625/1/3/004. 
  27. Sismondo, S. (2010). An Introduction to Science and Technology Studies (2nd ed.). Wiley-Blackwell. pp. 254. ISBN 9781405187657. 
  28. Hilgartner, S. (1990). "The Dominant View of Popularization: Conceptual Problems, Political Uses". Social Studies of Science 20 (3): 519–39. doi:10.1177/030631290020003006. 
  29. Bucchi, M. (2014). Science and the Media: Alternative Routes in Scientific Communication. Routledge. pp. 208. ISBN 9780415510516. 
  30. Nova System Limited (2018). "Sensor Box for Android". Google Play. https://play.google.com/store/apps/details?id=imoblife.androidsensorbox. 
  31. Innoventions, Inc (2018). "Sensor Kinetics". App Store. https://apps.apple.com/us/app/sensor-kinetics/id579040333. 

Notes

This presentation is faithful to the original, with only a few minor changes to presentation. Grammar was cleaned up for smoother reading. In some cases important information was missing from the references, and that information was added. A few of the inline URLs were turned into citations for this version. The inline URL to Altmetric and the May 2018 workshop from the original article were removed for this version because they were dead, unarchived URLs. The W3C Device and Sensors Working Group URl also changed; an archived version of the site was used for the citation in this version.