UMBC’s Rick Forno has brought to light some important concerns about technology users’ experiences with decision-making related to their devices. Forno, senior lecturer and assistant director of the UMBC Center for Cybersecurity, wrote a piece for the Association for Computing Machinery (ACM) Special Interest Groups Design of Communication conference in October 2019 about practices that technology companies employ to coerce users into making decisions against their interests.
These decisions often come in the form of accepting software updates or new security measures on devices without users being certain about what those updates will mean in terms of user experience, Forno explains. While the installation of updates, for instance, might seem harmless at first, it can fall into a behavior that Forno and other researchers call “dark patterns.”
Dark patterns describe techniques that companies utilize to get users to act in a certain way, often without their informed consent, or with them realizing the implications of their actions. For example, this includes encouraging a user to blindly click a button that allows a company to do something they want that the user might not otherwise approve. Dark patterns can violate the trust that users have in their technology, but they are often employed because they can be profitable.
One common example of this practice manifests when iPhone users are prompted to install software updates. An iPhone will prompt a user to enter their passcode to “install updates tonight” in such a way that it looks like the user has no choice but to install them. The fine print underneath displays text saying, “remind me later,” suggesting that eventually the user will need to make the update, so they might as well do it now. In this case, a user might be misled to believe that updates are mandatory rather than optional, and Apple makes the installation of updates the more likely choice in this scenario.
With Apple’s exposed behavior in planned obsolescence, or slowing down old iPhones intentionally, many users may want to avoid software updates at all costs out of concern for the longevity of their devices or the fear of an update causing apps to malfunction, says Forno. In many cases, downloading an Apple software update might be contrary to the better interests of users, at least until the new update is vetted by others using the same devices and applications.
Another example of dark patterns is Facebook beginning to suggest that users consent to facial recognition. The company touts its potential benefits for protecting a user’s account security without including warnings about privacy concerns.
Forno believes the most practical solution is informing users about how companies may act in ways that may lead to users taking actions contrary to their own interests. This is not an easy task, but the first step is to encourage users to read the fine print within any agreement or request for consent. Users can opt out of new updates that have not been properly vetted, and refuse services that will not provide them with any value.
Forno recommends that people think hard about their relationship with technology, and ask what they are truly giving away when they consent to requests from tech companies. “So it always comes down to this— think before clicking, and know what you’re getting into,” says Forno. “Just because something is new and convenient doesn’t necessarily mean that it’s going to be good for you.”
###
These decisions often come in the form of accepting software updates or new security measures on devices without users being certain about what those updates will mean in terms of user experience, Forno explains. While the installation of updates, for instance, might seem harmless at first, it can fall into a behavior that Forno and other researchers call “dark patterns.”
Dark patterns describe techniques that companies utilize to get users to act in a certain way, often without their informed consent, or with them realizing the implications of their actions. For example, this includes encouraging a user to blindly click a button that allows a company to do something they want that the user might not otherwise approve. Dark patterns can violate the trust that users have in their technology, but they are often employed because they can be profitable.
One common example of this practice manifests when iPhone users are prompted to install software updates. An iPhone will prompt a user to enter their passcode to “install updates tonight” in such a way that it looks like the user has no choice but to install them. The fine print underneath displays text saying, “remind me later,” suggesting that eventually the user will need to make the update, so they might as well do it now. In this case, a user might be misled to believe that updates are mandatory rather than optional, and Apple makes the installation of updates the more likely choice in this scenario.
With Apple’s exposed behavior in planned obsolescence, or slowing down old iPhones intentionally, many users may want to avoid software updates at all costs out of concern for the longevity of their devices or the fear of an update causing apps to malfunction, says Forno. In many cases, downloading an Apple software update might be contrary to the better interests of users, at least until the new update is vetted by others using the same devices and applications.
Another example of dark patterns is Facebook beginning to suggest that users consent to facial recognition. The company touts its potential benefits for protecting a user’s account security without including warnings about privacy concerns.
Forno believes the most practical solution is informing users about how companies may act in ways that may lead to users taking actions contrary to their own interests. This is not an easy task, but the first step is to encourage users to read the fine print within any agreement or request for consent. Users can opt out of new updates that have not been properly vetted, and refuse services that will not provide them with any value.
Forno recommends that people think hard about their relationship with technology, and ask what they are truly giving away when they consent to requests from tech companies. “So it always comes down to this— think before clicking, and know what you’re getting into,” says Forno. “Just because something is new and convenient doesn’t necessarily mean that it’s going to be good for you.”
###
- For additional UMBC Science and Technology stories, visit the UMBC News site.
- For additional stories about the UMBC community, visit the UMBC Magazine site.
- For additional COEIT stories, visit the COEIT site.
- For additional COEIT Research Highlights, including Publications Spotlights, visit the COEIT Research pages on the COEIT Dean's Office site.