Smartphones have become vital to many children's lives due to increased access and the rise of digital learning methods such as COVID-19. Recent research shows children spend 6-9 hours daily on smartphones. The current expansion of mobile apps' use for games and online education for children requires examining the consequences of any ethical deviations. Children might not be as aware of the implications and risks associated with their actions when using smartphones or apps., specifically ed-tech apps.
Specifically, children in their adolescence are still in their normative decision-making stages and may be susceptible to influence by deceptive design patterns. Pew Research found that 54% of teens in the United States (US) spend too much valuable time on their cell phones. They are also more susceptible to anxiety, depression, and behavior disorders. Further, adolescents don't have an appropriate app category. Apps on platforms are classified as "children's apps" for users aged less than 12. Some apps allow users to log into the app via social media accounts.
On one side, the ethical issues in apps used by children create physical, economic, emotional, and psychological implications for children. In contrast, these are choices made to fit the business model by app developers/ platforms for many such popular apps. For instance, an education app requires general and precise access to location information, while the features do not support such needs. The app uses this location for personalized ads. Recent research by the author covered thirty-nine apps across five app categories, namely, education, communication, social media, games, and dating, to examine ethical issues in them.
Ethical issues fall under four key categories, namely, Privacy (privacy policy, consent, data collection, data shared with third parties, and use of third-party trackers), Age-appropriateness (age limits mentioned in the app marketplace places, design, content filtering, and age verification by the apps), and User interface (deceptive design patterns).
Only occasionally is consent sought for specific requirements. For instance, access to location can be coarse (approximate) or precise data. However, the permissions need to differentiate; the user consents to share location data without knowing whether it is for coarse or accurate data.
In addition, some of the apps examined to collect more data than their features require. For instance, certain apps ask for contact access without associated features. Further, third-party trackers in the apps collected information, including the user's profile, location, and behavioral signals related to the app or device use. These trackers do not always have explicit consent for the data being tracked or collected.
In some cases, the appropriate age, as referred by the marketplace, was less than the age appropriateness as directed in the app's privacy policy, thereby creating confusion on the suitability of the age. Further, some apps with peer-to-peer chat functionality did not have content filters allowing abusive or inappropriate language in these communications.
In all these cases, most apps examined determined the user's age based on the declaration, which could be falsified. No validation mechanism or age verification mechanism exists in these apps. Except for a few apps that had the means to block accounts based on search history or violation of community protocols of these apps.
Deceptive design patterns exploit the cognitive bias of the user to persuade them to do things they would not otherwise do, using design tactics and persistent attention-seeking behavior. There were several deceptive designs in the apps reviewed. Some of the prominent ones include (a) gaining consent with too little or too much information on Privacy, (b) seeking phone numbers without need, (c) prompting continuously to make users share more personal data, and (d) emotionally steering to subscribe by using visuals or wordings (e.g., last few days left), (e) continuously syncing contacts at the back end while user understands it to be one-time exercise, (f) double negative connotations for influencing actions (i.e.) uncheck here not to personalize ads, (g) data invasive notification setting enabled by default, and (h) creating difficulties in deleting data or the account.
In March 2022, EDPB issued detailed guidance and practical recommendations to designers and developers to avoid deceptive design patterns on social media platforms. The advice suggested best practices to adopt in preventing deceptive design patterns, including easy-to-understand design, ease of navigation, adequate information & appropriate disclosure, coherent wordings, use of examples, ability to spot changes, consequences, and access to the supervisory authority, and cross-device consistency. GDPR also expects the privacy notification is set in an age-appropriate fashion.
The core underlying issue is that the children's ed-tech app industry is irresponsibly focused on business prospects; they tend to need to adequately address the Privacy, age-appropriateness, and user design that provides autonomy to the user for decisions. Further, a lack of sufficient marketplace governance practices adds to the complexity (including auditing before apps are placed on marketplaces).
While regulatory requirements are making clear expectations on Privacy and children's rights, there is a heightened need for operating responsibly. Ethical considerations in developing and deploying apps for children and adolescents are necessary and cannot be undermined, considering mobile apps' influence on them. Apps that intend to drive responsible user engagement and those that respect the children's best interest shall undertake relevant measures while enabling their intelligence through educational insights. The appropriate measures include addressing privacy concerns, including data minimization measures, age-appropriate communications and filters and age verification, and consistent efforts to avoid deceptive designs.
Responsible development and deployment of ed-tech apps used by children and adolescents must move beyond regulatory expectations to establish minimum thresholds for public interest technology, setting the baseline for responsible ed-tech apps.
About the Author:
Sundar Narayanan is an Ethics and Compliance Advisor with over 18 years of experience in ethics and compliance framework development, policy implementation, fact-finding reviews, compliance management, and compelling/ creative for corporates. His papers on AI ethics are published in NeurIPS 2022, ASIS&T SIG-SI and SIG-IEP workshop, Annual Conference of SEAC 2021, A Better Tech convention and career fair, BHCC 2021, and SweDS 2022. He is currently pursuing his doctoral thesis from the University of Bordeaux where his research focuses on ethical issues in AutoML tools.
コメント