In the news…
It has recently been reported that retailers are pulling smart toy brand CloudPets from sale due to the cybersecurity threats they pose. These threats to families and children’s security are not new and concerns were raised over a year ago when the data of 820,000 toy owners with links to 2.2million voice messages were compromised. Yet retailers are only starting to take action now-well over a year later (they should have acted much sooner!).
These toys have been putting families and children at risk since their launch and this exposure and risk has been accepted for far too long. Perhaps the GDPR has forced retailers to take the security of smart toys more seriously and perhaps manufacturers of these toys will, hopefully, now do the same.
So far, Amazon and eBay have removed the CloudPets brand from their stores. Tesco, The Entertainer, Walmart and Target seem to have removed the toys from their shelves too.
The CloudPets range is soft plush smart toys for young children to cuddle and interact with. They record messages which are stored and can be played back.
These voice recordings have been stored unprotected online and were accessed multiple times by unauthorised individuals and used maliciously since 2017. They are exploitable by hackers who can initiate their own recordings and record anything happening where the smart toy (when active) is present.
‘Unsecured’ by design Smart toys are a serious problem
Thousands of children across the world are playing with ‘unsecured’ by design smart toys. These toys are popping up everywhere and are on many children’s must-have list. Parents, grandparents, uncles and aunts are adding a smart toy to a child’s much-loved collection with each celebration. With many unaware, as this is happening, an influx of potential vulnerability points is popping up because of these toys and hackers are exploiting them. They are stealing confidential data and using it maliciously against families and children. Manufacturers of these have not taken privacy and security seriously and are effectively assisting hackers to easily access this sensitive data to exploit innocent people.
It’s not only CloudPets that poses a cybersecurity threat
You do not need to look too deep to find a company or toy that has been responsible for a data breach of children’s personal information. Some notable ones are VTech (fined substantially for their breach of children’s information), My Friend Cayla doll and of course the big brand CloudPets, marketed as ‘the message you can hug’. The list is only expanding, especially as smart toys grow in popularity all over the world and manufacturers rapidly push out volumes to take advantage of the demand.
Parents readily give these toys to their kids without a second thought
Since 2017 hundreds of smart toys have been released. These toys have their place but, in some cases, the users are trading their privacy for the use or subscription to the toys echo systems.
Their growth in popularity is no surprise but what is surprising is how readily parents or guardians give these toys to their children with not much thought. Especially since most parents are usually overly protective to ensure that their children are safe in all other respects. Why is this any different?
Many parents do not realise the impact a smart toy can have on their and their child’s privacy and how dangerous they can actually be. The teddy that responds, the robot that interacts, the duck that reads your child’s emotions… do parents stop to think how it is that these toys are able to accomplish these things and at what cost?
The massive recent breaches emphasise the need for education around smart toys. Many are riddled with security flaws and are vulnerable to hacks. Frightfully, some do not even need to be hacked to cause a problem all that is required is an improperly configured database.
It’s so important that if you want your children to enjoy these toys in a safe manner, to make sure that you and your child are educated with regards to them.
How these toys do what they do
In order to do what they do and respond as they do, these toys need data and they need to connect to the internet. As your child is playing with them, they are using the data (and other data that they see and hear around them within their location). They are recording, processing and storing this data in the cloud. Do you know what this data is or what it is being used for?
What’s behind their soft, cuddly exterior
Most of these toys are not always built safely. Many of these toys are built by hardware manufacturers that produce hardware to specified standards so that it’s simple to use them with software APIs.
These hardware manufacturers produce the modules at mass and the different developers put the APIs and software together to achieve a differentiator by adding the SMARTs. This intersection of hardware and software is fantastic and exciting. However, has its weaknesses.
Some can be hacked and have known vulnerabilities that have been exploited and used against its users. Products are meant to be secure by design (by default), some of these devices are not secure at all.
Many of these toys carry on conversations and this requires a microphone. These could be listening to other conversation and not necessarily that of your ‘child’s play’. This is often transmitted to a server in the cloud.
Several attack types are possible
Several attacks are possible. These toys can easily be hacked with or without physical access. Moreover, when the data is in transit, with toys that stream information like video, voice and live information the data is easily hackable too. Then in storage, attacking the data that gets stored and processed in the cloud. At all stages, if the data is not protected it is vulnerable to attack and compromise. Unfortunately, many smart toy vendors and manufacturers have, up until now, not taken data security seriously at all.
The data can also be modelled, processed and mined to add value to the user/s and for the company. The data could also be intercepted and used maliciously if not properly managed and protected.
These technologies that “help” our children and offer hours of fun and entertainment could be used in a way that counters that.
GDPR and our Children’s privacy
Many of these smart toys have not been regulated as they should be. More specifically the services that they offer have been unregulated from a privacy perspective in part until recently.
With the enforcement of the GDPR, developers of these devices and services should be designing and ensuring these toys are secure and private by default. Particularly since the data processed through these toys include special category information-information about children.
Some of these toys have been set up to gather data on the user by default and without user permission and more importantly without users understanding what the data is and how it’s being used. These practices are no longer acceptable or legal.
The manufacturers and developers often forget that this information neither belongs to them nor do they have any right to collect the information or store it without the user giving appropriate permission to the company.
Play with them responsibly
There is nothing wrong in using responsible and secure technology in its toy form but it’s good to know, research and understand what you are bringing into your home and what comes into contact with your children and family.
If you use any of this technology take the time to think creatively about how the data collected could be used and if you are comfortable with others having access to this type of information. In many cases, people say they don’t mind, until there is a breach and the data is used against them in some way they never imagined possible.
It’s vital to understand and make sure your child understands the security and privacy risks when using smart toys. It’s important that everyone knows how to use them in a safe way. You should approach them with caution and handle them like you do other risks, like when letting a child cross a road on their own (for example). You teach them how to do this safely and have precautions in place- ‘look left, look right and look left again’. Precautions need to be in place and taught when giving a child a smart toy to use and interact with.
Some precautionary measures to consider
- Only connect and use smart toys in environments with trusted and secured Wi-Fi internet access
- Closely monitor children’s activity with the toys (such as conversations and voice recordings) through the toy’s partner parent application, if such features are available
- Check what personal data the toy is collecting and for what purpose (the more sensitive the data the higher the risk of breach and impact on you and your child)
- Carefully read disclosures, privacy policies and data protection practices
- Check how the personal data is processed and if it is being sent to third parties for processing
- Check the default settings and see if you are able to change the privacy settings
- Check whether the toy is always on (connected) or is there an off-line option. Otherwise, the toy may be monitoring more than you are aware of and all the time (listening to you and watching you as you go about your business at home)
- Take care when clicking yes to everything you sign up to and avoid companies that give you no choice on your own data-it’s your data you should control it.
Many parents don’t see the harm-it’s a toy after all!
It may be difficult for some parents to believe that by their child simply playing with toys such consequences are possible, but they are and the risk is there. Data breaches are happening, and children’s (and parent’s) sensitive data is being leaked and being used inappropriately.
Photos, video and voice data are all vulnerable. These toys are designed to record voice, monitor locations and even health and wellbeing of children and to analyse personal qualities and interact through conversation with children.
They are meant to have a positive impact on children but there is a scary side that we need to do our best to avert to protect our privacy and that of our children.