I guess I’m just naturally curious … or maybe I just like to give my grey cells a workout (though I seriously wonder whether they are depleting in number given some of the choices I make concerning what I want to be curious about!!!).
So, my latest adventure to slake my curiosity-bent is into the world of TikTok.
“Why not?” I said to myself when I saw an offer for a course on how to navigate TikTok.
My grey cells are not thanking me one bit and, really, they’re saying to me that fading into oblivion is a much more attractive option.
Quite apart from the usual embarrassing hooha I go through every time I learn something new that involves technology:
“Which button did you press? Just let me see it again.”
And two seconds later. “Sorry, I’ve forgotten. Which button did you press?”
And “I can’t find it!”
Followed by “Oh yes! I see it now 😬 it was under my thumb.”
And so on…
I wonder whether I’m cut out for this brave (now-not-so-new) world of social media.
Here is my tiktok account in case you (are more up-to-date-than-I-am and already blithely surf tiktok and) want to check out my painfully produced offerings: @miriamverbeekauthor.
More important than all that – and what motivated me to write to you about my adventures – is that I became uneasy about using TikTok. Here's why:
Upon setting up my account, I was immediately fed short video clips of sexily dressed and over-made-up young people doing gyrating things with their bodies to sounds (that might be music or might be a form of talking). Not really my scene and, because I couldn’t quickly figure out whether I should scroll down, click somewhere, swipe sideways to move to other images, I panicked and clicked out of the app altogether, which meant I was fed the same images when I clicked back in! Over and over … till I – 🥵 – got it!
The second fright I got was how quickly the algorithm learnt that large-breasted/heavily muscled, voluptuously clad, over-made-up individuals doing gyrating things wasn't my scene and started feeding me things more to my taste (people talking about books).
Okay. So why did these issues upset me?
As I’ve mentioned a few times in other newsletters, I am deeply immersed in writing another book set in the Si’Empra world. The main character is a Dutch lady by the name of Saskia: 43 years old; parents own a café in Rotterdam but Saskia lives in Lyon, France; is a partner in a finance consultancy organisation; is a nerdy dwarf, who occasionally goes undercover to help police solve child abuse crimes – abuse images sold over the internet.
To get my head around this story, I’ve researched stuff on the dark web, cryptocurrency, child abuse crime and nerdy stuff (Yes! My grey cells complained endlessly about all of that as well!). What I am now 150% aware of is that the amount of material available depicting children being abused is skyrocketing. Even though specialised units around the world manage to close down sites offering the awful images, arrest perpetrators and rescue children, the problem continues to grow.
So, what does this have to do with TikTok?
Or Instagram, or whatever …?
One of the fastest-growing problems law enforcement faces is the abuse of these platforms by predators. Because children have such ready access to phones and platforms such as TikTok are so attractive and fun to use, children use them and use them and use them. As a result something like this can (and does) happen: A girl gets a follower on her feed, who poses as a 'friend of the same age and interest' or attractive admirer. The predator provides encouraging feedback. Then there’s a request via a direct message for more risque images. This escalates and then the predator threatens to reveal the images to the mother or father and so on unless the girl delivers more and more risque stuff (blackmail). Meanwhile, without permission or even knowledge of the girl, the images are shared – even sold – by the predator and remain in circulation even if the predator is caught. The victim will never be free of the abuse.
Currently, there is no legislation that requires any of the technology companies to take responsibility for stopping this form of crime (among others).
Another reason why I am disturbed by the speed of the algorithm to know my requirements is that it demonstrates how “clever” the platform is to feed me more of what I “paused” to look at.
The lack of regulations in how and what these algorithms track provides a gap in which criminals can operate and opens us all to being fed conspiracies (and aren't we all deluged in conspiracy theories just now!)
It’s not only TikTok, Instagram, Facebook and such platforms that are iffi; have you ever considered how search engines like Google and Yahoo (and retailers like Amazon) manipulate (or, they would argue, try to help give you what you want) you with their algorithms? When you type in a term such as … ummm … “vaccines”, the algorithm will know the preferred type of information you like and will present you with a completely different set of choices to what it will present to me (I try to befuddle the algorithm by immediately clicking onto the third “page” of offerings and see if I can find something there - or I use the privacy settings on the search engine, or I mix up my search engines depending on what I'm looking for - some people do even more clever things to secure their privacy).
Am I being paranoid about all of this?
Do you have a tale to tell about your interactions with the technology behemoth we currently use/suffer under/love/ignore?