Pass a Law to Prevent Youngsters from Accessing Social Media. Yep, That Will Work Well
December 2, 2024
This is the work of a dinobaby. Smart software helps me with art, but the actual writing? Just me and my keyboard.
I spotted a very British “real” news story called “It’s So Easy to Lie: : A Fifth of Children Use Fake Age on Social Media.” I like the idea that one can pick 100 children at random from a school with 13 year olds, only 80 percent will allegedly follow the rules.
Thanks, Midjourney. Good enough. I might point out you did not present a young George Washington despite my efforts to feed you words to which you would respond.
Does the 20 percent figure seem low to you? I would suggest that if a TikTok-type video was popular at that school, more than 20 percent would find a way to get access to that video. If the video was about being thin or a fashion tip, the females would be more interested and they would lie to get that information. The boys might be more interested in other topics, which I shall leave to your imagination.
The write up says:
A newly released survey, conducted by the UK media regulator, indicates 22% of eight to 17 year olds lie that they are 18 or over on social media apps.
I doubt that my hypothetical group of 13 years olds are different from those who are four years older. The write up pointed out:
A number of tech firms have recently announced measures to make social media safer for young people, such as Instagram launching “teen accounts.” However, when BBC news spoke to a group of teenagers at Rosshall Academy, in Glasgow, all of them said they used adult ages for their social media accounts. “It’s just so easy to lie about your age”, said Myley, 15.
Australia believes it has a fix: Ban access. I quite like the $AUS 33 million fine too.
I would suggest that in a group of 100 teens, one will know how to create a fake persona, buy a fake ID from a Telegram vendor, and get an account. Will a Telegram user set up a small online business to sell fake identities or social media accounts to young people? Yep.
Cyber security firms cannot block bad actors. What makes regulators think that social media companies can prevent young people from getting access to their service. Enjoy those meetings. I hope the lunches are good.
My hunch is that the UK is probably going to ban social media access for those under a certain age. Good luck.
Stephen E Arnold, December 2, 2024
FOGINT: Telegram and Its Possible Latent Weakness
December 2, 2024
This write up is the work of a dinobaby. Thanks to Gifer.com for the moving fog!
Cointelegram gathered some interesting information about Telegram. Let’s take a look at some of the data in Cointelegraph’s “Telegram’s Crypto Holdings Rose to $1.3B in H1 2024,” November 25 or 26, 2024. Verification of the data is difficult, which is a frequent issue where crypto valuations are presented. Let’s assume that the general information reflects Telegram’s business position.,
According to the article:
Telegram’s crypto holdings jumped from $400 million to $1.3 billion in H1 2024, driven by Toncoin sales and strategic deals.
Translating: Telegram dumped some crypto, made a profit, and reported the upside to fatten its financial position.
The key point is that Telegram allegedly had about $400 million in digital assets. Now the company has more than $1 billion.
Allegedly Telegram generated about $500 million in revenue in the period from January to June 2024. The boost does not come from advertising and subscriptions. Crypto dealing and other business agreements have bolstered the company’s apparently positive financials. It is worth noting that an audit of Telegram’s finances is rumored to have reported that Telegram lost money in a previous financial period.
The Cointelegraph story reminded its readers that Pavel Durov is confined to France. His legal issues have not been resolved. He has posted bail in the neighborhood of $5 million euros, an indication of the seriousness of the French charges against him.
The FOGINT’s research team notes:
- Telegram is making significant moves with its tie ups with organizations like CryptoCasino.com. This is part of the firm’s effort to become the platform for Telegram-centric gaming
- The Open Network Foundation (which runs on the Telegram platform) continues to promote the TON crypto and the Telegram-centric capabilities on which the Foundation’s services run. One of these initiatives involves investing in TON-centric applications and holding training courses in major international cities.
- Telegram’s “value” as an allegedly secure messaging app is eroding. Mr. Durov has insisted that Telegram cooperates with law enforcement. Those statements mean that Telegram has to kick its online gaming activities and the Foundation’s financial services into high gear.
Net net: Telegram may be strong at first glance, but there may be a latent weakness in the Telegram, Foundation, and TON.Social entities.
Stephen E Arnold, December 2, 2024
Deepfakes: An Interesting and Possibly Pernicious Arms Race
December 2, 2024
As it turns out, deepfakes are a difficult problem to contain. Who knew? As victims from celebrities to schoolchildren multiply exponentially, USA Today asks, “Can Legislation Combat the Surge of Non-Consensual Deepfake Porn?” Journalist Dana Taylor interviewed UCLA’s John Villasenor on the subject. To us, the answer is simple: Absolutely not. As with any technology, regulation is reactive while bad actors are proactive. Villasenor seems to agree. He states:
“It’s sort of an arms race, and the defense is always sort of a few steps behind the offense, right? In other words that you make a detection tool that, let’s say, is good at detecting today’s deepfakes, but then tomorrow somebody has a new deepfake creation technology that is even better and it can fool the current detection technology. And so then you update your detection technology so it can detect the new deepfake technology, but then the deepfake technology evolves again.”
Exactly. So if governments are powerless to stop this horror, what can? Perhaps big firms will fight tech with tech. The professor dreams:
“So I think the longer term solution would have to be automated technologies that are used and hopefully run by the people who run the servers where these are hosted. Because I think any reputable, for example, social media company would not want this kind of content on their own site. So they have it within their control to develop technologies that can detect and automatically filter some of this stuff out. And I think that would go a long way towards mitigating it.”
Sure. But what can be done while we wait on big tech to solve the problem it unleased? Individual responsibility, baby:
“I certainly think it’s good for everybody, and particularly young people these days to be just really aware of knowing how to use the internet responsibly and being careful about the kinds of images that they share on the internet. … Even images that are sort of maybe not crossing the line into being sort of specifically explicit but are close enough to it that it wouldn’t be as hard to modify being aware of that kind of thing as well.”
Great, thanks. Admitting he may sound naive, Villasenor also envisions education to the (partial) rescue:
“There’s some bad actors that are never going to stop being bad actors, but there’s some fraction of people who I think with some education would perhaps be less likely to engage in creating these sorts of… disseminating these sorts of videos.”
Our view is that digital tools allow the dark side of individuals to emerge and expand.
Cynthia Murrell, December 2, 2024