By Golly, the Gray Lady Will Not Miss This AI Tech Revolution!

November 2, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

The technology beacon of the “real” newspaper is shining like a high-technology beacon. Flash, the New York Times Online. Flash, terminating the exclusive with LexisNexis. Flash. The shift to a — wait for it — a Web site. Flash. The in-house indexing system. Flash. Buying About.com. Flash. Doing podcasts. My goodness, the flashes have impaired my vision. And where are we today after labor strife, newsroom craziness, and a list of bestsellers that gets data from…? I don’t really know, and I just haven’t bothered to do some online poking around.

image

A real journalist of today uses smart software to write listicles for Buzzfeed, essays for high school students, and feature stories for certain high profile newspapers. Thanks for the drawing Microsoft Bing. Trite but okay.

I thought about the technology flashes from the Gray Lady’s beacon high atop its building sort of close to Times Square. Nice branding. I wonder if mobile phone users know why the tourist destination is called Times Square. Since I no longer work in New York, I have forgotten. I do remember the high intensity pinks and greens of a certain type of retail establishment. In fact, I used to know the fellow who created this design motif. Ah, you don’t remember. My hunch is that there are other factoids you and I won’t remember.

For example, what’s the byline on a New York Times’s story? I thought it was the name or names of the many people who worked long hours, made phone calls, visited specific locations, and sometimes visited the morgue (no, the newspaper morgue, not the “real” morgue where the bodies of compromised sources ended up).

If the information in  that estimable source Showbiz411.com is accurate, the Gray Lady may cite zeros and ones. The article is “The New York Times Help Wanted: Looking for an AI Editor to Start Publishing Stories. Six Figure Salary.” Now that’s an interesting assertion. A person like me might ask, “Why not let a recent college graduate crank out machine generated stories?” My assumption is that most people trying to meet a deadline and in sync with Taylor Swift will know about machine-generated information. But, if the story is true, here’s what’s up:

… it looks like the Times is going let bots do their journalism. They’re looking for “a senior editor to lead the newsroom’s efforts to ambitiously and responsibly make use of generative artificial intelligence.” I’m not kidding. How the mighty have fallen. It’s on their job listings.

The Showbiz411.com story allegedly quotes the Gray Lady’s help wanted ad as saying:

“This editor will be responsible for ensuring that The Times is a leader in GenAI innovation and its applications for journalism. They will lead our efforts to use GenAI tools in reader-facing ways as well as internally in the newsroom. To do so, they will shape the vision for how we approach this technology and will serve as the newsroom’s leading voice on its opportunity as well as its limits and risks. “

There are a bunch of requirements for this job. My instinct is that a few high school students could jump into this role. What’s the difference between a ChatGPT output about crossing the Delaware and writing a “real” news article about fashion trends seen at Otto’s Shrunken Head.

Several observations:

  • What does this ominous development mean to the accountants who will calculate the cost of “real” journalists versus a license to smart software? My thought is that the general reaction will be positive. Imagine: No vacays, no sick days, and no humanoid protests. The Promised Land has arrived.
  • How will the Gray Lady’s management team explain this cuddling up to smart software? Perhaps it is just one of those newsroom romances? On the other hand, what if something serious develops and the smart software moves in? Yipes.
  • What will “informed” reads think of stories crafted by the intellectual engine behind a high school student’s essay about great moments in American history? Perhaps the “informed” readers won’t care?

Exciting stuff in the world of real journalism down the street from Times Square and the furries, pickpockets, and gawkers from Ames, Iowa. I wonder if the hallucinating smart software will be as clever as the journalist who fabricates a story? Probably not. “Real” journalists do not shape, weaponized, or filter the actual factual. Is John Wiley & Sons ready to take the leap?

Stephen E Arnold, November 2, 2023

test

Social Media: The Former Big Thing

November 2, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

It’s a common saying that if you aren’t on social media you might as well not exist. Social media profiles are necessary to be successful in the modern world, but Business Insider claims that many people are spending less time glued to their screens: “Great News-Social Media Is Falling Apart.”

Facebook, Instagram, Twitter, and other social media giants alienated their users with too much sponsored content and entertainment hubs. Large social media platforms are less about connections and more about generating revenue via clicks. Users are experiencing social network fatigue so they’re posting less and even jumping ship. Users are now spending time group chats or on small, more intimate social platforms. On the small platforms, users are free from curated content and ads. They’re also using platforms for specific groups or topics.

The current state of social media is a fractured, disconnected mess. New networks pop up and run the popularity gambit before they disappear. Users want a social media platform that connects everything with the niche appeal of small networks:

“Mike McCue, Flipboard’s CEO, believes that the next big, social platform must bring together the benefits of both worlds, he said: ‘the quality and trust in small, transparent communities with the ability for those quality conversations to reach millions." But instead of one platform that manages to appease everyone, the future of social media is looking more like a network of platforms that offer people a customized experience. The ideal system would not only allow you to migrate to new social apps without losing your network or profile but also link them together so that you could post on one and a friend could comment on it from another.’”

None of the smaller social media networks are making money yet but the opportunities are there. Users want a clean, ad-free experience similar to how Facebook and Twitter used to be. If decentralized social media platforms learn to connect, they’ll give the larger companies a run for their money and end their monopolies.

Whitney Grace, November 2, 2023

Telegram: A Super App with Features Al Capone Might Have Liked

November 1, 2023

When I mention in my law enforcement lectures that Telegram, a frisky encrypted super app for thumb typers, is “off the radar” for some analysts, I get more than a few blank looks. Consider this: The “special conflict” or whatever some in the Land of Tolstoy call it, pivots on Telegram. And why not? It allows encrypted messages, both public and private. A safety conscious user can include an image or a video snippet and post it to the Musky service with a couple of taps. Those under attack can disseminate location data to a mailing list of Telegram contacts. The app makes it possible to pay for “stuff,” often that stuff is CSAM or information about where to pick up an order containing contraband.

11 1 soldiiers foxhole

The soldier with the mobile phone says, “Hey, this hot content video content is great on Telegram.” The other soldier says, “Jump to the Spies-R-Us service. I will give you the coordinates for the drone assault. Also, order some noodle latkes to Checkpoint Grhriba at 1800 hours.” Thanks, MidJourney. WW2 cartoonists would be proud of you.

Pivot to the Israel Hamas war. Yep, Telegram is in use. Civilians, war fighters, even those in prison with mobile devices are Telegramming away. The Russian brothers who created the original app may not have anticipated its utility in war zones.

My research team has noted that some Clear Web sites discuss slippery subjects like carding. Then the “buy now” or similar action points to a Telegram “location.” What about the Dark Web? Telegram makes it possible to do “Dark Web things” without the risk and hassle of operating a Dark Web site or service. Pretty innovative, right? And what about that Dark Web traffic? Our analysis suggests that one will find Dark Web bots, law enforcement from numerous countries, and a modest number of human bad actors who cannot or have not embraced Telegram.

Now the super app is getting some enhancements, if the information in Gadgets360 article is accurate. “Telegram Update Brings Advanced Reply Options, Link Preview Customizations, Account Colors, More.” Enhancements include:

Replying to a message from one chat to another. Will this be useful for certain extremist users doing fund raising or recruiting?

  • Customize shared links. Will this be useful to CSAM purveyors?
  • Fast forward and rewind videos in Telegram messages. Winner for some video content vendors.
  • Telegram also has a special feature. Some Telegram users pay for these services. Yep, money. Subscription money.

And the encryption thing? Reasonably good. Possibly less open than the UK Covid information allegedly from WhatsApp.

Stephen E Arnold, November 1, 2023

Cyber Security Professionals May Need Worry Beads. Good Worry Beads

November 1, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

I read “SEC Charges SolarWinds and Its CISO With Fraud and Cybersecurity Failures.” Let’s assume the write up is accurate or — to hit today’s target for excellence — the article is close enough for horseshoes. Armed with this assumption, will cyber security professionals find that their employers or customers will be taking a closer look at the actual efficacy of the digital fences and news flows that keep bad actors outside the barn?

10 31 happy hacker

A very happy bad actor laughs after penetrating a corporate security system cackles in a Starbucks: “Hey, that was easy. When will these people wake up that you should not have fired me.” Thanks, MidJourney, not exactly what I wanted but good enough, the new standard of excellence.

The write up suggests that the answer may be a less than quiet yes. I noted this statement in the write up:

According to the complaint filed by the SEC, Austin, Texas-based SolarWinds and Brown [top cyber dog at SolarWinds] are accused of deceiving investors by overstating the company’s cybersecurity practices while understating or failing to disclose known risks. The SEC alleges that SolarWinds misled investors by disclosing only vague and hypothetical risks while internally acknowledging specific cybersecurity deficiencies and escalating threats.

The shoe hit the floor, if the write up is on the money:

A key piece of evidence cited in the complaint is a 2018 internal presentation prepared by a SolarWinds engineer [an employee who stated something senior management does not enjoy knowing] that was shared internally, including with Brown. The presentation stated that SolarWinds’ remote access setup was “not very secure” and that exploiting the vulnerability could lead to “major reputation and financial loss” for the company. Similarly, presentations by Brown in 2018 and 2019 indicated concerns about the company’s cybersecurity posture.

From my point of view, there are several items to jot down on a 4×6 inch notecard and tape on the wall:

  1. The “truth” is often at odds with what senior managers want to believe, think they know, or want to learn. Ignorance is bliss, just not a good excuse after a modest misstep.
  2. There are more companies involved in the foul up than the news sources have identified. Far be it from me to suggest that highly regarded big-time software companies do a C minus job engineering their security. Keep in mind that most senior managers — even at high tech firms — are out of the technology loop no matter what the LinkedIn biography says or employees believe. Accountants and MBA are good at some things, bad at others. Cyber security is in the “bad” ledger.
  3. The marketing collateral for most cyber security, threat intelligence services, and predictive alerting services talks about a sci-fi world, not the here and now of computer science students given penetration assignments from nifty places like Estonia and Romania, among others. There are disaffected employees who want to leave their former employers a digital hickey. There are developers, hired via a respected gig matcher, who will do whatever an anonymous customer requires for hard cash or a crypto payment. Most companies have no idea how or where the problem originates.
  4. Think about insider threats, particularly when insiders include contractors, interns, employees who are unloved, or consulting firm with a sketchy wizard gathering data inside of a commercial operation.

Sure, cyber security just works. Yeah, right. Maybe this alleged action toward a security professional will create some discomfort and a few troubled dreams. Will there be immediate and direct change? Nope. But the PowerPoint decks will be edited. The software will not be fixed up as quickly. That’s expensive and may not be possible with a cyber security firm’s current technical staff and financial resources.

Stephen E Arnold, November 1, 2023

How Does One Impede US AI Progress? Have a Government Meeting?

November 1, 2023

green-dino_thumbThis essay is the work of a dumb humanoid. No smart software required.

The Washington Post may be sparking a litigation hoedown. How can a newspaper give legal eagles an opportunity to buy a private island and not worry about the cost of LexisNexis searches? The answer may be in “AI Researchers Uncover Ethical, Legal Risks to Using Popular Data Sets.” The UK’s efforts to get a group to corral smart software are interesting. Lawyers may be the foot that slows AI traffic on the new Information  Superhighway.

The Washington Post reports:

The advent of chatbots that can answer questions and mimic human speech has kicked off a race to build bigger and better generative AI models. It has also triggered questions around copyright and fair use of text taken off the internet, a key component of the massive corpus of data required to train large AI systems. But without proper licensing, developers are in the dark about potential copyright restrictions, limitations on commercial use or requirements to credit a data set’s creators.

There is nothing like jumping in a lake with the local Polar Bears Club to spark investor concern about paying big fines. The chills and thrills of the cold water create a heightened state of awareness.

The article continues:

But without proper licensing, developers are in the dark about potential copyright restrictions, limitations on commercial use or requirements to credit a data set’s creators.

How’s the water this morning?

Several observations:

  1. A collision between the compunction to innovate in AI and the risk of legal liability seems likely
  2. Innovators will forge ahead and investors will have to figure out the risks by looking for legal eagles and big sharks lurking below the surface
  3. Whatever happens in North America and Western Europe will not slow the pace of investment into AI in the Middle East and China.
  4. Are there unpopular data sets perhaps generated by biased smart software?

Uncertainty and risk. Thanks, AI innovators.

Stephen E Arnold, November 1, 2023

China and Russia: Thinking Alike

November 1, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

China’s authoritarian government went to a new extreme with its social credit system. The social credit system a.k.a. a social rating system assigns points to citizens based on arbitrary rules that align with the Chinese government’s ideology. If citizens have a low score, they are denied services and privileges. Gaming Deputy explains that a Russian university is following China’s example: “The Russian State Social University Is Developing A Social Rating System ‘We’.”

The Russian State Social University (RGSU) is developing a social rating system for Russian citizens called “We.” RGSU invited its students and other interested people to participate in We testing. The We social credit platform rates people on numerous factors:

“The pilot rating system will include questions about various aspects of citizens’ lives, such as education, presence of children and dependents, sources of income, benefits, credit history, criminal records, social media accounts, participation in public life, government awards, language skills (especially Chinese), commitment to sports, healthy lifestyle and so on. All these parameters will be used to determine the social status and level of each person.”

People will receive a two-digit scoring code. The first number will be an individual’s social status and the second will be their social level. In order to ensure the We system’s data is accurate, people’s TIN, passport, SNILS, and telephone will be connected. The RGSU developers claim We will be useful for banks and governors whom want to classify citizens based on their usefulness.

A social credit system might sound useful but it doesn’t take long to become a tool of nightmares. The article emphasizes that transparency, data protection, and a balance between individual rights and government interests is necessary. Does anyone actually believe the Russian government will be held accountable?

Whitney Grace, November 1, 2023

« Previous Page

  • Archives

  • Recent Posts

  • Meta