Cyber Investigators: Feast, Famine, or Poisoned Data in 2023

January 11, 2023

At this moment in time, the hottest topic among some cyber investigators is open source intelligence or OSINT. In 2022, the number of free and for-fee OSINT tools and training sessions grew significantly. Plus, each law enforcement and intelligence conference I attended in 2022 was awash with OSINT experts, exhibitors, and investigators eager to learn about useful sites, Web and command line techniques, and intelware solutions combining OSINT information with smart software. I anticipate that 2023 will be a bumper year for DYOR or do your own research. No collegial team required, just a Telegram group or a Twitter post with comments. The Ukraine-Russia conflict has become the touchstone for the importance of OSINT.

Over pizza, my team and I have been talking about how the OSINT “revolution” will unwind in 2023. On the benefit side of the cyber investigative ledger, OSINT is going to become even more important. After 30 years in the background, OSINT has become the next big thing for investigators, intelligence professionals, entrepreneurs, and Beltway bandits. Systems developed in the US, Israel, and other countries continue to bundle sophisticated analytics plus content. The approach is to migrate basic investigative processes into workflows. A button click automates certain tasks. Some of the solutions have proven themselves to be controversial. Voyager Lab and the Los Angeles Police Department generated attention in late 2021. The Brennan Center released a number of once-confidential documents revealing the capabilities of a modern intelware system. Many intelware vendors have regrouped and appear to be ready to returned to aggressive marketing of their systems, its built-in data, and smart software. These tools are essential for certain types of investigations whether in US agencies like Homeland Security or in financial crime investigations at FINCEN. Even state and city entities have embraced the mantra of better, faster, easier, and, in some cases, cheaper investigations.

Another development in 2023 will be more tension between skilled human investigators and increasingly smarter software. The bean counters (accountants) see intelware as a way to reduce the need for headcount (full time equivalents) and up the amount of smart software and OSINT information. Investigators will face an increase in cyber crime. Some involved in budgeting will emphasize smart software instead of human officers. The crypto imbroglio is just one facet of the factors empowering online criminal behavior. Some believe that the Dark Web, CSAM, and contraband have faded from the scene. That’s a false idea. In the last year or so, what my team and I call the “shadow Web” has become a new, robust, yet hard-to-penetrate infrastructure for cyber crime. Investigators now face an environment into which a digital Miracle-Gro has been injected. Its components are crypto, encryption, and specialized software that moves Web sites from Internet host to Internet host in the click of a mouse. Chasing shadows is a task even the most recent intelware systems find difficult to accomplish.

However, my team and I believe that there is another downside for law enforcement and a major upside for bad actors. The wide availability of smart software capable of generating misinformation in the form of text, videos, and audio. Unfortunately today’s intelware is not yet able to flag and filter weaponized information in real time or in a reliable way. OSINT advocates and marketers unfamiliar with the technical challenges of ignoring “fake” information downplay the risk of weaponized or poisoned information. A smart software system ingesting masses of digital information can, at this time, learn from bogus data and, therefore, output misleading or incorrect recommendations. In 2023, poisoned data continue to derail many intelware systems as well as traditional investigations when insufficient staff are available to determine provenance and accuracy. Our research has identified 10 widely-used mathematical procedures particularly sensitive to bogus information. Few want to discuss these out-of-sight sinkholes in public forums. Hopefully the reluctance to talks about OSINT blindspots will fade in 2023.

The feast? Smart software. Masses of information.

The famine? Funds to expand the hiring of full time (not part time) investigators and the money needed to equip these professionals with high-value, timely instruction about tools, sources, pitfalls, and methods for verification of data.

The poison? The ChatGPT and related tools which can make anyone with basic scripting expertise into a volcano of misinformation.

Let me suggest four steps to begin to deal with the feast, famine, and poison challenges?

First, individuals, trade groups, and companies marketing intelware to law enforcement and intelligence entities stick to the facts about their systems. The flowery language and the truth-stretching lingo must be decreased. Why do intelware vendors experience brutal churn among licensees? The distance between the reality of the system and the assertions made to sell the system.

Second, procurement processes and procurement professionals must become advocates for reform. Vendors often provide “free” trials and then work to get “on the budget.” The present procurement methods can lead to wasted time, money, and contracting missteps. Outside-the-box ideas like a software sandbox require consideration. (If you want to know more about this, message me.)

Third, consulting firms which are often quick to offer higher salaries to cyber investigators need to evaluate the impact of their actions on investigative units. There is no regulatory authority monitoring the behavior of these firms. The Wild West of cyber investigator poaching hampers some investigations. Legislation perhaps? More attention from the Federal Trade Commission maybe? Putting the needs of the investigators ahead of the needs of the partners in the consulting firms?

Fourth, a stepped up recruitment effort is needed to attract investigators to the agencies engaged in dealing with cyber crime. In my years of work for the US government and related entities, I learned that government units are not very good at identifying, enlisting, and retaining talent. This is an administrative function that requires more attention from individuals with senior administrative responsibilities. Perhaps 2023 will generate some progress in this core personnel function.

Don’t get me wrong. I am optimistic about smart software. I believe techniques to identify and filter weaponized information can be enhanced and improved. I am confident that forward leaning professionals in government agencies can have a meaningful impact on institutionalized procedures and methods associated with fighting cyber crime.

My team and I are committed to conducting research and sharing our insights with law enforcement and intelligence professionals in 2023. My hope is that others will adopt a similar “give back” and “pay it forward” approach in 2023 in the midst of feasts, famines, and poisoned data.

Thank you for reading. — Stephen E Arnold, January 11, 2023

China Orders AI Into the Courtroom

January 11, 2023

China is simply delighted with the possibilities of AI technology. In fact, it is now going so far as to hand most of its legal services over to algorithms. ZDNet reports, “China Wants Legal Sector to be AI-Powered by 2025.” Yep, once those algorithms are set up justice in China can be automated. Efficient. Objective. What’s not to like? Writer Eileen Yu explains:

“The country’s highest court said all courts were required to implement a ‘competent’ AI system in three years, according to a report by state-owned newspaper China Daily, pointing to guidelines released by the Supreme People’s Court.  The document stated that a ‘better regulated’ and more effective infrastructure for AI use would support all processes needed in handling legal cases. This should encompass in-depth integration of AI, creation of smart courts, and higher level of ‘digital justice’, the high court said.  A more advanced application of AI, however, should not adversely affect national security or breach state secrets as well as violate personal data security, the document noted, stressing the importance of upholding the legitimacy and security of AI in legal cases.  It added that rulings would remain decisions made by human judges, with AI tapped as supplemental references and tools to improve judges’ efficiency and ease their load in trivial matters. An AI-powered system also would offer the public greater access to legal services and help resolve issues more effectively, the Supreme People’s Court said.”

We’re sure that is exactly how it will work out, making life better for citizens caught up in the legal system. The directive also instructs courts train their workers on using AI, specifically on learning to spot irregularities. What could go wrong? At least final decisions will be made by humans. For now. To make matters even trickier, the Supreme People’s Court is planning to use blockchain technology to link courts to other sectors in support of socioeconomic development. Because what is more important in matters of justice than how they affect the almighty yuan?

Cynthia Murrell, January 11, 2023

US AI Legal Decisions: Will They Matter?

January 10, 2023

I read an interesting essay called “This Lawsuit against Microsoft Could Change the Future of AI.” It is understandable that the viewpoint is US centric. The technology is the trendy discontinuity called ChatGPT. The issue is harvesting data, lots of it from any source reachable. The litigation concerns Microsoft’s use of open source software to create a service which generates code automatically in response to human or system requests.

The essay uses a compelling analogy. Here’s the passage with the metaphor:

But there’s a dirty little secret at the core of AI — intellectual property theft. To do its work, AI needs to constantly ingest data, lots of it. Think of it as the monster plant Audrey II in Little Shop of Horrors, constantly crying out “Feed me!” Detractors say AI is violating intellectual property laws by hoovering up information without getting the rights to it, and that things will only get worse from here.

One minor point: I would add the word “quickly” after the final word here.

I think there is another issue which may warrant some consideration. Will other countries — for instance, China, North Korea, or Iran — be constrained in their use of open source or proprietary content when training their smart software? One example is the intake of Galmon open source satellite data to assist in guiding anti satellite weapons should the need arise. What happens when compromised telecommunications systems allow streams of real time data to be pumped into ChatGPT-like smart systems? Smart systems with certain types of telemetry can take informed, direct action without too many humans in the process chain.

I suppose I should be interested in Microsoft’s use of ChatGPT. I am more interested in weaponized AI operating outside the span of control of the US legal decisions. Control of information and the concomitant lack of control of information is more than adding zest to a Word document.

As a dinobaby, I am often wrong. Maybe what the US does will act like a governor on an 19th century steam engine? As I recall, some of the governors failed with some interesting consequences. Worry about Google, Microsoft, or some other US company’s application of constrained information could be worrying about a lesser issue.

Stephen E Arnold, January 10. 2023

Insight about Software and Its Awfulness

January 10, 2023

Software is great, isn’t it? Try to do hanging indents with numbers in Microsoft Word. If you want this function without wasting time with illogical and downright weird controls, call a Microsoft Certified Professional to code what you need. Law firms are good customers. What about figuring out which control in BlackMagic DaVinci delivers the effect you want? No problem. Hire someone who specializes in the mysteries of this sort of free software. No expert in Princeton, Illinois, or Bear Dance, Montana? Do the Zoom thing with a gig worker. That’s efficient. There are other examples; for instance, do you want to put your MP3 on an iPhone? Yeah, no problem. Just ask a 13 year old. She may do the transfer for less than an Apple Genius.

Why is software awful?

There Is No Software Maintenance” takes a step toward explaining what’s going on and what’s going to get worse. A lot worse. The write up states:

Software maintenance is simply software development.

I think this means that a minimal viable product is forever. What changes are wrappers, tweaks, and new MVP functions. Yes, that’s user friendly.

The essay reports:

The developers working on the product stay with the same product. They see how it is used, and understand how it has evolved.

My experience suggests that the mindset apparent in this article is the new normal.

The advantages are faster and cheaper, quicker revenue, and a specific view of the customer as irrelevant even if he, she, or it pays money.

The downsides? I jotted down a few which occurred to me:

  1. Changes may or may not “work”; that is, printing is killed. So what? Just fix it later.
  2. Users’ needs are secondary to what the product wizards are going to do. Oh, well, let’s take a break and not worry about today. Let’s plan for new features for tomorrow. Software is a moving target for everyone now.
  3. Assumptions about who will stick around to work on a system or software are meaningless. Staff quit, staff are RIFed, and staff are just an entity on the other end of an email with a contract working in Bulgaria or Pakistan.

What’s being lost with this attitude or mental framing? How about trust, reliability, consistency, and stability?

Stephen E Arnold, January 10, 2023

The EU Has the Google in Targeting Range for 2023

January 10, 2023

Unlike the United States, the European Union does not allow Google to collect user data. The EU has passed several laws to protect its citizens’ privacy, however, Google can still deploy tools like Google Analytics with stipulations. Tutanota explains how Google operates inside the EU laws in, “Is Google Analytics Illegal In The EU? Yes And No, But Mostly Yes.”

Max Schrems is a lawyer who successfully sued Facebook for violating the privacy of Europeans. He won again, this time against Google. France and Austria decided that Google Analytics is illegal to use in Europe, but Denmark’s and Norway’s data protection authorities developed legally compliant ways to use the analytics service.

Organizations were using Google Analytics to collect user information, but that violated Europeans’ privacy rights because it exposed them to American surveillance. The tech industry did not listen to the ruling, so Schrems sued:

“However, the Silicon Valley tech industry largely ignored the ruling. This has now led to the ruling that Google Analytics is banned in Europe. NOYB says:

‘While this (=invalidation of Privacy Shield) sent shock waves through the tech industry, US providers and EU data exporters have largely ignored the case. Just like Microsoft, Facebook or Amazon, Google has relied on so-called ‘standard Contract Clauses’ to continue data transfers and calm its European business partners.’

Now, the Austrian Data Protection Authority strikes the same chord as the European court when declaring Privacy Shield as invalid: It has decided that the use of Google Analytics is illegal as it violates the General Data Protection Regulation (GDPR). Google is “subject to surveillance by US intelligence services and can be ordered to disclose data of European citizens to them’. Therefore, the data of European citizens may not be transferred across the Atlantic.”

There are alternatives to Google services, including Gmail and Google Analytics based in Europe, Canada, and the United States. This appears to be one more example of the EU lining up financial missiles to strike the Google.

Whitney Grace, January 10, 2023

The Pain of Prabhakar Becomes a Challenge for Microsoft

January 9, 2023

A number of online “real” news outfits have reported and predicted that ChatGPT will disrupt the Google’s alleged monopoly in online advertising. The excitement is palpable because it is not fashionable to beat up the technology giants once assumed to have feet made of superhero protein.

The financial information service called Seeking Alpha published “Bing & ChatGPT Might Work Together, Could Be Revolutionary.” My mind added “We Hope!” to the headline. Even the allegedly savvy Guardian Newspaper weighed in with “Microsoft Reportedly to Add ChatGPT to Bing Search Engine.”  Among the examples I noted is the article in The Information (registration required, thank you) called “Ghost Writer: Microsoft Looks to Add OpenAI’s Chatbot Technology to Word, Email.”

The origin of this boomlet in Bing will kill Google may be in the You.com Web search system which includes this statement. I have put in bold face the words and phrases revealing Microsoft’s awareness of You.com:

YouChat does not use Microsoft Bing web, news, video or other Microsoft Bing APIs in any manner. Other Web links, images, news, and videos on you.com are powered by Microsoft Bing. Read Microsoft Bing Privacy Policy

I am not going to comment on the usefulness of the You.com search results. Instead, navigate to www.you.com and run some queries. I am a dinobaby, and I like command line searching. You do not need to criticize me for my preference for Stone Age search tools. I am 78 and will be in one of Dante’s toasty environments. Boolean search? Burn for eternity. Okay with me.

I would not like to be Google’s alleged head of search (maybe the word “nominal” is preferable to some. That individual is a former Verity wizard named Prabhakar Raghavan. His domain of Search, Google Assistant, Ads, Commerce, and Payments has been expanded by the colorful Code Red activity at the Google. Mr. Raghavan’s expertise and that of his staff appears to be ill-equipped to deal with one of least secret of Microsoft’s activities. Allegedly more Google wizards have been enlisted to deal with this existential threat to Google’s search and online ad business. Well, Google is two decades old, over staffed, and locked in its aquarium. It presumably watched Microsoft invest a billion into ChatGPT and did not respond. Hello, Prabhakar?

The “value” has looked like adding ChatGPT-like functions and maybe some of its open sourciness to Microsoft’s ubiquitous software. One can envision typing a dot point in PowerPoint and the smart system will create a number of slides. The PowerPoint user fiddles with the words and graphics and rushes to make a pitch at a conference or a recession-proof venture capital firm.

Imagine a Microsoft application which launches ChatGPT-type of smart search in a Word document. This type of function might be useful to crypto bros who want to explain how virtual tokens will become the Yellow Brick Road to one of the seven cities of Cibola. Sixth graders writing an essay and MBAs explaining how their new business will dominate a market will find this type of functionality a must-have. No LibreOffice build offers this type of value…yet.

What if one thinks about Outlook? (I wou8ld prefer not to know anything about Outlook, but there are individuals who spend hours each day fiddling around in email. Writing email can become a task for a ChatGPT-like software. Spammers will love this capability, particularly combined with VBScript.

The ultimate, of course, will be the integration of Teams and ChatGPT. The software can generate an instance of a virtual person and the search function can generate responses to questions directed at the construct presented to others in a Teams’ session. This capability is worth big bucks.

Let’s step back from the fantasies of killing Google and making Microsoft Office apps interesting.

Microsoft faces a handful of challenges. (I will not mention Microsoft’s excellent judgment in referencing the Federal Trade Commission as unconstitutional. Such restraint.)

First, the company has a somewhat disappointing track record in enterprise security. Enough said.

Second, Microsoft has a fascinating series of questionable engineering decisions. One example is the weirdness of old code in Windows 11. Remember that Windows 10 was to be the last version of Windows. Then there is the chaos of updates to Windows 11, particularly missteps like making printing difficult. Again enough said.

Third, Google has its own smart software. Either Mr. Raghavan is asleep at the switch and missed the signal from Microsoft’s 2019 one billion dollar investment in OpenAI or Google’s lawyers have stepped on the smart software brake. Who owns outputs built from the content of Web sites? What happens when content the European Union appears in outputs? (You know the answer to that question. I think it is even bigger fines which will make Facebook’s recent half a billion dollar invoice look somewhat underweight.)

When my research team and I talked about the You.com-type search and the use of ChatGPT or other OpenAI technology in business, law enforcement, legal, healthcare, and other use cases — we hypothesized that:

  1. Time will be required to get the gears and wheels working well enough to deliver consistently useful outputs
  2. Google has responded and no one noticed much except infinite scrolling and odd “cards” of allegedly accurate information in response to a user’s query.
  3. Legal issues will throw sand in the gears of the machinery once the ambulance chasers tire of Camp Lejeune litigation
  4. Aligning costs of resources with the to-be revenue will put some potholes on this off-ramp of the information superhighway.

Net net: The world of online services is often described as being agile. A company can turn on a dime. New products and services can be issued and fixes can be a system better over time. I know Boolean works. The ChatGPT thing seems promising. I don’t know if it replaces human thought and actions in certain use cases. Assume you have cancer. Do you want your oncologist to figure out what to do using Bing.com, Google.com, or You.com?

Stephen E Arnold, January 9, 2023

Backups: Slam Dunk? Well, No and Finding That Out Is a Shock to Some

January 9, 2023

Flash back in time: You have an early PC. You have files on floppy discs. In order to copy a file, one had to fiddle around, maybe swapping discs or a friend in the technology game with a disc duplicator. When one disc is bad, one just slugs in the second disc. Oh, oh. That disc is bad too. In the early 1980s, that type of problem on an Eagle computer or DEC Rainbow could force a person back to a manual typewriter and a calculating machine with a handle no less.

Today, life is better, right? There are numbers that explain the mean time between failure of speedy solid state discs. If one pokes around, there are back-in-fashion tape back up systems. Back up software can be had for free or prices limited only by the expertise of the integrator bundling hardware and software. Too expensive? Lease the hardware and toss in a service plan. What happens when the back up data on the old, reliable magnetic tape cannot be read? Surprise.

The cloud provides numerous back up options. One vendor, which I shall not name, promises automatic back up. The surprise on the face of the customer who stores high-value data in a uniquely named file folder is fascinating. You may be able to see this after a crash and the cloud believer learns that the uniquely named folder was not backed up. Surprise for sure.

I read “EA Says It Can’t Recover 60% of Players’ Corrupted Madden Franchise Save Files.” I am not into computer games. I don’t understand the hardship created by losing a “saved game.” That’s okay. The main point of the article strikes me as:

EA says that a temporary “data storage issue” led to the corruption of many Madden NFL 23 players’ Connected Franchise Mode (CFM) save files last week. What’s worse, the company now estimates it can recover fewer than half of those corrupted files from a backup.

It is 2023, isn’t it?

What’s clear is that this company did not have a procedure in place to restore lost data.

Some things never change. Here’s an example. Someone calls me and says, “My computer crashed.” I ask, “Do you have a back up?” The person says, “Yes, the system automatically saves data to an external drive.” I ask, “Do you have another copy on a cloud service or a hard drive you keep at a friend’s house?” The person says, “No, why would I need that?”

The answer, gentle reader, is that multiple back ups are necessary even in 2023.

Some folks are slow learners.

Stephen E Arnold, January 9, 2023

SQL Made Easy: Better Than a Human? In Some Cases

January 9, 2023

Just a short item for anyone who has to formulate Structured Query Language queries. Years ago, SQL queries were a routine for my research team. Today, the need has decreased. I have noticed that my recollection and muscle memory for SQL queries have eroded. Now there is a solution which seems to work reasonably well. Is the smart software as skilled as our precious Howard? Nope. But Howard lives in DC, and I am in rural Kentucky. Since neither of us like email or telephones, communicate via links to data available for download and analysis. Hey, the approach works for us. But SQL queries. Just navigate to TEXT2SQL.AI. Once you sign in using one of the popular privacy invasion methods, you can enter a free text statement and get a well formed SQL query. Is the service useful? It may be. The downside is the overt data collection approach.

Stephen E Arnold, January 9, 2023

Smart Software: Just One Real Problem? You Wish

January 6, 2023

I read “The One Real Problem with Synthetic Media.” when consulting and publishing outfits point out the “one real problem” analysis, I get goose bumps. Am I cold? Nah, I am frightened. Write ups that propose the truth frighten me. Life is — no matter what mid tier consulting outfits say — slightly more nuanced.

What is the one real problem? The write up asserts:

Don’t use synthetic media for your business in any way. Yes, use it for getting ideas, for learning, for exploration. But don’t publish words or pictures generated by AI — at least until there’s a known legal framework for doing so. AI-generated synthetic media is arguably the most exciting realm in technology right now. Some day, it will transform business. But for now, it’s a legal third rail you should avoid.

What’s the idea behind the shocking metaphor? The third rail provides electric power to a locomotive. I think the idea is that one will be electrocuted should an individual touch a live third rail.

Okay.

Are there other issues beyond the legal murkiness?

Yes, let me highlight several which strike me as important.

First, the smart software can output quickly and economically weaponized information. Whom can one believe? A college professor funded by a pharmaceutical company or a robot explaining the benefits of an electric vehicle? The hosing of synthetic content and data into a society may provide more corrosive than human outputs alone. Many believe that humans are expert misinformation generators. I submit that smart software will blow the doors off the human content jalopies.

Second, smart software ingests data, when right or wrong, human generated or machine generated, and outputs results on these data. What happens when machine generated content makes the human generated content into tiny rivulets? The machine output is as formidable as Hokusai’s wave. Those humans in the boats: Goners perhaps?

Third, my thought is that in some parts of the US the slacker culture is the dominant mode. Forget that crazy, old-fashioned industrial revolution 9-to-5 work day. Ignore the pressure to move up, earn more, and buy a Buick, not a Chevrolet. Slacker culture dwellers look for the easy way to accomplish what they want. Does this slacker thing explain some FTX-type behavior? What about Amazon’s struggles with third-party resellers’ products? What about Palantir Technology buying advertising space in the Wall Street Journal to convince me that it is the leader in smart software? Yeah, slacker stuff in my opinion. These examples and others mean that the DALL-E and ChatGPT type of razzle dazzle will gain traction.

Where are legal questions in these three issues? Sure legal eagles will fly when there is an opportunity to bill.

I think the smart software thing is a good example of “technology is great” thinking. The one real problem is that it is not.

Stephen E Arnold, January 6, 2023

UK Focused on Apple and Google in 2023

January 6, 2023

While there continues to be some market competition with big tech companies, each has their own monopoly in the technology industry. The United Statuses slow to address these industry monopolies, but the United Kingdom wants to end Google and Apples’ control says Mac Rumors in the article: “UK Begins Market Investigation Into Apple and Google’s Mobile Dominance.”

The UK Competition and Markets Authority (CMA) will investigate how Apple and Google dominate the mobile market as well as Apple’s restrictions on cloud gaming through its App Store. Smaller technology and gaming companies stated that Google and Apple are harming their bottom lines and holding back innovation:

“The consultation found 86% of respondents support taking a closer look at Apple and Google’s market dominance. Browser vendors, web developers, and cloud gaming service providers said the tech giants’ mobile ecosystems are harming their businesses, holding back innovation, and adding unnecessary costs.

The feedback effectively justifies the findings of a year-long study by the CMA into Apple and Google’s mobile ecosystems, which the regulatory body called an “effective duopoly” that allows the companies to “exercise a stranglehold over these markets.” According to the CMA, 97% of all mobile web browsing in the UK in 2021 happened on browsers powered by either Apple’s or Google’s browser engine, so any restrictions can have a major impact on users’ experiences.”

The CMA will conduct an eighteen-month-long investigation and will require Apple to share information about its business products. After the investigation, the CMA could legally force Apple to make changes to its business practices. Apple, of course, denies its current practices promote innovation and competition as well as protect users’ privacy and security.

Whitney Grace, January 6, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta