Apple and Google: Teaming Up for a Super Great Reason?

April 21, 2020

In a remarkable virtue signaling action, Apple and Google joined forces to deal with coronavirus. The approach is not the invention of a remedy, although both companies have dabbled in health. The mechanism is surveillance-centric in the view of DarkCyber.

Google Apple Contact Tracing (GACT): A Wolf in Sheep’s Clothes” provides an interesting opinion about the Google Apple Contact Tracing method. The idea seems to be that there are two wolves amongst the sheep. The sheep cooperate because that’s the nature of sheep. The wolves have the system, data, and methodology to make the sheep better. Are there other uses of the system? It is too soon to tell. But we can consider what the author asserts.

But the bigger picture is this: it creates a platform for contact tracing that works all across the globe for most modern smart phones (Android Marshmallow and up, and iOS 13 capable devices) across both OS platforms.

image

The write up states:

Whenever a user tests positive, the daily keys his or her devices used the last 14 days can be retrieved by the app through the GACT API, presumably only after an authorised request from the health authorities. How this exactly works, and in particular how a health authority gets authorised to sign such request or generate a valid confirmation code is not clear (yet). The assumption is that these keys are submitted to a central server set up by the contact tracing app. Other instances of the same app on other people’s phones are supposed to regularly poll this central server to see if new daily keys of phones of recently infected people have been uploaded. Another function in the GACT API allows the app to submit these daily keys to the operating system for analysis. The OS then uses these keys to derive all possible proximity identifiers from them, and compares each of these with the proximity identifiers it has stored in the database of identifiers recently received over Bluetooth. Whenever a match is found, the app is informed, and given the duration and time of contact (where the time may be rounded to daily intervals).

The author includes this observation about the procedure:

Google and Apple announced they intend to release the API’s in May and build this functionality into the underlying platforms in the months to follow. This means that at some point in time operating system updates (through Google Play Services updates in the case of Android) will contain the new contact tracing code, ensuring that all users of a modern iPhone or Android smartphone will be tracked as soon as they accept the OS update. (Again, to be clear: this happens already even if you decide not to install a contact tracing app!) It is unclear yet how consent is handled, whether there will be OS settings allowing one to switch on or off contact tracing, what the default will be.

The write up concludes with this statement:

We have to trust Apple and Google to diligently perform this strict vetting of apps, to resist any coercion by governments, and to withstand the temptation of commercial exploitation of the data under their control. Remember: the data is collected by the operating system, whether we have an app installed or not. This is an awful amount of trust….

DarkCyber formulated several observations:

  1. The system appears to be more accessible than existing specialized services now available to some authorities
  2. Apple’s and Google’s cooperation seems mature in terms of operational set up. When did work on this method begin?
  3. Systems operated by private companies on behalf of government agencies rely on existing legal and contractual methods to persist through time; that is, once funded or supported in a fungible manner, the programs operate in an increasingly seamless manner.

Worth monitoring this somewhat rapid and slightly interesting tag team duo defeat their opponent.,

Stephen E Arnold, April 21, 2020

Cookies and Fingerprints: You Will Be Monitored by Mom

April 15, 2020

Everywhere you go on the Internet, cookies are tracking your movements (even with a VPN). The technology is over a decade old and they range from tracking pixels, content tracker, cross-site tracking cookies, social trackers and browser finger-printing. The Next Web explains that browser fingerprinting is becoming more popular with advertisers in the article, “Digital Fingerprints Are The New Cookies-And Advertisers Want Yours.”

Digital Fingerprinting refers to a company generating a profile about your device’s characteristics. These can include everything from operating system down to browser settings. In other words, it is more like an anonymous barcode. Your identity is not attached to the digital fingerprint, but your data is for advertisers to send targeted ads.

Banks use digital fingerprinting as a security measure. Banking Web sites can identify the device you are on, but if they do not they ask security questions. Advertisers now want the technology to make more money. For users, it is more along the lines of capitalist Big Brother.

There are ways to turn off digital fingerprinting. Most of the tracking happens when you are on the Internet, so look through your browser settings and see if it has tracking protection. Even if you turn on tracking protection it does not entirely hide you:

“While “incognito mode” prevents your browser history from being recorded on your computer and prevents your spouse to spy on you, it does not prevent websites that you visit from collecting data about you and it does nothing to block fingerprinting. Similarly, clearing your browsing history on a regular basis, while a healthy thing to do, does not address fingerprinting either.

While ad blockers block ads from loading, not all ad blockers also block trackers, even less fingerprinters. Trackers can come attached to ads, but quite often they are not part of the ad delivery process itself. Social trackers, tracking pixels and fingerprinters for instance don’t need to piggyback on an ad to track your data.”

To avoid cookies, use a private connection, a good decent VPN, and browse in incognito mode. It does not work 100%, but it is better than capitalist Big Brother.

Whitney Grace, April 15, 2020

Startup Gretel Building Anonymized Data Platform

March 19, 2020

There is a lot of valuable but sensitive data out there that developers and engineers would love to get their innovative hands on, but it is difficult to impossible for them to access. Until now.

Enter Gretel, a startup working to anonymize confidential data. We learn about the upcoming platform from Inventiva’s article, “A Group of Ex-NSA And Amazon Engineers Are Building a ‘GitHub for Data’.” Co-founders Alex Watson, John Myers, Ali Golshan, and Laszlo Bock were inspired by the source code sharing platform GitHub. Reporter surbhi writes:

“Often, developers don’t need full access to a bank of user data — they just need a portion or a sample to work with. In many cases, developers could suffice with data that looks like real user data. … ‘We’re building right now software that enables developers to automatically check out an anonymized version of the data set,’ said Watson. This so-called ‘synthetic data’ is essentially artificial data that looks and works just like regular sensitive user data. Gretel uses machine learning to categorize the data — like names, addresses and other customer identifiers — and classify as many labels to the data as possible. Once that data is labeled, it can be applied access policies. Then, the platform applies differential privacy — a technique used to anonymize vast amounts of data — so that it’s no longer tied to customer information. ‘It’s an entirely fake data set that was generated by machine learning,’ said Watson.”

The founders are not the only ones who see the merit in this idea; so far, the startup has raised $3.5 million in seed funding. Gretel plans to charge users based on consumption, and the team hopes to make the platform available within the next six months.

Cynthia Murrell, March 19, 2020

Banjo: A How To for Procedures Once Kept Secret

March 13, 2020

DarkCyber wrote about BlueDot and its making reasonably clear what steps it takes to derive actionable intelligence from open source and some other types of data. Ten years ago, the processes implemented by BlueDot would have been shrouded in secrecy.

From Secrets to Commercial Systems

Secret and classified information seems to find its way into social media and the mainstream media. DarkCyber noted another example of a company utilizing some interesting methods written up in a free online publication.

DarkCyber can visualize old-school companies depending on sales to law enforcement and the intelligence community asking themselves, “What’s going on? How are commercial firms getting this know how? Why are how to and do it yourself travel guides to intelligence methods becoming so darned public?”

It puzzles DarkCyber as well.

Let’s take a look at the revelations in “Surveillance Firm Banjo Used a Secret Company and Fake Apps to Scrape Social Media.” The write up explains:

  • A company called Pink Unicorn Labs created apps which obtained information from users. Users did not know their data were gathered, filtered, and cross correlated.
  • Banjo, an artificial intelligence firm that works with police used a shadow company to create an array of Android and iOS apps that looked innocuous but were specifically designed to secretly scrape social media. The developer of the apps was Pink Unicorn. Banjo CEO Damien Patton created Pink Unicorn.
  • Why create apps that seemed to do one while performing data inhalation: “Dataminr received an investment from Twitter. Dataminr has access to the Twitter fire hose. Banjo, the write up says, “did not have that sort of data access.” The fix? Create apps that sucked data.
  • The apps obtained information from Facebook, Twitter, Instagram, Russian social media app VK, FourSquare, Google Plus, and Chinese social network Sina Weibo.
  • The article points out: “Once users logged into the innocent looking apps via a social network OAuth provider, Banjo saved the login credentials, according to two former employees and an expert analysis of the apps performed by Kasra Rahjerdi, who has been an Android developer since the original Android project was launched. Banjo then scraped social media content.”
  • The write up explains, Banjo, via a deal with Utah, has access to the “state’s traffic, CCTV, and public safety cameras. Banjo promises to combine that input with a range of other data such as satellites and social media posts to create a system that it claims alerts law enforcement of crimes or events in real-time.”
Discussion

Why social media? On the surface and to most parents and casual users of Facebook, Twitter, and YouTube, there are quite a few cat posts. But via the magic of math, an analyst or a script can look for data which fills in missing information. The idea is to create a record of a person, leave blanks where desirable information is not yet plugged in, and then rely on software to spot the missing item. How is this accomplished? The idea is simple. One known fact appears in the profile and that fact appears in another unrelated item of content. Then the correlated item of content is scanned by a script and any information missing from the profile is plugged in. Using this method and content from different sources, a clever system can compile a dossier on an entity. Open source information yields numerous gems; for example, a cute name applied to a boy friend might become part of a person of interest’s Dark Web handle. Phone numbers, geographic information, friends, and links to other interesting content surface. Scripts work through available data. Data can be obtained in many ways. The methods are those which were shrouded in secrecy before the Internet started publishing essays revealing what some have called “tradecraft.”

Net Net

Banjo troubles DarkCyber on a number of levels:

  1. Secrecy has significant benefits. Secrets, once let loose, have interesting consequences.
  2. Users are unaware of the risks apps pose. Cluelessness is in some cases problematic.
  3. The “now” world looks more like an intelligence agency than a social construct.

Stephen E Arnold, March 13, 2020

Eliminalia: Reputation Management and Content Removal

March 12, 2020

One of our readers called our attention to a company called Eliminalia. This firm provides what DarkCyber considers reputation management services. The unique selling proposition for the firm is that it says that it can achieve results quickly. DarkCyber does not have a firm position on the value of reputation management firms. The organizations or individuals who want content removed may feel a compelling need to modify history or take content corrective actions. Because removing content rests in the hands of a third party, often a large indexing company, getting attention and action can be a challenging job. Europa Press asserts that 24 percent of people and businesses want to have data about them removed from “the Internet.” We took a quick look at our files and located some information. Here’s a summary of points we found interesting.

image

Plus, the firm asserts:

We are the first to guarantee the results or we will refund your money. We will give an answer to your doubts and needs. We will help you and advise you on a global level.

The firm adds:

We delete internet data and information and guarantee your right to be forgotten. Eliminalia is the leading company in the field which guarantees that the information that bothers and harms you is completely deleted from Internet search engines (Google, Bing, etc.), web portals, blogs..

The firm offers three videos on Vimeo. The most recent video is at https://vimeo.com/222670049 and includes this commentary:

Eliminalia is a renowned company with several world headquarters that protects online privacy and reputation of its customers, finding and removing negative contents from the Web.

There are several YouTube videos as well. These may be located at this link.

The company has offices in Brazil, Colombia, Ecuador, Italy, Mexico, Switzerland, and the United Kingdom.

image

Eliminalia offers a mobile app for iPhones and Android devices.

The firm’s Web site asserts:

  • 99% happy satisfied clients
  • 8260+ success stories
  • 3540 business clients.

The company states:

We delete your name from:

  • Mass media
  • State gazettes
  • Social media

The president of Eliminalia is Dídac Sánchez. The company was founded in 2013. Crunchbase lists the date of the company’s founding as 2011.

image

There is an interesting, but difficult to verify, article about the Eliminalia process in “Why Is William Hill a Corporate Partner of Alzheimer’s Society?” The assertions about Eliminalia appear toward the end of the WordPress post. These can be located by searching for the term “Eliminalia.” One interesting item in the write up is that the Eliminalia business allegedly shares an address with World Intelligence Ltd. It is also not clear if Eliminalia is headquartered in Manchester at 53 Fountain Street. Note: the William Hill article includes other names allegedly associated with the company.

DarkCyber believes the company focuses on selling its services in countries with data protection regulations. The firm has a strong Spanish flavor.

If you are interested in having content removed from the Internet, consider speaking with Eliminalia. DarkCyber believes that some content can be difficult to remove. Requests for removal can be submitted. Some sites have a “removal request button” like www.accessify.com. However, there may be backlogs, bureaucracy, and indifference to requests which may be interpreted as trivial or nuisance. Some of our information revealed quite interesting information about the firm. DarkCyber can prepare a more robust summary of the company, including information about the methods used to remove content from the Internet.

Stephen E Arnold, March 12, 2020

BOB: A Blockchain Phone

November 29, 2019

Remember the comment by some FBI officials about going dark. The darkness is now spreading. “Meet BOB, World’s First Modular Blockchain-Powered Smartphone” reports that a crypto currency centric phone may become more widely available.

The write up states:

BOB runs on Function X OS, which is an open-source operating system. As it uses the blockchain ecosystem, every task on the phone, be it sending texts, making calls, browsing the web, and file sharing, all happen on a decentralized network, making it highly encrypted and thus secure. Each unit of the BOB is a node that supports the entire Function X blockchain system.

DarkCyber thinks that Mr. Comey was anticipating these types of devices as well as thinking about Facebook’s encrypted message systems.

For more details, consult the TechRadar article.

One important point: The BOB has a headphone jack. Even those concerned about privacy and secrecy like their tunes.

Stephen E Arnold, November 29, 2019

Is Google Thinking about Turkeys?

November 27, 2019

Is Google actually fearful of an authoritarian government? Google is okay with firing people who do not go along. Google exerts considerable force. Is Google is a company driven by dollar signs? Is it possible that Google fears anything and anyone that threatens its net profit? The Register explains the cause of Google’s fear in “Google Takes Sole Stand on Privacy, Rejects New Rules For Fear Of ‘Authoritarian’ Review.”

Google, like any company from a capitalist society, is leery of any organization that wishes to restrain its power. Google recently blocked a new draft for he Privacy Interest Group (PING)’s charter. PING is a member of the W3C web standards body. Google blocked the new draft, because it creates an unchecked authoritarian review group and will create “significant unnecessary chaos in the development of the web platform.”

PING exists to enforce technical specifications that W3C issued to respect people’s Web privacy. W3C provides horizontal review, where members share suggestions with technical specifications authors to ensure they respect privacy. Ever since the middle of 2019, PING’s sixty-eight members have tried to rewrite its charter. The first draft was shared with 450 W3C members, one of which is Google, and only twenty-six members responded. Of the twenty-six members, Google was the only one that objected.

Google supports PING’s horizontal review, bit the search engine giant did not want to invest in the new charter without the group having more experience. There are not many differences between the charter drafts:

“‘The new charter is not dramatically different from the existing one, Doty said in an email. ‘It includes providing input and recommendations to other groups that set process, conduct reviews or approve the progression of standards and mentions looking at existing standards and not just new ones. I think those would all have been possible under the old charter (which I drafted originally); they’re just stated more explicitly in this draft. It includes a new co-chair from Brave, in addition to the existing co-chairs from the Internet Society and Google.’

Doty said he’s not surprised there would be discussion and disagreement about how to conduct horizontal spec reviews. ‘I am surprised that Google chose to formally object to the continued existence of this interest group as a way to communicate those differences,’ he said.”

Doty hopes that Google will invest in PING and Web privacy, but Google’s stance is more adversarial. Google and other tech companies are worried about their business models changing of cookies are blocked. Google does not want to lose the majority of its business, which comes from advertising through its search engine. Google might protect privacy, but only so far as it does not interfere with their bottom line.

Whitney Grace, November 27, 2019

Light Bulb On. Consumers Not Thrilled with What They See

November 22, 2019

We cannot say this comes as much of a surprise. Citing a recent Pew survey, Fortune reports, “Americans to Companies: We Don’t Trust You With Our Persona Data.” Any confidence the public had that companies can safeguard personal data has been eroded by news of data breach after data breach. On top of that, many consumers have noticed how eerily accurate targeted ads have become due to unannounced data sharing by the likes of Facebook and Google. Writer Danielle Abril tells us:

“The Pew survey, based on responses from 4,272 U.S. adults between June 3 and June 17, found that most Americans doubt that companies will publicly admit to and take responsibility for mismanaging their data. Seventy-nine percent of respondents said they have little to no confidence that businesses will do the right thing. And even though many continue to exchange their data for services and products, 81% of people feel the risks now outweigh the benefits of the exchange. The sentiments appear have intensified over time, as 70% of those surveyed said they feel that their personal information is less secure than it was five years ago. … The survey found that 83% of respondents frequently or occasionally see ads that appear to be based on profiles companies created using their personal data. And of that group, 61% say that the ads are somewhat or very good at accurately reflecting their interests. But that doesn’t mean that people actually want companies using their data this way. More than eight in 10 people are concerned about the information social media companies and advertisers know about them.”

Pointing to user agreements, companies insist they are playing by the rules. They are not wrong, but they are quite aware how opaque those agreements are to most consumers. Over 80 percent of respondents say they are asked each month to agree to one privacy policy or another, and a third say they do so weekly. However, most only skim the policies, at best. Of those who do read them through, more than 85 percent only partially understand them. While it is true that, legally, it is on the consumers to understand what they are signing, tech companies could certainly make it easier. They won’t, though, as long as they can profit from users’ confusion.

Cynthia Murrell, November 22, 2019

Google and Privacy: Our Way, Please

October 25, 2019

Google has made its privacy stance known. The Register reports, “Google Takes Sole Stand on Privacy, Rejects New Rules for Fear of ‘Authoritarian’ Review.” The company’s solitary “no” vote halted a proposed charter revision at the W3C’s Privacy Interest Group (PING). The proposed revision would have slightly changed the charter to allow for recommendations to be made to groups that set processes, consult reviews, and approve the progression of standards, as well as require considering existing standards alongside new ones, according to PING member and author of the original charter, Nick Doty. The vote had to have been unanimous to pass, and Google says it put its foot down to avoid “unnecessary chaos.” Writer Thomas Claburn reveals:

“As The Register has heard, the issue for Google is that more individuals are participating in PING and there’s been some recent pushback against work in which Google has been involved. In other words, a formerly cordial group has become adversarial. The required context here is that over the past few years, a broad consensus has been building around the need to improve online privacy. Back in 2014, not long after Edward Snowden’s revelations about the scope of online surveillance transformed the privacy debate, the Internet Engineering Task Force published an RFC declaring that pervasive monitoring is an attack on privacy. That concern has become more widespread and has led to legislation like the California Consumer Privacy Act (opposed by Google) and efforts by companies like Apple, Brave, and Mozilla to improve privacy by blocking ad tracking. ‘The strategic problem for Google, with Apple, Brave, Mozilla, Samsung all blocking tracking, is how to preserve their business advantages and share price while appearing to be “pro privacy,”’ said Brendan Eich, CEO of Brave, in a message to The Register.”

In a move some called “privacy gas lighting,” Google proposed a “privacy sandbox,” their plan to change the very way cookies work to preserve privacy without sacrificing advertisers’ tracking ability. Why would they go there before PING got the chance to review other specifications? There are already browser-based privacy protections that need standardization, Eich emphasizes, and the W3C is obliged to do so. Google did not respond the Register’s request for comment.

Cynthia Murrell, October 25, 2019

Emailing Phishing: Yes, It Works

September 19, 2019

Phishing scams aka spam are arguably the oldest Internet scam. One would think that after almost thirty years with the Internet and email, people would have wised up to phishing scams, but no. People still fall for them and ZDNet has an article that explains why, “Phishing Emails: Here’s Why We Are Still Getting Caught After All These Years.” Here is an interesting fact, phishing emails are actually the first stage in security and data hacks within the past few years.

Google blocks more than 100 million scam emails a day and 68% of the messages are new variations of ones already blocked. What is even more interesting is who the phishing campaigns target. Enterprise users are five times more likely than a regular Gmail user to be targeted, while education users are two times more likely, government workers are three times likely, and non-profits have a 3.8 more likelihood than regular consumers. The scams only last a certain length of time to avoid detection, sometimes they last hours or only a few minutes. The scams mask themselves:

“While bulk phishing campaigns only last for 13 hours, more focused attacks are even more short lived; what Google terms as a ’boutique campaign’ — something aimed at just a few individuals in a company — lasts just seven minutes. In half of all phishing campaigns, the email pretends to have come from the email provider, in a quarter it claims to be from a cloud services provider; after that it’s most likely masquerading as a message from a financial services company or ecommerce site.”

An even scarier fact is that 45% of the Internet does not understand phishing scams. The phishing bad actors play on the naiveté and use psychological tricks, such as urgency and fear, to get people to comply.

People need to wise up and be aware of Internet scams and phishing attacks. Be aware that a reputable company will never ask for your password and always check the email address to see if it appears suspicious. If it has lot of numbers and letters and does not come from the company’s official domain, it is a scam.

Whitney Grace, September 19, 2019

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta