OnionScan Checks for Falsely Advertised Anonymous Sites on Dark Web
July 6, 2016
Dark Web sites are not exempt from false advertising about their anonymity. A recently published article from Vice’s Motherboard shares a A Tool to Check If Your Dark Web Site Is Really Anonymous. The program is called OnionScan and it determines issues on sites that may unmask servers or reveal their owners. An example of this is that could potentially be metadata, such as photo location information, hidden in images on the site. Sarah Jamie Lewis, an independent security researcher who developed OnionScan, told Motherboard:
The first version of OnionScan will be released this weekend, Lewis said. “While doing some research earlier this year I kept coming across the same issues in hidden services—exposed Apache status pages, images not stripped of exif data, pages revealing information about the tools used to build it with, etc. The goal is [to] provide an easy way of testing these things to drive up the security bar,” Lewis added. It works “pretty much the same as any web security scanner, just tailored for deanonymization vectors,” she continued.”
It is interesting that it appears this tool has been designed to protect users from the mistakes made by website administrators who do not set up their sites properly. We suppose it’s only a matter of time before we start seeing researchers publish the number of truly secure and anonymous Dark Web sites versus those with outstanding issues.
Megan Feil, July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Hacking Team Cannot Sell Spyware
June 27, 2016
I do not like spyware. Once it is downloaded onto your computer, it is a pain to delete and it even steals personal information. I think it should be illegal to make, but some good comes from spyware if it is in the right hands (ideally). Some companies make and sell spyware to government agencies. One of them is the Hacking Team and they recently had some bad news said Naked Security, “Hacking Team Loses Global License To Sell Spyware.”
You might remember Hacking Team from 2015, when its systems were hacked and 500 gigs of internal, files, emails, and product source code were posted online. The security company has spent the past year trying to repair its reputation, but the Italian Ministry of Economic Development dealt them another blow. The ministry revoked Hacking Team’s “global authorization” to sell its Remote Control System spyware suite to forty-six countries. Hacking Team can still sell within the European Union and expects to receive approval to sell outside the EU.
“MISE told Motherboard that it was aware that in 2015 Hacking Team had exported its products to Malaysia, Egypt, Thailand, Kazakhstan, Vietnam, Lebanon and Brazil.
The ministry explained that “in light of changed political situations” in “one of” those countries, MISE and the Italian Foreign Affairs, Interior and Defense ministries decided Hacking Team would require “specific individual authorization.” Hacking Team maintains that it does not sell its spyware to governments or government agencies where there is “objective evidence or credible concerns” of human rights violations.”
Hacking Team said if they suspect that any of their products were used to caused harm, they immediately suspend support if customers violate the contract terms. Privacy International does not believe that Hacking Team’s self-regulation is enough.
It points to the old argument that software is a tool and humans cause the problems.
Whitney Grace, June 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Public Opinion of Dark Web May Match Media Coverage
June 17, 2016
A new survey about the Dark Web was released recently. Wired published an article centered around the research, called Dark Web’s Got a Bad Rep: 7 in 10 People Want It Shut Down, Study Shows. Canada’s Center for International Governance Innovation surveyed 24,000 people in 24 countries about their opinion of the Dark Web. The majority of respondents, 71 percent across all countries and 72 percent of Americans, said they believed the “dark net” should be shut down. The article states,
“CIGI’s Jardine argues that recent media coverage, focusing on law enforcement takedowns of child porn sites and bitcoin drug markets like the Silk Road, haven’t improved public perception of the dark web. But he also points out that an immediate aversion to crimes like child abuse overrides mentions of how the dark web’s anonymity also has human rights applications. ‘There’s a knee-jerk reaction. You hear things about crime and its being used for that purpose, and you say, ‘let’s get rid of it,’’ Jardine says.”
We certainly can attest to the media coverage zoning in on the criminal connections with the Dark Web. We cast a wide net tracking what has been published in regards to the darknet but many stories, especially those in mainstream sources emphasize cybercrime. Don’t journalists have something to gain from also publishing features revealing the aspects the Dark Web that benefit investigation and circumvent censorship?
Megan Feil, June 17, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Behind the Google Search Algorithm
June 16, 2016
Trying to reveal the secrets behind Google’s search algorithm is almost harder than breaking into Fort Knox. Google keeps the 200 ranking factors a secret, what we do know is that keywords do not play the same role that they used to and social media does play some sort of undisclosed factor. Search Engine Journal shares that “Google Released The Top 3 ranking Factors” that offers a little information to help SEO.
Google Search Quality Senior Strategist Andrey Lipattsev shared that the three factors are links, content, and RankBrain-in no particular order. RankBrain is an artificial intelligence system that relies on machine learning to help Google process search results to push the more relevant search results to the top of the list. SEO experts are trying to figure out how this will affect their jobs, but the article shares that:
“We’ve known for a long time that content and links matter, though the importance of links has come into question in recent years. For most SEOs, this should not change anything about their day-to-day strategies. It does give us another piece of the ranking factor puzzle and provides content marketers with more ammo to defend their practice and push for growth.”
In reality, there is not much difference, except that few will be able to explain how artificial intelligence ranks particular sites. Nifty play, Google.
Whitney Grace, June 15, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Speculation About Beyond Search
June 2, 2016
If you are curious to learn more about the purveyor of the Beyond Search blog, you should check out Singularity’s interview with “Stephen E Arnold On Search Engine And Intelligence Gathering.” A little bit of background about Arnold is that he is an expert specialist in content processing, indexing, online search as well as the author of seven books and monographs. His past employment record includes Booz, Allen, & Hamilton (Edward Snowden was a contractor for this company), Courier Journal & Louisville Times, and Halliburton Nuclear. He worked on the US government’s Threat Open Source Intelligence Service and developed a cost analysis, technical infrastructure, and security for the FirstGov.gov.
Singualrity’s interview covers a variety of topics and, of course, includes Arnold’s direct sense of humor:
“During our 90 min discussion with Stephen E. Arnold we cover a variety of interesting topics such as: why he calls himself lucky; how he got interested in computers in general and search engines in particular; his path from college to Halliburton Nuclear and Booze, Allen & Hamilton; content and web indexing; his who’s who list of clients; Beyond Search and the core of intelligence; his Google Trilogy – The Google Legacy (2005), Google Version 2.0 (2007), and Google: The Digital Gutenberg (2009); CyberOSINT and the Dark Web Notebook; the less-known but major players in search such as Recorded Future and Palantir; Big Brother and surveillance; personal ethics and Edward Snowden.”
When you listen to the experts in certain fields, you always get a different perspective than what the popular news outlets gives. Arnold offers a unique take on search as well as the future of Internet security, especially the future of the Dark Web.
Whitney Grace, June 2, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Everyone Rejoice! We Now Have Emoji Search
June 1, 2016
It was only a matter of time after image search actually became a viable and useful tool that someone would develop a GIF search. Someone thought it would be a keen idea to also design an emoji search and now, ladies and gentlemen, we have it! Tech Viral reports that “Now You Can Search Images On Google Using Emoji.”
Using the Google search engine is a very easy process, type in a few keywords or a question, click search, and then delve into the search results. The Internet, though, is a place where people develop content and apps just for “the heck of it”. Google decided to design an emoji search option, probably for that very reason. Users can type in an emoji, instead of words to conduct an Internet search.
The new emoji search is based on the same recognition skills as the Google image search, but the biggest question is how many emojis will Google support with the new function?
“Google has taken searching algorithm to the next level, as it is now allowing users to search using any emoji icon. Google stated ‘An emoji is worth a thousand words’. This feature may be highly appreciated by lazy Google users, as they now they don’t need to type a complete line instead you just need to use an emoji for searching images.”
It really sounds like a search for lazy people and do not be surprised to get a variety of results that do not have any relation to the emoji or your intended information need. An emoji might be worth a thousand words, but that is a lot of words with various interpretations.
Whitney Grace, June 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Paid Posts and PageRank
May 27, 2016
Google users rely on the search engine’s quality-assurance algorithm, PageRank, to serve up the links most relevant to their query. Blogger and Google engineer Matt Cutts declares, reasonably enough, that “Paid Posts Should Not Affect Search Engines.” His employer, on the other hand, has long disagreed with this stance. Cutts concedes:
“We do take the subject of paid posts seriously and take action on them. In fact, we recently finished going through hundreds of ‘empty review’ reports — thank you for that feedback! That means that now is a great time to send us reports of link buyers or sellers that violate our guidelines. We use that information to improve our algorithms, but we also look through that feedback manually to find and follow leads.”
Well, that’s nice to know. However, Cutts emphasizes, no matter how rigorous the quality assurance, there is good reason users may not want paid posts to make it through PageRank at all. He explains:
“If you are searching for information about brain cancer or radiosurgery, you probably don’t want a company buying links in an attempt to show up higher in search engines. Other paid posts might not be as starkly life-or-death, but they can still pollute the ecology of the web. Marshall Kirkpatrick makes a similar point over at ReadWriteWeb. His argument is as simple as it is short: ‘Blogging is a beautiful thing. The prospect of this young media being overrun with “pay for play” pseudo-shilling is not an attractive one to us.’ I really can’t think of a better way to say it, so I’ll stop there.”
Cynthia Murrell, May 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Google Changes Its Algorithm Again
May 26, 2016
As soon as we think we have figured out how to get our content to the top of Google’s search rankings, the search engine goes and changes its algorithms. The Digital Journal offers some insight into “Op-Ed: How Will The Google 2016 Algorithm Change Affect Our Content?”
In early 2016, Google announced they were going to update their Truth Algorithm and it carries on many of the aspects they have been trying to push. Quality content over quantity is still very important. Keyword heavy content is negated in favor of pushing Web sites that offer relevant, in-depth content and that better answer a user’s intent.
SEO changes took a dramatic turn with a Penguin uploaded and changes in the core algorithm. The biggest game changer is with mobile technologies:
“The rapid advancement of mobile technologies is deeply affecting the entire web scenario. Software developers are shifting towards the development of new apps and mobile websites, which clearly represent the future of information technology. Even the content for mobile websites and apps is now different, and Google had to account for that with the new ranking system changes. The average mobile user is very task oriented and checks his phones just to quickly accomplish a specific task, like finding a nearby café or cinema. Mobile-oriented content must be much shorter and concise than web-oriented one. The average web surfer wants to know, learn and explore things in a much more relaxed setting.”
Google wants to clear its search results of what is known as unviable information and offer users a better quality search experience for both their mobile devices and standard desk computers. Good to know that someone wants to deliver a decent product.
Whitney Grace, May 26, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Facebook Biased? Social Media Are Objective, Correct?
May 21, 2016
I am shattered. Imagine. Facebook delivering information services which are subjective. Facebook is social media at its finest. There are humans who “vote” or “like” something. That’s crowdsourcing. A person has built a career on the wisdom of crowds.
My illusionary social world crumbled before the information presented in “The Real Bias Built In at Facebook.” The author is an academic and opinion writer. I am confident that the students of Zeynep Tufekci are able to navigate the reefs and shoals of social media because the truth is that algorithms are set up by humans who are, as some people believe, biased. (Note that you may have to pay money to read the write up by the compensated opinion writer Zeynep Tufekci. There is no bias in this approach to information. Heck, there is no bias at the New York Times, correct?)
What is Facebook doing? Here’s a passage I circled in stunned scarlet:
On Facebook the goal is to maximize your engagement with the site and keep it ad friendly.
This suggests that algorithms are set up to deliver these payoffs to Facebook. It follows that algorithms which do not deliver the required outcome are changed by programmers who:
- Either tune or structure the numerical processes to bring home the bacon
- Engage in on going tinkering until the suite of algorithms pumps out the likes, the clicks, and the revenue.
My thought is that chatter about “algorithms” is a bit trendy, just like the railroad cars stuffed with baloney explaining how artificial intelligence is the now now now big thing. Big Data, it seems, has fallen to second place in the marketing marathon.
I prefer to believe that Facebook, Google, and the other combines are really trying to be objective. When someone suggests that Google results are not in line with my query or that my deceased dog’s Facebook page displays a stream of relevant information, there is no bias.
My world is a happier place. I like searching for a restaurant when I am standing in front of it. When I look for that restaurant on my smartphone, the restaurant does not appear.
That’s objectivity in action. I know I don’t need to know where the restaurant is. I am in front of it. That’s objectivity in action.
Stephen E Arnold, May 21, 2016
Parts Unknown of Dark Web Revealed in Study
May 13, 2016
While the parts unknown of the internet is said to be populated by terrorists’ outreach and propaganda, research shows a different picture. Quartz reports on this in the article, The dark web is too slow and annoying for terrorists to even bother with, experts say. The research mentioned comes from Thomas Rid and Daniel Moore of the Department of War Studies at King’s College London. They found 140 extremist Tor hidden services; inaccessible or inactive services topped the list with 2,482 followed by 1,021 non-illicit services. As far as illicit services, those related to drugs far outnumbered extremism with 423. The write-up offers a few explanations for the lack of terrorists publishing on the Dark Web,
“So why aren’t jihadis taking advantage of running dark web sites? Rid and Moore don’t know for sure, but they guess that it’s for the same reason so few other people publish information on the dark web: It’s just too fiddly. “Hidden services are sometimes slow, and not as stable as you might hope. So ease of use is not as great as it could be. There are better alternatives,” Rid told Quartz. As a communications platform, a site on the dark web doesn’t do what jihadis need it to do very well. It won’t reach many new people compared to “curious Googling,” as the authors point out, limiting its utility as a propaganda tool. It’s not very good for internal communications either, because it’s slow and requires installing additional software to work on a mobile phone.”
This article provides fascinating research and interesting conclusions. However, we must add unreliable and insecure to the descriptors for why the Dark Web may not be suitable for such uses.
Megan Feil, May 13, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph