Dark Web Marketplaces Under Assault

February 16, 2018

It seems to be getting more difficult to operate on the Dark Web. We learn of a couple complications from the DarkWebNews post, “Popular Darknet Markets Back Online After DDoS Attacks.” A string of DDoS attacks has been keeping Dark Web marketplaces on the defense, with several suffering severe outages. We’re told the attacks have been especially hard on the comparatively long-lived and popular Dream Market, in operation since 2013. Citing a recent report from Europol, The Internet Organized Crime Threat Assessment, writer Richard explains:

[These attacks] are implemented with ease on many of the darknet markets, even when such sites have put in place restrictive measures to protect them against DDoS attacks. However, with the recent cases, there seems to be a general increase in the longevity and severity of these attacks. After the collapse of several reputable sites such as Hansa and AlphaBay, there has been a general cloud of fear in the darknet market community, which is now apparently visible on various forums including Reddit. What’s more, the recent increase in DDoS attacks has not done any good to the darknet market industry, with numerous regular users now seeking to find other alternative options. Many of these users have now turned to visiting dedicated vendor shops with others even making use of peer-to-peer possibilities, both of which eradicate the likelihood of a central failure. Nonetheless, even with the future looking uncertain for some darknet markets like Dream, the crisis seems to have opened a way for the emergence of new alternative markets with the likes of OpenBazaar taking full advantage.

OpenBazaar, by the way, is a peer-to-peer proposition. On top of those accessibility issues, the recent Bitcoin craze has complicated Dark Web users’ lives. By its nature, cryptocurrency is susceptible to congestion as more and more users attempt to complete transactions. However, the rise of several alternative “coins” (or “altcoins”) may provide some relief for the Dark Web shopper. What to do about those DDoS attacks, though, is another matter.

Cynthia Murrell, February 16, 2018

Investigating Cybercrime

December 29, 2017

The devastating Equifax breach is being pursued by federal investigators who know what they are doing, we learn from the piece, “Cybercrimes Present Unique Challenges for Investigators” at SFGate. AP Writer Kate Brumback writes:

The federal investigators looking into the breach that exposed personal information maintained by the Equifax credit report company are used to dealing with high-profile hacks and the challenges they present. The U.S. attorney’s office and FBI in Atlanta have prosecuted developers and promoters of the SpyEye and Citadel malware toolkits, used to infect computers and steal banking information. They’ve helped prosecute a hack into Scottrade and ETrade that was part of an identity theft scheme, and aided the international effort that in July shut down AlphaBay, the world’s largest online criminal marketplace.


The U.S. Attorney’s office has confirmed that, along with the FBI, it is investigating the breach at Atlanta-based Equifax, which the company said lasted from mid-May to July and exposed the data of 145 million Americans.

Though investigators would not tell Brumback anything about this specific investigation, they shared some of what it is like to pursue cybercrime in general. For example, one prosecutor notes that for every conviction there are about 10 times as many investigations that dead-end. Aliases and invite-only forums make it difficult to identify perpetrators; often, success is the result of a slip-up on the part of the bad actor. Another complication—as we know, the internet transcends boundaries, and several foreign governments do not extradite to the U.S. (or do, but slowly). Once we do catch the bad guys, they can be punished, but the issue of restitution tends to be prohibitively complicated. With a focus on prevention, investigators are now working with many companies before breaches occur.

Cynthia Murrell, December 29, 2017

Alexa AI Could Drastically Change Your Shopping Experience

December 25, 2017

Amazon’s Alexa, a wi-fi enabled, voice-activated speaker, has become less of a novelty and more of a way of life for millions of owners. With that in mind, the company is aiming to utilize this exposure for analytic purposes. But many are not so excited, as we learned from a Wired piece, “Alexa Wants You To Talk to Your Ads.”

According to the story,

These early interactions won’t necessarily provide additional revenue, but for forward-thinking brands they do hold value. No matter how basic the interaction, connecting with a customer through voice provides a trove of data on how consumers are interacting with a product. Collecting information on how Alexa is used will provide a base of knowledge to position brands to build the more sophisticated tech still to come. Once that “killer experience” is discovered and the confusion clears, these early advertising settlers will be set up to succeed.

They are angling this as a great thing for customers, too. But we are a little skeptical. There is a real fear that Amazon is overstepping boundaries in the name of AI and analytics. Recently, it has come to light that Alexa is always listening and possibly transmitting that data to a warehouse. Even more intimidating is a recent report that Alexa can be easily hacked and used as an eavesdropping tool. This might not be the ideal time for Amazon to encourage this level of interaction with Alexa.

Patrick Roland, December 25, 2017

Data Analysis Startup Primer Already Well-Positioned

December 22, 2017

A new startup believes it has something unique to add to the AI data-processing scene, we learn from VentureBeat’s article, “Primer Uses AI to Understand and Summarize Mountains of Text.” The company’s software automatically summarizes (what it considers to be) the most important information from huge collections of documents. Filters then allow users to drill into the analyzed data. Of course, the goal is to reduce or eliminate the need for human analysts to produce such a report; whether Primer can soar where others have fallen short on this tricky task remains to be seen. Reporter Blair Hanley Frank observes:

Primer isn’t the first company to offer a natural language understanding tool, but the company’s strength comes from its ability to collate a massive number of documents with seemingly minimal human intervention and to deliver a single, easily navigable report that includes human-readable summaries of content. It’s this combination of scale and human readability that could give the company an edge over larger tech powerhouses like Google or Palantir. In addition, the company’s product can run inside private data centers, something that’s critical for dealing with classified information or working with customers who don’t want to lock themselves into a particular cloud provider.

Primer is sitting pretty with $14.7 million in funding (from the likes of Data Collective, In-Q-Tel, Lux Capital, and Amplify Partners) and, perhaps more importantly, a contract with In-Q-Tel that connects them with the U.S. Intelligence community. We’re told the software is being used by several agencies, but that Primer knows not which ones. On the commercial side, retail giant Walmart is now a customer. Primer emphasizes they are working to enable more complex reports, like automatically generated maps that pinpoint locations of important events. The company is based in San Francisco and is hiring for several prominent positions as of this writing.

Cynthia Murrell, December 22, 2017

Craigslist Is Shooting Itself in the Foot by Shunning Search

December 6, 2017

Craigslist is legendary as a way to find things, sell things, get jobs and meet people. But, it’s aim is to do so locally. Recently, some search engines started allowing users to search all of Criagslist, but it won’t last and that’s a shame. We learned this from a Search Engines List article, “How to Search All of Craigslist.”

According to the story, there are several new search tools on the market:

All these sites work roughly the same way. They provide a simple front end with either a series of selections to choose from or a search engine box. You can use them to search Craigslist, and sometimes other classified advert websites, without having to drill down into your city or area.


Use these services while you can, though. Unfortunately, Craigslist is cracking down on scrapers and websites that crawl its website. It has already blocked a number of the more popular Craigslist crawlers and will likely block more as time goes on. In the meantime, all those websites in the links I provided are currently working fine (as of January 2017).

This is a real shame. With a national and international reach that this technology serves, Craigslist should be embracing it, not shutting it down. Something like this could turn Craigslist into the next eBay.

Patrick Roland, December 6, 2017

Microsoft Bing Has the Last AI Laugh

December 1, 2017

Nobody likes Bing, but because it is a Microsoft product it continues to endure.  It chugs along as the second most used search engine in the US, but apparently first is the worst and second is the best for creating a database of useful information for AI.  India News 24 shares that, “Microsoft Bing: The Redmond Giant’s Overlooked Tool” is worth far more than thought.

Every day millions of users use Bing by inputting search queries as basic keywords, questions, and even images.  In order to test an AI algorithm, huge datasets are needed so the algorithm can learn and discover patterns.  Bing is the key to creating the necessary datasets.  You also might be using Bing without knowing it as it powers Yahoo search and is also on Amazon tablets.

All of this has helped Microsoft better understand language, images and text at a large scale, said Steve Clayton, who as Microsoft’s chief storyteller helps communicate the company’s AI strategy.  It is amazing how Bing serves a dual purpose:

Bing serves dual purposes, he said, as a source of data to train artificial intelligence and a vehicle to be able to deliver smarter services.  While Google also has the advantage of a powerful search engine, other companies making big investments in the AI race – such as IBM or Amazon – do not.

Amazon has access to search queries centered on e-commerce, but when it comes to everything else that is not available in one of their warehouses.  This is where Bing comes in.  Bing feeding Microsoft’s AI projects has yet to turn a profit, but AI is still a new market and new projects are always being worked on.

Whitney Grace, December 1, 2017

The Future of Virtual Search Lies in Surprising Hands

October 12, 2017

The world of text-based search has its days numbered. At least, that’s what some experts are saying when they discuss virtual search engines. But should we be throwing today’s strongest text-based search giants on the scrap heap? It’s not that easy, according to Search Engine Watch in a new article called, “Pinterest, Google, or Bing: Who Has The Best Virtual Search Engine?”

Historically, we know that video, images, and articles have been cataloged in a text-based system for search. This keyword-based system that Google has perfected over the last few decades is, however, more limiting than much anticipated. These static keyword searches are ignoring a vast swath of search potential that some surprising sources are tapping into the virtual search market.

According to Search Engine Watch:

Already, specific ecommerce visual search technologies abound: Amazon, Walmart, and ASOS are all in on the act. These companies’ apps turn a user’s smartphone camera into a visual discovery tool, searching for similar items based on whatever is in frame. This is just one use case, however, and the potential for visual search is much greater than just direct ecommerce transactions.


After a lot of trial and error, this technology is coming of age. We are on the cusp of accurate, real-time visual search, which will open a raft of new opportunities for marketers.

So, who is going to lead the charge in this virtual search frontier? Google, right? They own search today and will probably own it tomorrow, right? Not so fast. According to the piece, Google Lens is still in BETA testing and not as robust as the competition. If they follow their historical trajectory, they will be a leader here. But it’s too early to tell.

Instead, the virtual search market is currently led by some surprising players. Pinterest and Bing both have platforms that provide different levels of accuracy in accumulating things like your search history and things you take pictures of to help search. All these companies are still pretty new at virtual search, but we like the odds of Bing and Pinterest to stake a serious claim for the future.

Patrick Roland, October 12, 2017

Elsevier Makes a Brave Play to Steal Wikipedias Users

October 9, 2017

Is Wikipedia about to be unseated in the world of academic publishing? Elsevier thinks they can give the crowdsourced, yet flawed, info hub a serious run for its money. Money, being the key word, according to a recent TechDirt article, “Elsevier Launching Rival to Wikipedia by Extracting Scientific Definitions Automatically from Author’s Texts.”

According to the piece:

Elsevier is hoping to keep researchers on its platform with the launch of a free layer of content called ScienceDirect Topics, offering an initial 80,000 pages of material relating to the life sciences, biomedical sciences and neuroscience. Each offers a quick definition of a key term or topic, details of related terms and relevant excerpts from Elsevier books.

Seems like it makes sense, right? Elsevier has all this academic information at their fingertips, so why send users elsewhere on the web for other information. This extraction system, frankly, sounds pretty amazing. However, TechDirt has a beef with it.

It’s typical of Elsevier’s unbridled ambition that instead of supporting a digital commons like Wikipedia, it wants to compete with it by creating its own redundant versions of the same information, which are proprietary. Even worse, it is drawing that information from books written by academics who have given Elsevier a license.

It’s a valid argument, whether or not Elsevier is taking advantage of its academic sources by edging into Wikipedia’s territory. However, we have a hunch their lawyers will make sure everything is on the up and up. A bigger question is whether Elsevier will make this a free site or have a paywall. They are in business to make money, so we’d guess paywall. And if that’s the case, they’d better have a spectacular setup to draw customers from Wikipedia.

Patrick Roland, October 9, 2017

Why the Future of Computing Lies in Natural Language Processing

September 26, 2017

In a blog post, EasyAsk declares, “Cognitive Computing, Natural Language & AI: Game Changers.”  We must keep in mind that the “cognitive eCommerce” company does have a natural language search engine to sell, so they are a little biased. Still, writer and CEO Craig Bassin make some good points. He begins by citing research firm Gartner’s assessment that natural-language query “will dramatically change human-computer interaction.” After throwing in a couple amusing videos, Bassin examines the role of natural language in two areas of business, business intelligence (BI) and customer relationship management (CRM). He writes:

That shift [to natural language and cognitive computing] enables two things. First, it enables users to ask a computer questions the same way they’d ask an associate, or co-worker. Second, it enables the computer to actually answer the question. That’s the game changer. The difference is a robust Natural Language Linguistic Engine. Let’s go back to the examples above for a reexamination of our questions. For BI, what if there was an app that looked beyond the dashboards into the data to answer ah-hoc questions? Instead of waiting days for a report to be generated, you could have it on the fly – right at your fingertips. For CRM, what if that road warrior could ask and answer questions about the current status across prospects in a specific region to deduce where his/her time would be best spent? Gartner and Forrester see the shift happening. In Gartner’s Magic Quadrant Report for Business Intelligence and Analytics Platforms [PDF], strategic planning assumptions incorporate the use of natural language. It may sound like a pipe dream now, but this is the future.

Naturally, readers can find natural-language goodness in EasyAsk’s platform which, to be fair, has been building their cognitive computing tech for years now. Businesses looking for a more sophisticated search solution would do well to check them out—along with their competition.  Based in Burlington, Mass., EasyAsk also maintains their European office in Berkshire, UK. The company was founded in 2000 and was acquired by Progress Software in 2005.

Cynthia Murrell, September 26, 2017

AI Will Build Better Chatbots

September 21, 2017

For better or worse, chatbots have well and truly supplanted the traditional customer service role. Sure, one can still reach a human at many companies with persistence, but it is the rare (and appreciated!) business that assigns a real person to handle point-of-contact. Geektime ponders, “What is the Future of Chatbot Development and Artificial Intelligence?” Writer Damian Wolf surveys chatbots as they now exist, and asserts it is AI that will bridge the gap between these simple systems and ones that can realistically replicate human responses. He writes:

The future of AI bots looks promising and exciting at the same time. The limitation in regards to accessing big data can be eradicated by using AI techniques. The ultimate aim for the futuristic chatbot is to be able to interact with users as a human would. Computationally, it is a hard problem. With AI evolving every day, the chances of success are already high. The Facebook AI chatbot is already showing promises as it was able to come up with negotiation skills by creating new sentences. E-Commerce will also benefit hugely with a revolution in AI chatbots. The key here is the data  collection and utilization. Once done correctly, the data can be used to strengthen the performance of highly-efficient algorithm, which in turn, will separate the bad chatbots from the good ones. … Automation is upon us, and chatbots are leading the way. With a fully-functional chatbot, e-commerce, or even a healthcare provider can process hundreds of interactions every single minute. This will not only save them money but also enable them to understand their audience better.

In order for this vision to be realized, Wolf insists, companies must invest in machine learning infrastructure. The article is punctuated with informative links like those in the quotation above; one I’m happy to see is this guide for non-technical journalists who wish to write accurately about AI developments (also good for anyone unfamiliar with the field). See the article for more useful links, and for more on chatbots as they currently exist.

Cynthia Murrell, September 21, 2017

Next Page »

  • Archives

  • Recent Posts

  • Meta