Cybercrime as a Service Drives Cyber Attacks on Uber Accounts and More

January 26, 2016

Several articles lately have shined light on the dynamics at play in the cybercriminal marketplaces of the Dark Web; How much is your Uber account worth?, for example, was recently published on Daily Mail. Summarizing a report from security researchers at Trend Micro for CNBC, the article explains this new information extends the research previously done by Intel Security’s The Hidden Data Economy report. Beyond describing the value hierarchy where Uber and Paypal logins cost more than social security numbers and credit cards, this article shares insights on the bigger picture,

“’Like any unregulated, efficient economy, the cybercrime ecosystem has quickly evolved to deliver many tools and services to anyone aspiring to criminal behavior,’ said Raj Samani, chief technology officer for Intel Security EMEA. ‘This “cybercrime-as-a-service” marketplace has been a primary driver for the explosion in the size, frequency, and severity of cyber attacks.

‘The same can be said for the proliferation of business models established to sell stolen data and make cybercrime pay.’”

Moving past the shock value of the going rates, this article draws our attention to the burgeoning business of cybercrime. Similarly to the idea that Google has expanded the online ecosystem by serving as a connector, it appears marketplaces in the Dark Web may be carving out a similar position. Quite the implications when you consider the size of the Dark Web.

 

Megan Feil, January 26, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Experts Team for Complex Data Infographic

January 23, 2016

I wish to point out that experts are able to make the complex easy to understand. For an outstanding infographic about the state of complex data, you will want to navigate to “The State of Complex Data in January 2016.” Because this is the 23 of January 2016 as I write this, you must click tout de suite.

The explanation of the state of complex data is the work of the experts at Ventana Research and the Aberdeen Group. Believe me, these outfits are on top of the data thing.

I noted this statement in the write up:

As for the diversity of the data – 71% of organizations analyze more than 6 data sources, and an astonishing 23% use more than 20. Doubtless, today’s data comes in many shapes and forms, posing new challenges and presenting new opportunities.

Astonishing results? Absolutely.

I am, however, not exactly certain what “complex” data mean. But help is at hand, according to the article:

This guide covers the topic of data complexity and presents the distinction between Big, Simple, Diversified and Complex data, and will enable you to understand the complexity of your own data and how to determine the best tools and techniques you’ll need to use in order to prepare, analyze and visualize this data.

Thank goodness there are keen intellects able to explain the differences among the big, simple, diversified, and complex. At the back of my mind is this reminder from one of the glossaries I prepared: Data are facts and statistics collected together for reference or analysis.

Obviously I was too stupid to realize that data can be big, simple, diversified, or complex. Now I know. My life is better.

Stephen E Arnold, January 23, 2017

Data Discrimination Is Real

January 22, 2016

One of the best things about data and numbers is that they do not lie…usually.  According to Slate’s article, “FTC Report Details How Big Data Can Discriminate Against The Poor,” big data does a huge disservice to people of lower socioeconomic status by reinforcing existing negative patterns.  The Federal Trade Commission (FTC), academics, and activists have expressed for some time that big data analytics.

“At its worst, big data can reinforce—and perhaps even amplify—existing disparities, partly because predictive technologies tend to recycle existing patterns instead of creating new openings. They can be especially dangerous when they inform decisions about people’s access to healthcare, credit, housing, and more. For instance, some data suggests that those who live close to their workplaces are likely to maintain their employment for longer. If companies decided to take that into account when hiring, it could be accidentally discriminatory because of the radicalized makeup of some neighborhoods.”

The FTC stresses that big data analytics has positive benefits as well.  It can yield information that can create more job opportunities, transform health care delivery, give credit through “non-traditional methods, and more.

The way big data can avoid reinforcing these problems and even improve upon them is to include biases from the beginning.  Large data sets can make these problems invisible or even harder to recognize.  Companies can use prejudiced data to justify the actions they take and even weaken the effectiveness of consumer choice.

Data is supposed to be an objective tool, but the sources behind the data can be questionable.  It becomes important for third parties and the companies themselves to investigate the data sources, run multiple tests, and confirm that the data is truly objective.  Otherwise we will be dealing with social problems and more reinforced by bad data.

Whitney Grace, January 22, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Total Patent Counts for 2015 Are in, and IBM Wins (Again)

January 21, 2016

The article on Reuters titled IBM Granted Most U.S. Patents in 2015, Study Finds confirms the 23rd consecutive win in this area for IBM. Patents are a key indicator of the direction and focus of a given business, and top companies take these numbers very seriously. Interestingly, 2015 was the first year since 2007 that the total count of U.S. patents fell. Following that trend, Microsoft Corp’s patents were also 31% lower than past totals, and as a result the company took only tenth place on the list. The article provides some other details on patent rankings,

“Among the technology giants notable for their intellectual property, Alphabet Inc’s (GOOGL.O) Google stepped up its patent activity, moving to the fifth position from eighth in 2014, while Apple Inc (AAPL.O) stayed at the 11th position. Patents are sometimes the subject of legal battles, and investors, analysts and enthusiasts alike track patents closely to see what companies are looking to develop next. Following IBM, Samsung Electronics Co Ltd (005930.KS) and Canon Inc (7751.T) rounded off the top three spots…”

There are no big surprises here, but one aspect of patents that the article does not cover is whether patents count as revenue? We were under the impression that money did that trick, but the emphasis on patents seems to suggest otherwise.

 
Chelsea Kerwin, January 21, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Hello, Big Algorithms

January 15, 2016

The year had barely started and it looks lime we already have a new buzzword to nestle into our ears: big algorithms.  The term algorithm has been tossed around with big data as one of the driving forces behind powerful analytics.  Big data is an encompassing term that refers to privacy, security, search, analytics, organization, and more.  The real power, however, lies in the algorithms.  Benchtec posted the article, “Forget Big Data-It’s Time For Big Algorithms” to explain how algorithms are stealing the scene.

Data is useless unless you are able to are pull something out of it.  The only way get the meat off the bone is to use algorithms.  Algorithms might be the powerhouses behind big data, but they are not unique.  The individual data belonging to different companies.

“However, not everyone agrees that we’ve entered some kind of age of the algorithm.  Today competitive advantage is built on data, not algorithms or technology.  The same ideas and tools that are available to, say, Google are freely available to everyone via open source projects like Hadoop or Google’s own TensorFlow…infrastructure can be rented by the minute, and rather inexpensively, by any company in the world. But there is one difference.  Google’s data is theirs alone.”

Algorithms are ingrained in our daily lives from the apps run on smartphones to how retailers gather consumer detail.  Algorithms are a massive untapped market the article says.  One algorithm can be manipulated and implemented for different fields.  The article, however, ends on some socially conscious message about using algorithms for good not evil.  It is a good sentiment, but kind of forced here, but it does spur some thoughts about how algorithms can be used to study issues related to global epidemics, war, disease, food shortages, and the environment.

Whitney Grace, January 15, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Big Data Shows Its Return on Investment

January 13, 2016

Big data was the word that buzzed through the IT community and made companies revaluate their data analytics and consider new ways to use structured and unstructured information to their benefit.  Business2Community shares how big data has affected companies in sixteen case studies: “16 Case Studies Of Companies Proving ROI Of Big Data.” One of the problems companies faced when implementing a big data plan was whether or not they would see a return on their investment.  Some companies saw an immediate return, but others are still scratching their heads.  Enough time has passed to see how various corporations in different industries have leaned.

Companies remain committed to implementing big data plans into their frameworks, most of what they want to derive from big data is how to use it effectively:

  • “91% of marketing leaders believe successful brands use customer data to drive business decisions (source: BRITE/NYAMA)
  • 87% agree capturing and sharing the right data is important to effectively measuring ROI in their own company (BRITE/NYAMA)
  • 86% of people are willing to pay more for a great customer experience with a brand (souce: Lunch Pail)”

General Electric uses big data to test their products’ efficiently and the crunch the analytics to increase productivity.  The Weather Channel analyzes its users behavior patterns along with climate data in individual areas to become an advertising warehouse.  The big retailer Wal-Mart had added machine learning, synonym mining, and text analysis to increase search result relevancy.  Semantic search has also increased online shopping by ten percent.

The article highlights many other big brand companies and how big data has become a boon for businesses looking to increase their customer relations, increase sales, and improve their services.

 

Whitney Grace, January 13, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Reverend Bayes: Still Making Headlines

January 6, 2016

Autonomy, now owned by Hewlett Packard Enterprise, was one of the first commercial search and content processing firms to embrace Bayesian methods. The approach in the 1990s was not as well known and as widely used as it is today. Part of the reason was the shroud of secrecy dropped over the method. Another factor was the skepticism some math folks had about the “judgment” factor required to set up Bayesian methods. That skepticism is still evident today even though Bayesian methods are used by many of the information processing outfits making headlines today.

A good example of the attitude appears in “Bayes’s Theorem: What’s the Big Deal?

Here’s the quote I noted:

Embedded in Bayes’ theorem is a moral message: If you aren’t scrupulous in seeking alternative explanations for your evidence, the evidence will just confirm what you already believe. Scientists often fail to heed this dictum, which helps explains why so many scientific claims turn out to be erroneous. Bayesians claim that their methods can help scientists overcome confirmation bias and produce more reliable results, but I have my doubts.

Bayesian methods are just one of the most used methods in analytics outfits. Will these folks change methods? Nah.

Stephen E Arnold, January 6, 2015

Dark Web and Tor Investigative Tools Webinar

January 5, 2016

Telestrategies announced on January 4, 2016, a new webinar for active LEA and intel professionals. The one hour program is focused on tactics, new products, and ongoing developments for Dark Web and Tor investigations. The program is designed to provide an overview of public, open source, and commercial systems and products. These systems may be used as standalone tools or integrated with IBM i2 ANB or Palantir Gotham. More information about the program is available from Telestrategies. There is no charge for the program. In 2016, Stephen E Arnold’s new Dark Web Notebook will be published. More information about the new monograph upon which the webinar is based may be obtained by writing benkent2020 at yahoo dot com.

Stephen E Arnold, January 5, 2016

Big Data but for How Many Firms?

January 2, 2016

Big Data as a marketing buzzword is everywhere. Even search and content processing companies profess to be skilled in the art of Big Data. This statement is ridiculously easy to say. Delivering is, of course, another matter.

There is a disconnect between what failed middle school teachers, unemployed journalists, and self anointed pundits say and what is semi accurate.

Navigate to “Only 13% Organizations Have Implemented Big Data Analytics Solution That Impact Business.” The chart suggests that the talk is producing proof of concepts and allocations. The majority of the outfits in the CapGemini sample are exploring and implementing partial Big Data projects.

If one builds Big Data systems, perhaps the users will come.

Stephen E Arnold, January 2, 2015

Data Managers as Data Librarians

December 31, 2015

The tools of a librarian may be the key to better data governance, according to an article at InFocus titled, “What Librarians Can Teach Us About Managing Big Data.” Writer Joseph Dossantos begins by outlining the plight data managers often find themselves in: executives can talk a big game about big data, but want to foist all the responsibility onto their overworked and outdated IT departments. The article asserts, though, that today’s emphasis on data analysis will force a shift in perspective and approach—data organization will come to resemble the Dewey Decimal System. Dossantos writes:

“Traditional Data Warehouses do not work unless there a common vocabulary and understanding of a problem, but consider how things work in academia.  Every day, tenured professors  and students pore over raw material looking for new insights into the past and new ways to explain culture, politics, and philosophy.  Their sources of choice:  archived photographs, primary documents found in a city hall, monastery or excavation site, scrolls from a long-abandoned cave, or voice recordings from the Oval office – in short, anything in any kind of format.  And who can help them find what they are looking for?  A skilled librarian who knows how to effectively search for not only books, but primary source material across the world, who can understand, create, and navigate a catalog to accelerate a researcher’s efforts.”

The article goes on to discuss the influence of the “Wikipedia mindset;” data accuracy and whether it matters; and devising structures to address different researchers’ needs. See the article for details on each of these (especially on meeting different needs.) The write-up concludes with a call for data-governance professionals to think of themselves as “data librarians.” Is this approach the key to more effective data search and analysis?

Cynthia Murrell, December 31, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta