CyberOSINT banner

The Watson Update

July 15, 2016

IBM invested a lot of resources, time, and finances into developing the powerful artificial intelligence computer Watson.  The company has been trying for years to justify the expense as well as make money off their invention, mostly by having Watson try every conceivable industry that could benefit from big data-from cooking to medicine.  We finally have an update on Watson says ZDNet in the article, “IBM Talks About Progress On Watson, OpenPower.”

Watson is a cognitive computer system that learns, supports natural user interfaces, values user expertise, and evolves with new information.  Evolving is the most important step, because that will allow Watson to keep gaining experience and learn.  When Watson was first developed, IBM fed it general domain knowledge, then made the Watson Discovery to find answers to specific questions.  This has been used in the medical field to digest all the information created and applying it to practice.

IBM also did this:

“Most recently IBM has been focused on making Watson available as a set of services for customers that want to build their own applications with natural question-and-answer capabilities. Today it has 32 services available on the Watson Developer Cloud hosted on its Bluemix platform-as-a-service… Now IBM is working on making Watson more human. This includes a Tone Analyzer (think of this as a sort spellchecker for tone before you send that e-mail to the boss), Emotion Analysis of text, and Personality Insights, which uses things you’ve written to assess your personality traits.”

Cognitive computing has come very far since Watson won Jeopardy.  Pretty soon the technology will be more integrated into our lives.  The bigger question is how will change society and how we live?

 

Whitney Grace,  July 15, 2016

There is a Louisville, Kentucky Hidden Web/Dark

Web meet up on July 26, 2016. Information is at this link: http://bit.ly/29tVKpx.

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Publicly Available Information Is Considered Leaked When on Dark Web

July 7, 2016

What happens when publicly available informed is leaked to the Dark Web? This happened recently with staff contact information from the University of Liverpool according to an article, Five secrets about the Dark Web you didn’t know from CloudPro. This piece speaks to perception that the Dark Web is a risky place for even already publicly available information. The author reports on how the information was compromised,

“A spokeswoman said: “We detected an automated cyber-attack on one of our departmental online booking systems, which resulted in publically available data – surname, email, and business telephone numbers – being released on the internet. We take the security of all university-related data very seriously and routinely test our systems to ensure that all data is protected effectively. We supported the Regional Organised Crime Unit (TITAN) in their investigations into this issue and reported the case to the Information Commissioner’s Office.”

Data security only continues to grow in importance and as a concern for large enterprises and organizations. This incident is an interesting case to be reported, and it was the only story we had not seen published again and again, as it illustrates the public perception of the Dark Web being a playing ground for illicit activity. It brings up the question about what online landscapes are considered public versus private.

 

Megan Feil, July 7, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Computer Chip Inspired by a Brain

July 6, 2016

Artificial intelligence is humanity’s attempt to replicate the complicated thought processes in their own brains through technology.  IBM is trying to duplicate the human brain and they have been successful in many ways with supercomputer Watson.  The Tech Republic reports that IBM has another success under their belt, except to what end?  Check out the article, “IBM’s Brain-Inspired Chip TrueNorth Changes How Computers ‘Think,’ But Experts Question Its Purpose.”

IBM’s TrueNorth is the first computer chip with an one million neuron architecture.  The chip is a collaboration between Cornell University and IBM with the  BARPA SyNAPSE Program, using $100 million in public funding.  Most computer chips use the Von Neumann architecture, but the TrueNorth chip better replicates the human brain.  TrueNorth is also more energy efficient.

What is the purpose of the TrueNorth chip, however?  IBM created an elaborate ecosystem that uses many state of the art processes, but people are still wondering what the real world applications are:

“ ‘…it provides ‘energy-efficient, always-on content generation for wearables, IoT devices, smartphones.’ It can also give ‘real-time contextual understanding in automobiles, robotics, medical imagers, and cameras.’ And, most importantly, he said, it can ‘provide volume-efficient, unprecedented neural network acceleration capability per unit volume for cloud-based streaming processing and provide volume, energy, and speed efficient multi-modal sensor fusion at an unprecedented neural network scale.’”

Other applications include cyber security, other defense goals, and large scale computing and hardware running on the cloud.  While there might be practical applications, people still want to know why IBM made the chip?

” ‘It would be as if Henry Ford decided in 1920 that since he had managed to efficiently build a car, we would try to design a car that would take us to the moon,’ [said Nir Shavit, a professor at MIT’s Computer Science and Artificial Intelligence Laboratory]. ‘We know how to fabricate really efficient computer chips. But is this going to move us towards Human quality neural computation?’ Shavit fears that its simply too early to try to build neuromorphic chips. We should instead try much harder to understand how real neural networks compute.’”

Why would a car need to go to the moon?  It would be fun to go to the moon, but it doesn’t solve a practical purpose (unless we build a civilization on the moon, although we are a long way from that).  It continues:

” ‘The problem is,’ Shavit said, ‘that we don’t even know what the problem is. We don’t know what has to happen to a car to make the car go to the moon. It’s perhaps different technology that you need. But this is where neuromorphic computing is.’”

In other words, it is the theoretical physics of computer science.

 

Whitney Grace,  July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

The Cloud: Yep, Flying Blind Is Fun

July 5, 2016

Most of the information technology disasters I know about have a common characteristic. Ready for it? Managers did not do their job. The reasons ranged from a lack of informed decision making (this is a nice way of saying “stupid”) to a desire to leave the details to others (this is a nice way of saying “shirk responsibility”). Example: Portland, Oregon’s incompetence.

I thought about information technology crash and burns when I read “75 Percent of IT Pros Lack Visibility into Their Hybrid Clouds.” What I think the write up is trying to say is, “Folks with clouds don’t know what’s happening in the mist and haze.” The desire to get out of having computer systems down the hall is an admirable one. When I fiddled with the IBM mainframe at my third rate university in the 1960s, who wanted one of these puppies at home. The cloud is little more, in my opinion, than a return to the mainframe type approach to computing of a half century ago. Life is just easier with a smartphone.

The write  up reports:

The study from cloud governance specialist Netwrix reveals that almost 65 percent of organizations do not have complete visibility into user, IT and third-party activity in their IT infrastructure. In addition 75 percent of respondents have partial or no visibility into their cloud and hybrid IT environments. The survey of over 800 people across 30 industries worldwide shows a large majority of respondents (78 percent) saying they are unaware or only partly aware of what is happening across their unstructured data and file storage.

The painful reality is that people who are supposed to be professional struggle to know what the heck is going on with their cloud computing systems. MBAs and failed middle school teachers as well as bright young sprouts from prestigious university computer science programs have this characteristic too.

Understanding the limits of one’s own knowledge is a difficult undertaking. The confidence with which some “pros” talk about nifty technology is remarkable. The likelihood of a escalating costs, annoyed customers, grousing colleagues, and outright failure are highly likely events.

Whether it is the baloney about figuring out the context of a user query or an F 35 aircraft which cannot be serviced by a ground crew are examples of how arrogance or human behavior ensure information technology excitement.

Change human behavior or go with a Google and Facebook style smart system? Interesting choice or is it a dilemma.

Stephen E Arnold, July 5, 2016

Amazon AWS Jungle Snares Some Elasticsearch Functions

July 1, 2016

Elastic’s Elasticsearch has become one of the go to open source search and retrieval solutions. Based on Lucene, the system has put the heat on some of the other open source centric search vendors. However, search is a tricky beastie.

Navigate to “AWS Elasticsearch Service Woes” to get a glimpse of some of the snags which can poke holes in one’s rip stop hiking garb. The problems are not surprising. One does not know what issues will arise until a search system is deployed and the lucky users are banging away with their queries or a happy administrator discovers that Button A no longer works.

The write up states:

We kept coming across OOM issues due the JVMMemoryPresure spiking and inturn the ES service kept crapping out. Aside from some optimization work, we’d more than likely have to add more boxes/resources to the cluster which then means more things to manage. This is when we thought, “Hey, AWS have a service for this right? Let’s give that a crack?!”. As great as having it as a service is, it certainly comes with some fairly irritating pitfalls which then causes you to approach the situation from a different angle.

One approach is to use templates to deal with the implementation of shard management in AWS Elasticsearch. Sample templates are provided in the write up. The fix does not address some issues. The article provides a link to a reindexing tool called es-tool.

The most interesting comment in the article in my opinion is:

In hindsight I think it may have been worth potentially sticking with and fleshing out the old implementation of Elasticsearch, instead of having to fudge various things with the AWS ES service. On the other hand it has relieved some of the operational overhead, and in terms of scaling I am literally a couple of clicks away. If you have large amounts of data you pump into Elasticsearch and you require granular control, AWS ES is not the solution for you. However if you need a quick and simple Elasticsearch and Kibana solution, then look no further.

My takeaway is to do some thinking about the strengths and weaknesses of the Amazon AWS before chopping through the Bezos cloud jungle.

Stephen E Arnold, July 1, 2016

IBM Cloud Powers Comic-Con Channel

June 30, 2016

The San Diego Comic-Con is the biggest geek and pop culture convention in the country and it needs to be experienced to be believed.  Every year the San Diego Comic-Con gets bigger and more complex as attendees and the guests demand more from the purveyors.  If you are at Comic-Con, then you need to think big.  Thinking big requires thinking differently, which is why it would seem “IBM And Comic-Con HQ Make Strange Bedfellows” says Fortune.

IBM announced that they have teamed with Lionsgate to run a Comic-Con HQ video channeled powered by IBM’s cloud.  The on-demand channel will premiere during 2016’s Comic-Con.  Comic-con attendees and those unfortunate not to purchase a ticket have demanded video streaming services for years, practically ever since it became possible.  Due to copyright as well as how to charge attendees for the service have kept video on-demand on the back burner, but now it is going to happen and it is going to be a challenge.

Video streaming is:

“Video is a demanding application for cloud computing. Storing and shipping massive video files, often shot in ultra-high-definition 4k format, is a useful testbed to show off cloud services.”

Anything new related to Comic-Con always proves to be a hassle and troublesome.  One of the cases in point is when the SDCC launched its digital waiting room to purchase tickets and had way more traffic than their servers could handle.  The end result was a lot of angry fans unable to buy tickets.  Another challenge was handling the massive crowds that started flocking to the convention halls around the mid-2000s (attendance swelled around 2011 with the Twilight movies).

Anything that will improve the Comic-Con experience and even allow non-attendees a  taste of the magical July event would be welcome.

 

Whitney Grace, June 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

CloudFlare Claims Most Activity from Tor Is Malicious

June 28, 2016

Different sources suggest varying levels of malicious activity on Tor. Tech Insider shared an article responding to recent claims about Tor made by CloudFlare. The article, entitled, Google Search has a secret feature that shouts animal noises at you, offers information about CloudFlare’s perspective and that of the Tor Project. CloudFlare reports most requests from Tor, 94 percent, are “malicious” and the Tor Project has responded by requesting evidence to justify the claim. Those involved in the Tor Project have a hunch the 94 percent figure stems from CloudFlare attributing the label of “malicious” to any IP address that has ever sent spam. The article continues,

“We’re interested in hearing CloudFlare’s explanation of how they arrived at the 94% figure and why they choose to block so much legitimate Tor traffic. While we wait to hear from CloudFlare, here’s what we know: 1) CloudFlare uses an IP reputation system to assign scores to IP addresses that generate malicious traffic. In their blog post, they mentioned obtaining data from Project Honey Pot, in addition to their own systems. Project Honey Pot has an IP reputation system that causes IP addresses to be labeled as “malicious” if they ever send spam to a select set of diagnostic machines that are not normally in use. CloudFlare has not described the nature of the IP reputation systems they use in any detail.”

This article raises some interesting points, but also alludes to more universal problems with making sense of any information published online. An epistemology about technology, and many areas of study, is like chasing a moving target. Knowledge about technology is complicated by the relationship between technology and information dissemination. The important questions are what does one know about Tor and how does one know about it?

 

Megan Feil, June 28, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

SLI Systems Hopeful as Losses Narrow and Revenue Grows

June 14, 2016

The article titled SLI Systems Narrows First-Half Loss on Scoop reports revenue growth and plans to mitigate losses. SLI Systems is a New Zealand-based software as a service (SaaS) business that provides cloud-based search resources to online retailers. Founded in 2001, SLI Systems has already weathered a great deal of storms in the form of the dot-com crash that threatened to stall the core technology (developed at GlobalBrain.) According to a statement from the company, last year’s loss of $502K was an improvement from the loss of $4.1M in 2014. The article states,

“SLI shares have dropped 18 percent in the past 12 months, to trade recently at 76 cents, about half the level of the 2013 initial public offering price of $1.50. The software developer missed its sales forecast for the second half of the 2015 year but is optimistic new chief executive Chris Brennan and Martin Onofrio as chief revenue officer, both Silicon Valley veterans, can drive growth in revenue and earnings.”

The SLI of SLI stands for Search, Learn and (appropriately) Improve. The company hopes to achieve sustainable growth without raising additional capital by continuing to focus on innovation and customer retention rates, which slipped from 90% to 87% recently. Major clients include Lenovo, David Jones, Harvey Norman, and Paul Smith.

 

 

Chelsea Kerwin, June 14, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Axcelerate Focuses on Control and Visibility

June 13, 2016

The article on CMSWire titled Recommind Adds Muscle to Cloud e-Discovery relates the upgrades to the Axcelerate e-Discovery platform from Recommind. The muscle referred to in the article title is the new Efficiency Scoring feature offered to increase e-discovery review process transparency by tracking efficiency and facilitating a consistent assessment. The article explains,

“Axcelerate Cloud is built on Recommind’s interactive business intelligence layer to give legal professionals a depth of insight into the e-discovery process that Recommind says they have previously lacked. Behind all the talk of agility and visibility, there is one goal here: control. The company hopes this release allays the fears of legal firms, who traditionally have been reluctant to use cloud-based software for fear of compromising data.”

Hal Marcus, Director of Product Marketing at Recommind, suggested that in spite of early hesitancy by legal professional to embrace the cloud, current legal teams are more open to the possibilities available through consolidation of discovery requirements in the cloud. According to research, there are no enterprise legal departments without cloud-based legal resources related to contract management, billing, or e-discovery. Axcelerate Cloud aims to promote visibility into discovery practices to address the major concern among legal professionals: insufficient insight and transparency.

 

 

Chelsea Kerwin, June 13, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Unknown Future of Google Cloud Platform

June 10, 2016

While many may have the perception Google dominates in many business sectors, a recent graph published shows a different story when it comes to cloud computing. Datamation released a story, Why Google Will Dominate Cloud Computing, which shows Google’s position in fourth. Amazon, Microsoft and IBM are above the search giant in cloud infrastructure services when looking at the fourth quarter market share and revenue growth for 2015. The article explains why Google appears to be struggling,

“Yet as impressive as its tech prowess is, GCP’s ability to cater to the prosaic needs of enterprise cloud customers has been limited, even fumbling. Google has always focused more on selling its own services rather than hosting legacy applications, but these legacy apps are the engine that drives business. Remarkably, GCP customers don’t get support for Oracle software, as they do on Amazon Web Services. Alas, catering to the needs of enterprise clients isn’t about deep genius – it’s about working with others. GCP has been like the high school student with straight A’s and perfect SAT scores that somehow doesn’t have too many friends.”

Despite the current situation, the article hypothesizes Google Cloud Platform may have an edge in the long-term. This is quite a bold prediction. We wonder if Datamation may approach the goog to sell some ads. Probably not, as real journalists do not seek money, right?

 

Megan Feil, June 10, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Next Page »