September 29, 2016
When I first began reading the EasyAsk article, “Search Laboratory: Rock ‘n’ Roll Lab Rats” it has the typical story about search difficulties and the importance about an accurate, robust search engine. They even include video featuring personified search engines and the troubles a user goes through to locate a simple item, although the video refers to Google Analytics. The article pokes fun at EasyAsk employees and how they develop the Search Lab, where they work on improving search functions.
One of the experiments that Search Lab worked on is “sticky search.” What is sticky search? Do you throw a keyword reel covered in honey into the Web pool and see what returns? Is it like the Google “I Feel Lucky” button. None of these are correct. The Search Lab conducted an experiment where the last search term was loaded into the search box when a user revisited. The Search Lab tracked the results and discovered:
As you can see, the sticky search feature was used by close-to one third of the people searching from the homepage, but by a smaller proportion of people on other types of page. Again, this makes sense as you’re more likely to use the homepage as a starting point when your intention is to return to a previously viewed product. We had helped 30% of people searching from our homepage get to where they wanted to go more quickly, but added inconvenience to the other two thirds (and 75% of searchers across the site as a whole) because to perform their searches, rather than just tapping the search box and beginning to type they now had to erase the old (sticky) search term too.
In other words, it was annoying. Search Lab retracted the experiment, but it was a decent effort to try something new even if the results could have been predicted. Keep experimenting with search options SearchLab, but keep the search box empty.
August 17, 2016
Search engine optimization is the bane of Web experts. Why? If you know how to use it you can increase your rankings in search engines and drive more traffic to your pages, but if you are a novice at SEO you are screwed. Search Engine Land shares some bad SEO stories in “SEO Is As Dirty As Ever.”
Do not forget other shady techniques like the always famous shady sales, removing links, paid links, spam, link networks, removing links, building another Web site on a different domain, abusing review sites, and reusing content. One thing to remember is that:
“It’s not just local or niche companies that are doing bad things; in fact, enterprise and large websites can get away with murder compared to smaller sites. This encourages some of the worst practices I’ve ever seen, and some of these companies do practically everything search engines tell them not to do.”
Ugh! The pot is identifying another pot and complaining about its color and cleanliness.
There is a Louisville, Kentucky Hidden /Dark Web meet up on August 23, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233019199/
October 9, 2015
Enterprise search company Attivio has an interesting post in their Data Dexterity Blog titled “3 Questions for the CEO.” We tend to keep a close eye on industry leader Attivio, and for good reason. In this post, the company’s senior director of product marketing Jane Zupan posed a few questions to her CEO, Stephen Baker, about their role in the enterprise search market. Her first question has Baker explaining his vision for the field’s future, “search-based data discovery”; he states:
“With search-based data discovery, you would simply type a question in your natural language like you do when you perform a search in Google and get an answer. This type of search doesn’t require a visualization tool. So, for example, you could ask a question like ‘tell me what type of weather conditions which exist most of the time when I see a reduction in productivity in my oil wells.’ The answer that comes back, such as ‘snow,’ or ‘sleet,’ gives you insights into how weather patterns affect productivity. Right now, search can’t infer what a question means. They match the words in a query, or keywords, with words in a document. But [research firm] Gartner says that there is an increasing importance for an interface in BI tools that extend BI content creation, analysis and data discovery to non-skilled users. You don’t need to be familiar with the data or be a business analyst or data scientist. You can be anyone and simply ask a question in your words and have the search engine deliver the relevant set of documents.”
Yes, many of us are looking forward to that day. Will Attivio be the first to deliver? The interview goes on to discuss the meaning of the company’s slogan, “the data dexterity company.” Part of the answer involves gaining access to “dark data” buried within organizations’ data silos. Finally, Zupan asks what “sets Attivio apart?” Baker’s answers: the ability to quickly access data from more sources; deriving structure from and analyzing unstructured data; and friendliness to “non-technical” users.
Launched in 2008, Attivio is headquartered in Newton, Massachusetts. Their team includes folks with an advantageous combination of backgrounds: in search, database, and business intelligence companies.
Cynthia Murrell, October 9, 2015
August 13, 2015
A new acquisition by CounterTack brings predictive capability to that company’s security offerings, we learn from “CounterTack Acquires ManTech Cyber Solutions” at eWeek. Specifically, it is a division of ManTech International, dubbed ManTech Cyber Solutions International (MCSI), that has been snapped up under undisclosed terms by the private security firm.
CounterTack president and CEO Neal Chreighton says the beauty of the deal lies in the lack of overlap between their tech and what MCSI brings to the table; while their existing products can tell users what is happening or has already happened, MCSI’s can tell them what to watch out for going forward. Writer Sean Michael Kerner elaborates:
“MCSI’s technology provides a lot of predictive capabilities around malware that can help enterprises determine how dangerous a malicious payload might be, Creighton said. Organizations often use the MCSI Responder Pro product after an attack has occurred to figure out what has happened. In contrast, the MCSI Active Defense product looks at issues in real time to make predictions, he said. A big area of concern for many security vendors is the risk of false positives for security alerts. With the Digital DNA technology, CounterTack will now have a predictive capability to be able to better determine the risk with a given malicious payload. The ability to understand the potential capabilities of a piece of malware will enable organizations to properly provide a risk score for a security event. With a risk score in place, organizations can then prioritize malware events to organize resources to handle remediation, he said.”
Incorporation of the open-source Hadoop means CounterTack can scale to fit any organization, and the products can be deployed on-premises or in the cloud. Cleighton notes his company’s primary competitor is security vendor CrowdStrike; we’ll be keeping an eye on both these promising firms.
Based in Waltham, Massachusetts, CounterTack was founded in 2007. The company declares their Sentinel platform to be the only in-progress attack intelligence and response solution on the market (for now.) Founded way back in 1968, ManTech International develops and manages solutions for cyber security, C4ISR, systems engineering, and global logistics from their headquarters in Washington, DC. Both companies are currently hiring; click here for opportunities at CounterTack, and here for ManTech’s careers page.
Cynthia Murrell, August 13, 2015
July 7, 2015
RSS feeds and Web page readers curate content from select Web sites tailored to suit a users’ needs. While all of the content is gathered in one spot and the headlines are available to read, sometimes the readers return hundreds of articles and users do not have the time to read all of them. True, sometimes users can glen the facts from the headlines and the small blurb included with it, but sometimes it is not enough.
There are apps that gather and summarize a users’ content, but these are usually geared towards a specific industry or an enterprise system. There is a content reader that was designed for the average user, while at the same time it can be programmed to serve the needs of many professionals. The Context Organizer from Content Discovery Inc. is an application that summarizes Web pages and documents in order to pinpoint relevant information. The Content Organizer works via five basic steps:
“1. Get to the point – Speed-up reading by condensing web pages, emails and documents into keywords and summaries presented in context.
- Make a Long Story Short – The Short Summary headlines most important sentences – instant information capsules.
- Accelerate Search – Search the web with relevant keywords. Summarize Google search results for rapid understanding.
- Take Notes – Quickly collect topics and sentences. Send them to WordPad or Word. Share notes – send them by e-mail.
- Visualize – View summaries in context as Mindjet MindManager maps.”
There are three different Context Organizer versions: one that specifically searches the Web, another that searches the Web and Microsoft Products, and the third is a combination of the prior versions plus it includes the Mindjet MindManager. The prices range from $60-$120 with a free twenty-one day trial, which we suggest you start with. Always start with free trial first, because you mind be throwing away money on an item you do not like. With the amount of content available on the Web, any tool that helps organize and summarize it is worth investigating.
June 11, 2015
Forbes’ article “The 50 Most Innovative Companies Of 2014: Strong Innovators Are Three Times More Likely To Rely on Big Data Analytics” points out how innovation is strongly tied to big data analytics and data mining these days. The Boston Consulting Group (BCG) studies the methodology of innovation. The numbers are astounding when companies that use big data are placed against those who still have not figured out how to use their data: 57% vs. 19%.
Innovation, however, is not entirely defined by big data. Most of the companies that rely on big data as key to their innovation are software companies. According to Forbes’ study, they found that 53% see big data as having a huge impact in the future, while BCG only found 41% who saw big data as vital to their innovation.
Big data cannot be and should not be ignored. Forbes and BCG found that big data analytics are useful and can have huge turnouts:
“BCG also found that big-data leaders generate 12% higher revenues than those who do not experiment and attempt to gain value from big data analytics. Companies adopting big data analytics are twice as likely as their peers (81% versus 41%) to credit big data for making them more innovative.”
Measuring innovation proves to be subjective, but one cannot die the positive effect big data analytics and data mining can have on a company. You have to realize, though, that big data results are useless without a plan to implement and use the data. Also take note that none of the major search vendors are considered “innovative,” when a huge part of big data involves searching for results.
Whitney Grace, June 11, 2015
October 12, 2014
Oracle’s Secure Enterprise Search offered advanced security. Perfect Search stressed its speed. SES has been marginalized. That particular security pitch did not work. Perfect Search also has faded from the scene.
Perhaps pitching both security and speed will yield more together than as separate features.
SRCH2 asserts that it is four times faster than open source search engines. None of the open source search engines is a speed demon. Speed boosts require additional work on the specific subsystem introducing the latency for a particular deployment.
SRCH2’s “Real Time Computer Requires Faster Search” makes a case for the optimization built in to SRCH2’s system. The article states:
SRCH2 offers the world’s fastest search engine. Why is speed so important? After all, the human eye can’t detect the difference between a 10-millisecond and 50-millisecond response time.
Some data backing this assertion would be helpful. In a direct comparison of Lucid Works’ technology with ElasticSearch’s technology, the ArnoldIT team found that one was faster in indexing and the other was faster in query processing. Both could be improved with focused optimization. Perhaps SRCH2 will share some of their data which backs up the “four time faster claim? (I am not at liberty to release the performance data a client requested my team compile from live tests on my test corpus.
SRCH2’s “SRCH2 Introduces Access Control Lists to Improve Search Security.” The article states:
SRCH2 took the approach of providing native support of access control to set restrictions on search results. With SRCH2’s ACL feature, developers can restrict user permissions to access either certain records in an index, or specific attributes within a record or set of records.
The approach is useful. However, it is less robust that the Oracle approach which implemented a wider range of features provided by specialized Oracle subsystems.
Will the combination of security and speed pay off for SRCH2? Good question. I do not have an answer.
Stephen E Arnold, October 11, 2014
March 10, 2014
Phil Leggetter is a real time software and developer evangelist and on his blog he wrote a post entitled, “10 Real Time Web Technology Predictions For 2014.” He says in the post that he based his 2014 predictions on trends in 2013 and what has happened so far in 2014.
He notes that nearly all applications have a real time sync in their code for relevancy and that real time is becoming a common commodity. This means that real time fixtures will be included in frameworks, but it will not diminish their importance. One can expect to see more real time APIs, increasing API offerings and adding to their values, and WebHooks will gain more prominence.
Leggett mentions that open source needs an data sync solution, which comes as a surprise because there is nearly an open source program for everything. Why has this not been made yet?
Video and audio communication are getting even bigger. Real time video and data communication in real time is going to be even more important for applications and it might be time to check out peer-to-peer data sharing. What is even better is real time developer tools are on the horizon.
The next 10 months of 2014 is going to be very exciting for real time web technology, real time solution providers, real time hosted services, and more importantly for us developers. I expect some serious advancements in existing solutions and some new players to come along. Real time web technology is going to become even easier to integrate into existing applications and we’re going to have a much wider range of choice when building real time apps from the ground up.”
Will real time technology be the buzzword trend this year? Again, it is only predictions.
January 14, 2013
Dr. Jerry Lucas, founder of TeleStrategies, is an expert in digital information and founder of the ISS World series of conferences. “ISS” is shorthand for “intelligence support systems.” The scope of Mr. Lucas’ interests range from the technical innards of modern communications systems to the exploding sectors for real time content processing. Analytics, fancy math, and online underpin Mr. Lucas’ expertise and form the backbone of the company’s training and conference activities.
What makes Dr. Lucas’ viewpoint of particular value is his deep experience in “lawful interception, criminal investigations, and intelligence gathering.” The perspective of an individual with Dr. Lucas’ professional career offers an important and refreshing alternative to the baloney promulgated by many of the consulting firms explaining online systems.
Dr. Lucas offered a more “internationalized” view of the Big Data trend which is exercising many US marketers’ and sales professionals’ activities. He said:
“Big Data” is an eye catching buzzword that works in the US. But as you go east across the globe, “Big Data” as a buzzword doesn’t get traction in the Middle East, Africa and Asia Pacific Regions if you remove Russia and China. One interesting note is that Russian and Chinese government agencies only buy from vendors based in their countries. The US Intelligence Community (IC) has big data problems because of the obvious massive amount of data gathered that’s now being measured in zettabytes. The data gathered and stored by the US Intelligence Community is growing beyond what typical database software products can handle as well as the tools to capture, store, manage and analyze the data. For the US, Western Europe, Russia and China, “Big Data” is a real problem and not a hyped up buzzword.
Western vendors have been caught in the boundaries between different countries’ requirements. Dr. Lucas observed:
A number of western vendors made a decision because of the negative press attention to abandon the global intelligence gathering market. In the US Congress Representative Chris Smith (R, NJ) sponsored a bill that went nowhere to ban the export of intelligence gathering products period. In France a Bull Group subsidiary, Amesys legally sold intelligence gathering systems to Lybia but received a lot of bad press during Arab Spring. Since Amesys represented only a few percent of Bull Group’s annual revenues, they just sold the division. Amesys is now a UAE company, Advanced Middle East Systems (Ames). My take away here is governments particularly in the Middle East, Africa and Asia have concerns about the long term regional presence of western intelligence gathering vendors who desire to keep a low public profile. For example, choosing not to exhibit at ISS World Programs. The next step by these vendors could be abandoning the regional marketplace and product support.
The desire for federated information access is, based on the vendors’ marketing efforts, is high. Dr. Lucas made this comment about the existence of information silos:
Consider the US where you have 16 federal organizations collecting intelligence data plus the oversight of the Office of Director of National Intelligence (ODNI). In addition there are nearly 30,000 local and state police organizations collecting intelligence data as well. Data sharing has been a well identified problem since 9/11. Congress established the ODNI in 2004 and funded the Department of Homeland Security to set up State and Local Data Fusion Centers. To date Congress has not been impressed. DNI James Clapper has come under intelligence gathering fire over Benghazi and the DHS has been criticized in an October Senate report that the $1 Billion spent by DHS on 70 state and local data fusion centers has been an alleged waste of money. The information silo or the information stovepipe problem will not go away quickly in the US for many reasons. Data cannot be shared because one agency doesn’t have the proper security clearances, job security which means “as long as I control access the data I have a job,” and privacy issues, among others.
The full text of the exclusive interview with Dr. Lucas is at http://www.arnoldit.com/search-wizards-speak/telestrategies-2.html. The full text of the 2011 interview with Dr. Lucas is at this link. Stephen E Arnold interviewed Dr. Lucas on January 10, 2013. The full text of the interview is available on the ArnoldIT.com subsite “Search Wizards Speak.”
Donald Anderson, January 14, 2013
July 12, 2012
AtHoc joined forces with Intel and received a $5.6 million investment to improve their technology. Since they are the leader in enterprise-class, network-based mass notification systems for the security, life safety and defense sectors of the United States, one would have to agree that was a wise investment.
Contrary to some beliefs, there is more to search than key words. The recent press releases on AtHoc’s page “Intel Invests in AtHoc; Chairman of RSA Security Joins AtHoc’s Board,” are a reminder that increasing device technology demands improvements with critical situational awareness data. Organizations must be able to swiftly analyze and address anomalies because lives may depend on it.
AtHoc does just that with real life, real time alerts as stated:
“AtHoc helps organizations become fully prepared to provide emergency mass communication to all of its constituents. It allows users to provide additional data and responders to remediate the issues at hand, based on the information they receive. AtHoc improves the safety and security of our citizens, first responders, and armed forces personnel around the world.”
Just imagine attempting to get a real time response on the average search engine during an emergency. The repercussions of scanning pages of possible aid would almost assuredly be life threatening. When considering the outcome from that perspective, real life, real time alerts show there is more to search than key words.
Jennifer Shockley, July 12, 2012