January 20, 2017
Just a quick honk about a little Google feature called Popular Times. LifeHacker points out an improvement to the tool in, “Google Will Now Show You How Busy a Business Is in Real Time.” To help users determine the most efficient time to shop or dine, the feature already provided a general assessment of businesses’ busiest times. Now, though, it bases that information on real-time metrics. Writer Thorin Klosowski specifies:
The real time data is rolling out starting today. You’ll see that it’s active if you see a ‘Live’ box next to the popular times when you search for a business. The data is based on location data and search terms, so it’s not perfect, but will at least give you a decent idea of whether or not you’ll easily find a place to sit at a bar or how packed a store might be. Alongside the real-time data comes some other info, including how long people stay at a location on average and hours by department, which is handy when a department like a pharmacy or deli close earlier the rest of a store.
Just one more way Google tries to make life a little easier for its users. That using it provides Google with even more free, valuable data is just a side effect, I’m sure.
Cynthia Murrell, January 20, 2017
December 30, 2016
Do you remember where you were or what you searched the first time you used Google? This investors.com author does and shares the story about that, in addition to the story about what may be the last time he used Google. The article entitled Google Makes An ‘Historic’ Mistake reports on the demise of a search feature on mobile. Users may no longer search published dates in a custom range. It was accessed by clicking “Search tools” followed by “Any time”. The article provides Google’s explanation for the elimination of this feature,
On a product forum page where it made this announcement, Google says:
After much thought and consideration, Google has decided to retire the Search Custom Date Range Tool on mobile. Today we are starting to gradually unlaunch this feature for all users, as we believe we can create a better experience by focusing on more highly-utilized search features that work seamlessly across both mobile and desktop. Please note that this will still be available on desktop, and all other date restriction tools (e.g., “Past hour,” “Past 24 hours,” “Past week,” “Past month,” “Past year”) will remain on mobile.
The author critiques Google, saying this move force users back to the dying desktop for this feature no longer prioritized on mobile. The point appears to be missed in this critique. The feature was not heavily utilized. With the influx of real-time data, who needs history — who needs time limits? Certainly not a Google mobile search user.
Megan Feil, December 30, 2016
September 29, 2016
When I first began reading the EasyAsk article, “Search Laboratory: Rock ‘n’ Roll Lab Rats” it has the typical story about search difficulties and the importance about an accurate, robust search engine. They even include video featuring personified search engines and the troubles a user goes through to locate a simple item, although the video refers to Google Analytics. The article pokes fun at EasyAsk employees and how they develop the Search Lab, where they work on improving search functions.
One of the experiments that Search Lab worked on is “sticky search.” What is sticky search? Do you throw a keyword reel covered in honey into the Web pool and see what returns? Is it like the Google “I Feel Lucky” button. None of these are correct. The Search Lab conducted an experiment where the last search term was loaded into the search box when a user revisited. The Search Lab tracked the results and discovered:
As you can see, the sticky search feature was used by close-to one third of the people searching from the homepage, but by a smaller proportion of people on other types of page. Again, this makes sense as you’re more likely to use the homepage as a starting point when your intention is to return to a previously viewed product. We had helped 30% of people searching from our homepage get to where they wanted to go more quickly, but added inconvenience to the other two thirds (and 75% of searchers across the site as a whole) because to perform their searches, rather than just tapping the search box and beginning to type they now had to erase the old (sticky) search term too.
In other words, it was annoying. Search Lab retracted the experiment, but it was a decent effort to try something new even if the results could have been predicted. Keep experimenting with search options SearchLab, but keep the search box empty.
August 17, 2016
Search engine optimization is the bane of Web experts. Why? If you know how to use it you can increase your rankings in search engines and drive more traffic to your pages, but if you are a novice at SEO you are screwed. Search Engine Land shares some bad SEO stories in “SEO Is As Dirty As Ever.”
Do not forget other shady techniques like the always famous shady sales, removing links, paid links, spam, link networks, removing links, building another Web site on a different domain, abusing review sites, and reusing content. One thing to remember is that:
“It’s not just local or niche companies that are doing bad things; in fact, enterprise and large websites can get away with murder compared to smaller sites. This encourages some of the worst practices I’ve ever seen, and some of these companies do practically everything search engines tell them not to do.”
Ugh! The pot is identifying another pot and complaining about its color and cleanliness.
There is a Louisville, Kentucky Hidden /Dark Web meet up on August 23, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233019199/
October 9, 2015
Enterprise search company Attivio has an interesting post in their Data Dexterity Blog titled “3 Questions for the CEO.” We tend to keep a close eye on industry leader Attivio, and for good reason. In this post, the company’s senior director of product marketing Jane Zupan posed a few questions to her CEO, Stephen Baker, about their role in the enterprise search market. Her first question has Baker explaining his vision for the field’s future, “search-based data discovery”; he states:
“With search-based data discovery, you would simply type a question in your natural language like you do when you perform a search in Google and get an answer. This type of search doesn’t require a visualization tool. So, for example, you could ask a question like ‘tell me what type of weather conditions which exist most of the time when I see a reduction in productivity in my oil wells.’ The answer that comes back, such as ‘snow,’ or ‘sleet,’ gives you insights into how weather patterns affect productivity. Right now, search can’t infer what a question means. They match the words in a query, or keywords, with words in a document. But [research firm] Gartner says that there is an increasing importance for an interface in BI tools that extend BI content creation, analysis and data discovery to non-skilled users. You don’t need to be familiar with the data or be a business analyst or data scientist. You can be anyone and simply ask a question in your words and have the search engine deliver the relevant set of documents.”
Yes, many of us are looking forward to that day. Will Attivio be the first to deliver? The interview goes on to discuss the meaning of the company’s slogan, “the data dexterity company.” Part of the answer involves gaining access to “dark data” buried within organizations’ data silos. Finally, Zupan asks what “sets Attivio apart?” Baker’s answers: the ability to quickly access data from more sources; deriving structure from and analyzing unstructured data; and friendliness to “non-technical” users.
Launched in 2008, Attivio is headquartered in Newton, Massachusetts. Their team includes folks with an advantageous combination of backgrounds: in search, database, and business intelligence companies.
Cynthia Murrell, October 9, 2015
August 13, 2015
A new acquisition by CounterTack brings predictive capability to that company’s security offerings, we learn from “CounterTack Acquires ManTech Cyber Solutions” at eWeek. Specifically, it is a division of ManTech International, dubbed ManTech Cyber Solutions International (MCSI), that has been snapped up under undisclosed terms by the private security firm.
CounterTack president and CEO Neal Chreighton says the beauty of the deal lies in the lack of overlap between their tech and what MCSI brings to the table; while their existing products can tell users what is happening or has already happened, MCSI’s can tell them what to watch out for going forward. Writer Sean Michael Kerner elaborates:
“MCSI’s technology provides a lot of predictive capabilities around malware that can help enterprises determine how dangerous a malicious payload might be, Creighton said. Organizations often use the MCSI Responder Pro product after an attack has occurred to figure out what has happened. In contrast, the MCSI Active Defense product looks at issues in real time to make predictions, he said. A big area of concern for many security vendors is the risk of false positives for security alerts. With the Digital DNA technology, CounterTack will now have a predictive capability to be able to better determine the risk with a given malicious payload. The ability to understand the potential capabilities of a piece of malware will enable organizations to properly provide a risk score for a security event. With a risk score in place, organizations can then prioritize malware events to organize resources to handle remediation, he said.”
Incorporation of the open-source Hadoop means CounterTack can scale to fit any organization, and the products can be deployed on-premises or in the cloud. Cleighton notes his company’s primary competitor is security vendor CrowdStrike; we’ll be keeping an eye on both these promising firms.
Based in Waltham, Massachusetts, CounterTack was founded in 2007. The company declares their Sentinel platform to be the only in-progress attack intelligence and response solution on the market (for now.) Founded way back in 1968, ManTech International develops and manages solutions for cyber security, C4ISR, systems engineering, and global logistics from their headquarters in Washington, DC. Both companies are currently hiring; click here for opportunities at CounterTack, and here for ManTech’s careers page.
Cynthia Murrell, August 13, 2015
July 7, 2015
RSS feeds and Web page readers curate content from select Web sites tailored to suit a users’ needs. While all of the content is gathered in one spot and the headlines are available to read, sometimes the readers return hundreds of articles and users do not have the time to read all of them. True, sometimes users can glen the facts from the headlines and the small blurb included with it, but sometimes it is not enough.
There are apps that gather and summarize a users’ content, but these are usually geared towards a specific industry or an enterprise system. There is a content reader that was designed for the average user, while at the same time it can be programmed to serve the needs of many professionals. The Context Organizer from Content Discovery Inc. is an application that summarizes Web pages and documents in order to pinpoint relevant information. The Content Organizer works via five basic steps:
“1. Get to the point – Speed-up reading by condensing web pages, emails and documents into keywords and summaries presented in context.
- Make a Long Story Short – The Short Summary headlines most important sentences – instant information capsules.
- Accelerate Search – Search the web with relevant keywords. Summarize Google search results for rapid understanding.
- Take Notes – Quickly collect topics and sentences. Send them to WordPad or Word. Share notes – send them by e-mail.
- Visualize – View summaries in context as Mindjet MindManager maps.”
There are three different Context Organizer versions: one that specifically searches the Web, another that searches the Web and Microsoft Products, and the third is a combination of the prior versions plus it includes the Mindjet MindManager. The prices range from $60-$120 with a free twenty-one day trial, which we suggest you start with. Always start with free trial first, because you mind be throwing away money on an item you do not like. With the amount of content available on the Web, any tool that helps organize and summarize it is worth investigating.
June 11, 2015
Forbes’ article “The 50 Most Innovative Companies Of 2014: Strong Innovators Are Three Times More Likely To Rely on Big Data Analytics” points out how innovation is strongly tied to big data analytics and data mining these days. The Boston Consulting Group (BCG) studies the methodology of innovation. The numbers are astounding when companies that use big data are placed against those who still have not figured out how to use their data: 57% vs. 19%.
Innovation, however, is not entirely defined by big data. Most of the companies that rely on big data as key to their innovation are software companies. According to Forbes’ study, they found that 53% see big data as having a huge impact in the future, while BCG only found 41% who saw big data as vital to their innovation.
Big data cannot be and should not be ignored. Forbes and BCG found that big data analytics are useful and can have huge turnouts:
“BCG also found that big-data leaders generate 12% higher revenues than those who do not experiment and attempt to gain value from big data analytics. Companies adopting big data analytics are twice as likely as their peers (81% versus 41%) to credit big data for making them more innovative.”
Measuring innovation proves to be subjective, but one cannot die the positive effect big data analytics and data mining can have on a company. You have to realize, though, that big data results are useless without a plan to implement and use the data. Also take note that none of the major search vendors are considered “innovative,” when a huge part of big data involves searching for results.
Whitney Grace, June 11, 2015
October 12, 2014
Oracle’s Secure Enterprise Search offered advanced security. Perfect Search stressed its speed. SES has been marginalized. That particular security pitch did not work. Perfect Search also has faded from the scene.
Perhaps pitching both security and speed will yield more together than as separate features.
SRCH2 asserts that it is four times faster than open source search engines. None of the open source search engines is a speed demon. Speed boosts require additional work on the specific subsystem introducing the latency for a particular deployment.
SRCH2’s “Real Time Computer Requires Faster Search” makes a case for the optimization built in to SRCH2’s system. The article states:
SRCH2 offers the world’s fastest search engine. Why is speed so important? After all, the human eye can’t detect the difference between a 10-millisecond and 50-millisecond response time.
Some data backing this assertion would be helpful. In a direct comparison of Lucid Works’ technology with ElasticSearch’s technology, the ArnoldIT team found that one was faster in indexing and the other was faster in query processing. Both could be improved with focused optimization. Perhaps SRCH2 will share some of their data which backs up the “four time faster claim? (I am not at liberty to release the performance data a client requested my team compile from live tests on my test corpus.
SRCH2’s “SRCH2 Introduces Access Control Lists to Improve Search Security.” The article states:
SRCH2 took the approach of providing native support of access control to set restrictions on search results. With SRCH2’s ACL feature, developers can restrict user permissions to access either certain records in an index, or specific attributes within a record or set of records.
The approach is useful. However, it is less robust that the Oracle approach which implemented a wider range of features provided by specialized Oracle subsystems.
Will the combination of security and speed pay off for SRCH2? Good question. I do not have an answer.
Stephen E Arnold, October 11, 2014
March 10, 2014
Phil Leggetter is a real time software and developer evangelist and on his blog he wrote a post entitled, “10 Real Time Web Technology Predictions For 2014.” He says in the post that he based his 2014 predictions on trends in 2013 and what has happened so far in 2014.
He notes that nearly all applications have a real time sync in their code for relevancy and that real time is becoming a common commodity. This means that real time fixtures will be included in frameworks, but it will not diminish their importance. One can expect to see more real time APIs, increasing API offerings and adding to their values, and WebHooks will gain more prominence.
Leggett mentions that open source needs an data sync solution, which comes as a surprise because there is nearly an open source program for everything. Why has this not been made yet?
Video and audio communication are getting even bigger. Real time video and data communication in real time is going to be even more important for applications and it might be time to check out peer-to-peer data sharing. What is even better is real time developer tools are on the horizon.
The next 10 months of 2014 is going to be very exciting for real time web technology, real time solution providers, real time hosted services, and more importantly for us developers. I expect some serious advancements in existing solutions and some new players to come along. Real time web technology is going to become even easier to integrate into existing applications and we’re going to have a much wider range of choice when building real time apps from the ground up.”
Will real time technology be the buzzword trend this year? Again, it is only predictions.