Geofeedias Political Action

August 20, 2015

The presidential election is a little over a year away and potential presidential candidates are starting on their campaign trails.  The Republican and Democratic parties are heating up with the GOP debates and voters are engaging with the candidates and each other via social media.   The information posted on social media is a gold mine for the political candidates to learn about the voters’ opinions and track their approval rating.  While Twitter and Facebook data is easy to come by with Google Analytics and other software, visual mapping of the social media data is a little hard to find.

To demonstrate its product capabilities, Geofeedia took social media Instagram, fed it into its data platform, and shared the visual results in the blog post, “Instagram Map: Republican Presidential Debate.”  Geofeedia noted that while business mogul Donald Trump did not fare well during the debate nor is he in the news, he is dominating the social media feeds:

“Of all social content coming out of the Quicken Loans Center, 93% of posts were positive in sentiment. The top keywords were GOP, debate, and first, which was to be expected. Although there was no decided winner, Donald Trump scored the most headlines for a few of his memorable comments. He was, however, the winner of the social sphere. His name was mentioned in social content more than any other candidate.”

One amazing thing is that social media allows political candidates to gauge the voters’ attitudes in real time!  They can alter their answers to debate questions instantaneous to sway approval in their favor.  Another interesting thing Geofeedia’s visual data models showed is a heat map where the most social media activity took place, which happened to be centered in the major US metropolises.  The 2016 election might be the one that harnesses social media to help elect the next president.  Also Geofeedia also has excellent visual mapping tools.

Whitney Grace, August 20, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Search Your Yahoo Mail? Yeah, Right

August 19, 2015

While Web site search used to be considered the worst before Google released a high-performing search widget, the title now officially goes to email search.  Nobody wants to search through their email to find a missing email and you are doomed if you even think about using a mail application such as Outlook or Apple Mail.   In part of its rebranding effort, Yahoo is taking measures to fix email search, says the New York Times in “Yahoo Tweaks Email To Make Search More Personal.”

Yahoo has been working for a year to improve email search and now Yahoo mail has implemented the changes.  It now offers auto complete and suggestions when a search term is typed into the query box.  It will also index attachments and links included in emails, so users do not have to find the actual email they were in.  The sorting options have also been updated and social media accounts can now be synced.

The changes are small and the auto complete/suggestions usually revert to basic keyword suggestions, but it is a step in the right direction.  Yahoo does not want to overhaul the mail system too quickly, because, as anyone knows, too many changes at once are upsetting to users.

“Instead, Yahoo is subtly making changes. Last month, for example, it added a small plus button to the bottom right of the window used to compose emails. If you click on that button, you can drag and drop photos and documents from your email archive, pull in an animated GIF from Yahoo’s Tumblr social network, or add the results of a web search.”

Yahoo made a good business choice and is working to improve its email and other applications.  It will be interesting to watch the changes unfold.

Whitney Grace, August 19, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Wikipedia: The PR Revolution

August 17, 2015

i read “The Covert World of People Trying to Edit Wikipedia—for Pay.” I am an old fashioned backwoodsperson. I look up stuff. I try to figure out which source is semi reliable. I read and do some (not much, of course) thinking.

Other folks just whack 2.7 words into the Alphabet—oops, I mean, the Google—click on the first link which is often a pointer to Wikipedia and take the “information” displayed. Easy. Quick. Just right for those who have no time, like social media, and use handheld devices.

The write up points out what seems to me to be an obvious “evolutionary” leap:

How can a site run by volunteers inoculate itself against well-funded PR efforts? And how can those volunteers distinguish between information that’s trustworthy and information that’s suspect?

The write up explores one example of public relations folks cranking out objective articles for Wikipedia.

Why worry? Getting accurate information involves more than relying on Alphabet – oh, there I go again, I mean the Google – and its all time fave number one Wikipedia.

Dialog Information Services pioneered this default top hit. When I logged on, the default database was Education Index or something like that. The clueless would run their query for diamond deposition in that database, thus having an upside for Dialog. Too bad about the system user.

The burden, gentle reader, falls not on Wikipedia, which is fighting a losing battle against the forces of Lucifer – I am sorry, I mean public relations.

The burden falls on the person doing the search to figure out what information is correct. Bummer. That’s real work. Who has time for that anyway?

Stephen E Arnold, August 17, 2016

How to Use Watson

August 17, 2015

While there are many possibilities for cognitive computing, what makes an idea a reality is its feasibility and real life application.  The Platform explores “The Real Trouble With Cognitive Computing” and the troubles IBM had (has) trying to figure out what they are going to do with the supercomputer they made.  The article explains that before Watson became a Jeopardy celebrity, the IBM folks came up 8,000 potential experiments for Watson to do, but only 20 percent of them.

The range is small due to many factors, including bug testing, gauging progress with fuzzy outputs, playing around with algorithmic interactions, testing in isolation, and more.  This leads to the “messy” way to develop the experiments.  Ideally, developers would have a big knowledge model and be able to query it, but that option does not exist.  The messy way involves keeping data sources intact, natural language processing, machine learning, and knowledge representation, and then distributed on an infrastructure.

Here is another key point that makes clear sense:

“The big issue with the Watson development cycle too is that teams are not just solving problems for one particular area. Rather, they have to create generalizable applications, which means what might be good for healthcare, for instance, might not be a good fit—and in fact even be damaging to—an area like financial services. The push and pull and tradeoff of the development cycle is therefore always hindered by this—and is the key barrier for companies any smaller than an IBM, Google, Microsoft, and other giants.”

This is exactly correct!  Engineering is not the same as healthcare and it not all computer algorithms transfer over to different industries.  One thing to keep in mind is that you can apply different methods from other industries and come up with new methods or solutions.

Whitney Grace, August 18, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Yahoo: Semantic Search Is the Future

August 16, 2015

I love it when Yahoo explains the future of search. The Xoogler has done the revisionism thing and shifted from Yahoo as a directory built by silly humanoids to a leader in search. Please, do not remember that Yahoo bought Inktomi in 2002 and then rolled out a wild and crazy search system in cahoots with IBM in 2006. (By the way, that search solution brought my IBM multi cpu, DASD equipped, RAM stuffed server to its knees. At least, the “free” software installed.)

image

Now to business: I read “The Future of Search Relies on Semantic Technologies.” For me, semantic technologies have been part of search for many years. But never mind reality. Let’s get to the Reddi-wip in the Yahoo confection.

Yahoo asserts:

Search companies are thus investing in information extraction and data fusion, as well as more and more advanced question-answering capabilities on top of the collected information. The need for these technologies is only increasing with mobile search, where providing results as ten blue links leads to a very poor user experience.

I would point out that as lousy as blue links are, these links produce about $60 billion a year for the Alphabet Google thing and enough zeros for the Microsoft wizards to hang on to its online advertising business even as it loses enthusiasm for other aspects of the Bing thing.

Yahoo adds:

We are a consumer internet company, so for us there is little difference between our internal and external representations.

My comment is a simple question, “What the heck is Yahoo saying?”

I also highlighted this semantic gem:

At Yahoo Labs, we work in advancing the sciences that underlie these approaches, i.e. Natural Language Processing, Information Retrieval and the Semantic Web.

I like the notion of Yahoo advancing science. I wonder if these advances will lead to advances in top line revenue, stabilizing management, and producing search results that are sort of related to the query.

Read more

CounterTack Partners with ManTech Cyber Solutions for a More Comprehensive Platform

August 13, 2015

A new acquisition by CounterTack brings predictive capability to that company’s security offerings, we learn from “CounterTack Acquires ManTech Cyber Solutions” at eWeek. Specifically, it is a division of ManTech International, dubbed ManTech Cyber Solutions International (MCSI), that has been snapped up under undisclosed terms by the private security firm.

CounterTack president and CEO Neal Chreighton says the beauty of the deal lies in the lack of overlap between their tech and what MCSI brings to the table; while their existing products  can tell users what is happening or  has already happened, MCSI’s can tell them what to watch out for going forward. Writer Sean Michael Kerner elaborates:

“MCSI’s technology provides a lot of predictive capabilities around malware that can help enterprises determine how dangerous a malicious payload might be, Creighton said. Organizations often use the MCSI Responder Pro product after an attack has occurred to figure out what has happened. In contrast, the MCSI Active Defense product looks at issues in real time to make predictions, he said. A big area of concern for many security vendors is the risk of false positives for security alerts. With the Digital DNA technology, CounterTack will now have a predictive capability to be able to better determine the risk with a given malicious payload. The ability to understand the potential capabilities of a piece of malware will enable organizations to properly provide a risk score for a security event. With a risk score in place, organizations can then prioritize malware events to organize resources to handle remediation, he said.”

Incorporation of the open-source Hadoop means CounterTack can scale to fit any organization, and the products can be deployed on-premises or in the cloud. Cleighton notes his company’s primary competitor is security vendor CrowdStrike; we’ll be keeping an eye on both these promising  firms.

Based in Waltham, Massachusetts, CounterTack was founded in 2007. The company declares their Sentinel platform to be the only in-progress attack intelligence and response solution on the market (for now.) Founded way back in 1968, ManTech International develops and manages solutions for cyber security, C4ISR, systems engineering, and global logistics from their headquarters in Washington, DC. Both companies are currently hiring; click here for opportunities at CounterTack, and here for ManTech’s careers page.

Cynthia Murrell, August 13, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

SharePoint 2016 Beta is Coming Soon

August 13, 2015

There is a lot of excitement about the future of SharePoint. Microsoft wants to capitalize on the good buzz but in their excitement the timeline has gotten skewed. It seems that the most recent change is in their favor, however. CMS Wire covers the story in their article, “Cancel Your Plans: SharePoint 2016 Beta is (Almost) Here.”

The author begins:

“For the past couple of years, we IT pros really haven’t known what our place in the world was going to be with SharePoint. But I feel like in the past couple of months I’ve seen the future. At least for me, as an IT pro, part of that future is identity. So you’re going to be hearing a lot more about that from me. But also the reason you’re going to be hearing about a lot of that is because next month — August — we’re going to get our first public beta of SharePoint 2016.”

The beta release will come earlier than projected. Lots of updates will come fast and frequently once the release is available, making it difficult to stay ahead of the curve. In order to sort through the chaos, stay tuned to ArnoldIT.com, a website carefully curated by Stephen E. Arnold. His SharePoint feed is a great way to stay in touch with the latest news, without being overwhelmed by the unnecessary details.

Emily Rae Aldridge, August 13, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Thunderstone Rumbles about Webinator

August 13, 2015

There is nothing more frustrating than being unable to locate a specific piece of information on a Web site when you use its search function.  Search is supposed to be quick, accurate, and efficient.  Even if Google search is employed as a Web site’s search feature, it does not always yield the best results.  Thunderstone is a company that specializes in proprietary software application developed specifically for information management, search, retrieval, and filtering.

Thunderstone has a client list that includes, but not limited to, government agencies, Internet developer, corporations, and online service providers.  The company’s goal is to deliver “product-oriented R&D within the area of advanced information management and retrieval,” which translates to them wanting to help their clients found information very, very fast and as accurately as possible.  It is the premise of most information management companies.  On the company blog it was announced that, “Thunderstone Releases Webinator Web Index And Retrieval System Version 13.”  Webinator makes it easier to integrate high quality search into a Web site and it has several new appealing features:

  • “Query Autocomplete, guides your users to the search they want
  • HTML Highlighting, lets users see the results in the original HTML for better contextual information
  • Expanded XML/SOAP API allows integration of administrative interface”

We like the HTML highlighting that offers users the ability to backtrack and see a page’s original information source. It is very similar to old-fashioned research: go back to the original source to check a fact’s veracity.

Whitney Grace, August 13, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Browser Wars: Sparks Fly During Firefox Microsoft Snipe Hunt

August 12, 2015

i read “Firefox Sticks It to Microsoft, Redirects Cortana Searches in Windows 10.” Ah, ha. Windows 10. Some day, maybe. I assume the endless reboots upon updating are merely a rumor. The real story is that a Firefox user can use Cortana to search something other than Bing.

The write up says:

After blasting Microsoft’s attempts to set Edge as the default browser in Windows 10, Mozilla is enjoying some sweet revenge by steering Firefox users away from Bing. With the newly-released Firefox 40, users no longer have to use Bing for web searches from Cortana on the Windows 10 taskbar. Instead, Firefox will show results from whatever search engine the user has chosen as the default. Using Firefox isn’t the only way to replace Cortana’s Bing searches with Google or another search engine. But Firefox is currently the only browser that does so without the need for third-party extensions. (It wouldn’t be surprising, however, if Google follows suit.)

Poor Microsoft. The company has been trying to build bridges. The result is a dust up.

I long for the good old days. Microsoft would have been careful to avoid getting stuck in a squabble about its Siri and Google Voice killer Cortana.

What impact will this have? My hunch is that as Windows 10 flows into the hands of those who fondly recall Bob, then the issue will become more serious.

For now, this is amusing to me. Recall I am a person who abando0ned my Lumia Windows phone because the silly Cortana feature was in a location which made activation impossible to avoid. I dumped the phone. End of story.

Stephen E Arnold, August 12, 2015

Google Seeks SEO Pro

August 12, 2015

Well, isn’t this interesting. Search Engine Land tells us that “Google Is Hiring an SEO Manager to Improve its Rankings in Google.” The Goog’s system is so objective, even Google needs a search engine optimization expert! That must be news to certain parties in the European Union.

Reporter Barry Schwartz spotted the relevant job posting at the company’s Careers page. Responsibilities are as one might expect: develop and maintain websites; maintain and develop code that will engage search engines; keep up with the latest in SEO techniques; and work with the sales and development departments to implement SEO best practices. Coordination with the search-algorithm department is not mentioned.

Google still stands as one of the most sought-after employers, so it is no surprise they require a lot of anyone hoping to fill the position. Schwartz notes, though, that link-building experience is not specified. He shares the list of criteria:

“The qualifications include:

*BA/BS degree in Computer Science, Engineering or equivalent practical experience.

*4 years of experience developing websites and applications with SQL, HTML5, and XML.

*2 years of SEO experience.

*Experience with Google App Engine, Google Custom Search, Webmaster Tools and Google Analytics and experience creating and maintaining project schedules using project management systems.

*Experience working with back-end SEO elements such as .htaccess, robots.txt, metadata and site speed optimization to optimize website performance.

*Experience in quantifying marketing impact and SEO performance and strong understanding of technical SEO (sitemaps, crawl budget, canonicalization, etc.).

*Knowledge of one or more of the following: Java, C/C++, or Python.

*Excellent problem solving and analytical skills with the ability to dig extensively into metrics and analytics.”

Lest anyone doubt the existence of such an ironic opportunity, the post reproduces a screenshot of the advertisement, “just in case the job is pulled.”

Cynthia Murrell, August 12, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta