Real Life Alerts Show There is More to Search than Key Words

July 12, 2012

AtHoc joined forces with Intel and received a $5.6 million investment to improve their technology. Since they are the leader in enterprise-class, network-based mass notification systems for the security, life safety and defense sectors of the United States, one would have to agree that was a wise investment.

Contrary to some beliefs, there is more to search than key words. The recent press releases on AtHoc’s page “Intel Invests in AtHoc; Chairman of RSA Security Joins AtHoc’s Board,” are a reminder that increasing device technology demands improvements with critical situational awareness data. Organizations must be able to swiftly analyze and address anomalies because lives may depend on it.

AtHoc does just that with real life, real time alerts as stated:

“AtHoc helps organizations become fully prepared to provide emergency mass communication to all of its constituents. It allows users to provide additional data and responders to remediate the issues at hand, based on the information they receive. AtHoc improves the safety and security of our citizens, first responders, and armed forces personnel around the world.”

Just imagine attempting to get a real time response on the average search engine during an emergency. The repercussions of scanning pages of possible aid would almost assuredly be life threatening. When considering the outcome from that perspective, real life, real time alerts show there is more to search than key words.

Jennifer Shockley, July 12, 2012

Sponsored by Polyspot

Recorded Future Suggested for Cyber Attack Prediction

July 12, 2012

Oh, oh, scary marketing. Careful, the goose is easily startled. Sys-Con Media claims our attention with “Recorded Future for Forecasting Cyber Attacks.” Blogger Bob Gourley does a good job, though, of explaining why Recorded Future would be a good tool for predicting cyber attacks.

Already employed by agencies such as the US Southern Command, Recorded Future has been successfully used to anticipate citizen unrest and to analyze intelligence stored on a private cloud (the Bin Laden Letters, no less.) The software automates the aggregation and organization of data, leaving more time for human analysts to focus on assessment. The application presents the information collected from articles, blog posts, and/or tweets chronologically, including (this is the best part) a prediction of future events. The software also helps with the analysis stage by mapping relationships and tracking buzz.

Gourley asserts that the company’s technology can also be used in the struggle against international hackers:

“All together, these capabilities allow an organization to forecast more accurately whether they will be the target of a major cyber attacks and what threat vectors they should most worry about. Within minutes, analysts could see if there has been a trend of attacks against similar organizations, any threats reported online, or events likely to trigger attacks coming up. They can drill down into coverage by blogs or trade journals if they find the mainstream media insufficient or misleading, and map out the interactions and relationships between hacking groups, companies, government agencies, and law enforcement. While Recorded Future can’t tell you who will attack you and when, it makes open source intelligence intelligence analysis for cybersecurity easier, faster, and more effective.”

Still in the start-up phase, Recorded Future has headquarters in Cambridge, MA, and Göteborg, Sweden. Staffed with statisticians, linguists, and technical business pros as well as computer scientists, the company seems well-equipped to deliver what they call “the world’s first temporal analytics engine.”

Cynthia Murrell, July 12, 2012

Sponsored by PolySpot

A SharePoint Search Refiner

July 12, 2012

The SharePoint Blog contained a very informative explanation of SharePoint “refiners.” A “refiner”, according to Microsoft is “enable end-users to drill down into their search results based on managed properties that are associated with the indexed search items, such as creation date, author, and company names.”

Custom SharePoint 2010 Search Refiner – Displaying Range of Choices is a presentation of information which originally appeared in the ShareMuch blog. The write is, in my opinion, quite useful. The information provides a streamlined explanation of how to implement a refiner in a SharePoint 2010 installation. The write up provides an XML snippet which makes the addition of a refiner quick and easy.

The article explains:

MappedProperty maps to an actual managed property that you must define or is already defined in search service application. The SortBy defines, in this case, a custom filter right below the category. The CustomFilters node’s MappingType property means we’ll have a custom filter. In our case, we’re using a range mapper, meaning that whatever value are going to be in the managed property, our filter will display UI based on the range of those and let user toggle the display based on that range. I hope this makes sense. The DataType has only 3 types, so please don’t make the same mistake I did and try to guess the value, it’s limited to “Numeric”, “DateTime”, “String”. The CustomValue inside CustomFilter specifies the user friendly value and the OriginalValue defines the range. In our example, the “Size” property is measured in Bytes so “..1? means range anywhere from 0 bytes to 1 byte. It happens that list items and lists in search results are less than 1 byte in size which means that we can refine by list items and lists results by capturing items with size less than 1 byte. Everything else is a document.

Search Technologies implements “refiners” as well as other advanced features of SharePoint. If you want to extend SharePoint and make the system deliver even greater value to your users, contact Search Technologies.

Iain Fletcher, July 12, 2012

Is Mixpanel Tracking You?

July 11, 2012

A pair of articles highlight the ways in which Mixpanel is taking tracking software to new levels. Whether those levels are highs or lows depends entirely upon your perspective.

VentureBeat exclaims, “Mixpanel Now Lets Apps Target You—Yeah YOU—on a Deeply Personal Level.” Reporter Jolie O’Dell notes that Mixpanel is regarded as a provider of quality analytics. Now the company is working on tying data to individual users, a departure from the don’t-worry-this-data-isn’t-tied-to-you-personally convention we’ve grown used to. The aim, of course, is to target advertising with ever better accuracy. Reminds me of that scene from “Minority Report.”

Search Engine Marketing and Website Optimization’s blog is a bit more matter-of-fact than VentureBeat, stating simply, “Mixpanel is Tracking More Than Actions Now, Introduces User Analytics.” That piece also mentions that Mixpanel ties to individual users, and discusses related analysis:

“Specifically, when customers open up their Mixpanel dashboard, they’ll see a new menu under the ‘actions’ section called ‘people’, where they can get data about all of their visitors, such as gender, age, and country, and then correlate that data with user activity. . . .

“[Mixpanel’s co-founder Suhail Doshi] says the new features should be useful to companies of all sizes. If you’ve got a brand new website and only a few hundred visitors, you can look at the individual profiles. If you’re more established, with millions of users, you can still look for patterns among those users, and also target messages to specific groups.”

Great. I can’t say I’m surprised things are progressing this direction. I also can’t say I don’t appreciate not seeing ads about things I’m not interested in. Still, the whole trend leaves me a bit unsettled. Better get used to it, I suppose.

Cynthia Murrell, July 11, 2012

Sponsored by PolySpot

Is Google Stuffing a Sock in Microsofts Horn?

July 11, 2012

Google took the stage recently performing renditions of their personal favorite, ‘how great Google art’ for the press. The search engine superstar sees no reason to hire a bard when they can toot their own horn.

All three articles entertain readers with songs of Google’s greatness, as asserted:

“It’s clear that the two top tech companies in America are currently Apple and Google, no longer Apple and Microsoft.”

“According to all our metrics and everything we see out there, Chrome is the most popular browser. It’s fair to say we are No. 1 or barely No. 2 in all countries in the world.”

“Google provided $80 billion of economic activity directly to advertisers, website publishers and nonprofit organizations across the US last year.”

From one performance to the next, Google sings more of their wonderfulness. The articles present interesting numbers. Still, we have to wonder what’s really up at Google and why all the horn tooting? Maybe they are just trying to stuff a sock in Microsoft’s horn.

Jennifer Shockley, July 11, 2012

Sponsored by Polyspot

SharePoint Needs to Up Mobile Game

July 11, 2012

Microsoft is on a roll with its mobile game, generating a lot of buzz about their new Surface tablet.  With Microsoft 365 all systems and software are envisioned to be fully integrated, creating a fully portable and mobile experience.  CMS Wire speaks to how this new direction and vision may impact devoted SharePoint users in, “SharePoint has Yammer, Now it Needs To Up Its Mobile Game.”

The article states:

Last week’s announcement of the new Surface tablet range is bang on theme. Whilst observers quibble about price points and the merits of a built in kickstand, there is no doubting that Microsoft thinks many of us will spend our future computing time prodding at a screen of some sort . . . So what about the next version of SharePoint?  So can we expect a friendly touch enabled version of SharePoint? A ‘SharePoint: Metro’ to impress all those execs running meetings with their shiny new Surface tablets?   The evidence suggest not.

So if SharePoint is not yet living up to the rest of the Microsoft mobile offerings, what is an organization to do?  We suggest looking into the fully mobile capabilities of a trusted third party enterprise solution like Fabasoft MindbreezeFabasoft Mindbreeze Mobile offers full functionality and navigation, losing nothing over the traditional desktop-centered enterprise approach.  Working alongside an existing SharePoint deployment, or as a standalone product, we recommend Fabasoft Mindbreeze for a number of reasons, but specifically for its superior mobile functionality.

Emily Rae Aldridge, July 11, 2012

Sponsored by Pandia.com

PLM Interest Group Challenges Designers to Go Beyond English

July 11, 2012

Much like the writers of Star Trek PLM developers work off the assumption that the entire universe knows English.  The fact that all PLM solutions are based in the English language drove the PLM Interest Group (PLMIG) to begin the search for a PLM solution that does more than just translate according to the MCAD Café article, “PLMIG calls for Research into ‘Own-Language’ PLM”.

As the article explains of the problem facing PLM solutions around the world:

“The generally-accepted working language for PLM is English, and it has become accepted practice for all non-native English speakers to standardise on English for PLM so that everyone can understand each other. This is a workable solution, but it still represents a tremendous barrier to PLM understanding and adoption in the majority of countries in the world, as well as in large multi-national corporations.”

PLM in general is a complex set of data management techniques and systems that go far deeper than just putting business processes in order.  Providers must remain cognizant of cultural difference within industries as well as remain flexible enough to be relevant in countries around the globe.  Inforbix, a leader in the industry, works hard to help all their clients, regardless of nationality, make the most of their PLM solutions and find new data management solutions that meet their unique needs.

Catherine Lamsfuss, July 11, 2012

Tibco Performs Well and Axes Sales Exec

July 11, 2012

Infrastructure software outfit Tibco has had a busy time recently. Presumably, the company is happily celebrating its strong quarter; the Wall St. Cheat Sheet informs us, “TIBCO Software Earnings: Beats Analysists’ Estimates.” Meanwhile, we learn they have reshuffled some staff in Reuters’ “Tibco Removes Americas Sales Head.”

Tibco’s second quarter net income rose by 26% over the first quarter, from $21 million to $26.6 million. Their adjusted net income came to 26 cents per share, smashing the expectations of only 17 cents per share. This is the fourth quarter in a row Tibco has exceeded forecasts. Will they surprise the prognosticators again next quarter? Perhaps; the Wall St. Cheat Sheet write up reports that “analysts appear increasingly negative about the company’s results for the next quarter.” You’d think they’d learn.

The Reuters’ piece illustrates one potential reason for Tibco’s success—they fix things that aren’t working for them. The performance of their former US head of sales did not meet with expectations, so they have simply removed him. Regional sales vice presidents in this hemisphere will now report directly to the executive VP of global sales. The article reports:

“‘I was not happy with the way we were executing for the last three or four quarters,’ Chief Executive Vivek Ranadivé said on a conference call with analysts. ‘We felt that execution in other geographies like Europe and Asia was very strong. I felt we were leaving too much money on the table.’

“Sales at Tibco’s Americas business grew 13 percent in the second quarter, compared with a 22 percent growth in Europe and 27 percent in Asia. The Americas accounted for more than half of the company’s overall sales in 2011.”

Tibco‘s infrastructure platform focuses on real-time event processing and on easy-to-understand analytics. The company’s headquarters are in Palo Alto, CA, but it has offices around the world. Launched in 1997, Tibco workers prides themselves on “flawless” execution and delivery, and seem to enjoy helping clients navigate today’s shifting business landscape.

Cynthia Murrell, July 11, 2012

Sponsored by PolySpot

Data Vibrancy: Pursuing a Full Spectrum

July 11, 2012

Vibrant Data has painted a pretty picture, but more detail is needed before it can be called a masterpiece. Ted’s blog, “Fellows Friday: The Vibrancy of Data” talks about the broad span of challenging areas, while repeatedly emphasizing the need for more collaborative research and problem mapping.

In the opinion of Vibrant Data, a few areas need a little brush work to create an efficient data ecosystem, and:

“From the structure of that network, we identified four Grand Challenge areas for democratizing data: Digital Infrastructure, Digital Trust (protection of personal identity, and reputation systems for accountability), Digital Literacy (widespread access to intuitively asking questions with data and thinking critically about the answers) and Platform Openness (the ability to copy, edit and customize platforms that facilitate finding meaning from data).”

The color image of their grand challenges is quite impressive at first glance, however Vibrant Data openly admits this is a work in progress. The plan to involve more people in the problem mapping will definitely allow for more input, but to what end? Collaboration can be beneficial but at the same time, too many artists can clutter up the original vision.

The overall picture that Vibrant Data is trying to create would allow everyday people to directly benefit from the enormous pallet of evolving data. With access provided, everyday users could avoid possible negative consequences and paint their own portrait.

Jennifer Shockley, July 11, 2012

Sponsored by Polyspot

 

Unstructured Data Becomes the New Gold Rush

July 11, 2012

Knowledge is like gold, and today’s internet prospectors are digging deep for the ‘golden’ Data vein. Big Data offers a gold rush of information that may exceed the shiny nuggets of 1848. The big data cavern glimmers with possibilities according to Sys-Con’s article, “Actuate Unveils ‘Big Data’ Research Results” and data engineers are creating better tools. Data miners are digging for sophisticated analytics on unstructured content to be metamorphosed into pattern analysis, keyword correlation, incident prediction and fraud prevention. The purpose is to get data in active hands to be utilized for customer communications, management, profit and the overall organization’s evolution.

A study was done to determine the trials that companies face when digging deeper than the surface and:

“The findings highlight the contrast between organizations’ internal experience of accessing relevant, up-to-date decision-supporting information and their ability to pinpoint a wealth of data on almost any other subject via the public Internet.”

“As many as 80% of respondents admitted that they had no extended search capability across multiple repositories, while 70% said they found it “harder” or “much harder” (23%) to access key information held on internal systems versus that available on the Web, even though the content exists within their organization.”

Unstructured content was the golden glimmer that caught the organizational eye, now they have to access and deploy it efficiently. Rumors about this technology and its benefits will spread quickly, so it will not take long for this new gold rush to get underway.

Jennifer Shockley, July 11, 2012

Sponsored by Polyspot

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta