US Government and Its New IT Directions

September 14, 2010

The U.S. Government is shedding its old clothes for new ones that fit the new technology. The Obama Administration wants the agencies to be transparent and innovative, giving command to U.S. General Services Administration (GSA) to implement the “Open Government” initiative, which in turn created the Office of Citizen Services and Innovative Technologies (OCSIT).

The CRMBuyer interview “Making Change Happen Every Day: Q&A With GSA’s David McClure”, reports the OCSIT associate administrator comment that, “OCSIT is rapidly becoming a leader in the use of new media, Web 2.0 technologies, and cloud computing, to proactively make government agencies and services more accessible to the public.” According to him, by operating at the “enterprise level,” the GSA is aiming to accelerate the adoption of technologies, including mobile applications, and improving search engine capabilities, to involve greater customer interactions and gain efficiencies. We concur with David who feels enhancing citizen participation in government will pay dividends on technology investments, but by hiring IBM to add agility, we are not sure if it could be the swiftest runner on the track team.

Why are there so many separate search systems? Is one efficiency to use one indexing system?

Is IBM the swiftest cat in the nature preserve?

Leena Singh, September 14, 2010

Freebie

IBM and Its Fall 2010 Marketing Angle

September 14, 2010

I read “IBM’s Big Push to Steal Sales from Rivals”. If the write up is accurate, IBM is slashing prices or buying market share. I am not sure how to position the tactic. Here’s what Business Week says:

Starting this month, Oracle and HP customers that switch to IBM’s latest package of servers, software, and storage, priced at upwards of $75,000, will get trade-in credit and can defer all payments until next year, interest-free. Big Blue also will help finance the cost of taking out a client’s old equipment and transferring the data over to its Power7 system. IBM, which managed to steal 500 customers away from competitors last year, hit that mark in just six months in 2010, says Jeff Howard, the marketing director for Power7. Now it’s hoping the sweetened financing will help keep the momentum going.

How will other companies respond? I anticipate bundling, price cutting, special offers, and quite a bit of love and attention to procurement teams.

What are the implications?

First, I think the companies affected by these tactics, if the write up is accurate, will be second and third tier enterprise vendors.

Second, the bundling will put further pressure on some specialist providers of search, content processing and business intelligence. It takes deep pockets to buy market share Big Blue style.

Third, I think customers may take a closer look at products that may not be free. Deals from giants like IBM often come with an Iron Maiden, a thumbscrew, and foot chains. There is no free lunch in the rough and tumble world of enterprise software.

Stephen E Arnold, September 14, 2010

Freebie

Quick and Dirty Sentiment Analysis

September 14, 2010

I thought “Most Common Words Unique to 1 Star and 5 Star App Store Reviews” provides some insight into how certain sentiment analysis systems work. The article said:

I wrote a script to crawl U.S. App Store customer reviews for the top 100 apps from every category (minus duplicates) and compute the most common words in 1-star and 5-star reviews, excluding words that were also common in 3-star reviews.

Frequency count against a “field”. Here are the results for positive apps:

awesome, worth, thanks, amazing, simple, perfect, price, everything, ever, must, iPod, before, found, store, never, recommend, done, take, always, touch

How do you know a loser?

waste, money, crashes, tried, useless, nothing, paid, open, deleted, downloaded, didn’t, says, stupid, anything, actually, account, bought, apple, already

“Sentiment” can be disceerned by looking for certain words and keeping count. So much for rocket science of “understanding unstructured text.”

Stephen E Arnold, September 14, 2010

Freebie

SharePoint Dual Feature Bonanza

September 13, 2010

Microsoft SharePoint 2010 has more social and search features, which are intertwined to create an enticing platform for users. This and more is revealed in the Able Blue blog post “The MOSS Show Interview”, which takes you to The MOSS show site’s interview “Enterprise Social (and Search) in SharePoint 2010”.

The two part podcast interview of Matthew McDermott, who is a Microsoft SharePoint expert and MVP, talks about the new improved social features like improved My Sites, Activity Feed, tagging, rating, managed metadata, taxonomies, and folksonomies in SharePoint 2010. Matthew talks about the importance of having a search strategy, and leverage the search applications by making the search actionable and refined.

SharePoint 2010 can help create a knowledge base that benefit over a long period, and can be shared amongst users. Matthew points out, “What makes SharePoint 2010 special is its ability to gather feedback from people participating in the content consumption,” which enhances the value of the content, making it more important to the enterprise. This is enterprise social, which gains more relevance if “made more findable by tagging and using proper metadata.”

Matthew explains that SharePoint 2010 adds great enterprise social capabilities, and facilitates to integrate third party external applications like LinkedIn, Facebook, and Twitter outside the firewall from SharePoint 2010. These social tools can be used to create a business value. The new SharePoint 2010 allows the internal as well as external URLs in the browser to be tagged, and enables the list of all the tagged URLs to collect on a tag profile page.

The managed metadata store of SharePoint 2010, allows people to create a central repository of data through service applications. There is also a feature to make the data translatable into multi-lingual forms, and even deny the use of tags for various reasons. “Activity feed is a feature through which you get your news or tips of the day by just following the tags,” Matthew reports, “and you get the ability to consume content around the organization.” He believes that this helps the employees to connect with each other, nurture cooperation, and makes them productive by improving the culture of the workplace.

The beauty of SharePoint 2010, as per Matthew McDermott is that users can themselves decide upon the governance of the data, and thus get complete control of this powerful enterprise social platform, with highly developed search techniques.

Now, how expensive is it to maintain a proprietary system that requires hands on fiddling to make work as advertised? The answer to this question is not in the movie. Maybe the sequel?

Harleena Singh, September 13, 2010

Freebie

Some Social Traction: Grinding Google and Seducing Seniors

September 13, 2010

A recent survey reveals that the older American generation is becoming very active in social networking. Supposed to be the playground of the young, now “nearly half of the US Internet users ranging in age from 50 to 64, marking an 88 percent increase from the prior year” use the online communities such as Facebook, MySpace, or LinkedIn, according to the Medindia.net article “Older Americans Flocking to Online Social Networks.

Though the older users still use email as their primary mode of communication, most were “inclined to reconnect with people from the past, potentially creating support networks for retiring or changing careers,” stated the Pew Research Center’s report based article. There is a double fold increase in seniors of 65 years of age and above using the social networking platform, who mainly use it for blogging or online health discussions. It shows that the once broadband-resistant community has now embraced it, as an everyday utility of life.

To add to the spice of the Facebook service, PCWorld reported “Facebook Now More Popular than Google.” Usage data are controversial. But for those who never look at log files, these seemingly concrete numbers are reality. This means that PCWorld’s statement “A total of 9.9 percent of consumers’ online time was spent on the site Facebook in August, surpassing time spent on Google which came in with 9.6 percent.”

Is the sky falling on Google? Nope, but this Facebook news makes clear that Google’s “speed” with regard to displaying laundry lists is not equally in its race against Facebook. As any race car driver knows, a few seconds lead can make a significant difference in the end-of-race pay off.

Stephen E Arnold, September 13, 2010

Freebie

Bringing Twitter to Life

September 13, 2010

Real Simple Syndication or RSS has been the leading news aggregator for a long time. However, with the advent of social networking technologies, people became disinterested in RSS, which was then proclaimed dead. Twitter too disabled the Basic Authentication on its site, not providing any other alternatives.

Russell Beattie suggests a useful tip to tap on Twitter’s real time information, on his site’s article “Using Roomatic to Get Your Twitter Timeline RSS feed.” Russell has his own oAuth-enabled Twitter service called Roomatic, and all you have to do is click on the link provided in the article. He explains, “It’ll forward you to Twitter to ask for authentication, you click yes, and it’ll re-direct you back to the same url with some params [sic] which passes through the raw xml feed of your subscribers tweets.” You have to use this URL in your feed readers, to re-start the instant Twitter feeds.

Miracle. RSS resurrected.

Harleena Singh, September 13, 2010

Freebie

SwiftRiver: Open Source Pushes into the Intel Space

September 13, 2010

If you are one of the social netizens, you know it isn’t easy to keep track of, manage, and organize the hundreds of Twitter streams, Facebook updates, blog posts, RSS feeds, or SMS that you keep getting. Do not feel helpless as SwiftRiver comes to your aid, which is a free open source intelligence-gathering platform for managing real-time streams of data streams. This unique platform consists of a number of unique products and technologies, and its goal is to aggregate the information from multiple media channels, and add context related to it, using semantic analysis.

SwiftRiver can also be used as a search tool, for email filtering, to monitor numerous blogs, and verify real-time data from various channels. It offers, “Several advanced tools (social graph mining, natural language processing, locations servers, and twitter analytics) for free use via the open API platform Swift Web Services.” According to the parent site Swiftly.org, “This free tool is especially for organizations who need to sort their data by authority and accuracy, as opposed to popularity.” SwiftRiver has the ability to act quickly on massive amounts of data, a feat critical for emergency response groups, election monitors, media, and others.

swift river

There are multiple Swift Rivers. You want the one at http://swift.ushahidi.com or http://swiftly.org/.

Ushahidi, the company behind this initiative claims, “The SwiftRiver platform offers organizations an easy way to combine natural language/artificial intelligence process, data-mining for SMS and Twitter, and verification algorithms for different sources of information.” Elaborating further it states, “SwiftRiver is unique in that there is no singular ‘SwiftRiver’ application. Rather, there are many, that combine plug-ins, APIs, and themes in different ways that are optimized for workflows.”

Presently SwiftRiver uses the Sweeper App, the Kohana MVC UI, the distributed reputation system RiverID, and SwiftWebServices (SWS) as the API platform. The beauty here is that SwiftRiver is just the core, and it can have any UI, App, or API. It also has an intuitive and customizable dashboard, and the “users of WordPress and Drupal can add features like auto-tagging and more using Swift Web Services.” While you may download SwiftRiver and run it on your web server, SWS is a hosted cloud service, and does not need to be downloaded and installed.

Harleena Singh, September 13, 2010

Freebie

RSS Readers Dead? And What about the Info Flows?

September 13, 2010

Ask.com is an unlikely service to become a harbinger of change in content. Some folks don’t agree with this statement. For example, read “The Death Of The RSS Reader.” The main idea is that:

There have been predictions since at least 2006, when Pluck shut its RSS reader down that “consumer RSS readers” were a dead market, because, as ReadWriteWeb wrote then, they were “rapidly becoming commodities,” as RSS reading capabilities were integrated into other products like e-mail applications and browsers. And, indeed, a number of consumer-oriented RSS readers, including News Alloy, Rojo, and News Gator, shut down in recent years.

The reason is that users are turning to social services like Facebook and Twitter to keep up with what’s hot, important, newsy, and relevant.

image

An autumn forest. Death or respite before rebirth?

I don’t dispute that for many folks the RSS boom has had its sound dissipate. However, there are several factors operating that help me understand why the RSS reader has lost its appeal for most Web users. Our work suggest these factors are operating:

  1. RSS set up and management cause the same problems that the original Pointcast, Backweb, and Desktop Data created. There is too much for the average user to do and then too much on going maintenance required to keep the services useful.
  2. The RSS stream outputs a lot of baloney along with the occasional chunk of sirloin. We have coded our own system to manage information on the topics that interest the goose. Most folks don’t want this type of control. After some experience with RSS, my hunch is that many users find them too much work and just abandon them. End users and consumers are not too keen on doing repetitive work that keeps them from kicking back and playing Farmville or keeping track of their friends.
  3. The volume of information in itself is one part of the problem. The high value content moves around, so plugging into a blog today is guarantee that the content source will be consistent, on topic, or rich with information tomorrow. We have learned that lack of follow through by the creators of content creators is an issue. Publishers know how to make content. Dabblers don’t. The problem is that publishers can’t generate big money so their enthusiasm seems to come and go. Individuals are just individuals and a sick child can cause a blog writer to find better uses for any available time.

Read more

Oracle Keeps Pushing into Business Intelligence

September 12, 2010

It’s like following a trail in the woods made by a Hummer. Tough to miss. Deer get the heck out of the way—usually. Smaller critters may not know what’s about to happen when the metal beastie crunches their carapace. Oracle made a big stride into the business intelligence domain with its new Oracle Business Intelligence 11g. It was officially released in August 2010 and accelerated into the tender pines dainty flowers that make up the US business intelligence market.

From the beginning Oracle BI solutions have retained some of the popular stable features from Seibel Analytics and that has not changed with 11g. the newer version does have some great features, presentation and business analysis features which are a good improvement over the previous versions. However, the real enhancements can only be appreciated by someone who has used OBIEE 10.1.3.4. Steve Callan in his The Old and New of Oracle Business Intelligence” asserted, “There may be lots of changes you can’t or won’t appreciate if you don’t know what it was like in past.”

Just like Hummers. Big, powerful, and sometimes tough to avoid.

Martin Brooke, September 12, 2010

Freebie

Instant Search: Who Is on First?

September 12, 2010

A reader sent me a link to “Back to the Future: Innovation Is Alive in Search.” The point of the write up is to make clear that for the author of the post, Yahoo was an innovator way back in 2005. In fact, if I understand the Yahooligan’s blog post, Yahoo “invented” instant search. I am an addled goose, but I recall seeing that function in a demo given me in 1999 or 2000 by a Fast Search & Transfer technology whiz. Doesn’t matter.

Search has lacked innovation for a long, long time. In fact, if you can track down someone who will share the closely guarded TREC results, you will see that precision and recall scores remain an interesting challenge for developers of information retrieval systems. In fact, the reason social curation seems to be “good enough” is that traditional search systems used to suck, still suck, and will continue to suck.

The problem is not the math, the wizards, and the hybrid Rube Goldberg machines that vendors use to work their magic. Nope. The problem with search has several parts. Let me make them explicit because the English majors who popular the azure chip consulting firms and “real” blogs have done their best to treat technology as John Donne poem:

First, language. Search involves language, which is a moving target. There’s a reason why poems and secret messages are tough to figure out. Words can be used in ways that allow some to “get it” and others to get an “F” in English 301: The Victorian Novel. At this time, software does better at certain types of language than others. One example is medical lingo. There’s a reason why lots of vendors have nifty STM (scientific, technical, and medical) demos.

Second, humans. Humans usually don’t know exactly what they want. Humans can recognize something that is sort of what they want. If the “something” is close enough for horseshoes, a human can take different fragments and glue them together to create an answer. These skills baffle software systems. The reason social curation works for finding information is that the people in a “circle” may be closer to the mind set of the person looking for information. Even if the social circle is clueless, the placebo effect kicks in and justifies the “good enough” method; that is, use what’s available and “make it work”, just like Project Runway contestants.

Third, smart software. Algorithms and numerical recipes, programmable search engines, fuzzy logic, and the rest of the PhD outputs are quite useful. The problem is that humans who design systems, by definition, are not yet able to create a system that can cope with the oddities that emerge from humans being human. So as nifty as Google is at finding a pizza joint in Alphabet City, Google and other systems may be wildly wrong as humans just go about their lives being unpredictable, idiosyncratic, irrational, and incorrect in terms of the system output.

I think there is innovation in search. Now my definition of innovation is very different from the Yahooligan’s. I am not interested is pointing out that most commercial and open source search systems just keep doing the basics. Hey, these folks went to college and studied the same general subjects. The variants are mostly tweaks to methods others know pretty well. After getting a PhD and going into debt, do you think a search engineer is going to change direction and “invent” new methods? I find that most of the search engineers are like bowling balls rolling down a gutter. The ball gets to the end of the lane, but the pins are still standing. Don’t believe me? Run a query on Ask.com, Bing.com, Google.com, or any other search system you can tap? How different are the results?

The challenge becomes identifying companies and innovators who have framed a findability problem in such a way that the traditional problems of search become less of an issue. Where does one look for that information? Not in blog posts from established companies whose track record in search is quite clear.

Stephen E Arnold, September 12, 2010

Freebie

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta