Live Data and Web Site Design Considerations in SharePoint Deployment
June 6, 2012
We’ve been covering Robert Schifreen’s SharePoint 2010 series of posts on his SharePoint deployment experience. In his eleventh installment, “Countdown to Launch: Importing Data,” he continues the topic of security and how to manage it alongside live data and a public-facing web site.
Schifreen has this to say about importing data:
There’s a good bulk upload tool, which improves on the facilities in out-of-the-box SharePoint, available for a couple of hundred dollars, which will make it easy for users to copy stuff across. Or they can just use WebDav from Windows. Or I could probably do the whole thing in PowerShell, because there’s enough information in our Active Directory for me to find their current files, check that they are staff rather than a student, and copy the files to the correct SharePoint library.
And he goes on to comment on developing a public facing web site,
A SharePoint site, especially one that isn’t a public-facing website, tends to look like every other SharePoint site on the planet. You can, if you wish, customise it in any way you want. At the simple level you can replace the SharePoint logo with your own.
Importing data and branding your SharePoint site are both important steps in the overall SharePoint deployment. If you’re in the same process, you may want to read the article for some handy tips and guidance. You may also consider a third party solution to extend the capabilities of your SharePoint system. It seems that the experts at Fabasoft Mindbreeze understand the importance of a web site brand and design.
An attractive website serves as an effective digital business card. Surprise your website visitors with an intuitive search. Fabasoft Mindbreeze InSite is intuitive and user friendly and is instantly ready for use as a Cloud service. It turns your website into a user-friendly knowledge portal for your customers and recognizes correlations and links through semantic and dynamic search processes. This delivers pinpoint accurate and precise “finding experiences” and is the perfect website search for your company.
And with no installation, configuration, or maintenance required, the comprehensive and cost-effective solution will save you valuable time and training resources. Navigate to http://www.mindbreeze.com/ to read more about the full suite of solutions.
Philip West, June 6, 2012
Sponsored by Pandia.com
Oracle Chases Customer Support
June 6, 2012
Computer Business Review recently reported on Oracle integrating RightNow with Fusion in the article “Oracle Integrates RightNow CX Cloud Service With Fusion Sales.”
According to the article, Oracle has now integrated its RightNow CX Cloud customer experience suite with Fusion Sales in order to help organizations facilitate relevant cross channel customer interactions by improving revenue and making processes more efficient.
RightNow, a U.S. company that incorporates search technology, acquired Q-Go, a European natural language search system, in 2011. Since this acquisition the firm has been able to extend and improve its services. The additional $8 million in revenue helped make the CX Cloud experience suite possible.
The article states:
“The integrated applications also provides a cross-channel view of the customer to sales, marketing and service, allows sales to review service history in preparation for sales calls and empowers sales and service departments to collaborate to solve customer issues, using opportunities to provide purchase advice at the right time and with the right applications.”
This new suite of products will be able to allow organizations to deliver a more targeted approach to customer needs.
Jasmine Ashton, June 6, 2012
Sponsored by PolySpot
Innovation Leading the Way Once Again
June 6, 2012
Manufacturing companies have long understood the value of product lifecycle management (PLM) solutions but now thanks to lower costs of PLM implementation companies need a new edge to gain traction in competitive markets. The article, “The Role of IT in Linking Innovation Lifecycle Information Management to ERP and PLM”, on Network World, proposes that companies turn back to good old fashioned ingenuity in the form of research and development (R&D) for that winning edge.
According to the article,
“In today’s highly connected information ecosystem, the moment has arrived for R&D to e-enable itself, just as the manufacturing and supply chain side has done with PLM and ERP. Thanks to the advent of cloud computing, service-oriented architecture and the use of Web services and technologies that support advanced search and data mining, innovation management that streamlines R&D, yet respects its complexity, is now a real possibility.”
While large companies with dedicated scientists, statisticians and other brilliant members of R&D teams lead innovation smaller enterprises often struggle to find a single innovator, much less an entire department. For those finding their company in that position we contend that focusing on smarter PLM solutions is the best move. Of course innovation should never be ignored but by adopting next-generation PLM technology, like Inforbix, small and midsized companies can gain a proven edge simply through better data management.
Catherine Lamsfuss, June 6, 2012
Data Visualization Solves One Piece of the Analytics Software Puzzle
June 6, 2012
We came across an interesting summary of visual mining for bioscience in the form of an abstract from PubMed from the U.S. National Library of Medicine. The background, results and conclusions were shared in, “Methods for Visual Mining of Genomic and Proteomic Data Atlases.”
Analytic software is in an interesting place currently. Users demand a high level of scientific reasoning within an intuitive and efficient tool. Data visualization is part of the answer as it enables the user to interact with diverse and complex data directly manipulating it on screen.
The paper both discusses and provides illustrations regarding an approach to developing visual mining tools capable of supporting the mining of petabytes of information. For laypeople like my colleagues and I at Beyond Search we stuck to reading the text.
The paper concludes with the following thoughts:
“The mining of massive repositories of biological data requires the development of new tools and techniques. Visual exploration of the large-scale atlas data sets allows researchers to mine data to find new meaning and make sense at scales from single samples to entire populations. Providing linked task specific views that allow a user to start from points of interest (from diseases to single genes) enables targeted exploration of thousands of spectra and genomes.”
While we thank the authors for their work on the subject, they must understand that visualization is not a silver bullet.
Megan Feil, June 6, 2012
Sponsored by PolySpot
IBM on Cloud Control
June 6, 2012
It is good to know IBM is asserting leadership in the cloud. Wired Cloudline (sponsored by IBM, by the way) reports: “IBM to the World: on Cloud Computing, You’ve Got Nothing on Us.” IBM recently put out a lengthy press release filled with details about its cloud services and who uses it. For example, about a million application users work in the IBM Cloud, and that Cloud processes 4.5 million transactions daily.
Writer Todd Nielsen shares (and comments upon) some highlights from the press release:
“*A New 99.9% Service Level Agreement for Smart Cloud Enterprise (Wow… not many cloud providers can promise that)
*Developers seem to be lining up with over 30 new ISV’s validated for IBM Smart Cloud (Cloud Providers live and die from applications. It is a strong statement that developers are seeing the value of the platform)
*SmartCloud Enterprise available in North America and Europe with plans for additional global roll-out in Q3 2012. (An important move for global business that not a lot of cloud providers can provide)
*Several tools are services are mentioned to improve and ease migration and transferring applications to the cloud.
*Improved licensing management and support for many operating systems and software packages”
Many more tidbits are available in both the write up and the original press release, including details on how different companies are using the resource. As Nielsen observes, it is clear that IBM has a good handle on this cloud thing. However, he wonders: how will IBM resellers benefit from all of this? Or will they?
We have a suggestion for Big Blue: why not run an instance of Watson in this cloud so we can explore its excellence?
Cynthia Murrell, June 6, 2012
Sponsored by PolySpot
HP Autonomy: The Big Data Arabesque
June 5, 2012
Hewlett Packard has big plans for Autonomy. HP paid $10 billion for the search and content processing company last year. HP faces a number of challenges in its printer and ink business. The personal computer business is okay, but HP is without a strong revenue stream from mobile devices.
“HP Rolls Out Hadoop AppSystem Stack” provided some interesting information about Autonomy and big data. The write up focuses on the big data trend. In order to make sense out of large volumes of information, HP wants to build management software, integrate the “Vertica column oriented distributed database and the Autonomy Intelligent Data Operating Layer (IDOL) 10 stack.” The article reports:
On the Autonomy front, HP has announced the capability to put the IDOL 10 engine, which supports over 1,000 file types and connects to over 400 different kinds of data repositories, onto each node in a Hadoop cluster. So you can MapReduce the data and let Autonomy make use of it. For instance, you can use it to feed the Optimost Clickstream Analytics module for the Autonomy software, which also uses the Vertica data store for some parts of the data stream. HP is also rolling out its Vertica 6 data store, and the big new feature is the ability to run the open source R statistical analysis programming language in parallel on the nodes where Vertica is storing data in columnar format. More details on the new Vertica release were not available at press time, but Miller says that the idea is to provider connectors between Vertica, Hadoop, and Autonomy so all of the different platforms can share information.
HP’s idea blends a hot trend, HP’s range of hardware, HP’s system management software, a database, and Autonomy IDOL. In order to make this ensemble play in tune, HP will offer professional services.
InfoWorld’s “HP Extends Autonomy’s Big Data Chops to Hadoop Cloud” added some additional insight. I learned that former Autonomy boss Michael Lynch will leave HP “along with Autonomy’s entire original management team and 20 percent of its staff.”
The story then explained that Autonomy, which combines with Vertica:
can now be embedded in Hadoop nodes. From there, users can combine Idol’s 500-plus functions — including automatic categorization, clustering, and hyperlinking — to scour various sources of structured and unstructured data to glean deeper meanings and trends. Sources run the gamut, too, from structured data such as purchase history, services issues, and inventory records to unstructured Twitter streams, and even audio files. IDOL includes 400 connectors, which companies can use to get at external data.
Autonomy moved beyond search many years ago. This current transformation of Autonomy makes marketing sense. I am interested in monitoring this big data approach. IBM had a similar idea when it presented the Vivisimo clustering and deduplication system as a “big data” system. The challenge will be applying text centric technology to ensembles which generate insights from “big data.”
Will the shift earn back the purchase price of $10 billion and have enough horsepower to pull HP into robust top line growth? Big data and analytics have promise but I don’t know of any single analytics company that has multi-billion dollar product lines. Big data is a hot button, but does it hard wire into the pocketbooks of chief financial officers?
Stephen E Arnold, June 5, 2012
Sponsored by IKANOW
Why Newspapers Fail
June 5, 2012
Quote to note: I don’t want to “shirk” my responsibility to highlight interesting management analysis. My source is the free Wall Street Journal. On the Web site appeared a three page letter allegedly penned by Warren Buffett, icon of American business and Bill Gates’s pal. The subject is the investment uplift for newspapers. The write up explains that some newspapers have value. But here’s the quote to note:
Times are certainly far tougher today than they used to be for newspapers. Circulation nationally will continue to slip and in some cases plunge. But American papers have only failed when one or more of the following factors was present (1) The town or city had two or more competing dailies; (2) The paper lost its position as the primary source of information important to its readers or (3) The town or city did not have a pervasive self-identity.
The question becomes, “If the formula is so obvious what has baffled so many newspaper publishers?” I will have to wait for an azure chip consultant, an academic, or an unemployed webmaster to give me the answer. In the meantime, I will check news on my iPad and catch local info via Twitter.
Stephen E Arnold, June 5, 2012
Sponsored by HighGainBlog
Security Concerns and Account Permissions in SharePoint 2010 Explained
June 5, 2012
Robert Schifreen brings us the tenth installment of his SharePoint 2010 series in his ZDNet.co.uk post, “Security on the Farm: Accounts and Permissions.” Shifreen explains that SharePoint’s most important database is SharePoint_config but that if it breaks, you’re best bet is to rebuild from your notes and restore backed-up content databases. Why? Schifreen points out that restoring a backup of SharePoint_config isn’t actually supported by Microsoft and rarely works in practice.
The author also has this to share about the nuances of a SharePoint deployment:
When you start building and running a SharePoint farm, you will come across dozens of seemingly unsolvable problems that turn out to be merely down to permissions.
He goes on to say,
Best practice is then to use separate accounts for installing various underlying services, databases, and so on…The most tempting option, of course, is to forget best practice and just use one account for running all the SharePoint internal stuff. The upside is that things will work a little better, with fewer permission-related errors. There are two downsides. First, if a hacker manages to penetrate the account he’ll have access to the entire farm rather than just a half or a third of it. Secondly, splitting everything across multiple accounts can actually aid troubleshooting in some cases because, by glancing at the server’s security log, the account that caused the problem will give you a clue as to why things are going wrong.
Schifreen’s topic of security is a valuable one in the world of big data that is continuously growing across on-premise and cloud platforms. Consider a comprehensive out of the box solution, like Fabasoft Mindbreeze, to extend your SharePoint system with the added certified security benefits.
Fabasoft Mindbreeze Enterprise “finds every scrap of information within a very short time, whether document, contract, note, e-mail or calendar entry, in intranet or internet, person- or text-related. The software solution finds all required information, regardless of source, for its users.” Further, Mindbreeze offers certified security and reliability with regular external audits of their relevant standards ISO 27001, ISO 20000, ISO 9001, and SAS 70 Tup II. The solution is worth a second look at www.mindbreeze.com.
Philip West, June 5, 2012
Sponsored by Pandia.com
The New Lexi-Portal Version 4 Offers More Options
June 5, 2012
Leximancer just introduced Lexi-Portal Version 4 to the market. This new service provides availability to users for all the wide-ranging text analytic capability of Leximancer. Market researchers will find that this portal will provide them with fast analysis of qualitative surveys, spreadsheets and verbatim data.
Leximancer’s technology is proven with customers all around the globe. Their providing new and innovative ways for businesses to benefit in a no strings attached way. Basically, you have options on how to utilize the Lexi-Portal.
There are several aspects of their portal that make it unique to users, such as the fact they made it an ‘on demand’ service. This means you don’t actually have to subscribe every month, but instead are charged for the actual amount of usage based on either a time used or service basis. The convenience of the pay as you go aspect is that the Lexi-Portal will retain your company’s information for up to two months even if your usage drops for a month.
About Leximancer:
“Leximancer is an Australian company that has been providing leading-edge text analytics technology for almost 10 years.”
“The technology was created following 7 years research and development at the University of Queensland by Dr Andrew Smith. Andrew’s physics and cognitive science background, in conjunction with his working IT application experience, enabled him to envisage and develop an innovative solution to the growing need to readily determine meaning from unstructured, qualitative, textual data.”
You can view sample out’s at the Leximancer Chart Gallery such as the interview dashboard below:
Jennifer Shockley, June 5, 2012
Sponsored by PolySpot
The Third Industrial Revolution Is Data Management
June 5, 2012
In the manufacturing race there is only so much companies can do to gain a competitive edge when it comes to technology. At least that is the premise under which PTC President and CEO, Jim Heppelmann, made his assertion that a ‘third industrial revolution’ is underway as reported in the Market Watch article, “PTC CEO Jim Heppelmann Declares New Era of Manufacturing Competitiveness Driven By Product and Service Advantage”.
According to the article Heppelmann’s vision of the new order of manufacturing includes:
“Fundamentally, PTC technology solutions transform the way companies create and service products by enabling them to make better, smarter, faster strategy and planning decisions. These decisions relate to how products are designed and engineered, how a supply chain is optimized, how quality and compliance is assured throughout the manufacturing process and, ultimately, how service is efficiently delivered against a product once sold.”
Another word for all this transforming and smarter decision making capabilities is data management (okay – two words). The PLM provider Inforbix has long understood the role of data management in PLM processes and has long strived to enable their clients to find, share and reuse data to accomplish everything Heppelmann sees as the future.
Catherine Lamsfuss, June 5, 2012