April 5, 2014
The New York Public Library has a massive collection of beautiful maps, but instead of keeping them locked in an archive Motherboard reports, “The New York Public Library Releases 20,000 Beautiful High Resolution Maps.”
All of the 20,000 maps are available via open access. What is even more amazing is that the NYPL decided to release the maps under the Creative Commons CCO 1.0 Universal Public Domain Dedication. If you are unfamiliar with a Creative Commons license, it means that users are free to download content and do whatever they want with it.
“Combined with its existing historical GIS program, the NYPL wants its users to engage with the maps, and allows them to warp (fitting together based on corresponding anchor points) and overlay the historic maps with modern geoweb services like Google and Open Street Map. Users can export WMS, KML files, and high-quality TIFFs. The historic map appears side by side with the modern maps, and users are invited to mark corresponding points on each, so you can overlay the historic map over the current day’s.”
Google Maps using old maps to explore the world of the past. It is yet another amazing use of modern technology and makes one wonder what people of yesterday would have thought about exploring their world via a small box.
Whitney Grace, April 5, 2014
February 24, 2014
Mind maps can be a valuable tool for the visual among us, and you can easily build your own virtual version with Knowledgebase Builder 2.6 from InfoRapid, based in Waiblingen, Germany. The best part—it’s free for personal use. As with most such business models, the company hopes you’ll try the freeware version and decide you can’t live without the tool in your workplace. The Professional Edition, which lets multiple users work together on the same knowledge base, goes for 99 euros (about $135 as of this writing). The price for the version with all the bells and whistles, the Enterprise Version, varies by company size, but starts at 1,000 euros (about $1,360 as I type) for a small business.
The description tells us:
“InfoRapid KnowledgeBase Builder allows you to easily create complex Mind Maps with millions of interconnected items. One single Mind Map can hold your entire knowledge, all your thoughts and ideas in a clear way. The data is stored securely in a local database file. While traditional Mind Maps don’t offer cross connections, InfoRapid KnowledgeBase Builder can connect any item with each other and label the connection lines. The program contains an archive for documents, images and web pages that may be imported and attached to any chart item or connection line.”
The six-minute video on the website demonstrates the Builder’s functionality, using as its example text about the software itself. The connection lines they mention above, which shift to adjust to new input, are reason enough to switch from pen-and-paper or MSPaint mapping techniques. Another key feature: You can link to documents or web pages from within the map, simplifying follow-through (a weak point for many of us.) The Highlighter Analysis is pretty nifty, too. Anyone curious about this tool should check out the site—the (personal use) price can’t be beat.
Cynthia Murrell, February 24, 2014
January 18, 2014
We are familiar with Visual Mining and its range of dashboard and data visualization software. Currently, Visual Mining has been working on products that help users better understand and analyze actionable business data. Its enterprise software line NetCharts is compatible across all platforms, including mobile and tablets. The company recently released their Winter 2013 Chartline Newsletter.
Along with the usual end of the year greetings and gratitudes, the first note of business in the newsletter addresses is the Web site’s redesign.
Among the new features are:
- “Live Demo We would like to invite you to take a virtual test drive of our live NetCharts Performance Dashboards (NCPD) demo to see our newly restyled dashboard KPI’s.
- Blog Among the new items to explore on our site includes our new blog. This developer driven blog features new content with many different topics including tips and simple tricks to help you build and style your charts and dashboards. Keep coming back for lots more new content that will be added each month.
- Chart Gallery We also have a new chart gallery, which features all new examples with many different kinds of chart types to demonstrate some of the countless possibilities. We also added new chart type categories such as Alerting Charts and Showcase Charts. The Alerting Charts include different chart types that use alert zones while the Showcase category features chart examples with new and unusual styling approaches to demonstrate the flexibility of our charts.”
We have to wonder if the redesign came from the lack of Web traffic. Most Web sites are losing traffic, among them are content processing vendors. Does Visual Mining hope to generate sales more traffic based on their new look? We hope so.
Whitney Grace, January 18, 2014
November 24, 2013
I have watched time shrink in the last 50 years. I recall having time in my first job. I did not feel pressured to do the rush rush thing. Now, when I accept an engagement, the work has to be done in double time in half the time, maybe faster.
As a result, reports have to be short. Graphics have to point out one key point. Presentations have to be six or eight PowerPoint slides. Big decisions are made in a heartbeat. The go go years were the slow slow years.
I took a look at Kantar Information Is Beautiful Awards. I think I saw the future of search. Users want information presented with Hollywood style visuals. Does it matter that the visualizations are incomprehensible? I don’t think so. Style takes precedence over clarity. I can visualize senior managers telling their colleagues, “I want graphics like these Kantar winners in my next PowerPoint.”
Here’s a winning visual.
The confusion of clarity with visual zing is interesting. As search vendors struggle to find a formula that generates top line revenue growth and yields net profits, are visualizations like the Kantar winners the future of search? I think the answer may be, “Absolutely.”
Vendors are not sure what they are selling. Whether it is BA Insight’s effort to get LinkedIn search group participants to explain the key attributes of search or other vendors slapping on buzzwords to activate a sales magnet, search is confused, lost maybe. Coveo is search, customer support and more. MarkLogic is XML data management, search, and business intelligence. Amazon, Google, IBM, and Microsoft search does everything one would want in the way of information access. Open source ElasticSearch, LucidWorks, and Searchdaimon are signaling a turn into the path that proprietary Verity blazed in 1988. Vendors do everything in an all out effort to close deals. Visualization may be the secret ingredient that gives search focus, purpose, and money.
Why not skip requiring a user to read, analyze, and synthesize? Boring. Why not present a predigested special effect? Exciting. Everyone will be happier.
Decisions making seems to be in a crisis. Pictures instead of works may improve senior managers’ batting averages.
Relying on incomprehensible visuals to communicate will be more fun and prove to be more lucrative. I assume audiences will applaud, cheer, and stomp their feet. Conferences can sell popcorn and soft drinks to accompany the talks.
Go snappy graphics. Will I understand them at a glance. Nope.
Stephen E Arnold, November 24, 2013
November 8, 2013
Data Science is a hot industry. Even though data scientists have been around for decades, but it is only the proliferation of new devices and data streams that have brought the career to the Internet spotlight. Data Science is more than monitoring reports about data or even the big data revolution. Data Science is an intricate and interesting science and to understand it better check out the Visual.ly infographic labeled: “Data Science: More Than Mining.”
The graphic explains that data science has exploded:
“Proliferation of sensors, mobile and social trends provide explosive growth of new types of data. Data scientists are creating the tools that can be used to interpret and help translate the streams of information into innovative new products. Social media platforms such as Facebook depend on data science to create innovative, interactive features that encourage users to get interested and stay that way.”
The basic of data science are data mining, statistics, interpretation, and leveraging. The data scientist interacts with the data by asking questions about how to apply the information in new ways and better the process. Data scientists are hardly people off the street, they require the skills of hacker, mathematician, and an artist. Mixing all those together goes makes a data scientist a very diverse person and able to see how to apply the data in new, unknown ways. It is amazing how data science has shaped society from behind the current since 1790.
Whitney Grace, November 08, 2013
November 3, 2013
A post at Ushahidi’s blog titled, “With Data, Never Underestimate the Power of a Pretty Picture” reminds us how important the arrangement of data into a palatable format can be. The write-up begins:
“There is power in data. Data can tell important stories, from which politicians are corrupt to which corporations are breaking the law. However, like any good story its power is dependent on being compelling to audiences — being a page-turner.”
Writer Chris R. Albon illustrates the point with two tables. The first presents real data in a bare-bones format that can only be read by those with some basic data science training. I, for one, cannot make heads or tails of it. The second is a polished presentation of gibberish; it really looks quite nice. Albon explains:
“The simple fact is that if both the table and image were placed on the desks of policymakers, journalists, business leaders, and politicians, it would undoubtedly be the image that interested them — that enticed them to examine it and kept their attention all the way through. The image’s ability to be compelling means that at the end of the day, it is going have a much stronger chance of having a real impact.”
Of course, the piece concludes by noting that Ushahidi is the place to go for all your data-visualization needs. (There are others out there.) The self-promotion, however, does not undercut the message: readers are much more likely to pay attention to, and understand, data if it is presented in a well-designed graphic.
Cynthia Murrell, November 03, 2013
November 1, 2013
Silicon Angle recently reported on a new free solution for processing streaming data in the article “Bottlenose Visualizes Twitter Firehouse Trends with Sonar Solo”.
According to the article, the startup, among other things, uses technology from Lexalytics and Nerve Center to help users unwrap social data by offering the visibility in personal data expected by a growing number of today’s customers.
The article states:
“Nova Spivack, the co-founder and CEO of Bottlenose, noted that ‘we wanted everyone to experience the power of Sonar’s real-time trend intelligence visualization, without restricting it to our large enterprise customers. Now, anyone with internet access can search for anything from their favorite celebrities to breaking news and current events, and everything in between, for a real-time view of what’s trending in the collective consciousness.’”
The Bottlenose offers its customers a customizable product that engages their audience. This is something that is very necessary for startups to succeed in today’s competitive market.
Jasmine Ashton, November 01, 2013
October 24, 2013
Considering that Google has a stake in Recorded Future, which has visualization capabilities, this is an interesting development: The Sacramento Bee shares the press release, “Tableau Software Partners with Google to Visualize Big Data at Gartner IT Symposium.” The partnership mixes Tableau’s analytics with the Google Cloud Platform. Recently at Gartner‘s convention in Orlando, attendees were given a demonstration of the project. The write-up tells us:
“Tableau and Google created a series of dashboards to visualize enormous volumes of real-time sensory data gathered at Google I/O 2013, Google’s developers’ conference. Data measuring multiple environmental variables, such as room temperature and volume, was analyzed in Tableau and presented to attendees at the Gartner event. With Tableau’s visual analytics, Gartner attendees could see that from the data created, I/O conference managers could adjust the experience and gain insights in real time, like re-routing air-conditioning to optimize power and cooling when rooms got too warm.”
The project will also be demonstrated at Gartner’s upcoming events around the world; see the article for dates and places (though I’ll go ahead and tell you that Orlando was the only location in North America.) We wonder—is this Gartner/Tableau/Google trio a marketing play, or a significant step forward in data visualization?
Founded in 2003 and located in Seattle, Washington, Tableau Software grew from a project begun at Stanford University. Their priority is to help ordinary people use data to solve problems quickly and easily. The company is fully invested in their own philosophy; not only does Tableau use their own products, but they also rely heavily on data analysis for their business decisions.
Cynthia Murrell, October 24, 2013
October 6, 2013
What is visual data mining? I know that data mining involves searching through data with a computer program in search of specific information. I am guessing that visual data mining includes the same aspect except it presents the data using various patterns. Am I right? Am I dead wrong? I do not know, but I do know the way to find the answer is to read Visual Data Mining-Theoyr by Arturas Mazeika, Michael H. Bohlen, and Simeoin Simoff.
Here is the item description from Amazon:
“The importance of visual data mining, as a strong sub-discipline of data mining, had already been recognized in the beginning of the decade. In 2005 a panel of renowned individuals met to address the shortcomings and drawbacks of the current state of visual information processing. The need for a systematic and methodological development of visual analytics was detected. This book aims at addressing this need. Through a collection of 21 contributions selected from more than 46 submissions, it offers a systematic presentation of the state of the art in the field. The volume is structured in three parts on theory and methodologies, techniques, and tools and applications.”
This book usually retails for a whooping $99.00 or $63.91 with the Amazon discount. It is still a hefty chunk of change for a 163 page book, which is why we are pleased to say if you are a member of ISBN Book Funder or OnlineBooks.com then it is available to you for free. Other books are free for members. If that does not appeal to you check our your local academic library.
Whitney Grace, October 06, 2013
September 5, 2013
If you ever wanted to visualize data sets containing up to one million lines of code, the impossible just became a reality without a commercial license. PRWeb has the good news: “Tableau Software Extends Tableau Public To 1 Million Rows Of Data.” Tableau Software is a data specialization company that helps its users share, analyze, and visualize their data. The company has an open source end portal Tableau Public that also allows its users to share their content on blogs and personal Web sites. Users demanded to have the line limit increased and Tableau Software added the one million limit to its public end.
“ ‘Since Tableau Public launched in 2010, we’ve seen an explosion in the number of data sets available on the web for public consumption,” said Tableau Public Product Marketing Manager Ben Jones. “It’s becoming more common for these data sets to exceed one hundred thousand records, so this change allows users of our software to share interactive visualizations of these larger data sets with their readers.’ ”
Some organizations that have big data sets out in the public are: airline on-time statistics and delay causes, US Medicare payments to hospitals, and historical weather station data recorded hourly. As the Internet grows the amount of space needed will grow proportionally and perhaps even larger. Wonder when they will release a trillion lines.
Whitney Grace, September 05, 2013