Amazon Updates Sneaker Net

October 8, 2015

I remember creating a document, copying the file to a floppy, and then walking up one flight of steps to give the floppy to my boss. He took the floppy, loaded it into his computer, and made changes. A short time later he would walk down one flight of steps, hand me the floppy with his file on it, and I would review the changes.

I thought this was the cat’s pajamas for two reasons:

  1. I did not have to use ledger paper, sharpen a pencil, and cramp my fingers
  2. Multiple copies existed so I no longer had to panic when I spilled my Fresca across my desk.

Based on the baloney I read every day about the super wonderful high speed, real time cloud technology, I was shocked when I read “Snowball’s Chance in Hell? Amazon Just Launched a Physical Data Transfer Service.” The news struck me as more important than the yap and yammer about Amazon disrupting cloud business and adding partners.

Here’s the main point I highlighted in pragmatic black:

A Snowball device is ordered through the AWS Management Console and is delivered to site within a few days; customers can order multiple devices and devices can be run in parallel. Described as coming in its “own shipping container” (it doesn’t require packing or unpacking) the Snowball is entirely self-contained, complete with 110 Volt power, a 10 GB network connection on the back and an E Ink display/control panel on the front. Once received it’s simply a matter of plugging the device in, connecting it to a network, configuring the IP address, and installing the AWS Snowball client; a job manifest and 25 character unlock code complete the task. When the transfer of data is complete the device is disconnected and a shipping label will automatically appear on the E Ink display; once shipped back to Amazon (currently only the Oregon data center is supporting the service, with others to follow) the data will be decrypted and copied to S3 bucket(s) as specified by the customer.

There you go. Sneaker net updated with FedEx, UPS, or another shipping service. Definitely better than carrying an appliance up and down stairs. I was hoping that individuals participating in the Mechanical Turk system would be available to pick up an appliance and deliver it to the Amazon customer and then return the gizmo to Amazon. If Amazon can do Etsy-type stuff, it can certainly do Uber-type functions, right?

When will the future arrive? No word on how the appliance will interact with Amazon’s outstanding search system. I wish I knew how to NOT out unpublished books or locate mysteries by Japanese authors available in English. Hey, there is a sneaker net. Focus on the important innovations.

Stephen E Arnold, October 8, 2015

Compare Cell Phone Usage in Various Cities

October 8, 2015

Ever wonder how cell phone usage varies around the globe? Gizmodo reports on a tool that can tell us, called ManyCities, in their article, “This Website Lets You Study Cell Phone Use in Cities Around the World.” The project is a team effort from MIT’s SENSEable City Laboratory and networking firm Ericsson. Writer Jamie Condliffe tells us that ManyCities:

“…compiles mobile phone data — such as text message traffic, number of phone calls, and the amount of data downloaded —from base stations in Los Angeles, New York, London, and Hong Kong between April 2013 and January 2014. It’s all anonymised, so there’s no sensitive information on display, but there is enough data to understand usage patterns, even down the scale of small neighbourhoods. What’s nice about the site is that there are plenty of intuitive interpretations of the data available from the get-go. So, you can see how phone use varies geographically, say, or by time, spotting the general upward trend in data use or how holidays affect the number of phone calls. And then you can dig deeper, to compare data use over time between different neighbourhoods or cities: like, how does the number of texts sent in Hong Kong compare to New York? (It peaks in Hong Kong in the morning, but in the evening in New York, by the way.)”

The software includes some tools that go a little further, as well; users can cluster areas by usage patterns or incorporate demographic data. Condliffe notes that this information could help with a lot of tasks; forecasting activity and demand, for example. If only it were available in real time, he laments, though he predicts that will happen soon. Stay tuned.

Cynthia Murrell, October 8, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Predictions about Technology: Digital Retreading

October 7, 2015

I like it when a person tells me that software or a human can predict the future. My question is, “If the predictions are spot on, why is the owner of the prediction system talking? Why not play fantasy football, pick stocks, or hang out at Keeneland during an auction and buy horses whose value will skyrocket?

The answer is, “Err, well, hmmm.”

Exactly. Predicting the future is a bit like imagining oneself putting on soccer boots and filling in for the injured Lionel Messi. Easy to thing. Essentially impossible to do.

The fix is to be fuzzy. Instead of getting into a win-lose situation, there are caveats. I find these predictions and their predictors amusing. Not as enjoyable as the antics of something like IBM cognitive computing marketed by Bob Dylan or the silliness of Hewlett Packard management activities. But close, darned close.

I read “Gartner: Top 10 Strategic Technology Trends For 2016.” I noted this statement from the capitalist tool:

…the evolution of digital business is clearly at the heart of what is covered.

Okay, the trends are going to identify trends which will allow an MBA or a savvy marketer to look at business and understand how “business” will evolve. Darwin to the future, not Darwin from the past I assume.

The question in my mind is, “Are these retread ideas?”

Here are three “trends” which caught my attention. To get the full intellectual payload, you will need to read the article or, better yet, seek out a Gartner wizard and get the trend thing straight from the horse’s mouth. Yep, right, mouth.

Trend 2: Ambient user experience.

I remember hearing about ambient computing years ago. The idea was that one could walk around and compute. I also ran across Deloitte’s identification of a similar trend months ago. But it was in the late 1990s or early 2000s when an MIT person talked about the concept. Obviously if one is computing whilst walking around, there is an experience involved. With mobile devices outselling tethered devices, it seems disingenuous to talk about this trend. According to Forbes, the capitalist tool:

Gartner posits that the devices and sensors will become so smart that they will be able to organize our lives without our even noticing that they are doing so.

I like posit. The word means “to dispose or set firmly, assume or affirm the existence of, and propose as an explanation.” Yep, posit something that academics and blue chip consulting firms have been saying for a while.

Trend 4: Information of Everything

Now these universal statements are rhetorical tactics which make my tail feathers stand up. “Everything” is a broad concept. A critical reader may want to ask, “Will you provide me with information about line 24 million in Watson’s 100 million lines of code?” The “everything” is going to provide this answer. Nope. Logical flaw. But here’s how the capitalist tool, a font of logical thought, presents this “information of everything” trend:

According to Gartner, by 2020, 25 billion devices will be generating data about almost every topic imaginable. This is equal parts opportunity and challenge.  There will be a plethora of data, but making sense of it will be the trick. Those companies that harness the power of this tidal wave of information will leapfrog competitors in the process.

I like the plethora. I like the leapfrog. I like the tidal wave. I have a sneaking suspicion that most folks with a computer device have experienced a moment of information confusion. “With every topic imaginable”, confusion is a familiar neighbor. Now how long has this concept of lots of information from lots of devices with communications capability been around? Forbes, the capitalist tool, published in June 2014 “A Very Short History of the Internet of Things.” If the Forbes’ writer had taken the time to look at that article, the concept poked its nose into the world in the early 1930s. Well, that is only 80 years ago. But it is a trend. Hmm. Trend.

Now my favorite.

Trend 9. Mesh App and Service Architecture

The notion that computer systems able to exchange information is a good one. I can’t recall when I learned about this concept. Wait. No, I remember. It was in 1963 when I took my first class in computer programming. The professor, a fine autistic polymath, explained that the mainframe—a 1710—was a collection of components. He said in 1962 that different machines would talk to one another in the future. Well, there you go. A third rate university with dullards like me in class got a prognostication which seems to be true. That was more than half a century ago. Here’s the modern version of this old chestnut:

More apps are being built to be plugged together, and the value of the combination is much greater than the sum of the parts.  As Lyft has integrated with comparable offerings in other countries, its ability to expand its offering for traditional customers traveling abroad and the reverse has meant faster growth with minimal cost implications.

Enough of these walks down memory lane. Three observations:

  1. These trends are recycled concepts
  2. The presentation of the trends is a marketing play, nothing more, nothing less
  3. Mid tier consulting firms are trying really hard to sound very authoritative, important, and substantial.

That would work if footnotes provides pointers to those who offered the ideas before. Whether a blue chip consulting firm like Deloitte or a half wild computer science professor in the Midwest, the trends are not trends.

We are, gentle reader, looking at digital retreaded tires. A recap. A remold. Old stuff made fresh. Just don’t drive too quickly into the future on these babies. Want to bet on this?

Stephen E Arnold, October 7, 2015

Full Text Search Gets Explained

October 6, 2015

Full text search is a one of the primary functions of most search platform.  If a search platform cannot get full text search right, then it is useless and should be tossed in the recycle bin.    Full text search is such a basic function these days that most people do not know how to explain what it is.  So what is full text?

According to the Xojo article, “Full Text Search With SQLite” provides a thorough definition:

“What is full text searching? It is a fast way to look for specific words in text columns of a database table. Without full text searching, you would typically search a text column using the LIKE command. For example, you might use this command to find all books that have “cat” in the description…But this select actually finds row that has the letters “cat” in it, even if it is in another word, such as “cater”. Also, using LIKE does not make use of any indexing on the table. The table has to be scanned row by row to see if it contains the value, which can be slow for large tables.”

After the definition, the article turns into advertising piece for SQLite and how it improves the quality of full text search.  It offers some more basic explanation, which are not understood by someone unless they have a coding background.   It is a very brief with some detailed information, but could explain more about what SQLite is and how it improves full text search.

Whitney Grace, October 6, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Learn One of These Programming Languages, Crunch Big Data. Easy, Right?

October 3, 2015

I read a listicle called “Ten Top Languages for Crunching Big Data.” The list is interesting but the underlying assumption about the languages and “crunching” Big Data was remarkable.

The core of the write up is a list of 10 programming languages which make it possible (maybe semi easy) to “generate insights.” The list has some old familiar programming languages; for example, SQL or structured query language. There’s the graduate student in psychology fave SAS. Some might argue that SPSS Clem is the way to chop Big Data down to size. There is a toolkit in the list. Remember Matlab, which for a “student” is only $49. For the sportier crowd, I would add Mathematica to the list, but I don’t want to melt the listicle.

Also on the list are Python and R. Both get quite a bit of love from some interesting cyber OSINT outfits.

For fans of Java, the list points to Scala. The open source fan can use HiveQL, Julia, or Pig Latin.

The listicle includes a tip of the hat to Alphabet Google. According to the write up:

Go has been developed by Google and released under an open source license. Its syntax is based on C, meaning many programmers will be familiar with it, which has aided its adoption. Although not specifically designed for statistical computing, its speed and familiarity, along with the fact it can call routines written in other languages (such as Python) to handle functions it can’t cope with itself, means it is growing in popularity for data programming.

Yep, a goodie from the GOOG spells Big Data magic. For how long? Well, I don’t know.

However, the assumption from which the listicle hangs is that a programming language allows Big Data to be crunched.

Whoa, Nellie.

There may be a couple of “trivial” intermediary steps required. Let me mention one. The Big Data cruncher has to code up something to get useful outputs. Now that “code up” step may require some other bothersome tasks; for example, dealing with messy data to ensure that the garbage in, garbage out problem does not arise. The mathematically inclined may suggest that the coded up “script” actually work within available computer time and memory resources. Wow, that might make a script to crunch Big Data either not work or output results which are dead wrong. What if the script implements algorithmic bias?

Whoa, whoa, Nellie.

I know that programming languages are important. But some other tasks deserve attention in my experience.

Stephen E Arnold, October 3, 2015

Legacy Servers: Upgrade Excitement

October 2, 2015

Enterprise management systems (ECM) were supposed to provide an end all solution for storing and organizing digital data.  Data needs to be stored for several purposes: taxes, historical record, research, and audits.  Government agencies deployed ECM solutions to manage their huge data loads, but the old information silos are not performing up to modern standards.  GCN discusses government agencies face upgrading their systems in “Migrating Your Legacy ECM Solution.”

When ECMs first came online, information was stored in silos programmed to support even older legacy solutions with niche applications.  The repositories are so convoluted that users cannot find any information and do not even mention upgrading the beasts:

“Aging ECM systems are incapable of fitting into the new world of consumer-friendly software that both employees and citizens expect.  Yet, modernizing legacy systems raises issues of security, cost, governance and complexity of business rules  — all obstacles to a smooth transition.  Further, legacy systems simply cannot keep up with the demands of today’s dynamic workforce.”

Two solutions present themselves: data can be moved from an old legacy system to a new one or simply moving the content from the silo.  The barriers are cost and time, but the users will reap the benefits of upgrades, especially connectivity, cloud, mobile, and social features.  There is the possibility of leaving the content in place using interoperability standards or cloud-based management to make the data searchable and accessible.

The biggest problem is actually convincing people to upgrade.  Why fix what is not broken?  Then there is the justification of using taxpayers’ money for the upgrade when the money can be used elsewhere.  Round and round the argument goes.

Whitney Grace, October 2, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Share by from StoryCloud Reigns in Control of Online Content by Content Creator

October 1, 2015

The article titled Permission Based Publishing Lets Users Keep Control of Content on Beta News describes an innovative approach to allowing online content publishers a tighter grip on how their content is disbursed. StoryCloud, the permission-based publishing provider of Share By, explains the myriad potential uses for their platform, from teachers measuring a class’s understanding of the homework assignment to a musical group sharing a song with specific subscribers. The article explains how the platform functions,

“By using permission-based technology that is tightly integrated with social networking, analytics and ecommerce, Share By allows content providers to easily determine who sees their content, when, and from what location. Other permissions include duration, view or download limits and scheduling time periods for sharing and the devices that are permitted. Once content providers upload content to StoryCloud and determine permissions, they receive a unique URL which can be shared with any online audience, including Facebook and Twitter.”

Beyond the privacy and control aspects of Share By, there is also the ability to graphically analyze the content they have released online. For most individuals, this might just mean checking in on who really spent time consuming the content, but for companies it means monetization. They can charge per viewing and offer subscriptions without worrying about people getting the content without consent.

Chelsea Kerwin, October 01, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

National Geographic Sells Out 

September 30, 2015

The National Geographic Society is one of the most respected institutes in regards to science and journalism related to nature.  For 127 years, National Geographic managed itself as a non-profit organization.  Buzzfeed reports that 21st Century Fox purchased National Geographic in the article, “Rupert Murdoch Is Buying National Geographic.”  Before you start getting upset that National Geographic has “sold out” in the same manner that Sesame Street has a new partnership with HBO, be aware that 21st Century Fox already owned and operated a joint-venture partnership with the company.

The bulk of National Geographic’s properties are being turned over to 21st Century Fox, who will manage them and allow the National Geographic Society to focus on:

“The National Geographic Society said the deal will let the foundation invest more money in sponsoring explorers and scientists. ‘The value generated by this transaction, including the consistent and attractive revenue stream that National Geographic Partners will deliver, ensures that we will have greater resources for this work, which includes our grant making programs,’ said CEO Gary Knell, in a statement.”

While National Geographic is still popular, it faces stiff competition from other news outlets that generate similar if not more content.  National Geographic wants to have better, modern storytelling “so that we may all know more of the world upon which we live.”

Hopefully this will free up more monies for scientific research, endeavors to protect endangered species, educational programs, and better ways to educate people on the natural world.

 

Whitney Grace, September 30, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

The Many Applications of Predictive Analytics

September 29, 2015

The article on Computer World titled Technology that Predicts Your Next Security Fail confers the current explosion in predictive analytics, the application of past occurrences to predict future occurrences. The article cites the example of the Kentucky Department of Revenue (DOR), which used predictive analytics to catch fraud. By providing SAS with six years of data the DOR received a batch of new insights into fraud indicators such as similar filings from the same IP address. The article imparts words of wisdom from SANS Institute instructor Phil Hagen,

“Even the most sophisticated predictive analytics software requires human talent, though. For instance, once the Kentucky DOR tools (either the existing checklist or the SAS tool) suspect fraud, the tax return is forwarded to a human examiner for review. “Predictive analytics is only as good as the forethought you put into it and the questions you ask of it,” Hagen warns….  Also It’s imperative that data scientists, not security teams, drive the predictive analytics project.”

In addition to helping the IRS avoid major fails like the 2013 fraudulent refunds totaling $5.8 billion, predictive analytics has other applications. Perhaps most interesting is its use protecting human assets in regions where kidnappings are common by detecting unrest and alerting organizations to lock up their doors. But it is hard to see limitations for technology that so accurately reads the future.

Chelsea Kerwin, September 29, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

NTENT Has a New CEO

September 29, 2015

NTENT is a leading natural language processing and semantic search company, that owns the Convera technology, and according to Business Wire Dan Stickel was hired as the new CEO, says “NTENT Appoints Dan Stickel As New CEO.”  NTENT is focused on expanding the company with AltaVista and Google.  Using Stickel’s experience, NTENT has big plans and is sure that Stickel will lead the company to success.

“CEO, Stickel’s first objective will be to prioritize NTENT’s planned expansion along geographic, market and technology dimensions. ‘After spending significant time with NTENT’s Board, management team and front-line employees, I’m excited by the company’s opportunities and by the foundation that’s already been laid in both traditional web and newer mobile capabilities. NTENT has clearly built some world-class technology, and is now scaling that out with customers and partners.’”

In his past positions as CEO at Metaforic and Webtrends s well as head of the enterprise business at AltaVista and software business at Macrovision, Stickel has transitioned companies to become the leaders in their respective industries.

The demand for natural language processing software and incorporating it into semantic search is one of the biggest IT trends at the moment.  The field is demanding innovation and NTENT believes Stickel will guide them.

Whitney Grace, September 29, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta