Icann Is an I Won’t
November 16, 2015
Have you ever heard of Icann? You are probably like many people within the United States and have not heard of the non-profit private company. What does Icann do? Icann is responsible for Internet protocol addresses (IP) and coordinating domain names, so basically the company is responsible for a huge portion of the Internet. According to The Guardian in “The Internet Is Run By An Unaccountable Private Company. This Is A Problem,” the US supposedly runs the Icann but its role is mostly clerical and by September 30, 2015 it was supposed to hand the reins over to someone else.
The “else” is the biggest question. The Icann community spent hours trying to figure out who would manage the company, but they ran into a huge brick wall. The biggest issue is that the volunteers want Icann to have more accountability, which does not seem feasible. Icann’s directors cannot be fired, except by each other. Finances are another problem with possible governance risks and corruption.
A supposed solution is to create a membership organization, a common business model for non-profits and will give power to the community. Icann’s directors are not too happy and have been allowed to add their own opinions. Decisions are not being made at Icann and with the new presidential election the entire power shift could be off. It is not the worst that could happen:
“But there’s much more at stake. Icann’s board – as ultimate authority in this little company running global internet resources, and answerable (in fact, and in law) to no one – does have the power to reject the community’s proposals. But not everything that can be done, should be done. If the board blunders on, it will alienate those volunteers who are the beating heart of multi-stakeholder governance. It will also perfectly illustrate why change is required.”
The board has all the power and the do not have anyone to hold them accountable. Icann directors just have to stall long enough to keep things the same and they will be able to give themselves more raises.
Whitney Grace, November 16, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Expect Disruption from Future Technology
November 13, 2015
A dystopian future where technology has made humanity obsolete is a theme older than the Industrial Revolution. History has proven that while some jobs are phased out thanks to technology more jobs are created by it, after all someone needs to monitor and make the machines. As technology grows and makes computing systems capable of reason, startups are making temporary gigs permanent jobs, and 3D printing makes it possible to make any object, the obsolete humanity idea does not seem so far-fetched. Kurzweilai shares a possible future with “The SAP Future Series: Digital Technology’s Exponential Growth Curve Foretells Avalanche Of Business Disruption.”
While technology has improved lives of countless people, it is disrupting industries. These facts prove to be insightful into how disruptive:
- In 2015 Airbnb will become the largest hotel chain in the world, launched in 2008, with more than 850,000 rooms, and without owning any hotels.
- From 2012 to 2014, Uber consumed 65% of San Francisco’s taxi business.
- Advances in artificial intelligence and robotics put 47% of US employment — over 60 million jobs — at high risk of being replaced in the next decade.
- 10 million new autonomous vehicles per year may be entering US highways by 2030.
- Today’s sensors are 1 billion times better — 1000x lighter, 1000x cheaper, 1000x the resolution — than only 40 years ago. By 2030, 100 trillion sensors could be operational worldwide.
- DNA sequencing cost dropped precipitously — from $1 billion to $5,000 — in 15 years. By 2020 could be $0.01.
- In 2000 it took $5,000,000 to launch an internet start-up. Today the cost is less than $5,000.
Using a series of videos, SAP explains how disruption will change the job market, project management, learning, and even predicting future growth. Rather than continuing the dystopia future projections, SAP positions itself to offer hope and ways to adapt for your success. Humanity will be facing huge changes because of technology in the near future, but our successful ability to adapt always helps us evolve.
3DWhitney Grace, November 13, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Google Takes Aim at Internet Crime
November 12, 2015
Google has a plan to thwart Internet crime: make it too expensive to be worth it. The company’s Online Security Blog examines the issue in “New Research: The Underground Market Fueling For-Profit Abuse.” The research was presented last June at the Workshop on the Economics of Information Security 2015; I recommend those interested check out the full report here.
The post describes the global online black market that has grown over the last ten years or so, where criminals trade in such items as stolen records, exploit kits, scam hosting, and access to compromised computers. The profit centers which transfer the shady funds rest upon an infrastructure, the pieces of which cost money. Google plans to do what it can to increase those costs. The write-up explains:
“Client and server-side security has dominated industry’s response to digital abuse over the last decade. The spectrum of solutions—automated software updates, personal anti-virus, network packet scanners, firewalls, spam filters, password managers, and two-factor authentication to name a few—all attempt to reduce the attack surface that criminals can penetrate. While these safeguards have significantly improved user security, they create an arms race: criminals adapt or find the subset of systems that remain vulnerable and resume operation.
“To overcome this reactive defense cycle, we are improving our approach to abuse fighting to also strike at the support infrastructure, financial centers, and actors that incentivize abuse. By exploring the value chain required to bulk register accounts, we were able to make Google accounts 30–40% more expensive on the black market. Success stories from our academic partners include disrupting payment processing for illegal pharmacies and counterfeit software outlets advertised by spam, cutting off access to fake accounts that pollute online services, and disabling the command and control infrastructure of botnets.”
Each of the links in the above quote goes to an in-depth paper, so there’s plenty of material to check out there. Society has been trying for centuries to put black markets out of business. Will the effort be more successful in the virtual realm?
Cynthia Murrell, November 12, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Amazon Punches Business Intelligence
November 11, 2015
Amazon already gave technology a punch when it launched AWS, but now it is releasing a business intelligence application that will change the face of business operations or so Amazon hopes. ZDNet describes Amazon’s newest endeavor in “AWS QuickSight Will Disrupt Business Intelligence, Analytics Markets.” The market is already saturated with business intelligence technology vendors, but Amazon’s new AWS QuickSight will cause another market upheaval.
“This month is no exception: Amazon crashed the party by announcing QuickSight, a new BI and analytics data management platform. BI pros will need to pay close attention, because this new platform is inexpensive, highly scalable, and has the potential to disrupt the BI vendor landscape. QuickSight is based on AWS’ cloud infrastructure, so it shares AWS characteristics like elasticity, abstracted complexity, and a pay-per-use consumption model.”
Another monkey wrench for business intelligence vendors is that AWS QuickSight’s prices are not only reasonable, but are borderline scandalous: standard for $9/month per user or enterprise edition for $18/month per user.
Keep in mind, however, that AWS QuickSight is the newest shiny object on the business intelligence market, so it will have out-of-the-box problems, long-term ramifications are unknown, and reliance on database models and schemas. Do not forget that most business intelligence solutions do not resolve all issues, including ease of use and comprehensiveness. It might be better to wait until all the bugs are worked out of the system, unless you do not mind being a guinea pig.
Whitney Grace, November 11, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Drone and Balloon WiFi Coming to the Sky near You
November 10, 2015
Google and Facebook have put their differences aside to expand Internet access to four billion people. Technology Review explains in “Facebook;s Internet Drone Team Is Collaborating With Google’s Stratospheric Balloons Project” how both companies have filed documented with the US Federal Communications Commission to push international law to make it easier to have aircraft fly 12.5 miles or 20 kilometers above the Earth, placing it in the stratosphere.
Google has been working on balloons that float in the stratosphere that function as aerial cell towers and Facebook is designing drones the size of aircraft that are tethered to the ground that serve the same purpose. While the companies are working together, they will not state how. Both Google and Facebook are working on similar projects, but the aerial cell towers marks a joint effort where they putting aside their difference (for the most part) to improve information access.
“However, even if Google and Facebook work together, corporations alone cannot truly spread Internet access as widely as is needed to promote equitable access to education and other necessities, says Nicholas Negroponte, a professor at MIT’s Media Lab and founder of the One Laptop Per Child Project. ‘I think that connectivity will become a human right,’ said Negroponte, opening the session at which Facebook and Google’s Maguire and DeVaul spoke. Ensuring that everyone gets that right requires the Internet to be operated similar to public roads, and provided by governments, he said.”
Quality Internet access not only could curb poor education, but it could also improve daily living. People in developing countries would be able to browse information to remedy solutions and even combat traditional practices that do more harm than good.
Some of the biggest obstacles will be who will maintain the aerial cell towers and also if they will pose any sort of environmental danger.
Whitney Grace, November 10, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
CEM Platform Clarabridge 7 Supports Silo Elimination
November 10, 2015
The move to eliminate data silos in the corporation has gained another friend, we learn in Direct Marketing News’ piece, “Clarabridge Joins the Burn-Down-the-Silos Movement.” With their latest product release, the customer experience management firm hopes to speed their clients’ incorporation of business intelligence and feedback. The write-up announces:
“Clarabridge today released Clarabridge 7, joining the latest movement among marketing tech companies to speed actionability of data intelligence by burning down the corporate silos. The new release’s CX Studio promises to provide users a route to exploring the full customer journey in an intuitive manner. A new dashboard and authoring capability allows for “massive rollout,” in Clarabridge’s terms, across an entire enterprise.
“Also new are role-based dashboards that translate data in a manner relevant to specific roles, departments, and levels in an organization. The company claims that such personalization lets users take intelligence and feedback and put it immediately into action. CX Engagor expedites that by connecting business units directly with consumers in real time.”
We have to wonder whether this rush to “burn the silos” will mean that classified information will get out; details germane to a legal matter, for example, or health information or financial data. How can security be applied to an open sea of data?
Clarabridge has spent years developing its sentiment and text analytics technology, and asserts it is uniquely positioned to support enterprise-scale customer feedback initiatives. The company maintains offices in Barcelona, London, San Francisco, Singapore, and Washington, DC. They also happen to be hiring as of this writing.
Cynthia Murrell, November 10, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Photo Farming in the Early Days
November 9, 2015
Have you ever wondered what your town looked like while it was still urban and used as farmland? Instead of having to visit your local historical society or library (although we do encourage you to do so), the United States Farm Security Administration and Office Of War Information (known as FSA-OWI for short) developed Photogrammer. Photogrammer is a Web-based image platform for organizing, viewing, and searching farm photos from 1935-1945.
Photogrammer uses an interactive map of the United States, where users can click on a state and then a city or county within it to see the photos from the timeline. The archive contains over 170,000 photos, but only 90,000 have a geographic classification. They have also been grouped by the photographer who took the photos, although it is limited to fifteen people. Other than city, photographer, year, and month, the collection c,an be sorted by collection tags and lot numbers (although these are not discussed in much detail).
While farm photographs from 1935-1945 do not appear to need their own photographic database, the collection’s history is interesting:
“In order to build support for and justify government programs, the Historical Section set out to document America, often at her most vulnerable, and the successful administration of relief service. The Farm Security Administration—Office of War Information (FSA-OWI) produced some of the most iconic images of the Great Depression and World War II and included photographers such as Dorothea Lange, Walker Evans, and Arthur Rothstein who shaped the visual culture of the era both in its moment and in American memory. Unit photographers were sent across the country. The negatives were sent to Washington, DC. The growing collection came to be known as “The File.” With the United State’s entry into WWII, the unit moved into the Office of War Information and the collection became known as the FSA-OWI File.”
While the photos do have historical importance, rather than creating a separate database with its small flaws, it would be more useful if it was incorporated into a larger historical archive, like the Library of Congress, instead of making it a pet project.
Whitney Grace, November 9, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Banks Turn to Blockchain Technology
November 9, 2015
Cryptocurrency has come a long way, and now big banks are taking the technology behind Bitcoin very seriously, we learn in “Nine of the World’s Biggest Banks Form Blockchain Partnership” at Re/code. Led by financial technology firm R3, banks are signing on to apply blockchain tech to the financial markets. A few of the banks involved so far include Goldman Sacks, Barclays, JP Morgan, Royal Bank of Scotland, Credit Suisse, and Commonwealth Bank of Australia. The article notes:
“The blockchain works as a huge, decentralized ledger of every bitcoin transaction ever made that is verified and shared by a global network of computers and therefore is virtually tamper-proof. The Bank of England has a team dedicated to it and calls it a ‘key technological innovation.’ The data that can be secured using the technology is not restricted to bitcoin transactions. Two parties could use it to exchange any other information, within minutes and with no need for a third party to verify it. [R3 CEO David] Rutter said the initial focus would be to agree on an underlying architecture, but it had not yet been decided whether that would be underpinned by bitcoin’s blockchain or another one, such as one being built by Ethereum, which offers more features than the original bitcoin technology.”
Rutter did mention he expects this tech to be used post-trade, not directly in exchange or OTC trading, at least not soon. It is hoped the use of blockchain technology will increase security while reducing security and errors.
Cynthia Murrell, November 9, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Pew Report Compares News Sources: Twitter and Facebook
November 6, 2015
As newspapers fall, what is rising to take their place? Why, social media, of course. The Pew Research Center discusses its recent findings on the subject in, “The Evolving Role of News on Twitter and Facebook.” The number of Americans getting their news from these platforms continues to rise, across almost all demographic groups. The article informs us:
“The new study, conducted by Pew Research Center in association with the John S. and James L. Knight Foundation, finds that clear majorities of Twitter (63%) and Facebook users (63%) now say each platform serves as a source for news about events and issues outside the realm of friends and family. That share has increased substantially from 2013, when about half of users (52% of Twitter users, 47% of Facebook users) said they got news from the social platforms.”
The write-up describes some ways the platforms differ in their news delivery. For example, more users turn to Twitter for breaking news, while Facebook now features a “Trending” sidebar, filterable by subject. The article notes that these trends can have an important impact on our society:
“As more social networking sites recognize and adapt to their role in the news environment, each will offer unique features for news users, and these features may foster shifts in news use. Those different uses around news features have implications for how Americans learn about the world and their communities, and for how they take part in the democratic process.”
Indeed. See the article for more differences between Facebook and Twitter news consumers, complete with some percentages. You can also see the data’s barebones results in the report’s final topline. Most of the data comes from a survey conducted across two weekends last March, among 2,035 Americans aged 18 and up.
Cynthia Murrell, November 6, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Google Continues to Improve Voice Search
November 5, 2015
Google’s research arm continues to make progress on voice search. The Google Research Blog updates us in, “Google Voice Search: Faster and More Accurate.” The Google Speech Team begins by referring back to 2012, when they announced their Deep Neural Network approach. They have since built on that concept; the team now employs a couple of models built upon recurrent neural networks, which they note are fast and accurate: connectionist temporal classification and sequence discriminative (machine) training techniques. The write-up goes into detail about how speech recognizers work and what makes their latest iteration the best yet. I found the technical explanation fascinating, but it is too lengthy to describe here; please see the post for those details.
I am still struck when I see any article mention that an algorithm has taken the initiative. This time, researchers had to rein in their model’s insightful decision:
“We now had a faster and more accurate acoustic model and were excited to launch it on real voice traffic. However, we had to solve another problem – the model was delaying its phoneme predictions by about 300 milliseconds: it had just learned it could make better predictions by listening further ahead in the speech signal! This was smart, but it would mean extra latency for our users, which was not acceptable. We solved this problem by training the model to output phoneme predictions much closer to the ground-truth timing of the speech.”
At least the AI will take direction. The post concludes:
“We are happy to announce that our new acoustic models are now used for voice searches and commands in the Google app (on Android and iOS), and for dictation on Android devices. In addition to requiring much lower computational resources, the new models are more accurate, robust to noise, and faster to respond to voice search queries – so give it a try, and happy (voice) searching!”
We always knew natural-language communication with machines would present huge challenges, ones many said could never be overcome. It seems such naysayers were mistaken.
Cynthia Murrell, November 5, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph