June 8, 2016
Discrimination or wise precaution? Perhaps both? MakeUseOf tells us, “This Is Why Tor Users Are Being Blocked by Major Websites.” A recent study (PDF) by the University of Cambridge; University of California, Berkeley; University College London; and International Computer Science Institute, Berkeley confirms that many sites are actively blocking users who approach through a known Tor exit node. Writer Philip Bates explains:
“Users are finding that they’re faced with a substandard service from some websites, CAPTCHAs and other such nuisances from others, and in further cases, are denied access completely. The researchers argue that this: ‘Degraded service [results in Tor users] effectively being relegated to the role of second-class citizens on the Internet.’ Two good examples of prejudice hosting and content delivery firms are CloudFlare and Akamai — the latter of which either blocks Tor users or, in the case of Macys.com, infinitely redirects. CloudFlare, meanwhile, presents CAPTCHA to prove the user isn’t a malicious bot. It identifies large amounts of traffic from an exit node, then assigns a score to an IP address that determines whether the server has a good or bad reputation. This means that innocent users are treated the same way as those with negative intentions, just because they happen to use the same exit node.”
The article goes on to discuss legitimate reasons users might want the privacy Tor provides, as well as reasons companies feel they must protect their Websites from anonymous users. Bates notes that there is not much one can do about such measures. He does point to Tor’s own Don’t Block Me project, which is working to convince sites to stop blocking people just for using Tor. It is also developing a list of best practices that concerned sites can follow, instead. One site, GameFAQs, has reportedly lifted its block, and CloudFlare may be considering a similar move. Will the momentum build, or must those who protect their online privacy resign themselves to being treated with suspicion?
Cynthia Murrell, June 8, 2016
June 3, 2016
DuckDuckGo, like a number of other online outfits, has a presence on Tor, the gateway to the part of the Internet which is actually pretty small. I read “Tor Switches to DuckDuckGo Search Results by Default.” I learned:
[F]or a while now Disconnect has no access to Google search results anymore which we used in Tor Browser. Disconnect being more a meta search engine which allows users to choose between different search providers fell back to delivering Bing search results which were basically unacceptable quality-wise. While Disconnect is still trying to fix the situation we asked them to change the fallback to DuckDuckGo as their search results are strictly better than the ones Bing delivers.
The privacy issue looms large. The write up points out:
…DuckDuckGo made a $25,000 donation to Tor which in recent times has been trying to diversify its funding away from reliance on the US government — including launching a crowdfunding campaign which pulled in just over $200,000 at the start of this year.
How private is Tor? No information about this topic appears in the write up.
Stephen E Arnold, June 3, 2016
May 30, 2016
I read “Did Google’s NHS Patient Data Deal Need Ethical Approval?” As I thought about the headline, my reaction was typically Kentucky, “Is this mom talking or what?”
The write up states:
Now, a New Scientist investigation has found that Google DeepMind deployed a medical app called Streams for monitoring kidney conditions without first contacting the relevant regulatory authority. Our investigation also asks whether an ethical approval process that covers this kind of data transfer should have been obtained, and raises questions about the basis under which Royal Free is sharing data with Google DeepMind.
I hear, “Did you clean up your room, dear?”
The notion of mining data has some charm among some folks in the UK. The opportunity to get a leg up on other outfits has some appeal to the Alphabet Google crowd.
The issue is, “Now that the horse has left the barn, what do we do about it?” Good question if you are a mom type. Ask any teenager about Friday night. Guess what you are likely to learn.
The write up continues:
Minutes from the Royal Free’s board meeting on 6 April make the trust’s relationship with DeepMind explicit: “The board had agreed to enter into a memorandum of understanding with Google DeepMind to form a strategic partnership to develop transformational analytics and artificial intelligence healthcare products building on work currently underway on an acute kidney failure application.” When New Scientist asked for a copy of the memorandum of understanding on 9 May, Royal Free pushed the request into a Freedom of Information Act request.
I recall a statement made by a US official. It may be germane to this question about medical data. The statement: “What we say is secret is secret.” Perhaps this applies to the matter in question.
I circled this passage:
The HRA confirmed to New Scientist that DeepMind had not started the approval process as of 11 May. “Google is getting data from a hospital without consent or ethical approval,” claims Smith. “There are ethical processes around what data can be used for, and for a good reason.”
And Alphabet Google’s point of view? I highlighted this paragraph:
“Section 251 assent is not required in this case,” Google said in a statement to New Scientist. “All the identifiable data under this agreement can only ever be used to assist clinicians with direct patient care and can never be used for research.”
I don’t want to draw any comparisons between the thought processes in some Silicon Valley circles and the Silicon Fen. Some questions:
- Where is that horse?
- Who owns the horse?
- What secondary products have been created from the horse?
My inner voice is saying, “Hit the butcher specializing in horse meat maybe.”
Stephen E Arnold, May 30, 2016
May 30, 2016
One of the fears of automation is that human workers will be replaced and there will no longer be any more jobs for humanity. Blue-collar jobs are believed to be the first jobs that will be automated, but bankers, financial advisors, and other workers in the financial industry have cause to worry. Algorithms might replace them, because apparently people are getting faster and better responses from automated bank “workers”.
Perhaps one of the reasons why bankers and financial advisors are being replaced is due to their sudden understanding that “Big Data And Predictive Analytics: A Big Deal, Indeed” says ABA Banking Journal. One would think that the financial sector would be the first to embrace big data and analytics in order to keep an upper hand on their competition, earn more money, and maintain their relevancy in an ever-changing world. They, however, have been slow to adapt, slower than retail, search, and insurance.
One of the main reasons the financial district has been holding back is:
“There’s a host of reasons why banks have held back spending on analytics, including privacy concerns and the cost for systems and past merger integrations. Analytics also competes with other areas in tech spending; banks rank digital banking channel development and omnichannel delivery as greater technology priorities, according to Celent.”
After the above quote, the article makes a statement about how customers are moving more to online banking over visiting branches, but it is a very insipid observation. Big data and analytics offer the banks the opportunity to invest in developing better relationships with their customers and even offering more individualized services as a way to one up Silicon Valley competition. Big data also helps financial institutions comply with banking laws and standards to avoid violations.
Banks do need to play catch up, but this is probably a lot of moan and groan for nothing. The financial industry will adapt, especially when they are at risk of losing more money. This will be the same for all industries, adapt or get left behind. The further we move from the twentieth century and generations that are not used to digital environments, the more we will see technology integration.
May 27, 2016
Open source software is an excellent idea, because it allows programmers across the globe to share and contribute to the same project. It also creates a think tank like environment that can be applied (arguably) to any tech field. There is a downside to open source and creative commons software and that is it not a sustainable model. Open Source Everything For The 21st Century discusses the issue in their post about “Robert Steele: Should Open Source Code Have A PayPal Address & AON Sliding Scale Rate Sheet?”
The post explains that open source delivers an unclear message about how code is generated, it comes from the greater whole rather than a few people. It also is not sustainable, because people do need funds to survive as well as maintain the open source software. Fair Source is a reasonable solution: users are charged if the software is used at a company with fifteen or more employees, but it too is not sustainable.
Micro-payments, small payments of a few cents, might be the ultimate solution. Robert Steele wrote that:
“I see the need for bits of code to have embedded within them both a PayPalPayPal-like address able to handle micro-payments (fractions of a cent), and a CISCO-like Application Oriented Network (AON) rules and rate sheet that can be updated globally with financial-level latency (which is to say, instantly) and full transparency. Some standards should be set for payment scales, e.g. 10 employees, 100, 1000 and up; such that a package of code with X number of coders will automatically begin to generate PayPal payments to the individual coders when the package hits N use cases within Z organizational or network structures.”
Micro-payments are not a bad idea and it has occasionally been put into practice, but not very widespread. No one has really pioneered an effective system for it.
Steele is also an advocate for “…Internet access and individual access to code is a human right, devising new rules for a sharing economy in which code is a cost of doing business at a fractional level in comparison to legacy proprietary code — between 1% and 10% of what is paid now.”
It is the ideal version of the Internet, where people are able to make money from their content and creations, users’ privacy is maintained, and ethics is essential are respected. The current trouble with YouTube channels and copyright comes to mind as does stolen information sold on the Dark Web and the desire to eradicate online bullying.
May 25, 2016
Some of us consider the movie Minority Report to be a cautionary tale, but apparently the Chinese government sees it as more of good suggestion. According to eTeknix, that country seems to be planning a crime-prediction unit similar to the one in the movie, except this one will use algorithms instead of psychics. We learn about the initiative from the brief write-up, “China Creating ‘Precrime’ System.” Writer Gareth Andrews informs us:
“The movie Minority Report posed an interesting question to people: if you knew that someone was going to commit a crime, would you be able to charge them for it before it even happens? If we knew you were going to pirate a video game when it goes online, does that mean we can charge you for stealing the game before you’ve even done it?
“China is looking to do just that by creating a ‘unified information environment’ where every piece of information about you would tell the authorities just what you normally do. Decide you want to something today and it could be an indication that you are about to commit or already have committed a criminal activity.
“With machine learning and artificial intelligence being at the core of the project, predicting your activities and finding something which ‘deviates from the norm’ can be difficult for even a person to figure out. When the new system goes live, being flagged up to the authorities would be as simple as making a few purchases online….”
Indeed. Today’s tech is being used to gradually erode privacy rights around the world, all in the name of security. There is a scene in that Minority Report that has stuck with me: Citizens in an apartment building are shown pausing their activities to passively accept the intrusion of spider-like spy-bots into their homes, upon their very faces even, then resuming where they left off as if such an incursion were perfectly normal. If we do not pay attention, one day it may become so.
Cynthia Murrell, May 25, 2016
May 24, 2016
The article on GlobeNewsWire titled Ex-Googler Startup DGraph Labs Raises US$1.1 Million in Seed Funding Round to Build Industry’s First Open Source, Native and Distributed Graph Database names Bain Capital Ventures and Blackbird Ventures as the main investors in the startup. Manish Jain, founder and CEO of DGraph, worked on Google’s Knowledge Graph Infrastructure for six years. He explains the technology,
“Graph data structures store objects and the relationships between them. In these data structures, the relationship is as important as the object. Graph databases are, therefore, designed to store the relationships as first class citizens… Accessing those connections is an efficient, constant-time operation that allows you to traverse millions of objects quickly. Many companies including Google, Facebook, Twitter, eBay, LinkedIn and Dropbox use graph databases to power their smart search engines and newsfeeds.”
Among the many applications of graph databases, the internet of thing, behavior analysis, medical and DNA research, and AI are included. So what is DGraph going to do with their fresh funds? Jain wants to focus on forging a talented team of engineers and developing the company’s core technology. He notes in the article that this sort of work is hardly the typical obstacle faced by a startup, but rather the focus of major tech companies like Google or Facebook.
Chelsea Kerwin, May 24, 2016
May 18, 2016
A recent report from SimilarWeb tells us what sorts of people turn to Internet search engine DuckDuckGo, which protects users’ privacy, over a more prominent engine, Microsoft’s Bing. The Search Engine Journal summarizes the results in, “New Research Reveals Who is Using DuckDuckGo and Why.”
The study drew its conclusions by looking at the top five destinations of DuckDuckGo users: Whitehatsec.com, Github.com, NYtimes.com, 4chan.org, and YCombinator.com. Note that four of these five sites have pretty specific audiences, and compare them to the top five, more widely used, sites accessed through Bing: MSN.com, Amazon.com, Reddit.com, Google.com, and Baidu.com.
Writer Matt Southern observes:
“DuckDuckGo users also like to engage with their search engine of choice for longer periods of time — averaging 9.38 minutes spent on DuckDuckGo vs. Bing.
“Despite its growth over the past year, DuckDuckGo faces a considerable challenge when it comes to getting found by new users. Data shows the people using DuckDuckGo are those who already know about the search engine, with 93% of its traffic coming from direct visits. Only 1.5% of its traffic comes from organic search.
“Roy Hinkis of SimilarWeb concludes by saying the loyal users of DuckDuckGo are those who love tech, and they use they use DuckDuckGo as an alternative because they’re concerned about having their privacy protected while they search online.”
Though Southern agrees DuckDuckGo needs to do some targeted marketing, he notes traffic to the site has been rising by 22% per year. It is telling that the privacy-protecting engine is most popular among those who understand the technology.
Cynthia Murrell, May 18, 2016
May 13, 2016
Did you know that Facebook combs your content for criminal intent? American Intelligence Report reveals, “Facebook Monitors Your Private Messages and Photos for Criminal Activity, Reports them to Police.” Naturally, software is the first entity to scan content, using keywords and key phrases to flag items for human follow-up. Of particular interest are “loose” relationships. Reporter Kristan T. Harris writes:
“Reuters’ interview with the security officer explains, Facebook’s software focuses on conversations between members who have a loose relationship on the social network. For example, if two users aren’t friends, only recently became friends, have no mutual friends, interact with each other very little, have a significant age difference, and/or are located far from each other, the tool pays particular attention.
“The scanning program looks for certain phrases found in previously obtained chat records from criminals, including sexual predators (because of the Reuters story, we know of at least one alleged child predator who is being brought before the courts as a direct result of Facebook’s chat scanning). The relationship analysis and phrase material have to add up before a Facebook employee actually looks at communications and makes the final decision of whether to ping the authorities.
“’We’ve never wanted to set up an environment where we have employees looking at private communications, so it’s really important that we use technology that has a very low false-positive rate,’ Sullivan told Reuters.”
Uh-huh. So, one alleged predator has been caught. We’re told potential murder suspects have also been identified this way, with one case awash in 62 pages of Facebook-based evidence. Justice is a good thing, but Harris notes that most people will be uncomfortable with the idea of Facebook monitoring their communications. She goes on to wonder where this will lead; will it eventually be applied to misdemeanors and even, perhaps, to “thought crimes”?
Users of any social media platform must understand that anything they post could eventually be seen by anyone. Privacy policies can be updated without notice, and changes can apply to old as well as new data. And, of course, hackers are always lurking about. I was once cautioned to imagine that anything I post online I might as well be shouting on a public street; that advice has served me well.
Cynthia Murrell, May 13, 2016
May 11, 2016
Strife has plagued the human race since the beginning, but the Pentagon’s research arm thinks may be able to get to the root of the problem. Defense Systems informs us, “DARPA Looks to Tap Social Media, Big Data to Probe the Causes of Social Unrest.” Writer George Leopold explains:
“The Defense Advanced Research Projects Agency (DARPA) announced this week it is launching a social science research effort designed to probe what unifies individuals and what causes communities to break down into ‘a chaotic mix of disconnected individuals.’ The Next Generation Social Science (NGS2) program will seek to harness steadily advancing digital connections and emerging social and data science tools to identify ‘the primary drivers of social cooperation, instability and resilience.’
“Adam Russell, DARPA’s NGS2 program manager, said the effort also would address current research limitations such as the technical and logistical hurdles faced when studying large populations and ever-larger datasets. The project seeks to build on the ability to link thousands of diverse volunteers online in order to tackle social science problems with implications for U.S. national and economic security.”
The initiative aims to blend social science research with the hard sciences, including computer and data science. Virtual reality, Web-based gaming, and other large platforms will come into play. Researchers hope their findings will make it easier to study large and diverse populations. Funds from NGS2 will be used for the project, with emphases on predictive modeling, experimental structures, and boosting interpretation and reproducibility of results.
Will it be the Pentagon that finally finds the secret to world peace?
Cynthia Murrell, May 11, 2016