CyberOSINT banner

Alphabet Google: An NHS Explainer

May 30, 2016

I read “Did Google’s NHS Patient Data Deal Need Ethical Approval?” As I thought about the headline, my reaction was typically Kentucky, “Is this mom talking or what?”

The write up states:

Now, a New Scientist investigation has found that Google DeepMind deployed a medical app called Streams for monitoring kidney conditions without first contacting the relevant regulatory authority. Our investigation also asks whether an ethical approval process that covers this kind of data transfer should have been obtained, and raises questions about the basis under which Royal Free is sharing data with Google DeepMind.

I hear, “Did you clean up your room, dear?”

The notion of mining data has some charm among some folks in the UK. The opportunity to get a leg up on other outfits has some appeal to the Alphabet Google crowd.

The issue is, “Now that the horse has left the barn, what do we do about it?” Good question if you are a mom type. Ask any teenager about Friday night. Guess what you are likely to learn.

The write up continues:

Minutes from the Royal Free’s board meeting on 6 April make the trust’s relationship with DeepMind explicit: “The board had agreed to enter into a memorandum of understanding with Google DeepMind to form a strategic partnership to develop transformational analytics and artificial intelligence healthcare products building on work currently underway on an acute kidney failure application.” When New Scientist asked for a copy of the memorandum of understanding on 9 May, Royal Free pushed the request into a Freedom of Information Act request.

I recall a statement made by a US official. It may be germane to this question about medical data. The statement: “What we say is secret is secret.” Perhaps this applies to the matter in question.

I circled this passage:

The HRA confirmed to New Scientist that DeepMind had not started the approval process as of 11 May. “Google is getting data from a hospital without consent or ethical approval,” claims Smith. “There are ethical processes around what data can be used for, and for a good reason.”

And Alphabet Google’s point of view? I highlighted this paragraph:

“Section 251 assent is not required in this case,” Google said in a statement to New Scientist. “All the identifiable data under this agreement can only ever be used to assist clinicians with direct patient care and can never be used for research.”

I don’t want to draw any comparisons between the thought processes in some Silicon Valley circles and the Silicon Fen. Some questions:

  • Where is that horse?
  • Who owns the horse?
  • What secondary products have been created from the horse?

My inner voice is saying, “Hit the butcher specializing in horse meat maybe.”

Stephen E Arnold, May 30, 2016

Financial Institutes Finally Realize Big Data Is Important

May 30, 2016

One of the fears of automation is that human workers will be replaced and there will no longer be any more jobs for humanity.  Blue-collar jobs are believed to be the first jobs that will be automated, but bankers, financial advisors, and other workers in the financial industry have cause to worry.  Algorithms might replace them, because apparently people are getting faster and better responses from automated bank “workers”.

Perhaps one of the reasons why bankers and financial advisors are being replaced is due to their sudden understanding that “Big Data And Predictive Analytics: A Big Deal, Indeed” says ABA Banking Journal.  One would think that the financial sector would be the first to embrace big data and analytics in order to keep an upper hand on their competition, earn more money, and maintain their relevancy in an ever-changing world.   They, however, have been slow to adapt, slower than retail, search, and insurance.

One of the main reasons the financial district has been holding back is:

“There’s a host of reasons why banks have held back spending on analytics, including privacy concerns and the cost for systems and past merger integrations. Analytics also competes with other areas in tech spending; banks rank digital banking channel development and omnichannel delivery as greater technology priorities, according to Celent.”

After the above quote, the article makes a statement about how customers are moving more to online banking over visiting branches, but it is a very insipid observation.  Big data and analytics offer the banks the opportunity to invest in developing better relationships with their customers and even offering more individualized services as a way to one up Silicon Valley competition.  Big data also helps financial institutions comply with banking laws and standards to avoid violations.

Banks do need to play catch up, but this is probably a lot of moan and groan for nothing.  The financial industry will adapt, especially when they are at risk of losing more money.  This will be the same for all industries, adapt or get left behind.  The further we move from the twentieth century and generations that are not used to digital environments, the more we will see technology integration.

Whitney Grace, May 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Open Source Software Needs a Micro-Payment Program

May 27, 2016

Open source software is an excellent idea, because it allows programmers across the globe to share and contribute to the same project.  It also creates a think tank like environment that can be applied (arguably) to any tech field.  There is a downside to open source and creative commons software and that is it not a sustainable model.  Open Source Everything For The 21st Century discusses the issue in their post about “Robert Steele: Should Open Source Code Have A PayPal Address & AON Sliding Scale Rate Sheet?”

The post explains that open source delivers an unclear message about how code is generated, it comes from the greater whole rather than a few people.  It also is not sustainable, because people do need funds to survive as well as maintain the open source software.  Fair Source is a reasonable solution: users are charged if the software is used at a company with fifteen or more employees, but it too is not sustainable.

Micro-payments, small payments of a few cents, might be the ultimate solution.  Robert Steele wrote that:

“I see the need for bits of code to have embedded within them both a PayPalPayPal-like address able to handle micro-payments (fractions of a cent), and a CISCO-like Application Oriented Network (AON) rules and rate sheet that can be updated globally with financial-level latency (which is to say, instantly) and full transparency. Some standards should be set for payment scales, e.g. 10 employees, 100, 1000 and up; such that a package of code with X number of coders will automatically begin to generate PayPal payments to the individual coders when the package hits N use cases within Z organizational or network structures.”

Micro-payments are not a bad idea and it has occasionally been put into practice, but not very widespread.  No one has really pioneered an effective system for it.

Steele is also an advocate for “…Internet access and individual access to code is a human right, devising new rules for a sharing economy in which code is a cost of doing business at a fractional level in comparison to legacy proprietary code — between 1% and 10% of what is paid now.”

It is the ideal version of the Internet, where people are able to make money from their content and creations, users’ privacy is maintained, and ethics is essential are respected.  The current trouble with YouTube channels and copyright comes to mind as does stolen information sold on the Dark Web and the desire to eradicate online bullying.

 

Whitney Grace, May 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

China Reportedly Planning Its Own Precrime System

May 25, 2016

Some of us consider the movie Minority Report to be a cautionary tale, but apparently the Chinese government sees it as more of good suggestion. According to eTeknix, that country seems to be planning a crime-prediction unit similar to the one in the movie, except this one will use algorithms  instead of psychics. We learn about the initiative from the brief write-up, “China Creating ‘Precrime’ System.” Writer Gareth Andrews informs us:

“The movie Minority Report posed an interesting question to people: if you knew that someone was going to commit a crime, would you be able to charge them for it before it even happens? If we knew you were going to pirate a video game when it goes online, does that mean we can charge you for stealing the game before you’ve even done it?

“China is looking to do just that by creating a ‘unified information environment’ where every piece of information about you would tell the authorities just what you normally do. Decide you want to something today and it could be an indication that you are about to commit or already have committed a criminal activity.

“With machine learning and artificial intelligence being at the core of the project, predicting your activities and finding something which ‘deviates from the norm’ can be difficult for even a person to figure out. When the new system goes live, being flagged up to the authorities would be as simple as making a few purchases online….”

Indeed. Today’s tech is being used to gradually erode privacy rights around the world, all in the name of security. There is a scene in that Minority Report that has stuck with me: Citizens in an apartment building are shown pausing their activities to passively accept the intrusion of spider-like spy-bots into their homes, upon their very faces even, then resuming where they left off as if such an incursion were perfectly normal. If we do not pay attention, one day it may become so.

 

Cynthia Murrell, May 25, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

DGraph Labs Startup Aims to Fill Gap in Graph Database Market

May 24, 2016

The article on GlobeNewsWire titled Ex-Googler Startup DGraph Labs Raises US$1.1 Million in Seed Funding Round to Build Industry’s First Open Source, Native and Distributed Graph Database names Bain Capital Ventures and Blackbird Ventures as the main investors in the startup. Manish Jain, founder and CEO of DGraph, worked on Google’s Knowledge Graph Infrastructure for six years. He explains the technology,

“Graph data structures store objects and the relationships between them. In these data structures, the relationship is as important as the object. Graph databases are, therefore, designed to store the relationships as first class citizens… Accessing those connections is an efficient, constant-time operation that allows you to traverse millions of objects quickly. Many companies including Google, Facebook, Twitter, eBay, LinkedIn and Dropbox use graph databases to power their smart search engines and newsfeeds.”

Among the many applications of graph databases, the internet of thing, behavior analysis, medical and DNA research, and AI are included. So what is DGraph going to do with their fresh funds? Jain wants to focus on forging a talented team of engineers and developing the company’s core technology. He notes in the article that this sort of work is hardly the typical obstacle faced by a startup, but rather the focus of major tech companies like Google or Facebook.

 

Chelsea Kerwin, May 24, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Tech Savvy Users Turn to DuckDuckGo

May 18, 2016

A recent report from SimilarWeb tells us what sorts of people turn to Internet search engine DuckDuckGo, which protects users’ privacy, over a more prominent engine, Microsoft’s Bing. The Search Engine Journal summarizes the results in, “New Research Reveals Who is Using DuckDuckGo and Why.”

The study drew its conclusions by looking at the top five destinations of DuckDuckGo users: Whitehatsec.com, Github.com, NYtimes.com,  4chan.org, and  YCombinator.com. Note that four of these five sites have pretty specific audiences, and compare them to the top five, more widely used, sites accessed through Bing: MSN.com, Amazon.com, Reddit.com, Google.com, and Baidu.com.

Writer Matt Southern observes:

“DuckDuckGo users also like to engage with their search engine of choice for longer periods of time — averaging 9.38 minutes spent on DuckDuckGo vs. Bing.

“Despite its growth over the past year, DuckDuckGo faces a considerable challenge when it comes to getting found by new users. Data shows the people using DuckDuckGo are those who already know about the search engine, with 93% of its traffic coming from direct visits. Only 1.5% of its traffic comes from organic search.

“Roy Hinkis of SimilarWeb concludes by saying the loyal users of DuckDuckGo are those who love tech, and they use they use DuckDuckGo as an alternative because they’re concerned about having their privacy protected while they search online.”

Though Southern agrees DuckDuckGo needs to do some targeted marketing, he notes traffic to the site has been rising by 22% per year.  It is telling that the privacy-protecting engine is most popular among those who understand the technology.

 

Cynthia Murrell, May 18, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Facebook and Law Enforcement in Cahoots

May 13, 2016

Did you know that Facebook combs your content for criminal intent? American Intelligence Report reveals, “Facebook Monitors Your Private Messages and Photos for Criminal Activity, Reports them to Police.” Naturally, software is the first entity to scan content, using keywords and key phrases to flag items for human follow-up. Of particular interest are “loose” relationships. Reporter Kristan T. Harris writes:

Reuters’ interview with the security officer explains,  Facebook’s software focuses on conversations between members who have a loose relationship on the social network. For example, if two users aren’t friends, only recently became friends, have no mutual friends, interact with each other very little, have a significant age difference, and/or are located far from each other, the tool pays particular attention.

“The scanning program looks for certain phrases found in previously obtained chat records from criminals, including sexual predators (because of the Reuters story, we know of at least one alleged child predator who is being brought before the courts as a direct result of Facebook’s chat scanning). The relationship analysis and phrase material have to add up before a Facebook employee actually looks at communications and makes the final decision of whether to ping the authorities.

“’We’ve never wanted to set up an environment where we have employees looking at private communications, so it’s really important that we use technology that has a very low false-positive rate,’ Sullivan told Reuters.”

Uh-huh. So, one alleged predator  has been caught. We’re told potential murder suspects have also been identified this way, with one case awash in 62 pages of Facebook-based evidence. Justice is a good thing, but Harris notes that most people will be uncomfortable with the idea of Facebook monitoring their communications. She goes on to wonder where this will lead; will it eventually be applied to misdemeanors and even, perhaps, to “thought crimes”?

Users of any social media platform must understand that anything they post could eventually be seen by anyone. Privacy policies can be updated without notice, and changes can apply to old as well as new data. And, of course, hackers are always lurking about. I was once cautioned to imagine that anything I post online I might as well be shouting on a public street; that advice has served me well.

 

Cynthia Murrell, May 13, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

DARPA Seeks Keys to Peace with High-Tech Social Science Research

May 11, 2016

Strife has plagued the human race since the beginning, but the Pentagon’s research arm thinks may be able to get to the root of the problem. Defense Systems informs us, “DARPA Looks to Tap Social Media, Big Data to Probe the Causes of Social Unrest.” Writer George Leopold explains:

“The Defense Advanced Research Projects Agency (DARPA) announced this week it is launching a social science research effort designed to probe what unifies individuals and what causes communities to break down into ‘a chaotic mix of disconnected individuals.’ The Next Generation Social Science (NGS2) program will seek to harness steadily advancing digital connections and emerging social and data science tools to identify ‘the primary drivers of social cooperation, instability and resilience.’

“Adam Russell, DARPA’s NGS2 program manager, said the effort also would address current research limitations such as the technical and logistical hurdles faced when studying large populations and ever-larger datasets. The project seeks to build on the ability to link thousands of diverse volunteers online in order to tackle social science problems with implications for U.S. national and economic security.”

The initiative aims to blend social science research with the hard sciences, including computer and data science. Virtual reality, Web-based gaming, and other large platforms will come into play. Researchers hope their findings will make it easier to study large and diverse populations. Funds from NGS2 will be used for the project, with emphases on predictive modeling, experimental structures, and boosting interpretation and reproducibility of results.

Will it be the Pentagon that finally finds the secret to world peace?

 

Cynthia Murrell, May 11, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Artificial Intelligence Spreading to More Industries

May 10, 2016

According to MIT Technology Review, it has finally happened. No longer is artificial intelligence the purview of data wonks alone— “AI Hits the Mainstream,” they declare. Targeted AI software is now being created for fields from insurance to manufacturing to health care. Reporter Nanette Byrnes  is curious to see how commercialization will affect artificial intelligence, as well as how this technology will change different industries.

What about the current state of the AI field? Byrnes writes:

“Today the industry selling AI software and services remains a small one. Dave Schubmehl, research director at IDC, calculates that sales for all companies selling cognitive software platforms —excluding companies like Google and Facebook, which do research for their own use—added up to $1 billion last year. He predicts that by 2020 that number will exceed $10 billion. Other than a few large players like IBM and Palantir Technologies, AI remains a market of startups: 2,600 companies, by Bloomberg’s count. That’s because despite rapid progress in the technologies collectively known as artificial intelligence—pattern recognition, natural language processing, image recognition, and hypothesis generation, among others—there still remains a long way to go.”

The article examines ways some companies are already using artificial intelligence. For example, insurance and financial firm USAA is investigating its use to prevent identity theft, while GE is now using it to detect damage to its airplanes’ engine blades. Byrnes also points to MyFitnessPal, Under Armor’s extremely successful diet and exercise tracking app. Through a deal with IBM, Under Armor is blending data from that site with outside research to help better target potential consumers.

The article wraps up by reassuring us that, despite science fiction assertions to the contrary, machine learning will always require human guidance. If you doubt, consider recent events—Google’s self-driving car’s errant lane change and Microsoft’s racist chatbot. It is clear the kids still need us, at least for now.

 

Cynthia Murrell, April 10, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

New Criminal Landscape Calls for New Approaches

May 9, 2016

The Oxford University Press’s blog discusses law enforcement’s interest in the shady side of the Internet in its post, “Infiltrating the Dark Web.” Writer Andrew Staniforth observes that the growth of crime on the Dark Web calls for new tactics. He writes:

“Criminals conducting online abuses, thefts, frauds, and terrorism have already shown their capacity to defeat Information Communication Technology (ICT) security measures, as well as displaying an indifference to national or international laws designed to stop them. The uncomfortable truth is that as long as online criminal activities remain profitable, the miscreants will continue, and as long as technology advances, the plotters and conspirators who frequent the Dark Web will continue to evolve at a pace beyond the reach of traditional law enforcement methods.

“There is, however, some glimmer of light amongst the dark projection of cybercrime as a new generation of cyber-cops are fighting back. Nowhere is this more apparent than the newly created Joint Cybercrime Action Taskforce (J-CAT) within Europol, who now provide a dynamic response to strengthen the fight against cybercrime within the European Union and beyond Member States borders. J-CAT seeks to stimulate and facilitate the joint identification, prioritisation, and initiation of cross-border investigations against key cybercrime threats and targets – fulfilling its mission to pro-actively drive intelligence-led actions against those online users with criminal intentions.”

The article holds up J-CAT as a model for fighting cybercrime. It also emphasizes the importance of allocating resources for gathering intelligence, and notes that agencies are increasingly focused on solutions that can operate in mobile and cloud environments. Increased collaboration, however, may make the biggest difference in the fight against criminals operating on the Dark Web.

 

Cynthia Murrell, April 9, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Next Page »