Rare Sighting in Silicon Valley: A Unicorn
July 8, 2016
Unicorns are mythical creatures with a whole slew of folklore surrounding them, but in modern language the horned beast has been used as a metaphor for a rare occurrence. North Korea once said that Kim Jong Un spotted a unicorn from their despotic controlled media service, but Fortune tells us that a unicorn was spotted in California’s Silicon Valley: “The SEC Wants Unicorns To Stop Bragging About Their Valuations”.
Unicorns in the tech world are Silicon Valley companies valued at more than one billion. In some folklore, unicorns are vain creatures and love to be admired, the same can be said about Silicon Valley companies and startups as they brag about their honesty with their investors. Mary Jo White of the SEC said she wanted them to stop blowing the hot air.
“ ‘The concern is whether the prestige associated with reaching a sky-high valuation fast drives companies to try to appear more valuable than they actually are,’ she said.”
Unlike publicly traded companies, the SEC cannot regulate private unicorns, but they still value protecting investors and facilitating capital formation. Silicon Valley unicorns have secondary markets forming around their pre-IPO status. The status they retain before they are traded on the public market. The secondary market uses derivative contracts, which can contribute to misconceptions about their value. White wants the unicorns to realize they need to protect their investors once they go public with better structures and controls for their daily operations.
Another fact from unicorn folklore is that unicorns are recognized as symbols of truth. So while the braggart metaphor is accurate, the truthful aspect is not.
Whitney Grace, July 8 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Publicly Available Information Is Considered Leaked When on Dark Web
July 7, 2016
What happens when publicly available informed is leaked to the Dark Web? This happened recently with staff contact information from the University of Liverpool according to an article, Five secrets about the Dark Web you didn’t know from CloudPro. This piece speaks to perception that the Dark Web is a risky place for even already publicly available information. The author reports on how the information was compromised,
“A spokeswoman said: “We detected an automated cyber-attack on one of our departmental online booking systems, which resulted in publically available data – surname, email, and business telephone numbers – being released on the internet. We take the security of all university-related data very seriously and routinely test our systems to ensure that all data is protected effectively. We supported the Regional Organised Crime Unit (TITAN) in their investigations into this issue and reported the case to the Information Commissioner’s Office.”
Data security only continues to grow in importance and as a concern for large enterprises and organizations. This incident is an interesting case to be reported, and it was the only story we had not seen published again and again, as it illustrates the public perception of the Dark Web being a playing ground for illicit activity. It brings up the question about what online landscapes are considered public versus private.
Megan Feil, July 7, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Hacking Team Cannot Sell Spyware
June 27, 2016
I do not like spyware. Once it is downloaded onto your computer, it is a pain to delete and it even steals personal information. I think it should be illegal to make, but some good comes from spyware if it is in the right hands (ideally). Some companies make and sell spyware to government agencies. One of them is the Hacking Team and they recently had some bad news said Naked Security, “Hacking Team Loses Global License To Sell Spyware.”
You might remember Hacking Team from 2015, when its systems were hacked and 500 gigs of internal, files, emails, and product source code were posted online. The security company has spent the past year trying to repair its reputation, but the Italian Ministry of Economic Development dealt them another blow. The ministry revoked Hacking Team’s “global authorization” to sell its Remote Control System spyware suite to forty-six countries. Hacking Team can still sell within the European Union and expects to receive approval to sell outside the EU.
“MISE told Motherboard that it was aware that in 2015 Hacking Team had exported its products to Malaysia, Egypt, Thailand, Kazakhstan, Vietnam, Lebanon and Brazil.
The ministry explained that “in light of changed political situations” in “one of” those countries, MISE and the Italian Foreign Affairs, Interior and Defense ministries decided Hacking Team would require “specific individual authorization.” Hacking Team maintains that it does not sell its spyware to governments or government agencies where there is “objective evidence or credible concerns” of human rights violations.”
Hacking Team said if they suspect that any of their products were used to caused harm, they immediately suspend support if customers violate the contract terms. Privacy International does not believe that Hacking Team’s self-regulation is enough.
It points to the old argument that software is a tool and humans cause the problems.
Whitney Grace, June 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Banks as New Dark Web Educators
June 15, 2016
The Dark Web and deep web can often get misidentified and confused by readers. To take a step back, Trans Union’s blog offers a brief read called, The Dark Web & Your Data: Facts to Know, that helpfully addresses some basic information on these topics. First, a definition of the Dark Web: sites accessible only when a physical computer’s unique IP address is hidden on multiple levels. Specific software is needed to access the Dark Web because that software is needed to encrypt the machine’s IP address. The article continues,
“Certain software programs allow the IP address to be hidden, which provides anonymity as to where, or by whom, the site is hosted. The anonymous nature of the dark web makes it a haven for online criminals selling illegal products and services, as well as a marketplace for stolen data. The dark web is often confused with the “deep web,” the latter of which makes up about 90 percent of the Internet. The deep web consists of sites not reachable by standard search engines, including encrypted networks or password-protected sites like email accounts. The dark web also exists within this space and accounts for approximately less than 1 percent of web content.”
For those not reading news about the Dark Web every day, this seems like a fine piece to help brush up on cybersecurity concerns relevant at the individual user level. Trans Union is on the pulse in educating their clients as banks are an evergreen target for cybercrime and security breaches. It seems the message from this posting to clients can be interpreted as one of the “good luck” variety.
Megan Feil, June 15, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Websites Found to Be Blocking Tor Traffic
June 8, 2016
Discrimination or wise precaution? Perhaps both? MakeUseOf tells us, “This Is Why Tor Users Are Being Blocked by Major Websites.” A recent study (PDF) by the University of Cambridge; University of California, Berkeley; University College London; and International Computer Science Institute, Berkeley confirms that many sites are actively blocking users who approach through a known Tor exit node. Writer Philip Bates explains:
“Users are finding that they’re faced with a substandard service from some websites, CAPTCHAs and other such nuisances from others, and in further cases, are denied access completely. The researchers argue that this: ‘Degraded service [results in Tor users] effectively being relegated to the role of second-class citizens on the Internet.’ Two good examples of prejudice hosting and content delivery firms are CloudFlare and Akamai — the latter of which either blocks Tor users or, in the case of Macys.com, infinitely redirects. CloudFlare, meanwhile, presents CAPTCHA to prove the user isn’t a malicious bot. It identifies large amounts of traffic from an exit node, then assigns a score to an IP address that determines whether the server has a good or bad reputation. This means that innocent users are treated the same way as those with negative intentions, just because they happen to use the same exit node.”
The article goes on to discuss legitimate reasons users might want the privacy Tor provides, as well as reasons companies feel they must protect their Websites from anonymous users. Bates notes that there is not much one can do about such measures. He does point to Tor’s own Don’t Block Me project, which is working to convince sites to stop blocking people just for using Tor. It is also developing a list of best practices that concerned sites can follow, instead. One site, GameFAQs, has reportedly lifted its block, and CloudFlare may be considering a similar move. Will the momentum build, or must those who protect their online privacy resign themselves to being treated with suspicion?
Cynthia Murrell, June 8, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
DuckDuckGo Tor Search
June 3, 2016
DuckDuckGo, like a number of other online outfits, has a presence on Tor, the gateway to the part of the Internet which is actually pretty small. I read “Tor Switches to DuckDuckGo Search Results by Default.” I learned:
[F]or a while now Disconnect has no access to Google search results anymore which we used in Tor Browser. Disconnect being more a meta search engine which allows users to choose between different search providers fell back to delivering Bing search results which were basically unacceptable quality-wise. While Disconnect is still trying to fix the situation we asked them to change the fallback to DuckDuckGo as their search results are strictly better than the ones Bing delivers.
The privacy issue looms large. The write up points out:
…DuckDuckGo made a $25,000 donation to Tor which in recent times has been trying to diversify its funding away from reliance on the US government — including launching a crowdfunding campaign which pulled in just over $200,000 at the start of this year.
How private is Tor? No information about this topic appears in the write up.
Stephen E Arnold, June 3, 2016
Alphabet Google: An NHS Explainer
May 30, 2016
I read “Did Google’s NHS Patient Data Deal Need Ethical Approval?” As I thought about the headline, my reaction was typically Kentucky, “Is this mom talking or what?”
The write up states:
Now, a New Scientist investigation has found that Google DeepMind deployed a medical app called Streams for monitoring kidney conditions without first contacting the relevant regulatory authority. Our investigation also asks whether an ethical approval process that covers this kind of data transfer should have been obtained, and raises questions about the basis under which Royal Free is sharing data with Google DeepMind.
I hear, “Did you clean up your room, dear?”
The notion of mining data has some charm among some folks in the UK. The opportunity to get a leg up on other outfits has some appeal to the Alphabet Google crowd.
The issue is, “Now that the horse has left the barn, what do we do about it?” Good question if you are a mom type. Ask any teenager about Friday night. Guess what you are likely to learn.
The write up continues:
Minutes from the Royal Free’s board meeting on 6 April make the trust’s relationship with DeepMind explicit: “The board had agreed to enter into a memorandum of understanding with Google DeepMind to form a strategic partnership to develop transformational analytics and artificial intelligence healthcare products building on work currently underway on an acute kidney failure application.” When New Scientist asked for a copy of the memorandum of understanding on 9 May, Royal Free pushed the request into a Freedom of Information Act request.
I recall a statement made by a US official. It may be germane to this question about medical data. The statement: “What we say is secret is secret.” Perhaps this applies to the matter in question.
I circled this passage:
The HRA confirmed to New Scientist that DeepMind had not started the approval process as of 11 May. “Google is getting data from a hospital without consent or ethical approval,” claims Smith. “There are ethical processes around what data can be used for, and for a good reason.”
And Alphabet Google’s point of view? I highlighted this paragraph:
“Section 251 assent is not required in this case,” Google said in a statement to New Scientist. “All the identifiable data under this agreement can only ever be used to assist clinicians with direct patient care and can never be used for research.”
I don’t want to draw any comparisons between the thought processes in some Silicon Valley circles and the Silicon Fen. Some questions:
- Where is that horse?
- Who owns the horse?
- What secondary products have been created from the horse?
My inner voice is saying, “Hit the butcher specializing in horse meat maybe.”
Stephen E Arnold, May 30, 2016
Financial Institutes Finally Realize Big Data Is Important
May 30, 2016
One of the fears of automation is that human workers will be replaced and there will no longer be any more jobs for humanity. Blue-collar jobs are believed to be the first jobs that will be automated, but bankers, financial advisors, and other workers in the financial industry have cause to worry. Algorithms might replace them, because apparently people are getting faster and better responses from automated bank “workers”.
Perhaps one of the reasons why bankers and financial advisors are being replaced is due to their sudden understanding that “Big Data And Predictive Analytics: A Big Deal, Indeed” says ABA Banking Journal. One would think that the financial sector would be the first to embrace big data and analytics in order to keep an upper hand on their competition, earn more money, and maintain their relevancy in an ever-changing world. They, however, have been slow to adapt, slower than retail, search, and insurance.
One of the main reasons the financial district has been holding back is:
“There’s a host of reasons why banks have held back spending on analytics, including privacy concerns and the cost for systems and past merger integrations. Analytics also competes with other areas in tech spending; banks rank digital banking channel development and omnichannel delivery as greater technology priorities, according to Celent.”
After the above quote, the article makes a statement about how customers are moving more to online banking over visiting branches, but it is a very insipid observation. Big data and analytics offer the banks the opportunity to invest in developing better relationships with their customers and even offering more individualized services as a way to one up Silicon Valley competition. Big data also helps financial institutions comply with banking laws and standards to avoid violations.
Banks do need to play catch up, but this is probably a lot of moan and groan for nothing. The financial industry will adapt, especially when they are at risk of losing more money. This will be the same for all industries, adapt or get left behind. The further we move from the twentieth century and generations that are not used to digital environments, the more we will see technology integration.
Whitney Grace, May 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Open Source Software Needs a Micro-Payment Program
May 27, 2016
Open source software is an excellent idea, because it allows programmers across the globe to share and contribute to the same project. It also creates a think tank like environment that can be applied (arguably) to any tech field. There is a downside to open source and creative commons software and that is it not a sustainable model. Open Source Everything For The 21st Century discusses the issue in their post about “Robert Steele: Should Open Source Code Have A PayPal Address & AON Sliding Scale Rate Sheet?”
The post explains that open source delivers an unclear message about how code is generated, it comes from the greater whole rather than a few people. It also is not sustainable, because people do need funds to survive as well as maintain the open source software. Fair Source is a reasonable solution: users are charged if the software is used at a company with fifteen or more employees, but it too is not sustainable.
Micro-payments, small payments of a few cents, might be the ultimate solution. Robert Steele wrote that:
“I see the need for bits of code to have embedded within them both a PayPalPayPal-like address able to handle micro-payments (fractions of a cent), and a CISCO-like Application Oriented Network (AON) rules and rate sheet that can be updated globally with financial-level latency (which is to say, instantly) and full transparency. Some standards should be set for payment scales, e.g. 10 employees, 100, 1000 and up; such that a package of code with X number of coders will automatically begin to generate PayPal payments to the individual coders when the package hits N use cases within Z organizational or network structures.”
Micro-payments are not a bad idea and it has occasionally been put into practice, but not very widespread. No one has really pioneered an effective system for it.
Steele is also an advocate for “…Internet access and individual access to code is a human right, devising new rules for a sharing economy in which code is a cost of doing business at a fractional level in comparison to legacy proprietary code — between 1% and 10% of what is paid now.”
It is the ideal version of the Internet, where people are able to make money from their content and creations, users’ privacy is maintained, and ethics is essential are respected. The current trouble with YouTube channels and copyright comes to mind as does stolen information sold on the Dark Web and the desire to eradicate online bullying.
Whitney Grace, May 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
China Reportedly Planning Its Own Precrime System
May 25, 2016
Some of us consider the movie Minority Report to be a cautionary tale, but apparently the Chinese government sees it as more of good suggestion. According to eTeknix, that country seems to be planning a crime-prediction unit similar to the one in the movie, except this one will use algorithms instead of psychics. We learn about the initiative from the brief write-up, “China Creating ‘Precrime’ System.” Writer Gareth Andrews informs us:
“The movie Minority Report posed an interesting question to people: if you knew that someone was going to commit a crime, would you be able to charge them for it before it even happens? If we knew you were going to pirate a video game when it goes online, does that mean we can charge you for stealing the game before you’ve even done it?
“China is looking to do just that by creating a ‘unified information environment’ where every piece of information about you would tell the authorities just what you normally do. Decide you want to something today and it could be an indication that you are about to commit or already have committed a criminal activity.
“With machine learning and artificial intelligence being at the core of the project, predicting your activities and finding something which ‘deviates from the norm’ can be difficult for even a person to figure out. When the new system goes live, being flagged up to the authorities would be as simple as making a few purchases online….”
Indeed. Today’s tech is being used to gradually erode privacy rights around the world, all in the name of security. There is a scene in that Minority Report that has stuck with me: Citizens in an apartment building are shown pausing their activities to passively accept the intrusion of spider-like spy-bots into their homes, upon their very faces even, then resuming where they left off as if such an incursion were perfectly normal. If we do not pay attention, one day it may become so.
Cynthia Murrell, May 25, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph