Remote Access Round Up

July 15, 2015

I received an inquiry about remote access tools. With the mass media frenzy over the hack of a Italian services firm, interest in controlling another computer from a distance—that is, by remote control—seems to be on the uptick. If you want to dabble with remote access, navigate to “9 Free Remote Access Tools.” You can download a few and give them a whirl. The real question is, “How do you get the tool on another computer if that computer is not your mom’s or a helpless neighbor’s machine?” That is the big question, not the RAT technology. Enjoy.

Stephen E Arnold, July 18, 2015

The Skin Search

July 15, 2015

We reported on how billboards in Russia were getting smarter by using facial recognition software to hide ads advertising illegal products when they recognized police walking by.  Now the US government might be working on technology that can identify patterns on tattoos, reports Quartz in, “The US Government Wants Software That Can Detect And Interpret Your Tattoos.”

The Department of Justice, Department of Defense, and the FBI sponsored a competition that the National Institute of Standards and Technology (NIST) recently held on June 8 to research ways to identify ink:

“The six teams that entered the competition—from universities, government entities, and consulting firms—had to develop an algorithm that would be able to detect whether an image had a tattoo in it, compare similarities in multiple tattoos, and compare sketches with photographs of tattoos. Some of the things the National Institute of Standards and Technology (NIST), the competition’s organizers, were looking to interpret in images of tattoos include swastikas, snakes, drags, guns, unicorns, knights, and witches.”

The idea is to use visual technology to track tattoos among crime suspects and relational patterns. Vision technology, however, is still being perfected.  Companies like Google and major universities are researching ways to make headway in the technology.

While the visual technology can be used to track suspected criminals, it can also be used for other purposes.  One implication is responding to accidents as they happen instead of recording them.  Tattoo recognition is the perfect place to start given the inked variety available and correlation to gangs and crime.  The question remains, what will they call the new technology, skin search?

Whitney Grace, July 15, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Short Honk: Saudi Supercomputer

July 14, 2015

In order to crunch text and do large scale computations, a fast computer is a useful tool. Engineering & Technology Magazine reported in “Saudi Machine Makes It on to World’s Top Ten Supercomputer List”:

The Shaheen II is the first supercomputer based in the Middle East to enter the world’s top ten list, debuting at number seven. The Saudi supercomputer is based at King Abdullah University of Science and Technology and is the seventh most powerful computer on the planet, according to the Top 500 organization that monitors high-performance machines. China’s Tianhe-2 kept its position as the most powerful supercomputer in the world in the latest rankings.

If you are monitoring the supercomputer sector, this announcement, if accurate, is important in my opinion. There are implications for content processing, social graph generation, and other interesting applications.

Stephen E Arnold, July 14, 2015

Page Load Speed: Let Us Blame Those in Suits

July 14, 2015

I read “News Sites Are Fatter and Slower Than Ever.” Well, I am not sure about “ever.” I recall when sites simply did not work. Those sites never worked. You can check out the turtles if you can grab a peak at a crawler’s log file. Look for nifty codes like 2000, 4, or 12. Your mileage may vary, but the log file tells the tale.

The write up aims at news sites. My hunch is that the definition of a news site is one of those toip one percent things: The user is looking for information from a big name and generally clueless outfit like The Daily Whatever or a mash up of content from hither and yon.

Enter latency, lousy code, crazy ads from half baked ad servers, and other assorted craziness.

The write up acknowledges that different sites deliver different response times. Okay.

If you are interested in data, the article presents an interesting chart. You can see home page load times with and without ads. There’s a chart which shows page load times via different mobile connections.

The main point, in my opinion, is a good one:

Since its initial release 22 years ago, the Hyper Text Markup Language  (HTML) has gone through many iterations that make web sites richer and smarter than ever. But this evolution also came with loads of complexity and a surfeit of questionable features. It’s time to swing the pendulum back toward efficiency and simplicity. Users are asking for it and will punish those who don’t listen.

My hunch is that speed is a harsh task master. In our work, we have found that with many points in a process, resources are often constrained or poorly engineered. As a result, each new layer of digital plaster contributes to the sluggishness of a system.

Unless one has sufficient resources (money and expertise and time), lousy performance is the new norm. The Google rails and cajoles because slow downs end up costing my favorite search engine big bucks.

Most news sites do not get the message and probably never will. The focus is on another annoying overlay, pop up, or inline video.

Click away, gentle reader, click away. Many folks see the browser as the new Windows 3.11. Maybe browsers are the new Windows 3.11?

Stephen E Arnold, July 14, 2015

Algorithmic Art Historians

July 14, 2015

Apparently, creativity itself is no longer subjective. MIT Technology Review announces, “Machine Vision Algorithm Chooses the Most Creative Paintings in History.” Traditionally, art historians judge how creative a work is based on its novelty and its influence on subsequent artists. The article notes that this is a challenging task, requiring an encyclopedic knowledge of art history and the judgement to decide what is novel and what has been influential. Now, a team at Rutgers University has developed an algorithm they say is qualified for the job.

Researchers Ahmed Elgammal and Babak Saleh credit several developments with bringing AI to this point. First, we’ve recently seen several breakthroughs in machine understanding of visual concepts, called classemes. that include recognition of factors from colors to specific objects. Another important factor: there now exist well-populated online artwork databases that the algorithms can, um, study. The article continues:

“The problem is to work out which paintings are the most novel compared to others that have gone before and then determine how many paintings in the future have uses similar features to work out their influence. Elgammal and Saleh approach this as a problem of network science. Their idea is to treat the history of art as a network in which each painting links to similar paintings in the future and is linked to by similar paintings from the past. The problem of determining the most creative is then one of working out when certain patterns of classemes first appear and how these patterns are adopted in the future. …

“The problem of finding the most creative paintings is similar to the problem of finding the most influential person on a social network, or the most important station in a city’s metro system or super spreaders of disease. These have become standard problems in network theory in recent years, and now Elgammal and Saleh apply it to creativity networks for the first time.”

Just what we needed. I have to admit the technology is quite intriguing, but I wonder: Will all creative human endeavors eventually have their algorithmic counterparts and, if so, how will that effect human expression?

Cynthia Murrell, July 14, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Watson Based Tradeoff Analytics Weighs Options

July 13, 2015

IBM’s Watson now lends its considerable intellect to helping users make sound decisions. In “IBM Watson Tradeoff Analytics—General Availability,” the Watson Developer Community announces that the GA release of this new tool can be obtained through the Watson Developer Cloud platform. The release follows an apparently successful Beta run that began last February. The write-up explains that the tool:

“… Allows you to compare and explore many options against multiple criteria at the same time. This ultimately contributes to a more balanced decision with optimal payoff.

“Clients expect to be educated and empowered: ‘don’t just tell me what to do,’ but ‘educate me, and let me choose.’ Tradeoff Analytics achieves this by providing reasoning and insights that enable judgment through assessment of the alternatives and the consequent results of each choice. The tool identifies alternatives that represent interesting tradeoff considerations. In other words: Tradeoff Analytics highlights areas where you may compromise a little to gain a lot. For example, in a scenario where you want to buy a phone, you can learn that if you pay just a little more for one phone, you will gain a better camera and a better battery life, which can give you greater satisfaction than the slightly lower price.”

For those interested in the technical details behind this Watson iteration, the article points you to Tradeoff Analyticsdocumentation. Those wishing to glimpse the visualization capabilities can navigate to  this demo. The write-up also lists post-beta updates and explains pricing, so check it out for more information.

Cynthia Murrell, July 13, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Information Technology: The Myth of Control

July 12, 2015

In the good old days circa 1962, one went to a computer center. In the center was a desk, usually too high to be comfortable for the supplicant to lean comfortably. There were young people ready to ask the supplicant to sign in, fill out a form to request computer time, and wait. Once in a while, a supplicant would be granted a time slot on a keypunch machine. Most of the time, the supplicant was given an time slot. But that was the start of the process.

I won’t bore you with the details of submitting decks of punched cards, returning to get a green bar print out, and the joy or heartbreak of finding out that you program ran or did not.

I figured out quickly that working in the computer center was the sure fire way to get access to the computer, a giant IBM thing which required care and feeding of two or three people plus others on call.

The pain of those experiences have not gone away, gentle reader. If you are fortunate enough to be in a facility with a maybe-is or maybe-isn’t quantum computer, the mainframe mentality is the only way to go. There are research facilities with even more stringent guidelines, but the average mobile phone user thinks that computer use is a democracy. Wrong. Controls are important. Period. But senior management, not information technology, has the responsibility to steer the good ship Policies & Procedures.

It is not. It never will be.

When I read “Cloudy with a Chance of Data Loss: Has Corporate IT Lost Control?” I was not comfortable. The reality is that corporate information technology has control in certain situations. In others, for all practical purposes, there is no organizational information technology department.

MBAs, venture capital types, and those without patience what what they want when they want it. The controls are probably in place, but the attitude of these hyper kinetic history majors with a law degree is that those rules do not apply to them. Toss in a handful of entitled but ineffective middle school teachers and a clueless webmaster and you have the chemical components of bone head information technology behaviors.

The information technology professionals just continue to do their thing, hoping that they can manage the systems in today’s equivalent of a 1960s air conditioned, sealed off, locked, and limited access computer room.

Other stuff is essentially chaos.

The write up assumes that control is a bad thing. The write up uses words like “consumer oriented,” “ease of use,” and “ownership.” The reason a non mainframe mentality exists among most people with whom I interact is a reptilian memory of the mainframe method. For most people, entitlement and do your own thing are the keys to effective computing.

If an information technology professional suggests a more effective two factor authentication procedure or a more locked down approach to high value content—these people are either ignored, terminated, or just worked around.

As a result of organization’s penchant for hiring those who are friendly and on the team, one gets some darned exciting information technology situations. Management happily cuts budgets. One Fortune 100 company CFO told me, “We are freezing the IT budget. Whatever those guys do, they have to do it with a fixed allocation.” Wonderful reasoning.

The write up concludes with this statement:

Modern IT departments realize that to overcome security challenges they must work together with users– not dictate to them. The advent of the cloud model means that smart users can readily circumvent restrictions if they see no value in abiding by the rules. IT teams must therefore be inclusive and proactive, investing in secure file-sharing solutions that are accepted by users while also providing visibility, compliance and security. Fortunately, there are good alternatives for the 84 per cent of senior IT management who admit they are “concerned” over employee-managed cloud services. The bottom line is this: there are times when we all need to share files. But there is never an occasion when any of us should trust a consumer-grade service with critical business data. It simply presents too many risks.

Nope. The optimal way in my view is for organizations to knock off the shortcuts, focus on specific methods required to deliver functionality and help reduce the risk of a “problem,” and shift from entitlement feeling good attitudes to a more formal, business-centric approach.

It is not a matter of control. Commonsense and the importance of senior management to create a work environment in which control exists across business policies and procedures.

The hippy dippy approach to information technology is more risky than some folks realize. As the wall poster in my server room says, “Ignorance is bliss. Hello, happy.”

Stephen E Arnold, July 12, 2015

Holy Cow. More Information Technology Disruptors in the Second Machine Age!

July 11, 2015

I read a very odd write up called “The Five Other Disruptors about to Define IT in the Second Machine Age.”

Whoa, Nellie. The second machine age. I thought we were in the information age. Dorky machines are going to be given an IQ injection with smart software. The era is defined by software, not machines. You know. Mobile phones are pretty much a commodity with the machine part defined by fashion and brand and, of course, software.

So a second machine age. News to me. I am living in the second machine age. Interesting. I thought we had the Industrial Revolution, then the boring seventh grade mantra of manufacturing, the nuclear age, the information age, etc. Now we are doing the software thing.

My hunch is that the author  of this strange article is channeling Shoshana Zuboff’s In the Age of the Smart Machine. That’s okay, but I am not convinced that the one, two thing is working for me.

Let’s look at the disruptors which the article asserts are just as common as the wonky key fob I have for my 2011 Kia Soul. A gray Kia soul. Call me exciting.

Here are the four disruptors that, I assume, are about to remake current information technology models. Note that these four disruptors are “about to define IT.” These are like rocks balanced above Alexander the Great’s troops as they marched through the valleys in what is now Afghanistan. A 12 year old child could push the rock from its perch and crush a handful of Macedonians. Potential and scary enough to help Alexander to decide to march in a different direction. Hello, India.

These disruptors are the rocks about to plummet into my information technology department. The department, I wish to point out, works from their hovels and automobiles, dialing in when the spirit moves them.

Here we go:

  • Big Data
  • Cloud
  • Mobile
  • Social

I am not confident that these four disruptors have done much to alter my information technology life, but if one is young, I assume that these disruptors are just part of the everyday experience. I see grade school children poking their smart phones when I take my dogs for their morning constitutional.

But the points which grabbed my attention were the “five other disruptors.” I had to calm down because I assumed i had a reasonable grasp on disruptors important in my line of work. But, no. These disruptors are not my disruptors.

Let’s look at each:

The Trend to NoOps

What the heck does this mean? In my experience, experienced operations professionals are needed even as some of the smart outfits I used to work with.

Agility Becomes a First Class Citizen

I did not know that the ability to respond to issues and innovations was not essential for a successful information technology professional.

Identity without Barriers

What the heck does this mean? The innovations in security are focused on ensuring that barriers exist and are not improperly gone through. The methods have little to do with an individual’s preferences. The notion of federation is an interesting one. In some cases, federation is one of the unresolved challenges in information technology. Mixing up security, “passwords,” and disparate content from heterogeneous systems is a very untidy serving of fruit salad.

Thinking about information technology after reading Rush’s book of farmer flummoxing poetry. Is this required reading for a mid tier consultant? I wonder if Dave Schubmehl has read it? I wonder if some Gartner or Forrester consultants have dipped into its meaty pages. (No pun intended.)

IT Goes Bi Modal?

What the heck does this mean again? Referencing Gartner is a sure fire way to raise grave concerns about the validity of the assertion. But bi-modal. Two modes. Like zero and one. Organizations have to figure out how to use available technology to meet that organization’s specific requirements. The problem of legacy and next generation systems defines the information landscape. Information technology has to cope with a fuzzy technology environment. Bi modal? Baloney.

The Second Machine Age

Okay, I think I understand the idea of a machine age. The problem is that we are in a software and information datasphere. The machine thing is important, but it is software that allows legacy systems to coexist with more with it approaches. This silly number of ages makes zero sense and is essentially a subjective, fictional, metaphorical view of the present information technology environment.

Maybe that’s why Gartner hires poets and high profile publications employ folks who might find an hour discussing the metaphorical implications of “bare ruined choirs.”

None of these five disruptions makes much sense to me.

My hunch is that you, gentle reader, may be flummoxed as well.

Stephen E Arnold, July 11, 2015

Researchers Glean Audio from Video

July 10, 2015

Now, this is fascinating. Scary, but fascinating. MIT News explains how a team of researchers from MIT, Microsoft, and Adobe are “Extracting Audio from Visual Information.” The article includes a video in which one can clearly hear the poem “Mary Had a Little Lamb” as extrapolated from video of a potato chip bag’s vibrations filmed through soundproof glass, among other amazing feats. I highly recommend you take four-and-a-half minutes to watch the video.

 Writer Larry Hardesty lists some other surfaces from which the team was able reproduce audio by filming vibrations: aluminum foil, water, and plant leaves. The researchers plan to present a paper on their results at this year’s Siggraph computer graphics conference. See the article for some details on the research, including camera specs and algorithm development.

 So, will this tech have any non-spying related applications? Hardesty cites MIT grad student, and first writer on the team’s paper, Abe Davis as he writes:

 “The researchers’ technique has obvious applications in law enforcement and forensics, but Davis is more enthusiastic about the possibility of what he describes as a ‘new kind of imaging.’

“‘We’re recovering sounds from objects,’ he says. ‘That gives us a lot of information about the sound that’s going on around the object, but it also gives us a lot of information about the object itself, because different objects are going to respond to sound in different ways.’ In ongoing work, the researchers have begun trying to determine material and structural properties of objects from their visible response to short bursts of sound.”

 That’s one idea. Researchers are confident other uses will emerge, ones no one has thought of yet. This is a technology to keep tabs on, and not just to decide when to start holding all private conversations in windowless rooms.

 Cynthia Murrell, July 10, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

IBM Ultradense Computer Chips. Who Will Profit from the Innovation?

July 9, 2015

I don’t want to rain on IBM’s seven nanometer chips. To experience the rah rah wonder of the innovation, navigate to “IBM Announces Computer Chips More Powerful Than Any in Existence.” Note: You may have to purchase a dead tree edition of the Gray Lady or cough up money to deal with the pay wall.

The write up reveals, just like on an automobile rebuilding television program without Chip Foose, who shows the finished vehicle, not a component:

The company said on Thursday that it had working samples of chips with seven-nanometer transistors. It made the research advance by using silicon-germanium instead of pure silicon in key regions of the molecular-size switches. The new material makes possible faster transistor switching and lower power requirements. The tiny size of these transistors suggests that further advances will require new materials and new manufacturing techniques. As points of comparison to the size of the seven-nanometer transistors, a strand of DNA is about 2.5 nanometers in diameter and a red blood cell is roughly 7,500 nanometers in diameter. IBM said that would make it possible to build microprocessors with more than 20 billion transistors.

Okay. Good.

My question is, “Has IBM the capability to manufacture these chips, package them in hardware that savvy information technology professionals will want, and then support the rapidly growing ecosystem?”

Like the pre Judge Green Bell Labs, IBM can invent or engineer something nifty. But the Bell Labs’ folks were not the leaders in the productization field. IBM seems to connect its “international consortium” and the $3 billion in “a public private partnership” as evidence that revenue is just around the corner.

Like the Watson PR, IBM’s ability to get its tales of technical prowess in front of me may be greater than the company’s ability to generate substantial top line growth and a healthy pile of cash after taxes.

From my vantage point in rural Kentucky, my hunch is that the outfits which build the equipment, work out the manufacturing processes, and then increase chip yields will be the big winners. The proven ability to make things may have more revenue potential than the achievement, which is significant, than a seven nanometer chip.

Who will be the winner? The folks at Samsung who could use a win? The contractors involved in the project? IBM?

No answers, but my hunch is that core manufacturing expertise might be a winner going forward. Once a chip is made smaller, others know it can be done which allows the followers to move forward. IBM, however, has more than an innovator’s dilemma. Will Watson become more of a market force with these new chips? If so, when? One week, one year, 10 years?

Also, IBM has to deal with the allegedly accurate statements about the company which appear in the Alliance@IBM blog.

Stephen E Arnold, July 9, 2015

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta