Quantum Schmantum
May 25, 2020
What happens when the miasmatic hyperbole about artificial intelligence begins to wane? Another revolutionary, game changing, paradigm shifting technology will arise. Maybe the heiress to AI hoo-hah is waiting in the wings, ready to rush on stage?
One candidate is quantum computing. A couple of years ago, a conference organizer told me, “I’m all in on quantum computing. It’s the next technology revolution.”
My reaction was, “Yeah, okay.”
I noted Intel’s announcement of its horse collar or horse baloney breakthrough. I noted Google’s quantum supremacy PR push. I noted innovations like the value of photons in controlling a quantum interaction.
Got it. Careers are being made. Grants are being obtained. And venture firms are using other people’s money to make the quantum revolution arrive sooner rather than later. “Later” in hyperbole land is rarely defined.
I was interested in a paper by Gil Kalai, whose nominal professional relationship is with the Hebrew University of Jerusalem. The title? “The Argument against Quantum Computers, the Quantum Laws of Nature, and Google’s Supremacy Claim.”
The write up explains some caveats with the technology packing with anticipation to grab the spotlight from artificial intelligence. The paper is quite interesting. Sure, it includes equations, which are conversation killers at a newly reopened beach front bar on the Jersey Shore. There’s also postulates and reasonably easy-to-follow arguments. So read the paper already.
Here’s the conclusion:
I expect that the most important application will eventually be the understanding of the impossibility of quantum error-correction and quantum computation. Overall, the debate over quantum computing is a fascinating one, and I can see a clear silver lining: major advances in human ability to simulate quantum physics and quantum chemistry are expected to emerge if quantum computational supremacy can be demonstrated and quantum computers can be built, but also if quantum computational supremacy cannot be demonstrated and quantum computers cannot be built. Some of the insights and methods characteristic of the area of quantum computation might be useful for classical computation of realistic quantum systems – which is, apparently, what nature does.
This is a good news, bad news conclusion. The research is a journey. The destination may be surprising. So hype on.
Stephen E Arnold, May 25, 2020
Googler Departing: Dr. Eric Schmidt and His Visibility
May 10, 2020
DarkCyber commented on the New York Times’ story about Eric Schmidt, a former Sun Microsystems professional. No, we did not comment about Google and Java. No we did not remark about our longing for NetWare’s compsurf.
Yes, we did suggest that the purpose of the write up “I Could Solve Most of Your Problems: Eric Schmidt’s Pentagon Offensive” was a PR play by Google.
That may have been part of the motivation. But we learned in “Eric Schmidt, Who Led Google’s Transformation into a Tech Giant, Has Left the Company” that the former “adult” at Google and leader of NetWare departed from the Google in February 2020.
Who knew?
Not the New York Times it seems.
As a result, an alternative motivating factor for the revelations assembled by the NYT could have been publicity for Dr. Schmidt himself.
That NYT story is probably a better job hunting tool that a short item in Microsoft LinkedIn. Just a hunch, of course.
When will that compsurf process be completed? A week, maybe more. By then, Dr. Schmidt may have a new post pandemic job. Is Palantir hiring? Does the White House have a job opening? Is Oracle poking around for an expert to advise the Dolphin Way outfit about Java? What about the Department of Defense as it navigates the Amazon Microsoft worlds of technology?
Opportunities are out there.
Stephen E Arnold, May 10, 2020
TechRepublic: Unintentionally Amusing Non Playing Videos about Videos
May 10, 2020
DarkCyber noted “How to Hold Video Meetings Like a Pro.” We clicked the link to learn what this interesting publication offered for those struggling with the video work from home activity. Here’s what we saw, and we left the page rendering for 10 minutes:
Yep, a video that would not play. But take heart, gentle reader. The write-up includes a link to an audio version of the podcast about video meetings. That worked even though the guest’s audio was subdued. And, if the rich media from the article leaves you with some disappointment thoughts, just read the article itself. It contains some amazing observations; for example:
- There’s a part of the brain that knows when you’re alone in the cave, when you’re a cave person in the dark that there’s someone in the room with you.
- Now, I didn’t pay retail for it. I bought it on Craigslist.
- I believe very strongly in nesting. This isn’t performative. [Interesting word]
- It’s also worth pointing out that I’m looking at a mirror image of myself as you are as well. That’s because people hate looking at themselves as they are seen.
For more insights and maybe the video if you are lucky, this interview is the cat’s pajamas with Lego toys in the background.
Video interviews probably should include video which actually renders. The spinning green thing is interesting for a short time, then it’s boring… just like… video like a pro? Amusing.
Stephen E Arnold, May 10, 2020
Quantum Computing: A Quite Useful Text
April 29, 2020
DarkCyber noted a useful textbook about quantum computing. Quantum Country by Andy Matuschak (former Apple engineer) and Michael Nielsen (a research fellow at Y Combinator) is a series of essay. The authors point out that you will need familiarity with linear algebra and complex numbers. A number of other topics may be useful to the reader. The authors point out that the book “makes it almost effortless to remember what you read.” Like quantum computing, the book in in a new “medium.”
Stephen E Arnold, April 29, 2020
In Cobol News: Cloudflare Gets Interested in Revealing That It Is a Time Sharing Company
April 21, 2020
Legacy systems exist. This is perhaps big news for the recently unemployed Silicon Valley types. Some states are struggling to find Cobol programmers. IBM has rolled out Cobol training.
“Cloudflare Workers Now Support Cobol” reports:
COBOL can now be used to write code for Cloudflare’s serverless platform Workers.
The write up provides a number of historical factoids, including sample code and a Game of Life example.
Quick thought: Has the mainframe returned to offer coding opportunities and a career path to the thumb typing millennials?
What’s next for Cloudflare? Lab coats, glass walls, and elevated floors, sign up sheets for keypunch machines, and greenbar paper?
Has cloud computing become a time shared mainframe?
PS. My first programming project relied on Cobol. That was in 1963. I also used Cobol for the Psychology Today / Intellectual Digest readability work I did in the 1970s. Am I relevant again? I miss JCL too.
Stephen E Arnold, April 18, 2020
A Revolution in Management: Efficiency Redefined?
April 14, 2020
I read “How COVID-19 Made Old-School Management Irrelevant: No More Pointless Micro-Management.” I think a more suitable subtitle would have been “A Millennial’s Howl for Me-Ness.”
The essay is interesting for three reasons.
First, it predicts the future. Predictions are easy, but as “now” yields to the future, most are sort of correct. Management changes may be a tough discipline to change. Why? The notion of organizing tasks and orchestrating the completion of those tasks requires responsibility. That’s an old fashioned concept, but remote control may lack some of the intangibles that traditional management principles rely upon.
Second, the notion of irrelevance is a mostly a point of view issue. Who determines relevance? Perhaps shifting from externally imposed obligations or expectations to an individual determining if those obligations or expectations are “relevant.” Reliability, particularly among many colleagues, is a slippery topics. Without reliability, tasks may be difficult to complete. Relevant or irrelevant issue? The answer depends on whom one asks.
Third, the idea of micro management annoys some people. On the other hand, there are individuals who do their best work within structures and expectations about behavior. One can make generalizations about direct interaction in person. The number of exceptions can undermine what one wants to be true. In fact, the generalization may be an attempt to impose what an individuals wants and needs upon others. Arrogance, stupidity, or a certain blindness?
Now the write up. The article asserts:
The need for a manager who “checks on you” has suddenly evaporated.
Interesting but the emergence of new methods for monitoring seem to be a growth industry: Mobile phone surveillance, Slack, and even Zoom meetings are monitoring, control, and directive devices in some ways.
Here’s another interesting mental construct:
In this new world of “work-from-home”, creatives feel free from antagonisms of the old, and the creators of new. Getting people to perform competitively in environments where remote work relies on individual resourcefulness, the in-your-face old school management has died.
The phrase “in your face” reminds me of a bright sprout deeply offended by a grade school teacher’s statement, “Pay attention to the assignment.” The reaction of some people to being told to deliver is rebellion. That’s not a reason to discard some management methods. In fact, I term this type of anti-management behavior as high school science club management methods or HSSCMM. The idea is that a few smart people gather and know what’s better, faster, and cheaper. Does this sound like some of the Silicon bro ethos? It should because this world view has created some interesting challenges; namely, employees who don’t do what’s expected. Employees who protest, leak, strike, and submarine work so it has more flaws than normal.
The write up identifies what has changed since the global pandemic modified some established patterns; for example:
- Work from home will become more common
- We are in a cultural tsunami
- Social distancing is “demolishing age old officer hierarchies”.
These sound ominous or life affirming depending upon one’s point of view. The flows of digital information undermine hierarchies. I addressed this subject in my Eagleton Lecture (sponsored by ASIS and Bell Labs) in the late 1980s. As digital information zips around, the “old” patterns are weakened and some collapse; for example, knowing about a company’s legal problems once easily concealed until ubiquitous “publishing.” The cultural tsunami picked up steam in developed countries as newer technologies and tools became widely available. Change does not speed along when certain capabilities are classified and available to a comparatively small number of individuals. Diffusion of tools accelerates diffusion of behaviors. New ideas flourish in such an environment. The datasphere is a hot house. The work from home or WFH is definitely becoming more common, just not for everyone. It is difficult to create certain products from home. It is difficult to reach some decisions from home when a golf outing, lunch, and one to one sizing up is necessary.
I grant that change is taking place, some good, some bad. I agree that in some sectors, the 19th century approach to business will not be successful. I support the idea that a 9 to 5 workplace of the “organization man” will be the only or best way to build an organization.
However, if one takes even a cursory look at different cultures at different points in the past, interesting commonalities emerge. Examples range from a group’s appointing a leader to provide guidance seem widespread. Specialists perform certain tasks, often working alone or in concert to deliver an artifact that cannot be crafted alone in a different location.
Several observations:
- WFH or work from home is not right for everyone. Multiple methods are needed. Picking the most suitable method to achieve the goal is the job of management. I think a manager from a Roman engineering brigade would agree in part. A stone cutter working in a quarry is of zero value to team in trans Alpine Gaul.
- Management evolves. Take a flip through Peter Drucker’s management books. The ideas seem both in tune and out of step. Why? Individuals organizing resources to achieve a goal have to adapt to the cultural environment. A failure to adapt is the ultimate failure of an enterprise.
- Some people need the structure of an organization and a routine which may involve a commute, annoyances like a cube in a bigger space, and people making noise, suggestions, and waves.
Net net: Generalizations which are focused on a narrow slice of those who need to work are interesting but self centered, not objective, and wishful thinking. Parts of life will be like grade school. Suck it up. Deliver something of value.
Stephen E Arnold, April 14, 2020
Not a Joke: More of a Commentary on Allegedly Smart PhDs
April 1, 2020
Trigger warning: This is not about search, cybercrime, intelware, or any of the other hobby horses I flog each day as I have since 2008.
Before I highlight the real news item from the “we beg for dollars” outfit the Guardian, try to answer these questions:
- Did the PhD get his degree online?
- Did the PhD understand the equation F = q2B1v2 sin theta?
- Did the PhD think that people would shove ceramic magnets up their nose?
Okay, now navigate to “Astrophysicist Gets Magnets Stuck Up Nose While Inventing Coronavirus Device.” The allegedly accurate write up states:
Australian Dr Daniel Reardon ended up in hospital after inserting magnets in his nostrils while building a necklace that warns you when you touch your face.
The newspaper provides a number of details. Here’s one:
Before attending the hospital, Reardon attempted to use pliers to pull them out, but they became magnetized by the magnets inside his nose.
You too can get a PhD online, impress your friends, and invent new things. Darwin award nominee?
Stephen E Arnold, April 1, 2020
Strange Attractors: Technology Centers
March 31, 2020
DarkCyber spotted this story: “Indianapolis Tech Firms Hold Their Own Amid Growth Elsewhere.” The write up references a report from the Brookings Institute. Curious one of the team scanned the article and spotted an interesting paragraph:
“All of this points to the extent to which innovation-sector dynamics compound over time, leaving most places falling further behind,” the report stated.
The “this” is the fact that 90 percent of technology employment growth from 2005 to 2017 was “generated in just five major coastal cities: Seattle, Boston, San Francisco, San Diego, and San Jose.”
The DarkCyber team member was not aware that San Jose had an ocean exposure, but the main point is that “dynamics compound over time.”
Graphic of a the strange attractor. Centers emerge. Iterating calculations can cause new attractors to appear.
Is this an example of a human crafted strange attractor? Does the compounding of dynamics exert a magnetic pull on certain types of individuals? Can an excess of compounding trigger a downstream event: Homelessness, for instance?
The Dark Cyber team member observed, “Perhaps the absence of concentration provides a way to measure the “importance” or “value” of a particular effort?”
If data were available about a particular technology—for example, Loon balloons or hyper converged infrastructure—would insights into the potential of that “technology” reveal interesting insights.
What’s clear is that these centroids of technology can increase disparities in pay, talent, and innovation. What are the signs of attractor failure? Maybe disease, homelessness, a decline in certain demographics?
The implications of innovation centers compounding over time are interesting to consider.
Stephen E Arnold, March 31, 2020
A Hype-Free Look at Quantum Computing
March 31, 2020
How refreshing—The Times of Israel shares a sane view of quantum computing in its post, “Quantum Future: When Will the Super-Fast Computers Be Around?” Alas, such a useful perspective is rare amidst the marketing baloney.
Writer Pabalta Rijal begins by noting that quantum technology does hold great promise. Once the kinks are ironed out, it could solve problems of scale and complexity that are beyond the computers we are used to. Researchers have been working on quantum computing since the early 1980s, and the field has made several breakthroughs in recent years. Last October, Google declared it had reached “quantum supremacy”—the point at which a quantum system is able to solve a problem that traditional computers cannot. This does not mean the tech is anywhere near ready for the mainstream, but it gives researchers hope. What hurdles remain? Rijal writes:
“Today, quantum computers are prone to errors as they are very sensitive to defects in the underlying materials and such defects are highly challenging to control. Such interactions between Qubits (Quantum bits) and defects cause Quantum Decoherence – that is the loss of the quantum behavior of a quantum system (and, in consequence, the loss of information) and this is one of the biggest challenges in quantum computing. Advancements in materials engineering could in the future help address the problem. The error rate is directly related to material quality and new technologies used in the semiconductor manufacturing industry could potentially help improve the quantum computing industry to scale, [Applied Materials’ Nir] Yahav said. ‘To realize a quantum future, we will need collaboration between materials companies, equipment companies, and device companies, as well as academics and government institutes,’ Yahav stressed. ‘Materials engineering can play a critical role in helping push quantum technology into real commercialization,’ he added.”
So it would be premature to fall for the hype that declares quantum computing is just around the corner. The technology does, however, look more and more promising. Let us hope all parties involved can play well together.
Cynthia Murrell, March 31, 2020
Rediscovering What Once Was Taught: Why Software Goes the Wrong Way
March 27, 2020
DarkCyber spotted a link to an essay called “The Expert Blind Spot In Software Development.” The write up states:
I stumbled upon the theory of the expert blind spot…
What’s the blind spot? DarkCyber knows that Microsoft cannot update Windows 10 without creating problems for some users. Google cannot update Chrome without wizards in the office. Apple cannot update the iPhone without breaking things like the hot spot function. In fact, software pretty much is one set of things that don’t work. Some large, some small—Most are friction, costing money, slowing down actions.
Modern software development explains why Amazon, Google, IBM, and Microsoft make their cloud technologies complex and opaque. Increasing friction generates revenue, not happy users. The image is from Go Physics’ depiction of entropy.
The article explains that beginners operated with the “illusion of competence.” What’s omitted is that institutional pressure forces beginners to operate as if they were chock full of information germane to a task. Managers don’t want to manage, and most managers know that their responsibilities exceed their competence. But that’s the way the world works: Everyone is an expert, and the leaders lead, or that’s the theory in many organizations. The managerial forces create Brownian motion in which those creating software operate like sailboats, each generally heading in some direction: Just with poorly defined rules of the road.
The write up works through an interesting explanation of how “memory” works. But the core of the essay is that “expert blind spots” exist, and those blind spots are problems. The article states:
The best way to be aware of somebody’s level of knowledge in some precise areas is simply to speak with him. In my experience, informal, relaxed conversations, around a cup of coffee, a tea or whatever you like, is the best way to do so.
The idea is that interaction and talking fill in some of the knowledge gaps between those who work together to achieve a goal. There are a number of tips; for example:
- Map your schemata which seems to edge close to the idea of taking notes
- Write a journal which seems to be taking notes, just on a time centric trajectory
- Writing a blog, which seems to be converting the two previous ideas into a coherent essay.
What’s quite interesting about this write up is that the core idea was well stated in “On a Certain Blindness in Human Beings,” an essay / lecture by William James, yeah, the novelist’s brother.
James wrote:
And now what is the result of all these considerations and quotations? It is negative in one sense, but positive in another. It absolutely forbids us to be forward in pronouncing on the meaninglessness of forms of existence other than our own; and it commands us to tolerate, respect, and indulge those whom we see harmlessly interested and happy in their own ways, however unintelligible these may be to us. Hands off: neither the whole of truth nor the whole of good is revealed to any single observer, although each observer gains a partial superiority of insight from the peculiar position in which he stands. Even prisons and sick-rooms have their special revelations. It is enough to ask of each of us that he should be faithful to his own opportunities and make the most of his own blessings, without presuming to regulate the rest of the vast field.
Several observations:
- A certain blindness defines the human condition
- Technical people are rediscovering why their software sucks but lack an pre-conditioning or early alert about why their work product is half baked or just good enough
- A flawed mechanism for creating the fuel for the 21st century guarantees that the friction will wear down the parts; that is, software becomes more and more of a problem for its users.
What’s the fix? On one hand, there is no fix. On the other, a more comprehensive education might reduce the frustration and time consuming rediscovery of what’s been known for many years.
Now about those new nVidia drivers which cause crashes when a cursor is repositioned…
Stephen E Arnold, March 27, 2020