December 13, 2015
Yes, it might be possible. Navigate to “Delivering Results: A Framework for Federal Government Technology Access & Acquisition.” If the link does not resolve, you will have to scout around. No guarantees that this document will remain on a public Web site. The comments apply to almost any government too. I think the write up is focused on the US government, the approach is borderless and a wonderful example of clear thinking about how consultants think about contracting opportunities.
Let me get to the heart of the matter. The way to improve government technology involves the government taking action on several “principles”. Intrigued? Here they are:
- A Common Goal – The Common Good
- Competition and Innovation
- Contracting Flexibility
If you have been involved in government work in the US or elsewhere, you may note that the principles omit one of the key drivers: Billing and related matters such as scope change.
There are some other hitches in the git along. Let me highlight one for each of the principles.
1. A common goal is tough to achieve. Government entities want to retain power, headcount, and budgets. The notion of intra and inter agency cooperation or even inter and intra department cooperation is fascinating. The nature of the bureaucratic process is meetings with overt and hidden agendas. The common goal and the common good are easy to describe, just tough to implement.
2. Competition. If you are a vendor in rural Kentucky and you want to bid on a government project, you may have a difficult time achieving your goal. Projects often begin at the appropriate stage, work their way through the consultant driven request for proposal stage, then there is the statement of work stage, and along the way are contracting officers, legal eagles, and assorted procurement professionals. For someone working in Hazard County, the process is definitely expensive, slow, and designed to allow the big dogs to eat the tasty bits. Competition exists but in a meta sort of way.
3. Collaboration. Meetings are collaborative fun fests. The problem is that the objectives of power, headcount, and budget act as fusion power sources among the participants. Talk is the energy of government meetings. Doing results from expanding power, headcount, and budget allocations. Many meetings require a paid consultant or two to provide the catalyst for the talk. The collaboration results in more meetings.
4. Contracting flexibility. Right. There are rules, and if the rules are sidestepped even by some highly placed folks, that wandering off the reservation is rarely a good thing. An outfit called 18f is trying to deliver flexible contracting on a modest scale to some GSA functions. Right. Have you ever heard about 18f? If so, you are one of the lucky few. In the meantime, the established contractors keep doing their thing: Capturing major contracts.
5. Risks / rewards. Risk is not something that is highly desirable either for a government professional or for the people and companies capturing major contracts. Risk can be discussed in a “collaborative” meeting. The systems then continuously operate to reduce risk. Want to have an unknown contractor build your next weapons system? Not going to happen in my lifetime. Want an unknown contractor to code a Web page? Well, sure, just fill out the appropriate forms or figure out what 18f is all about. There is a reason some government contractors are big. These outfits know how to deal with risk, government style.
6. Workforce. The governments with which I have worked struggle with the workforce issue. The idea is to find and hire the best and brightest. How is this working out? Some folks from a successful company flow into the government and then flow out. This is the revolving door for some folks. Folks who stick in government operations, regardless of country, like the working environment, enjoy the processes, and revel in the environment.
The principles are well stated. I am not sure that changing how governments operate is going to make much headway. Think about your last interaction with a government entity. What did that reveal to you?
Stephen E Arnold, December 13, 2015
November 26, 2015
It is almost 2016. IDC, an outfit owned by an optimistic outfit, has taken a tiny step forward. The IDC wizards answered this question, “How big will Big Data spending be in 2019?” Yep, that is 36 months in the future. There might be more money in predicting Super Bowl winners, what stock to pick, and the steps to take to minimize risk at a restaurant. But no.
According to the true believers in the Content Loop, “IDC Days Big Data Spending to Hit 48.6 Billion in 2019.” I like that point six, which seems to suggest that real data were analyzed exhaustively.
The write up reports:
The market for big data technology and services will grow at a compound annual growth rate (CAGR) of 23 percent through 2019, according to a forecast issued by research firm International Data Corp. (IDC) on Monday. IDC predicts annual spending will reach $48.6 billion in 2019. IDC divides the big data market into three major submarkets: infrastructure, software and services. The research firm expects all three submarkets to grow over the next five years, with software — information management, discovery and analytics and applications software — leading the charge with a CAGR of 26 percent.
I will go out on a limb. I predict that IDC will offer for sale three reports, maybe more. I hope the company communicates with its researchers to avoid the mess created when IDC wizard Dave Schubmehl tried to pitch eight pages of wonderfulness based on my research for a mere $3,500 without my permission. Ooops. Those IDC folks are too busy to do the contract thing I assumed.
A Schubmehl-type IDC wizard offered this observation with only a soupçon of jargon:
The ever-increasing appetite of businesses to embrace emerging big data-related software and infrastructure technologies while keeping the implementation costs low has led to the creation of a rich ecosystem of new and incumbent suppliers…. At the same time, the market opportunity is spurring new investments and M&A activity as incumbent suppliers seek to maintain their relevance by developing comprehensive solutions and new go-to-market paths.– Ashish Nadkarni, program director, Enterprise Servers and Storage, IDC
Yes, ever increasing and go to spirit. Will the concept apply to IDC’s revenues? Those thrilled with the Big Numbers are the venture folks pumping money into Big Data companies with the type of enthusiastic good cheer as Russian ground support troops are sending with the Backfires, Bears, and Blackjacks bound for Syria.
Thinking about international tension, my hunch is that the global economy seems a bit dicey, maybe unstable, at this time. I am not too excited at the notion of predicting what will happen in all things digital in the next few days. Years. No way, gentle reader.
Thinking about three years in the future strikes me as a little too bold. I wonder if the IDC predictive methods have been applied to DraftKings and FanDuel games?
Stephen E Arnold, November 26, 2015
November 19, 2015
Want to know what the future will look like? Navigate to “7 Reasons Why the Algorithmic Business Will Change Society.” The changes come via Datafloq via a mid tier consulting firm. I find the predictions oddly out of step with the milieu in which I live. That’s okay but this list of seven changes raises a number of questions and seems to sidestep some of the social consequences of the world foreshadowed in the predictions. Finding information is, let me say at the outset, not part of the Big Data future.
Here are the seven predictions:
- By 2018, 20% of all business content will be authorized by machines, which means a hiring freeze on copywriters in favor of robowriting algorithms;
- By 2020, autonomous software agents, or algorithms, outside human control, will participate in 5% off all economic transactions, thanks to, among others, blockchain. On the other hand, we will need pattern-matching algorithms to detect robot thieves.
- By 2018, more than 3 million workers globally will be supervised by a “roboboss”. These algorithms will determine what work you would need to do.
- By 2018, 50% of the fastest growing companies will have fewer employees than smart machines. Companies will become smaller due to expanding presence of algorithms.
- By 2018, customer digital assistants will recognize individuals by face and voice across channels and partners. Although this will benefit the customer, organizations should prevent the creepiness-factor.
- By 2018, 2 millions employees will be required to wear health and fitness tracking devices. The data generated from these devices, will be monitored by algorithms, which will inform management on any actions to be taken.
- By 2020, smart agents will facilitate 40% of mobile transactions, and the post-app era will begin to dominate, where algorithms in the cloud guide us through our daily tasks without the need for individual apps.
Fascinating. Who will work? What will people do in a Big Data world? What about social issues? How will one find information? What happens if one or more algorithms drift and deliver flawed outputs?
No answers of course, but that’s the great advantage of talking about a digital future three or more years down the road. I assume folks will have time to plan their Big Data strategy for this predicted world. I suppose one could ask Google, Watson, or one’s roboboss.
Stephen E Arnold, November 19, 2015
October 16, 2015
I noticed two things when we were working through the Overflight news about proprietary vendors of enterprise search systems on October 14, 2015.
First, a number of enterprise search vendors which the Overflight system monitors, are not producing substantive news. Aerotext, Dieselpoint, and even Polyspot are just three firms with no buzz in social media or in traditional public relations channels. Either these outfits are so busy that the marketers have no time to disseminate information or there is not too much to report.
Second, no proprietary enterprise search vendor is marketing search and retrieval in the way Autonomy and the now defunct Convera used to market. There were ads, news releases, and conference presentations. Now specialist vendors talk about webinars, business intelligence, Big Data, and customer support solutions. These outfits are mostly selling consulting firms. Enterprise search as a concept is not generating much buzz based on the Overflight data.
Imagine my surprise when I read “Enterprise Search Market Expanding at a 12.2% CAGR by 2019.” What a delicious counterpoint to the effective squishing of the market sector which husbanded the Autonomy and Fast Search & Transfer brouhahas. These high profile enterprise search vendors found themselves mired in legal hassles. In fact, the attention given to these once high profile search vendors has made it difficult for today’s vendors to enjoy the apparent success that Autonomy and Fast Search enjoyed prior to their highly publicized challenges.
Open source search solutions have become the popular and rational solution to information access. Companies offering Lucene, Solr, and other non proprietary information access systems have made it difficult for vendors of proprietary solutions to generate Autonomy-scale revenue. The money seems to be in consulting and add ons. The Microsoft SharePoint system supports a hot house of third party components which improve the SharePoint experience. The problem is that none of the add in and component vendors are likely to reach Endeca-scale revenues.
Even IBM with its Watson play seems to be struggling to craft a sustainable, big money revenue stream. Scratch the surface of Watson and you have an open source system complemented with home brew code and technology from acquired companies.
The write up reporting the double digit comp9ound growth rate states:
According to a recent market study published by Transparency Market Research (TMR), titled “Enterprise Search Market – Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2013 – 2019”, the global enterprise search market is expected to reach US$3,993.7 million by 2019, increasing from US$1,777.5 million in 2012 and expanding at a 12.2% CAGR from 2013 to 2019. Enterprise search system makes content from databases, intranets, data management systems, email, and other sources searchable. Such systems enhance the productivity and efficiency of business processes and can save as much as 30% of the time spent by employees searching information.The need to obtain relevant information quickly and the availability of technological applications to obtain it are the main factors set to drive the global enterprise search market.
TMR, like other mid tier consulting firms, will sell some reports to enterprise search vendors who need some good news about the future of the market for their products.
The write up also contains a passage which I found quite remarkable:
To capitalize on opportunities present in the European regional markets, major market players in the U.S. are tying up with European vendors to provide enterprise search solutions.
Interesting. I do not agree. I don’t see to many US outfits tying up with Antidot, Intrafind, or Sinequa and their compatriots. Folks are using Elasticsearch, but I don’t categorize these relationships as tie ups like the no cash merger between Lexalytics and its European partner.
Furthermore, we have the Overflight data and evidence that enterprise search is a utility function increasingly dominated by open source options and niche players. Where are the big brands of a decade ago: Acquired, out of business, discredited, and adorned with jargon.
The problems include sustainable revenue, the on going costs of customer support, and the appeal of open source solutions.
Transparency Market Research seems to know more than I do about enterprise search and its growth rate. That’s good. Positive. Happy.
Stephen E Arnold, October 16, 2015
October 9, 2015
I read “Predictive Analytics Are the Future of Big Data.” Who makes this pronouncement? None other than mid tier consultants. According to the write up:
Forrester analysts … believe that predictive analytics have never been more relevant and easier to use, and offer ways for forward-thinking enterprises to succeed in competitive sectors.
Why the sudden flurry of interest in predictive analytics? Well, the answer is not far to seek, gentle reader:
Forrester has authored another blast furnace brick of insight called Forrester Wave research Big Data Predictive Analytics Solutions, Q2 2015.
What’s the future hold? I like the predictions as a service. This will be useful to those who want to compare Bing predictions with the actual winner of the Kentucky Derby.
What companies are the leaders in predictive analytics? Forrester offers some well known outfits; for example, the free-spending IBM, the acquisition minded Dell, Microsoft, Oracle, and canny SAP whose enterprise software runs on Oracle, not just HANA. (I bet you knew that.) There are some surprises; for example, Alpine, Alteryx, and Angoss. The graduate student in psychology essential SAS is singled out as a leader in predictive analytics. (I am delighted to know that SAS is not a collection of components and programming methodologies with which one can build numerical machines.) There are some outfits not on my radar because I am simply not with the Forrester program; for example, KNIME and Predixion.
But the most interesting leader in predictive analytics is FICO, which is a publicly traded outfit active in 90 countries . How fresh are FICO’s predictive methods? Pretty fresh based on the Forrester analysis. I noted this passage on the FICO Web site:
Founded in 1956, FICO introduced analytic solutions such as credit scoring that have made credit more widely available, not just in the United States but around the world. We have pioneered the development and application of critical technologies behind decision management. These include predictive analytics, business rules management and optimization. We use these technologies to help businesses improve the precision, consistency and agility of their complex, high–volume decisions.
I am okay with innovations from a company with 59 of doing math for decision management and, of course, predictive analytics. (Wasn’t that calculator and mainframe centric?)
How can you get a copy of the Forrester report and learn about the companies pioneering in the predictive analytics sector? Due to the ominous legal verbiage on my copy of the high value mid tier report, I did not make a list of the companies nor hint at the wealth of insights the mid tier experts captured in the report. For me to get a copy, I clicked this link and the document became available. For you? I am not sure.
Were any companies omitted from the report. Yep, most of the firms I monitor. I am okay with Microsoft and the other big names, but there are some innovators chugging along but off the radar of the mid tier wizards in the mid tier consulting company.
If a company is not on the radar of a mid tier consulting firm, those companies are essentially irrelevant. I wonder if anyone from a mid tier consulting firm will share this devaluation of a certain Google and In-Q-Tel investment with the Alphabet Google thing?
This report is a marketing play. I hope many Fortune 50 companies turn to Forrester to guidance in predictive analytics. If I were a betting person, I would take the FanDuel-type and DraftKings-type approach to certainty. Oh, I just asked myself, “Maybe this is how the mid tier consulting reports already work?” Interesting question.
Stephen E Arnold, October 9, 2015
October 9, 2015
“AP’s Robot Journalists Are Writing Their Own Stories Now” suggests that wizards who suggest that automation creates jobs may want to outthink their ideas. Remember the good old days. The Associated Press, United Press International and other “we use humans” news gathering organizations hired people. Now some of the anecdotes about real journalists are derogatory. I never met a journalist who was inebriated at 9 30 am. Noon? Maybe?
In the write up, the Associated Press, which has a fascinating approach to its ownership, rolled out Automated Insights. The idea was that software filtered and assembled real news stories.
Well, how is that working out?
IBM’s CEO believes that automation will not decimate the work force. Gannett is making an effort to buy up more newspapers so these too can be tooled to the tolerances of the Louisville Courier Journal. Fine newspaper. Fine operation.
And the AP itself? Well, the accumulated loss continues to go up. I recall reading “Employment Rates Are Improving For Everyone But Journalism Majors.”
I noted this passage in a NASDAQ write up:
The prospect of technology-driven job destruction is a matter of great debate for many scientists, technologists, and economists, some of whom predict massive losses in the labor market. In the past, new technology has destroyed jobs and created new ones, but some experts wonder if the increasing power of information technology will leave relatively less and less for people to do.
Journalism majors, unemployed “real” journalists, and contract journalists once called stringers—life is only going to get better. Lyft will make it easier for some folks to become taxi drivers. There are plenty of jobs as data scientists, a profession eager for those who can write prose. There are also opportunities to become experts in search and content processing. Hey, words are words.
Stephen E Arnold, October 9, 2015
October 7, 2015
By chance, my Overflight system spit out two articles which I read one after the other.
The first was “Technological Dark Matter.” The second was “The Tyranny of Choice: Why Enterprise Tech Buyers Are Confused.” Information access mavens seem to be drifting into a philosophical mode. Deeper thinking is probably needed. Superficial thinking is not doing a very good job of dealing with issues such as the difficulty of looking for an image in the British Library collection, the dazzling irrelevance of Web search results, and trivial matters such as the online security glitches experienced by outfits which like to think they are the best and brightest around.
The Dark Matter write up confuses me. The notion of Dark Matter is that “something” is there, but it cannot be located. I don’t want to call it a physicist cheat, but darn it, if one can’t find it, maybe the notion is flawed in some way.
The write up informed me that I come into contact with “internal tools.” Well, no. I think internal tools like the other points in the write up are business processes manifested in interactions with other systems and people. These processes, if not worked out correctly, add friction to a system. Who wants to change a mainframe based system into a cloud service for free or for fun? I don’t, and I don’t know too many people who would or could. Pain, gentle reader, pain is migrating an undocumented mainframe system to a cloud hybrid confection. Nope.
The write up’s points — monetization, security, localization (which I don’t understand), long tail features (but I don’t understand the word “bajillion” either), and micro optimizations (again, baffled).
Nevertheless, the write up sparked my thoughts about the invisible, yet cost adding, functions that are not on the users’, customers’, competitors’, or consultants’ radar. Big outfits have big friction. Inefficiency is the name of the game. Now that’s Dark Matter I find interesting.
The second article struck a chord because it focuses on the relationship between complexity and confusion. The write up is more coherent than the first article. I highlighted this passage:
Brazier [a wizard from Canalys] said “rising levels of complexity” were marking it “harder for customers to keep up with everything.” This in turn made it harder for customers to make decisions, he concluded. “Prices are going up. That has clearly restricted demand.”
The complexity thing linked with confusion and prices.
The magic of juxtaposition. Technology outfits, particularly those engaged in information access, have a tough time explaining what their products do, what the products’ value is, and why the information access systems anger half or more of their users.
Consultants explain the problem in terms of governance, a term a bit like bajillions. Sounds good, means nothing. Consultants (often failed webmasters trying to get “real” work or art history majors with a knack for Photoshop) guide the helpless procurement team to a decision.
Based on my brush againsts with these groups, the choices are narrowed to established companies which are pitching software which may not work. Often a deal will be made because someone knows someone. A personal endorsement is better than an Instagram factoid.
I have three notions floating around in my mental mine drainage pond:
- Technology centric companies are faced with rising technology costs and may have fewer and fewer ways to generate more cash. Not good for investors, employees, and customers. Googlers call this the rising cost of technology’s credit card debt.
- The problems which seem to crop up with outfits like Amazon, Facebook, Google really gum up the lives of users, partners, and others involved with the company. Whether know how based like Google’s Belgium glitch or legal like the European Commission’s pursuit of monopolists, costs will be driven up.
- The notion of guidance is becoming buck passing and derrière shielding. Those long, inefficient, circular procurement processes defer a decision and accountability.
Net net: Process friction, confusion, complexity, and cost increases. The new hot buttons for information access and other technology centric companies.
Stephen E Arnold, October 7, 2015
October 7, 2015
I like it when a person tells me that software or a human can predict the future. My question is, “If the predictions are spot on, why is the owner of the prediction system talking? Why not play fantasy football, pick stocks, or hang out at Keeneland during an auction and buy horses whose value will skyrocket?
The answer is, “Err, well, hmmm.”
Exactly. Predicting the future is a bit like imagining oneself putting on soccer boots and filling in for the injured Lionel Messi. Easy to thing. Essentially impossible to do.
The fix is to be fuzzy. Instead of getting into a win-lose situation, there are caveats. I find these predictions and their predictors amusing. Not as enjoyable as the antics of something like IBM cognitive computing marketed by Bob Dylan or the silliness of Hewlett Packard management activities. But close, darned close.
I read “Gartner: Top 10 Strategic Technology Trends For 2016.” I noted this statement from the capitalist tool:
…the evolution of digital business is clearly at the heart of what is covered.
Okay, the trends are going to identify trends which will allow an MBA or a savvy marketer to look at business and understand how “business” will evolve. Darwin to the future, not Darwin from the past I assume.
The question in my mind is, “Are these retread ideas?”
Here are three “trends” which caught my attention. To get the full intellectual payload, you will need to read the article or, better yet, seek out a Gartner wizard and get the trend thing straight from the horse’s mouth. Yep, right, mouth.
Trend 2: Ambient user experience.
I remember hearing about ambient computing years ago. The idea was that one could walk around and compute. I also ran across Deloitte’s identification of a similar trend months ago. But it was in the late 1990s or early 2000s when an MIT person talked about the concept. Obviously if one is computing whilst walking around, there is an experience involved. With mobile devices outselling tethered devices, it seems disingenuous to talk about this trend. According to Forbes, the capitalist tool:
Gartner posits that the devices and sensors will become so smart that they will be able to organize our lives without our even noticing that they are doing so.
I like posit. The word means “to dispose or set firmly, assume or affirm the existence of, and propose as an explanation.” Yep, posit something that academics and blue chip consulting firms have been saying for a while.
Trend 4: Information of Everything
Now these universal statements are rhetorical tactics which make my tail feathers stand up. “Everything” is a broad concept. A critical reader may want to ask, “Will you provide me with information about line 24 million in Watson’s 100 million lines of code?” The “everything” is going to provide this answer. Nope. Logical flaw. But here’s how the capitalist tool, a font of logical thought, presents this “information of everything” trend:
According to Gartner, by 2020, 25 billion devices will be generating data about almost every topic imaginable. This is equal parts opportunity and challenge. There will be a plethora of data, but making sense of it will be the trick. Those companies that harness the power of this tidal wave of information will leapfrog competitors in the process.
I like the plethora. I like the leapfrog. I like the tidal wave. I have a sneaking suspicion that most folks with a computer device have experienced a moment of information confusion. “With every topic imaginable”, confusion is a familiar neighbor. Now how long has this concept of lots of information from lots of devices with communications capability been around? Forbes, the capitalist tool, published in June 2014 “A Very Short History of the Internet of Things.” If the Forbes’ writer had taken the time to look at that article, the concept poked its nose into the world in the early 1930s. Well, that is only 80 years ago. But it is a trend. Hmm. Trend.
Now my favorite.
Trend 9. Mesh App and Service Architecture
The notion that computer systems able to exchange information is a good one. I can’t recall when I learned about this concept. Wait. No, I remember. It was in 1963 when I took my first class in computer programming. The professor, a fine autistic polymath, explained that the mainframe—a 1710—was a collection of components. He said in 1962 that different machines would talk to one another in the future. Well, there you go. A third rate university with dullards like me in class got a prognostication which seems to be true. That was more than half a century ago. Here’s the modern version of this old chestnut:
More apps are being built to be plugged together, and the value of the combination is much greater than the sum of the parts. As Lyft has integrated with comparable offerings in other countries, its ability to expand its offering for traditional customers traveling abroad and the reverse has meant faster growth with minimal cost implications.
Enough of these walks down memory lane. Three observations:
- These trends are recycled concepts
- The presentation of the trends is a marketing play, nothing more, nothing less
- Mid tier consulting firms are trying really hard to sound very authoritative, important, and substantial.
That would work if footnotes provides pointers to those who offered the ideas before. Whether a blue chip consulting firm like Deloitte or a half wild computer science professor in the Midwest, the trends are not trends.
We are, gentle reader, looking at digital retreaded tires. A recap. A remold. Old stuff made fresh. Just don’t drive too quickly into the future on these babies. Want to bet on this?
Stephen E Arnold, October 7, 2015
September 27, 2015
The mid tier outfit Forrester has released another report about enterprise business intelligence platforms” for the third quarter of 2015. These reports cost about $2,500, so you know the information is red hot, spot in, and objective. Always objective. in the write up “The Forrester Wave: Agile Business Intelligence Platforms 2015”, the report is described as “juicy.” Imagine. Juicy applied to IBM, Microsoft, and Oracle. Let me refresh your memory of juicy’s official definition:
2: rewarding or profitable especially financially : fat <juicy contract> <a juicy dramatic role>
3a : rich in interest : colorful <juicy details>
c : full of vitality : lusty
I am not sure mid tier consulting firms’ reports are “rewarding or profitable especially financially” for the reader. At a couple of thousand per authorized copy of the report, the mid tier firms are likely to be drenched in juiciness. Will this report be lusty, sensational, colorful, and succulent? Nah. This is marketing pulp, gentle reader.
Which are the companies which make the cut? According to this write up, there are a baker’s dozen of agile, BI vendors:
- Information Builders
- Panorama Software
- Tableau Software
- TIBCO Software.
Scanning this list, I wonder how “agile” IBM, Microsoft, Oracle, SAP, and SAS really are. I know that TIBCO acquired some nifty technology for its analytics functions, and that the founders of Spotfire have moved on to even more interesting analytics at their new company, funded in part by Google and In-Q-Tel. The other firms are ones which have run around the BI bases for years and may have a touch of arthritis; for instance, Information Builders which kicked off its career 1975. Qlik was founded in 1993. MicroStrategy flipped on its lights in 1989 and spawned at least one outfit (Clarabridge) which strikes me as slightly more agile than the mother ship. Tableau, now a publicly traded outfit, hung out its shingle in 2003.
GoodData may be the most spry among this group, not because it was founded in 2007, but because the firm landed another $25 million in funding in 2014.
According to the blurb about the report, each of these companies are agile because of several special features each of these vendors offer their customers. These characteristics are:
First, these 13 vendors’ products allow their business users to be self sufficient. I am not sure I agree, that SAS stuff requires a person to be SAS-sy, which means able to navigate the companies’ programming methods with some skill. IBM, Microsoft, and Oracle provide many different ways to skin the business intelligence cat. In my opinion, these companies’ business intelligence technology require that the business user have the equivalent of a fighter jet maintenance crew to assist them on the flights into analysis and visualization.
Second, each company generates knock out visualizations. My thought is that for zippy visualizations, more specialized tools are required. The companies highlighted in this report can deliver slides and graphs which are niftier than those in Excel, but far short of the Hollywood style outputs which come from Palantir and Recorded Future, among other firms not included in the agile list.
Third, each of the 13 companies offers its licensees and customers options and additional features. This is definitely a must have function. Most of the firms in the list of agile BI companies sells services. Some have partners, lots of partners. The business model may be less to be agile and more to sell billable work, but that’s okay. I am not sure inking a six figure services contract delivers agility.
I assume the complete $2,500 report will become available from the companies listed in the report. For now, think agility. Think IBM, Microsoft, and Oracle, along with the 10 other companies.
Remember, these are 13 juicy and agile outfits. Remarkable. Juicy.
Stephen E Arnold, September 27, 2015
September 24, 2015
Gentle reader, I know that knowledge about Spark is as widespread as information about the woes of the Philadelphia Eagles. My understanding of Spark is that is is an open source engine for large scale data processing. It is faster than Hadoop. It is easy to use. It is flexible enough to allow the intrepid Spark aficionado the combine structured query language, streaming, and analytics in one software system. Spark runs “everywhere.” For more about Spark, see this Apache project page.
Spark is one of the next big things, poised to ignite innovation, consulting revenues, innovations, and vendor repositionings.
I approached “Game-Changing Real-time Uses for Apache Spark” in order to learn how Spark can change the game for real time data and information work. Game changing means that old school outfits are going to lose because the new game has new rules, new players, and new everything.
The write up identified these ways Spark will change some quite significant markets:
- Credit card fraud detection
- Network security
- Genomic sequencing
- Real time ad processing
My goodness, Spark will become the number one enabling technology for some very problematic market spaces.
Let’s look at what Spark will do to real time ad processing. The write up reports:
One advertising firm uses Spark, on MapR-DB, to build a real-time ad targeting platform. The system looks at user data and decides which ads to show users on the Internet based on demographic data. Since advertising is so time-sensitive, advertisers have to move fast if they want to capture mindshare. Spark Streaming is one way to help them do that.
What strikes me is that Spark requires programmers, software engineering, and then integration of different components. If an error manifests itself, the Spark solution may require those who embrace it to perform some old fashioned work.
In a sense, the game hasn’t changed at all. Open source software reduces license fees and provides a developer with some freedom from license restrictions. On the other hand, the difficult task of getting a complex system to work as intended remains.
My hunch is that Spark is an interesting open source project. The consultants and start ups see Spark as an opportunity. The game changing nature of Spark is potential energy, not a sure thing.
Stephen E Arnold, September 23, 2015