Discussion


Note: Images may be inserted into your messages by uploading a file attachment (see "Manage Attachments"). Even though it doesn't appear when previewed, the image will appear at the end of your message once it is posted.
Register Latest Topics
 
 
 


Reply
  Author   Comment  
bpierce

Moderator
Registered:
Posts: 100
Reply with quote  #1 

For his April/May/June 2015 Visual Business Intelligence Newsletter article, Stephen asks, "What Do Data Analysts Most Need from Their Tools?" Some vendors say data analysts need tools with natural language processing (NLP) and artificial intelligence (AI). Stephen believes data analyst's needs are simpler than that; their needs haven't changed in decades, but have yet to be addressed. In the article, Stephen describes what features are lacking in current tools, examines why they're lacking, and explains what could be done to change this.

What are your thoughts about the article? We invite you to post your comments here.

-Bryan

Jpkelly1

Registered:
Posts: 1
Reply with quote  #2 
I don't recall ever seeing you comment on MicroStrategy, which is the BI tool my company uses. In their recent version 10 release a couple of weeks ago, they introduced significant changes to their visualization tool (marketed as MicroStrategy Desktop). Do you have any thoughts about this tool?

Thanks
John
sfew

Moderator
Registered:
Posts: 838
Reply with quote  #3 
John,

I last reviewed MicroStrategy 2.5 years ago. At that time their EDA capabilities were still far from prime time. Do you believe that this is no longer the case?

__________________
Stephen Few
bidelivery

Registered:
Posts: 1
Reply with quote  #4 
"Spinning pie charts with variable speeds, both in forward and in reverse"....  Thanks for the laugh, Mr. Few!  That's a good one.  Very nice article profiling the top 4 tools doing EDA.  Keep up the great work!

Thanks,

Daniel

__________________
Daniel Smith
heinzel

Registered:
Posts: 16
Reply with quote  #5 
Stephen,

I've enjoyed reading this article. To highlight just two things, the list of guiding principles for providing good tools on the bottom of page 4 is spot on. Almost makes me want to create a company that designs based on those principles... If I had the money. Also, I am relieved to see that the tool management in my company seems to like (Tableau) makes it into the pack of viable EDA tools. Thank you for discussing those.

On page two, there is a three-item list of "Data analysis software should never attempt any of the following". I am wondering if that list is driven by a static view of tools. The comparison I have in mind is chess programs/computers. When they emerged in the mass market about 30 years ago, they were all beatable by skilled amateurs. One could have put forth the same list about them. But they have evolved to a level that is now unbeatable by the most skilled Chess players on the planet. Today, when using extreme amounts of computing powers. In the future, why wouldn't a mobile phone beat the Chess World Champion? And this evolution, while chess computers do not teach you why they take the move they take, has most likely not hindered humans from becoming more skilled players ("Analyst awareness and thinking").

Why couldn't data analysis software evolve in the same way? Today, suggesting something that only helps the novice. Tomorrow, making best in class suggestions that a skilled analyst appreciates? Are we slowing that evolution by pointing out the inadequacy of their early stage in the life cycle, missing the big picture that you have to take step one in order to take step two? I don't have the answer to those questions, but I'd hate to not see them asked.

Best,
Matthias
sfew

Moderator
Registered:
Posts: 838
Reply with quote  #6 
Matthias,

Playing chess is an ideal activity for a computer because it can be managed by referring to a database of all possible moves and responses and by following a set of rules. This doesn't require thinking as I'm using the term. If computers that can think are ever developed, we can decide at that time if it is appropriate for them to do our data sensemaking for us. As long as computers cannot match our data sensemaking ability, we shouldn't use them in a way that takes us out of the loop of the process in any significant way.

Every cognitive ability that we stop using languishes over time. Many of the abilities that our hunter/gatherer ancestors had, which we no longer use, have diminished. Their highly developed visual thinking ability is one of the losses. Changes in the ways that we use our brains resulting from the agricultural revolution and the industrial revolution (including our current information technologies) have actually rendered us less knowledgeable as individuals than hunter/gatherers. We are much more knowledgeable in the aggregate today (i.e., the combined knowledge of individuals with specialized knowledge), but we have lost a great deal of our ability. We can argue that the tradeoff was worthwhile, which it certainly was in many ways, but where would we be if our agricultural/industrial infrastructure began to break down or be disrupted on purpose by people with ill intent?

As we continue to develop technologies today, we need to do so with greater awareness and forethought than our predecessors. Regarding computers, for every possible application we should ask, "Is this something that we want computers to do?" The ramifications of these decisions are not trivial.

__________________
Stephen Few
jo_m

Registered:
Posts: 1
Reply with quote  #7 
Hi, I was just wondering if, in your experience, vendors and developers ever adapt and make changes based on your comments, articles and blogs?  I've been an avid follower for several years and attended one of your courses so I know you are consistently sending out these messages.  I don't have much experience with many of these tools but wondered if the companies were listening as well as the Analysts?
If not, what can we do as an Analyst community to get them to take notice?
Many Thanks
Jo

__________________
Jo
sfew

Moderator
Registered:
Posts: 838
Reply with quote  #8 
Jo,

My fingerprints show up in several products, including those on my list of viable EDA tools, but the degree to which vendors respond to my suggestions is always limited. In general, the suggestions that are implemented are those that are convenient. If a suggestion conflicts with the product's architecture, if the design/development teams have already invested significant time (and/or ego) into an approach that I warn them against, or if the sales team exercises too much influence over product design decisions, my suggestions are always ignored.

Data analysts have little influence over the direction in which their tools are developed. This is because data analysts have little influence in their organizations over purchase decisions. Unless a vendor comes along that is committed to best practices over maximum revenues, data analysts will only gain influence over the development of their tools if they manage to get a seat at the table of influence within their own organizations. This will require a change in culture. It won't be enough for this change to take place in a few organizations. A critical mass must be reached.

Don't let this stark reality discourage you. Sometimes our voices are heard. Sometimes vendor employees who give a damn manage to get things done despite competing interests. We must do what we can to change the culture, despite resistance and setbacks. This is what I'm trying to do. In time our efforts will pay off, or they won't, but if we don't make the effort, the failure will be partially ours.

__________________
Stephen Few
mramoju

Registered:
Posts: 1
Reply with quote  #9 
I enjoyed reading your article. I worked several years developing analytical tools. In the process, I encountered several "data analysts" who often expect the tools do everything on click of a button, including the real analysis that should have been done by human brain, and produce a fancy report to share with others. They want everything automated including the human brain's thought process. In your article, you considered them as "novice" analysts. I think it made sense.
dimitri_b

Registered:
Posts: 1
Reply with quote  #10 
Great article, thank you.

I am wondering if the vendors have a choice, and if yes - how much choice?

Given that most of them are public companies with shareholders, if the vendor wants to do something that Wall Street doesn't like, will the Wall Street allow the vendor to do it?

In other words, maybe the whole system is set up in such a way that it makes the whole R&D process dysfunctional? And if it takes people like Steve Jobs to go against the system, with any chance of success, then there are simply not enough of them to change the industry?

__________________
Regards,
Dimitri.
sfew

Moderator
Registered:
Posts: 838
Reply with quote  #11 
Dimitri,

When software vendors decide to go public, they rarely do so because it will help them produce better products. They do so to put money in their pockets. The only people who have any real say in the matter are executives who stand to gain the most. The result, as you point out, is a loss in control. Once a company goes public, it must cater to the wishes of its shareholders, and those wishes rarely coincide with optimal product decisions. This process begins long before the actual public offering. It begins as soon as the company sets its sights on going public. From that point on the company's decisions are designed to make them look good to Wall Street, whose interests rarely align with the interests and needs of the company's customers, especially the people who use its products. So yes, it would be difficult for a publicly traded analytics company to prioritize the needs of data analysts. In fact, it would be difficult for any company, public or not, to create great products for data analysts if its leaders are motivated by personal wealth. The best products are created by companies that care above all else for doing good work.

__________________
Stephen Few
CarlCook

Registered:
Posts: 2
Reply with quote  #12 
I'm a vendor, and I'm listening.  I've just encountered Stephen, and sir, I've ordered your first 3 books and will read them carefully, to be followed later by reading "Signal".  I've been involved in data analytics for more than 20 years which in our case spans heterogeneous access, synchronize, cleanse, convert, visualize, anomaly detection, event detection, model, predict and optimize, on-line and in real-time, principally in the commercial/industrial and financial sectors.  Part of our time is building tools (50% licensing), part of the time is using them ourselves (50% services) to help customers improve their performance.  And so, we eat our own cake :)  Most of what we analyze is time series and we've built our tools from that perspective.  We are starting our fourth generation suite of tools, moving to the browser and so we have the opportunity to do things "more right" than before.  We are a private company so we have no one to satisfy but our customers and ourselves.  I recognize much, if not all, of what Stephen says is truth, so I'm excited to join his following.
__________________
Thanks,
Carl
ChrisGerrard

Registered:
Posts: 30
Reply with quote  #13 
Thanks, Stephen, for succinctly and accurately covering the deficiencies with existing data analysis tools.

I, too, wonder why the vendors haven't progressed further with providing better tools. By 'better' I mean providing more effective intimate coupling between the human and tool vis-a-vis the essential elements of data analysis.

As I see it there are a limited number of things in the front rank of data analysis, lead by deciding: which fields to see; how they're sorted - usually nested; how the measures are aggregated - sum, average, mean, std dev, etc.; and filtering - choosing which records to include or exclude.
With these basic functions, one can create analyses that span the vast majority of the data analytical space, at least in terms of covering the basic data-analytical questions people have.

Tools that cover these basics well, by providing intuitive user interfaces surfacing them in a way that imposes minimal burdens on the human-in the best case the tool would be invisible, provide a huge amount of value.

There are a couple of examples of tools that have done this.

First: FOCUS was created in the age of COBOL data processing and report programming, providing simple, (largely) non-procedural English language syntax representing the basic functionality. As an example, if you wanted to know the sum of Salaries and Expenses for each Department, you would code:
  TABLE FILE EMPLOYEE
     SUM Salary, Expense
     BY Department
  END
or, if you wanted to show the data across the page, once for each Month, you would code:
  TABLE FILE EMPLOYEE
     SUM Salary, Expense
     BY Department
     ACROSS Month
  END
In it's day FOCUS was beautiful; it elevated data analysis from the realm of technical programming to the realm of non-technical business data analysis. Unfortunately, FOCUS was created as a mainframe tool, and it failed to adapt to the changes in human-computer interaction that erupted in the 1980s and continued into the 1990s.

Second: Tableau appeared in the age of big-technology BI, where data analysis had regressed to the realm of highly technical work that could only be done by specialized technical workers. Tableau was a flash of genius in it's surfacing the basic operations as direct operations on fields in the UI. Replicating the examples above:
Sum of Salary & Expense by Department
  - put Department on Rows, either by drag/drop or double-clicking Department in the data window
  - add Salary, via any of a number of mechanisms; Tableau presents 'appropriate' viz types for each
  - add Department, the available mechanisms and presentation depend upon the Salary viz
In order to show the same information across the page by Month, first do the steps above, then
  - put Month on the Columns shelf
Tableau has some idiosyncrasies based on it's built-in context sensitive presentation choice selection that can be confusing to new users, but these are largely marginal compared to its benefits.
As Stephen pointed out, Tableau has not stayed on the path of making data analysis as simple and straightforward as possible. One way to look at it is that Tableau was created to implement a specific concept of what data analysis is, leading to a design space defining what an tool implementing that concept would be. Seen this way, Tableau has and continues to suffer from the limitations of its origins, including half-baked functionality; poorly implemented functionality outside that initial space, e.g. bullet graphs built upon existing reference lines/bands, with poor results, etc.

Full disclosure: I worked for IBI, FOCUS's vendor, in the 80s and 90s, first as a consultant, then as a product manager in their PC and Unix Divisions and got to see first hand the missteps that led to FOCUS falling from being the best product in the business data analysis world to a seat and the me-too table. I've also been using Tableau since 1996, first for my own purposes, and for the past five years as a BI/BDA consultant specializing in Tableau, where I work with clients and try to help them recognize and take advantage of all the opportunities Tableau makes possible.

I'm increasingly worried about Tableau. Partly for the reasons Stephen noted-that they've strayed from the patch of being the best possible tool, for whatever their reasons are. A vary big worry is the increasing, and accelerating, elevation in the Tableau world of those people who perform amazingly complex and convoluted things with Tableau. As many people have demonstrated it's possible to coerce Tableau into doing amazing things. But that's the problem: recognizing and applauding feats of complicated data preparation and leveraging of subtle side effects builds and reinforces the status quo, entrenching the way Tableau is as the One True Way, and making it increasingly difficult to change course and leave behind all the hard-won acclaim and the value it represents, if only of the "look how clever I am" kind.

And now that Tableau is worth over $7B it feels very much like it's a case of "well, we're clearly successful, so thanks for your ideas but we're OK."

Maybe someone will come along with the drive to create the next great human-centered data analysis tool, one that recognizes the fundamental lessons of the past, particularly of the best modern tools, and extends the principles of making as much of data analysis as possible as simple and easy as possible for real non-technical people. I hope so.
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.