Note: Images may be inserted into your messages by uploading a file attachment (see "Manage Attachments"). Even though it doesn't appear when previewed, the image will appear at the end of your message once it is posted.
Register Latest Topics

  Author   Comment   Page 1 of 2      1   2   Next

Posts: 192
Reply with quote  #1 
Congratulations to everybody who participated to this conquest. I understand Stephen position in not identifying the other six selected dashboards. I also think that a real challenge for us would be to identify ourself the other six winner dashboards from what we will see in this forum. Any feedback would be very much appreciated, for all of us. Here is our proposal in PDF format:
We generated the dashboard only in vectorial format, a raster version you can see below.

Attached Images
Name: Algebra.png, Views: 706, Size: 221.77 KB


Posts: 247
Reply with quote  #2 
This dashboard's got a nice clean look.
I like how you've labeled the bullet graph axis with the letters, and at the same time kept it proportional to the numeric grades.
There are a lot of explanatory notes at the bottom ... I might move them to a separate 'Help' page.

Posts: 192
Reply with quote  #3 
Thank you in spending time in making comments on our dashboards. It is a very nice effort which I appreciate. I would like very much to add the comments I attached to this solution. I also insist to load the PDF and see it with the full advantage of a vectorial image.

Data layout
1. Screen is split in several distinct areas. 
   - Header with titles and comparative distribution of % of students in several contexts (class, teacher students, school, district)
   - Main Data with Current Term data with a table of all the students and four behavior values to focus on.
   - Footer with some explanations for the used colors and abbreviations
2. Students were sorted in descending order of the score allowing the teacher to quickly focus on bad or good results.
3. Possible correlations between score values and other data values (absences, tardies, disciplinary referrals and detentions) were represented different then standard XY correlation graphs with a trend line. We propose the following approach for data correlation:
   - Consider one of the measures (score) as the reference measure and display this as a simple line chart in a convenient sorting order, either ascending or descending.
   - Display the correspondent measures of the second measure with a light color in a overlapped line chart on a convenient scale (fit scale) which assures maximum visibility.
   - Display a trending curve (not necessary line, in our example I use a fitting curve of the 3rd degree) of the second measure with an emphasized color of the light color.
Interpretation is immediate and more versatile then in a standard approach allowing to estimate the correlation in a flexible way on several ranges in one picture. 
We had very good feedback from regular users who were trained to interpret both the correlation graph with a trend line and overlapped line graphs described as above. 
It made it easier to see that 'the results were decreasing while the absences were increasing". 
I personally prefer to use a correlation graph to isolate a group of value pairs rather then to interpret a least square line trend.
4. Screen areas, tables and chart grids are aligned as much as possible for an aesthetic display.
5. Everywhere the math scores or graphical related representation of scores have just one color: soft blue.
6. Red, Orange, Green colors are used just to reflect the level of the math score in small shapes. 
7. Values like absences, tardies, disciplinary referrals and detentions were represented in a heat map like table, with the most intense color assigned to the highest value for that column.
8. Different shades of gray were used for the rest of graphical representations (bullet, or horizontal bars), focusing on size rather then colors.
9. Table grid lines are colored in light gray, they do not distract the attention, yet they contribute to data alignment.
10. Colors were tested on different screens (laptop LCD, LED, tablet AMOLED, with different brightness) and on different color printers (the resulting printed colors can look different) and the results were clear everywhere.
11. The design is very compact, still it contains almost all of the provided data. It fits in one landscape A4 page or a 1650x1050 screen or higher resolution. (Students absences or tardies dates are considered details and are not included in current dashboard)
12. The dashboard was entirely designed using our application, every figure or graphical representation being the result of a formula. So just by changing the data a new dashboard can be generated. However, data was slightly different stored in our datasets.
13. The dashboard is fully vectorial designed (Hyper Analyzer drawings are all vectorial) providing a clear printout on any paper size and a perfect (crisp) display on any laptop, tablet or projector with a decent resolution.

Posts: 200
Reply with quote  #4 
Since you've put a lot of effort into detailed critiques, let me try to do the same.

At first look, the overall impression of this dashboard is very favorable.
It is aesthetically pleasing, layout is concise and easy to follow, colors well chosen overall.

First negative that hits me is the scale from green to red.  I've mentioned this in other threads here, but it just seems like an odd choice to use in a competition where the person judging it has continuously spoken loud and clear against the practice.

I agree with it being a bad idea, for the standard reason that it highlights everything instead of highlighting the small handful of items in need of attention.

I like that the current score is first in line.
I like the bar chart is used to display the score.
I like that there is a bullet graph for comparison....

1) I don't understand why there is both a bar and a bullet graph side by side, duplicating the measure of the current grade (I think...?)
2) What is the target on the bullet graph? Is it goal? Previous year?
3) The grey of the 'f' range is very heavy and overwhelms the grey of the bar
4) again with the green and the red...  the shape of the intersection (or lack thereof) of the target and the measure is sufficient to show whether target has been met.

The previous assignments section doesn't do much for me at all. The information is all there, but it's impossible to see patterns or trending.
The sparklines for the previous assessments are good, except that the vertical scales are all based on their own min and max so that the slope of each cannot be compared to the slope of any other, and shows the same kind of variation between small grade differences for one student, and large differences for another.

I like the heatmap matrix concept for the discipline. I feel it needs some work. 
Perhaps the 'previous' columns could be separated.  
The problematic items need something to draw more attention, or some way to differentiate the categories...? 8 tardies = 2 detentions.  I don't know if that's right, I don't know if that's wrong...

And I don't understand placing the special ed and english language section there at all...
Seems to say 'Special Ed is Bad!' and 'Non-native english speakers are bad!'
Sure that wasn't your intention, but it has that effect...

The comparison graphs.
I don't get them.
I see that it is trying to correlate the behavioral problems with grade performance.
I don't like the dual y axes though, and the false correlation they imply when the lines cross.
I don't understand what the 3rd light gray line is.
And I just think that
1) this is really a statistical analysis that doesn't need to take up space on the front page of a dashboard of this nature.  It should be clear that an increase in behavioral issues correlates to a decrease in grade performance, and that's really a study for a much larger data set.
2) this is better suited for a scatter plot

I am on the fence on the sections to the right.
Highlighting the top offenders is a good thing, but I feel it's repetitive and could be handled by improving the heat map set up for the main discipline section.
This would leave more room for the distribution charts, which would free up vertical space and aid in the scalability.

The distribution chart itself - I think this is decent.  I would prefer to see distribution polygons which are easier to compare across categories.
And I think further aggregate info and distribution info could have been charted and made better use of that space on the left side of the dashboard (as mentioned above).

All in all, I do think it's pretty good work.  But I do also think there is plenty of room for improvement as well.



Posts: 192
Reply with quote  #5 
I am really happy with your remarks, at least and at last I can defend my dashboard :).

1. The score (blue) is not the same with current grade (bullet). They are almost similar, but they are not the same. Check the Excel file.
Bullet graph is the current grade vs the goal. Target of the bullet graph is ... the goal, as should be.  Red when goal/target is not reached, green when is overpassed. Ranges are the F, DC, BA grades. 

2. Agree with the heavy gray. Noticed myself, on vectorial version looks better, and the printout is stunning. Check the PDF above.

3. By using red green as target I could see immediate the personal performance of every student. You can also see that more red is on the bottom of the table where bad results are as well.

4. Dot colors next to the Student name. Was not my intention to make a gradient of color there, but you can easily see on the right side the correspondence. With other words, not only bad scored students have absences or tardies. Why not to see right away that a top student had some absences to give him a bit of extra attention. 

5. Sparklines. The only way of seeing a variation between 95 and 100 is to have a fit scale. Read next to it the range with gray and black. Our application allows changing of scale from level, row, column, auto and global. None of them gave a better view then auto. Sparklines are not used for comparison between students, but for trend of scores for the same student.

6. Heat map like area contains all non related scores values or situations. You can easily see that most darker colors are going at the bottom, while few other gray spots are still present in the rest of the table. So everything could influence the level of the student is there, the color resizes itself from zero to maximum value for each category. Same as the bars from the right side of the dashboard.

7. Correlations. I am very glad that you took your time to analyze and criticize this part of the dashboard. For the correlation graphs, please read the above comment. Every graph shows that the score decreases with the increase of disciplinary counters. First version was the scatter graph and I presented it to 4 different people with almost no visual education but with a good intellectual level. Answer was 4-0 clear in the favor of this version against the scatter plot version. And don't forget that I am very much aware of Stephen preferences. I have read every single article from this site and 2 of his books. I would be happy to see more debates on this subject, thank you again that you raised attention on this. 

8. Repetitive can be indeed annoying. 4 was not that repetitive especially when all of the elements there share the same concept.

9.  One last comment is related to the empty space from the header of the page. This is intentionally left empty for teacher comments. This is always a nice option especially in the current era of tablets. Even if they still print the dashboards on paper and look at them from there is nice to add some comments in an empty space of the page. 

For me this submission was not a reason to prove how many techniques I am able to put in one page. Actually this bothers me a lot on other dashboards. I was aware it was a competition between people with solid knowledge of the visual representation but eventually such of dashboard has to be a useful tool for a teacher not a test for him. Being aware of Stephen preferences for a very long time already, I am a big supporter of almost every single idea he has. However, is room for interpretation for all of us...

Thank you very much for your time and suggestions.


Posts: 247
Reply with quote  #6 
Per the red/orange/green dots ... I would like to come to Danz's defense ...

I don't think Few is adverse to using colored dots in a dashboard per-say.  For example, he uses red & orange dots to indicate items that need attention in his dashboard book on pp, 199 and 197.

In this case, Danz takes it a step further, and gives every student a colored dot.  Is it more important to see which students are doing badly, or which ones are doing well?  In my dashboard, I tended to only draw attention to the students that were doing badly (with red color), but one of the people I had proof-read my dashboard mentioned he didn't like that - he doesn't like that throughout his whole school career the teachers' attention was always drawn towards the problem students, and largely ignoring the students who are doing well (and perhaps deserve some attention in the form of praise).  Danz's dashboard makes it easy to see both, which adds flexibility, imho.


Posts: 200
Reply with quote  #7 
Grasshopper: "I don't think Few is adverse to using colored dots in a dashboard per-say.  For example, he uses red & orange dots to indicate items that need attention in his dashboard book on pp, 199 and 197.

In this case, Danz takes it a step further, and gives every student a colored dot. "

I didn't say that Few is opposed to using colored dots.  I said nothing of the sort, in fact.

There are 2 problems.

1) we are once again at the traffic light analogy.  And I'm sorry, but it is just plain bad practice (and again, the judge of the competition has made it clear that he sees it that way)

2) the "step further" of assigning every student a dot, instead of just the small handful needing attention, is exactly the problem I was pointing out.
When every row has a bright colored indicator, bright colored indicators lose their meaning and usefulness.

Such indicators are meant to be used sparingly in order to make them effective at drawing the user's attention.

Posts: 200
Reply with quote  #8 
danz -
"8. Repetitive can be indeed annoying. 4 was not that repetitive especially when all of the elements there share the same concept. "

I didn't mean that the 4 elements were repetitive among themselves.
I meant that the elements as a whole were repetitive of the data already listed at the row level for each student.

As for the correlation charts...  again, we have a correlation that is obvious, and doesn't really need to take up space on a dashboard.
It is also the type of analysis that requires a much larger data set to glean anything useful from, which just further supports the idea of not having it here.
On the other note, I have a hard time taking a small sample of anecdotal evidence in the vein of "some people preferred this chart" when it comes to selecting the appropriate way to display data.
On three of the charts you have contact and/or crossing of the lines.  This jumps out immediately as something means something.
But it doesn't, because the axis scaling relative to the correlation of the two data sets is arbitrary.

I'm also still unclear what the 3rd line in the background represents.

The sparklines I still can't agree with you on.
I think a big part of the point is that you don't *have* to see much of a difference when the grades range from 95 to 100.  *not* seeing much difference will help make it clear that the student's performance is consistently high.  It will then also help to highlight those students who *do* have large variations in their grades, which will be completely lost with this set up.

Posts: 192
Reply with quote  #9 

I am sorry I did not understand you regarding the repetitive elements. I find this "repetitive" easier to read. As I said before, I am not going to exercise a new technique just because it was already used In the current dashboard.

I understand your point of view regarding the correlation, I just see it as it is. Works for me and for others. Clearly doesn't for you. The evidence you are pointing on can be easily extended to so many others things, so why not to skip them all? Again, I found it interesting a quite small graphical representation next to top 5 situations for each disciplinary case.
The light gray line is the counter value. The dark gray is a fitting curve of a higher degree, to be able to see the correlation. The "contact" you saw between them means something only if you are trained to think that way. Till then, it means nothing.

Sparkline. Is not in my intention to convince you. I just consider this a better approach and clearer view for the purpose I mentioned. In some cases, a global scale is maybe a better idea, but not now.

About the dots. I remind you that I am very much aware of Stephen preferences. I consider the color dots good enough in this case. Maybe is not that bad that some of his statements are also interpreted from case to case. The bright color is indeed used to see something. I wanted to see a the right mark next to every name of the student doesn't matter where does this appear on the dashboard.


Posts: 200
Reply with quote  #10 
"I am sorry I did not understand you regarding the repetitive elements. I find this "repetitive" easier to read. As I said before, I am not going to exercise a new technique just because it was already used In the current dashboard."

I think you still misunderstand me.
Either that or I don't understand your reply.

I am not saying anything about the technique being repetitive....? Not sure what you even mean by that.

I am saying that in the context of your heat map matrix where you are already addressing the behavioral issues, the info in these 4 blocks already exists.
Enhancing the heat map set up, to me, would be a far better way to highlight this information than creating these large blocks that repeat the same information.

As far as crossing lines is concerned - in a proper line chart, lines crossing each other *does* mean something!
It means one value was exceeding the other, and now that has switched.  That's important!
And you aren't "trained" to think that...that's just what it means when that happens.

So creating a chart where that *doesn't* mean something is counterintuitive and dangerous, IMO.

The rest we'll just have to agree to disagree ;)


Posts: 192
Reply with quote  #11 
Eventually I can follow you :)

Probably the most useful (for me) comment. Indeed I repeat the values found on the heat map. It is a redundant info which I realize now. This might have been done better.

Crossing lines. Switching value suggestion has sense if one scale is used. If there are 2 scales and the axis colors and values labels have different colors, I would rather ignore that. But as I you said, I agree with you that we disagree. 

Posts: 851
Reply with quote  #12 


Overall, your dashboard exhibits many good practices. Below are the areas that you could improve.

  1. I agree with Grasshopper about the instructions at the bottom of the dashboard. Most of this information does not need to reside permanently on a dashboard that a teacher would use on a daily basis. Placing them in a separate Help page would work best.
  2. On a true dashboard that is used for rapid performance monitoring, scoring everything with traffic light colors isn’t necessary. One of the downsides of this practice is that it prevents you from using colors as alerts that will catch the reader’s eye instantly. Over use of color – in this case all of the red, yellow, and green traffic light icons – causes the red icons, which are probably the ones that are most important to pop out less. In the case of the icons that appear to the left of the student names, given the fact that you’ve sorted the students by grade, these icons aren’t really needed to spot the highest grades and lowest grades.
  3. The latest standardized assessment score is useful context, but it doesn’t show how the student is doing in the class, so it doesn’t deserve to be featured as you have with a column of traffic light icons, a column of text values, and a column of horizontal bars.
  4. In the bullet graphs, the bars, which represent the most important data, should stand out more. Making them a little darker would have accomplished this.
  5. The current design only accommodates five assignments scores, but more assignments are coming. How will they be displayed? Actually, it isn’t necessary to show the precise score for each assignment. Your sparklines adequately show the students assignment scores as a patter of change (getting better, getting worse, etc.).
  6. The way that you’ve displayed the Absences, Tardies, Detentions, etc. works well for most of the items, but not for the absences and tardies. This attendance information should be shown through time so the teacher can see when they occurred and if they occurred in clumps (e.g., several days absent in a row).
  7. The assessment scores, which you’ve displayed next to the absences, tardies, etc. on the right side of the dashboard, have no direct relation with attendance and behavior. It would have been more appropriate to show the student’s current score in relation to these.
  8. Also in the sections on the right, a limit of five items per category (e.g., absences) is arbitrary. What happens if six students have significantly high absences?
  9. Rather than commenting on the effectiveness of the graphs on the right side of the dashboard, which others have already done, I'll point out instead that a correlation display would be useful for data analysis, but not for performance monitoring.

Stephen Few

Posts: 192
Reply with quote  #13 
Thank you, Stephen.

Is a very useful answer for me and for most of us.

1. Agree. Too much text on the bottom. Saw that, less would be on day by day dashboard.

2. Idea for color dots was to have them related on the right side to point immediately for instance that even a student with high score has absences. My approach has sense in a context of equal attention to all the students, then the colors would be easily identify the level on any part of the dashboard. After I reed you comment, I realized that was not the case.

3. Very much agree! Three representations for the same value is too much.

4. Agree. Colors had to be more distinct.

5. There are 2 things. Last 5 assignments (5 numbers, I thought all of them were this term) and Last 5 years history (spark line). Agree that more assignments would fit better in a sparkline, I just considered that intentionally were mentioned Last 5 assignments, to always accommodate the space on max 5 values.

6. I understood your point of view after your comment. I simply did not think about the evolution in time of the absences. Would have been an absolutely different story if the assignments scores were provided with dates!. Then we could have show a possible correlation between scores and absences. 

7. :( I have to say I am coming from a different school, my wrong impression was that assessment score is current level of the student on Algebra (this is why it was chosen as a sorting criteria), and the grade was for general level for the student. This was for me the second criteria.

8. The limit of 5 is indeed a possible issue. I would probably reconsider by using first (maximum) 5 absences values with a list of students next to it. However, designed space had to stay constant, at least from the software I use point of view.

9. Data analysis vs performance monitoring. Sometimes is quite a thin line between them. Sometimes we need quick analysis to see the big picture. Here might not be the case. 

Thank you again, is a review which would definitely help me to improve. For me was the main goal of my participation on this competition.


Posts: 192
Reply with quote  #14 
Based on the reviews on my dashboard and the others I made a different version. Comments?

Attached Images
Name: updated_dashboard.png, Views: 499, Size: 118.82 KB


Posts: 14
Reply with quote  #15 
Sparklines. I like your original implementation better, because it’s easier to track the high / low numbers to their values in the sparkline. It’s similar to Tufte’s financial format where he uses red and blue numbers to reference red and blue dots on the sparkline. 

Abbreviations. I find these disorienting -- LaSc, Lt, Ab, Ta, R, Rp, D, Dp – and difficult to remember when returning to the dashboard. I might delete the absence / tardy numbers, replace the discipline columns with a graphic, and find a way to spell out LaSc (which I assume is last score).  

Absence/Tardy Graph. This works for me. More concise than a strip plot. 

Right column. The graphs in this column confuse me, especially the top chart. I think the bottom charts are timelines. Maybe an x-axis label would help clarify (and maybe a line graph). 
Previous Topic | Next Topic

Quick Navigation:

Easily create a Forum Website with Website Toolbox.