Create a Treemap Chart with Highlighting Functionality

i. Drag SUM(Measure) to Size in Marks card
ii. Select Size button in marks card and change to Square

iii. Drag dimension to Label in Marks card

iv. Right click in Data pane and select Create Parameter
a. Choose Name: Select Dimension to Highlight
b. Data type: String
c. Allowable values: List
d. Add from Field: Dimension
e. OK

v. Right click on Select Dimension to Highlight and Show Parameter Control

vi. Right click in Data pane and select Create Calculated Field
a. Give the calculated field a name: Highlight Dimension
b. Add formula: [Dimension] = [Select Dimension to Highlight]
c. OK

vii. Drag Highlight Dimension to Color in Marks card
viii. Click down arrow on dimension and convert to attribute

Your final output will look similar to this:

Have fun vizzing!

Build Statistical Process Control Into Your Tableau Reports

Introduction

“We see first what stands out” (Berinato, 2016, p23). One of the books I’m reading recently is Good Charts by Scott Berinato. The purpose of the book is to educate data practitioners on the impact good data visualization has on influencing the audience. I’m reading this book because One of my biggest challenges as an analytical professional is making the data tell its own story. Creating charts that tell their own data story is important because the purpose of my data visualizations is to influence leadership to act. The clearer the data story is, the more likely the chance my audience will take the action I’m influencing them to take.

Expert Example

ZS Associates success dependent on the visualization of optimization. Andris Zolters, a founder of the global sales consultancy, commented that his sales force territory solution he was proposing wouldn’t gain traction without the assistance of data visualization. The solution adoption increased due to allowing companies to see the results of solutions. (Berinato, 2016, p23)

Personal Experience

Recent successes of utilizing statistical process control of same day lead conversion to alert on out of control data points and becomes the starting point for identifying root causes.

What Is Statistical Process Control

Processes fall into one of four states: 1) the ideal, 2) the threshold, 3) the brink of chaos and 4) the state of chaos (Berardinelli, unknown). When a process operates in the ideal state, that process is in statistical control and produces 100 percent conformance. This process has proven stability and target performance over time. This process is predictable and its output meets customer expectations. However, if the process is in a state of chaos then then the customer experience is unpredictable and will lead to low customer satisfaction. A Six Sigma Process Control Tool that identifies when a process is out of control (chaos state)

History

Invented by the Western Electric Company in 1920 to determine if process variation were signals vs. merely noise. The theory is less process variation results in fewer defects. (Jones, 2014, p106) A process that is out of control can lead to inconsistent, or even worse, poor quality customer experience. With statistical process control charting report users can identify issues and causes sooner with alert based reporting.

Build It in Tableau

“Using control charts is a great way to find out whether data collected over time has any statistically significant signals, or whether the variation in the data is merely noise.” (Jones, 2013) “Our eyes go right to change and difference – peaks, valleys, intersections, dominant colors, outliers” (Berinato, 2016, p35)

Anatomy of a Control Chart
1. The time series data: NBA shooting conversion data by day for 2017/2018 season
2. The average line: The average shooting percentage over a period of time
3. The control limits
a. Upper Control Limit (UCL): +3 SD [QM] or N + (3* mR / 1.128)
b. N: population size of population
c. mR: period to period rate of change
d. mR_UCL: 3.267 * mR
e. Lower Control Limit (LCL): -3 SD [QM] or N – (3* mR / 1.128)
4. Signals
a. Outliers: Data points either above the UCL or below the LCL
b. Trends: Six or more points either all ascending or all descending
c. Shifts: Nine or more points either all above or all below the average line
5. Steps
a. Quick method (see there’s an issue)
i. Add a WEEK(Date) Measures pill to Columns shelf
ii. Create a calculated field for Field Goal %
a. Note: FG% in data set won’t aggregate accurately by week so a calculated measure needs to be created
iii. Calculated Field formula: SUM([FGM])/SUM([FGA])
iv. Name field Field Goal %
v. Add Field Goal % pill to Rows shelf
vi. Add Player pill from Dimensions to Filters pane
vii. Filter on Anthony Davis
viii. Show Me then line chart
ix. Add average, UCL, and LCL reference lines
x. Right click on the y axis
xi. Click ‘Add reference line, band, or box’
xii. Add an average line in the by selecting the value and average
xiii. Click ‘Add reference line, band, or box’ again
xiv. Distribution then Scope (Per Pane) then Computation (Value: Standard Deviation, -3,3) then Formatting (Line: Dotted, Fill: None) then OK
b. Rigourous method (tell me there’s an issue)
i. Add a WEEK(Date) Measures pill to Columns shelf
ii. Create a calculated field for Field Goal %
a. Note: FG& in data set won’t aggregate accurately by week so a calculated measure needs to be created
iii. Calculated Field formula: SUM([FGM])/SUM([FGA])
iv. Name field Field Goal %
a. Add Field Goal % pill to Rows shelf
v. Add Player pill from Dimensions to Filters pane
vi. Filter on Anthony Davis
vii. Show Me then line chart
viii. Add a second Field Goal % pill, but drag to right side of line chart to create a dual axis chart
ix. Go to Marks card then select Field Goal % 2 then circle then Color then orange
x. Create a Moving Range chart
a. Drag another instance of Field Goal % pill next to other pills in rows shelf
b. Quick table calculation then Difference then Edit Table Calculation then Customize then Add ABS() around the equation in the Formula: window then change Name: to Field Goal % mR
c. Add a second Field Goal % mR pill, but drag to right side of Field Goal % mR line chart to create a dual axis chart
d. Go to Marks card then select Field Goal % mR 2 then circle then Color then orange
xi. Add average, UCL, and LCL reference lines
a. Right click on the y axis
b. Click ‘Add reference line, band, or box’
c. Add an average line in the by selecting the value and average
d. Create calculated fields for MR_UCL, UCL, and LCL
i. Field Goal % mR UCL: 3.267*WINDOW_AVG([Field Goal % mR])
ii. Field Goal % UCL: [Field Goal %] + 3*WINDOW_AVG([Field Goal % mR]) / 1.128
iii. Field Goal % LCL: [Field Goal %] – 3*WINDOW_AVG([Field Goal % mR]) / 1.128
xii. Add these three calculated fields to the detail button in the All Marks card
a. Right click on the Field Goal % y axis
b. Click ‘Add reference line, band, or box’
c. Line then Line (Value: Field Goal % UCL) then Formatting (Line: Dotted, Fill: None) then OK
d. Click ‘Add reference line, band, or box’
e. Line then Line (Value: Field Goal % LCL) then Formatting (Line: Dotted, Fill: None) then OK
xiii. Add Signals to graphs
a. Create two new Calculated Fields for the signals
a. Field Goal % Signals [code for workbook below]:
IF
([Field Goal %])>(WINDOW_AVG(([Field Goal %]))+3*(WINDOW_AVG([Field Goal % mR]))/1.128)
THEN ‘Outlier’
ELSEIF
([Field Goal %])<(WINDOW_AVG(([Field Goal %]))-3*(WINDOW_AVG([Field Goal % mR]))/1.128)
THEN ‘Outlier’
ELSEIF
([Field Goal %])>LOOKUP(([Field Goal %]),-1) AND
LOOKUP(([Field Goal %]),-1)>LOOKUP(([Field Goal %]),-2) AND
LOOKUP(([Field Goal %]),-2)>LOOKUP(([Field Goal %]),-3) AND
LOOKUP(([Field Goal %]),-3)>LOOKUP(([Field Goal %]),-4) AND
LOOKUP(([Field Goal %]),-4)>LOOKUP(([Field Goal %]),-5) AND
LOOKUP(([Field Goal %]),-5)>LOOKUP(([Field Goal %]),-6)
THEN ‘Trend’
ELSEIF
([Field Goal %])<LOOKUP(([Field Goal %]),-1) AND
LOOKUP(([Field Goal %]),-1)<LOOKUP(([Field Goal %]),-2) AND
LOOKUP(([Field Goal %]),-2)<LOOKUP(([Field Goal %]),-3) AND
LOOKUP(([Field Goal %]),-3)<LOOKUP(([Field Goal %]),-4) AND
LOOKUP(([Field Goal %]),-4)<LOOKUP(([Field Goal %]),-5) AND
LOOKUP(([Field Goal %]),-5)<LOOKUP(([Field Goal %]),-6)
THEN ‘Trend’

ELSEIF
([Field Goal %])>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-1)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-2)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-3)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-4)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-5)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-6)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-7)>WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-8)>WINDOW_AVG(([Field Goal %]))
THEN ‘Shift’
ELSEIF
([Field Goal %])<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-1)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-2)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-3)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-4)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-5)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-6)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-7)<WINDOW_AVG(([Field Goal %])) AND
LOOKUP(([Field Goal %]),-8)<WINDOW_AVG(([Field Goal %]))
THEN ‘Shift’
ELSE ‘In Range’
END

b. Field Goal % MR_Signals [code below]:

IF
[Field Goal % mR] > 3.267*(WINDOW_AVG([Field Goal % mR]))
THEN ‘Outlier’
ELSEIF
[Field Goal % mR]>LOOKUP([Field Goal % mR],-1) AND
LOOKUP([Field Goal % mR],-1)>LOOKUP([Field Goal % mR],-2) AND
LOOKUP([Field Goal % mR],-2)>LOOKUP([Field Goal % mR],-3) AND
LOOKUP([Field Goal % mR],-3)>LOOKUP([Field Goal % mR],-4) AND
LOOKUP([Field Goal % mR],-4)>LOOKUP([Field Goal % mR],-5) AND
LOOKUP([Field Goal % mR],-5)>LOOKUP([Field Goal % mR],-6)
THEN ‘Trend’
ELSEIF
[Field Goal % mR]<LOOKUP([Field Goal % mR],-1) AND
LOOKUP([Field Goal % mR],-1)<LOOKUP([Field Goal % mR],-2) AND
LOOKUP([Field Goal % mR],-2)<LOOKUP([Field Goal % mR],-3) AND
LOOKUP([Field Goal % mR],-3)<LOOKUP([Field Goal % mR],-4) AND
LOOKUP([Field Goal % mR],-4)<LOOKUP([Field Goal % mR],-5) AND
LOOKUP([Field Goal % mR],-5)<LOOKUP([Field Goal % mR],-6)
THEN ‘Trend’
ELSEIF
[Field Goal % mR]>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-1)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-2)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-3)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-4)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-5)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-6)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-7)>WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-8)>WINDOW_AVG([Field Goal % mR])
THEN ‘Shift’
ELSEIF
[Field Goal % mR]<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-1)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-2)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-3)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-4)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-5)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-6)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-7)<WINDOW_AVG([Field Goal % mR]) AND
LOOKUP([Field Goal % mR],-8)<WINDOW_AVG([Field Goal % mR])
THEN ‘Shift’
ELSE ‘In Range’
END
xiv. Drag these new calculated fields into the color button of AGG(Field Goal %) (2) and Field Goal % mR (2) marks cards
xv. Format y-axis to percentages
xvi. Create color schemes with following coloring logic
a. Line charts: medium gray
b. In Range: medium gray
c. Outlier: Red
d. Shift: Orange
e. Trend: Blue

Summary

In conclusion, statistical process control is a statistical tool that can help you understand when a process is in control (ideal state) vs. out of control (chaos state). Statistical process control charting logic can be built into tableau reporting so users can determine if organizational processes are in or out of control. Build the SPC logic into your reports will enable end users to identify issues that are statistically significant vs. noisy normal variance. Highlighting weeks and dates that are out of control pinpoints points in time issues happened and reduces the date ranges required to dig deeper into, understand an issue, and fix the problem so it won’t happen again in the future. I recommend leveraging statistical process control charting to alert the organization to process issues. Here are some tips to reach that goal.

1) Follow the steps above
2) Visit these websites [iSixSigma.com, Ben Jones at Tableau.com]
3) Read these books [Good Charts by Scott Berinato, Communicating Data with Tableau by Ben Jones, Sales and Marketing the Six Sigma Way by Michael Webb]
4) Get sample data sets and workbook examples from https://public.tableau.com/profile/charlesutton#!/vizhome/NBA_PPG_Trellis_Charts/FieldGoalSPC

References

1. Berinato, Scott (2016). Good Charts. New York, NY. Harvard Business Review Press. p35.
2. Beradinelli, Carl (unknown). A Guide to Control Charts. ISixSigma. Accessed on 3/24/18 from https://www.isixsigma.com/tools-templates/control-charts/a-guide-to-control-charts/#comments
3. Jones, Ben (2013). How to Make Control Charts in Tableau. Tableau Public. Accessed on 2/9/18 from https://public.tableau.com/en-us/s/blog/2013/11/how-make-control-charts-tableau
4. Jones, Ben (2014). Communicating Data with Tableau.
5. Webb, Michael (2006). Sales and Marketing the Six Sigma Way. Chicago, IL. Kaplan Publishing. pp 63-64.

The Day in the Life of a Data Analytics Professional

You think being a data professional is all glitz and glamour? Well it’s not. There is a ton of preparation that needs to be done before you can ever get to the insight. IF you ever get to the insight should be the real expectation. I’ve worked for tons of companies and the one common denominator is a lack of time and resources needed to get to truly insightful information. I’m very diligent with capturing the number of hours per day I spend gathering data for insights.

In this article, I want to cover five main themes within the analytics process
1. Understanding your customer needs is everything
2. Constant feedback throughout the process ensures alignment to expectations
3. Data quality is the difference between adoption and dismissal
4. KISSING (Keep It Simple, Straightforward, and Include Nice Graphs)
5. PowerPoint is an art that needs to be learned

Most my time is spent in the definition stage of analytics. Early in my career I found that I didn’t spend enough time on this stage and it would never fail that the final deliverable didn’t align with my requestor’s expectations. This stage of analytical projects is critical to delivering a solution that will get adopted by the requestor. The define stage is the meetings with the requestor to extract the ‘real’ business question they are trying to answer. What I mean by ‘real’ is a requestor usually starts off with a nebulous question that could send you down several different rabbit holes wasting your time and taking you off track from delivering an actionable insight to drive business performance. For example, your requestor asks questions like, “How can I increase revenue in my department?”, “Which employees are underperforming?”, or “Why is this trend going up?”. As an analyst, your job is to extract the real driver behind the requestor’s question. Starting with the revenue question, when your requestor asks how can they increase revenue, my first question might be “how many revenue segments are available and which can realistically be expected to increase?” This is an important question because if you just took the revenue question and ran with it, during the collect stage you’d find there could be multiple approaches to increasing revenue or maybe just one. Perhaps your final recommendation is an unrealistic solution. You’ve spent several hours gathering data, organizing it into a reasonable data story, converting it into a stunning presentation but still not providing any valuable insight. This happens all the time. The better you can get at the problem detective work; the better insights will be delivered in the final output of your analysis. I’ve found it’s helpful to use structured questioning frameworks to dig deep into the requestor’s problems and find the true problem they’re looking to resolve. The five whys, fishbone diagraming, and root cause analysis are excellent approaches to digging deeper to finding a solution to the problem. I must also mention that the stages of analytical research are not linear, but cyclical. I’ll find that I will meet with an individual, return to my desk and start researching available data and realize I need to ask more questions. I’ll setup more time with the requestor to get even more clarification. Never be afraid to ask more questions to really understand the problem, but I always recommend that you communicate upfront with your requestor that analytics is an iterative process so the back and forth problem definition process will require multiple meetings to clearly define the problem. The requestor is usually more open to working with you if they know it’s a back and forth process and they’ll learn more about their business question than they knew they understood. As a younger analyst, don’t let any individual push you to just go find the answer without enough information. While you may be working with execs with limited time, they are now your customer and the only way you can deliver them a valuable insight is to follow the process. Always trust the process.

Now that we’ve discussed the definition stage of analytics it’s time to discuss the collection piece which is finding the primary and secondary information needed to answer your requestor’s business question. Collecting data covers activities such as writing SQL queries, extracting data from front end systems, scraping data from the web, adding tags to web content, joining disparate files in Tableau, or merging excel data. I’ve had to do one or multiple of these activities in every project I’ve ever worked on. The purpose of this step is to collect the data needed to answer the requestor’s business question. Since this step can be highly technical and very complex, it is critical that enough time is spent during the definition stage. The best way to reduce the turnaround time of any analytics project is to clearly understand your requestor’s goals, capture them in clear and concise business questions, and hyper focus on finding the best source of data that can help answer the question. During the collection stage, you may find it difficult to find the perfect piece of data to answer every portion of the request. Don’t get stuck in analysis paralysis trying to find the perfect data point for every request. Often inferred data can be just as helpful to drive insights and turnaround insights in a shorter period. For example, your client’s may not have voice of the customer data readily available or even large enough sample size to be confident in. Most likely tons of secondary research data exist on the web to infer the customer pain points and how to address them. The point is during collection of data, there are tons of rabbit holes to go down that can kill the efficiency of your research process. These rabbit holes can cause you to miss timelines and frustrate the customer. When you encounter a rabbit hole, think outside the box and identify multiple sources of data that can drive insights without being held hostage by the data available from your client.

Once the data is collected, it needs to be organized in a way that can make patterns in the data obvious for research and presentation purposes. Organizing data requires software knowledge with some sort of data visualization tool. Organizing software can come in many different forms, but Excel, Tableau, Power BI, and Python are the ones I’m most familiar with. There are many different data sets ranging from small data to big data and all data in between. The amount of data being worked with will most likely dictate what program you’ll need to use. Don’t think just because you’re not using Hadoop, data lakes, or Python that you aren’t doing analytics. Many analytic insights can be done in Excel or an open source tool offered by Tableau or Power BI. What you’re looking for are trends in the data set the provide information not easily identified by looking at data in its raw form. A couple thousand lines of transaction data by a sales team can be aggregated, graphed, and compared to determine which sales associates are dragging down performance of a sales team. Please don’t forget to continuously review your strategic business questions defined during the definition stage. Are you starting to notice a trend? The information collected during the definition stage should always be reviewed. Constant communication with your customer through every stage of the analytical process to ensure proper alignment of your analytics process with their original expectations.

You’ll notice in my graph that data quality that is the fourth ranked activity where I spend my time. While it’s ranked fourth it’s probably the most important. You’re only as good as your first impression and a first impression can be destroyed by providing an insight than can be quickly refuted with other reports in the company. Whenever I find data I also look for sources of truth. Start with the requestor to help identify sources of data quality. Are there other departments that you could collaborate with to support your data story? Can segments be aggregated in a way that performance trends align with high level data views? Are there secondary resources that can increase confidence in the data story you’re trying to tell? As you look for data, ask yourself if you have supporting information that increases the confidence of every part of your insight. Your customers will always be skeptical of data and you will help yourself in a major way if you can address any data concern with supporting quality assured documentation.

The most powerful insights are delivered simply and easy to understand to the customer. KISSING is an acronym I picked up during a Six Sigma training that has stayed with me during my career as an analytical professional. It stands for Keep It Simple, Straightforward, and Include Nice Graphs. PowerPoint is usually the tool of choice for presenting information in our day of age. PowerPoint is an art and I suggest getting better at it every day. If fact, it is still my biggest weakness however I continue to read and reread books like Slideology by Nancy Duarte and Speaking PowerPoint by Bruce Gabrielle. In fact, I recommend you read those books because presenting in PowerPoint is still not my strong suit. I find so much time is being used in the previously mentioned activities, that there is little time to polish my deliverables. I follow a simple PowerPoint rule. Use as little words as possible and include high quality images aligned with your message. Go with an executive summary format by telling the audience what you’re going to say, say it, and remind them what you said. The further removed you are from your audience the more words you’ll need but don’t kill the slides with vocab. Einstein said it best, “If you can’t explain it simply, you don’t understand it well enough.”

In summary, we covered the analytic process from beginning to end. Spend the most time with your customer’s up front. With any customer, truly understand their needs and how you can deliver a solution to fulfill those needs. Find the data they need that aligns with their expectations. Always make sure you have supporting evidence that your data is accurate. Keep the explanation simple, but make it impactful and engaging. Deliver a quality product aligned with customer expectations and you’ll build a client base that leans on you for insights and recommends you to others looking for better information. This is how you’ll build credibility and be viewed as a source of truth within any organization.

Tableau Time – Creating a Fiscal Calendar

Tableau Time – Creating A Financial Calendar for Your Organization

Intro – You are the lead reporting analyst and your boss approaches you about measuring year over year business performance. You reply “Not a problem” and then proceed to change format calendar dates with Tableau’s built in week format feature and Voila!. You’re done. You forward the report to your boss and eagerly wait his appreciative response. What happens instead is you get an email asking several questions about the calendar dates. Your boss responds with questions like “Does this align with our March to February fiscal year?”, “Is the reporting calculating a Monday to Sunday week or Sunday to Saturday?”, “How come this report doesn’t align with the internal Excel reports previously built?” and so on and so on. No matter how much you scour the internet, there doesn’t seem to be a one size fits all calendar formatting to match your company’s fiscal year. Well, welcome to Tableau. As good as it is as an analysis and visualization tool, it too, has it’s limitations. Well, let me provide two workarounds that could help you build the fiscal calendar your company works with. The two approaches will either leverage an Excel file or creating a calculated field in Tableau for your dates. Let’s start with the Excel approach to show a two step methodology. My preference is the calculated field methodology because it is built into your Tableau workbook versus adding an additional file.

To start off with the Excel approach you will need to have an Excel file with all the dates you will want to view in the Tableau report. Determine how you would want to view your data, but in my experience I create a fiscal week, monthly, quarterly, and yearly view. Here are the steps that I would take to complete this task.

  1. Open a new file in Excel
  2. Label the columns A1:A5 date, fiscal week, fiscal month, fiscal quarter, fiscal year
  3. In my example, I started fiscal year at 1/1/2016 in B2 and pulled the dates through 12/31/2017
  4. Next, in cell C2 create an Excel formula that will read the fiscal week you just created and convert to a fiscal month. For example, I used IF and AND functions to determine that weeks 1 – 4 are fiscal month 1 while weeks 9 – 13 are fiscal month 3. I’ve provided the formula below to assist you in defining your own fiscal month for your organization.
    1. =IF(AND(B2>=1,B2<=4),1,IF(AND(B2>=5,B2<=8),2,IF(AND(B2>=9,B2<=13),3,IF(AND(B2>=14,B2<=17),4,IF(AND(B2>=18,B2<=21),5,IF(AND(B2>=22,B2<=26),6,IF(AND(B2>=27,B2<=30),7,IF(AND(B2>=31,B2<=34),8,IF(AND(B2>=35,B2<=39),9,IF(AND(B2>=40,B2<=43),10,IF(AND(B2>=44,B2<=47),11,IF(AND(B2>=48,B2<=52),12,0))))))))))))
  5. Now, in C3 create an Excel formula that will read the fiscal month and convert it to a fiscal quarter. Again, use the IF and AND functions to determine month 1 – 3 are Quarter 1. I’ve provided the formula below to assist you once again.
    1. =IF(AND($C2>=1,$C2<=3),1,IF(AND($C2>=4,$C2<=6),2,IF(AND($C2>=7,$C2<=9),3,IF(AND($C2>=10,$C2<=12),4,0))))
  6. There are multiple ways to define years, but I simply entered the year for each quarter and pulled down the cell handle to populate year through week 52 and restarted at week 1 for the following year.

You now have the Excel file needed to merge with your base data set. Assuming you have already uploaded your base data set which should include a date column, it is now time to upload the Excel file to the Tableau workbook and blend your two data sets together. If you’ve never blended multiple data sets together in Tableau, then don’t worry because I’ve provided the steps for you below.

  1. Assuming you’ve already uploaded data from your database, you’ll need to include your new Excel file with the fiscal calendar dates defined
  2. Select data from tab in left top corner –> Connect to data –> Microsoft Excel –> Select Excel file from folder

  1. Once data is loaded, choose data tab in left top corner again and edit relationships
  2. Select the primary data source you want to connect to (This should be your master data set, not the Excel data set) –> Select your secondary data set (the Excel file) –> Determine if the columns are automatically identified –> If they are not, then select the Custom radio button –> Select the Add… button –> Chose the date field from your primary source and the date field from your secondary source –> Select OK –> Select OK
  3. Once the two databases are connected on date fields you can now drag fiscal week, month, etc… into your report while pulling in metrics data from your primary field.

Now you should be able to view your company data by the fiscal year calendar defined in your Excel file. Now that I’ve covered the steps needed to create a fiscal calendar with Excel, I will teach you how to create the same view inside of your Tableau workbook.

To build the fiscal calendar within Tableau you’ll need to utilize calculated fields to define what your date ranges are for fiscal week, month, quarter and year. To create a calculated field right click in the side bar and select calculated field. In the

Name: box label the calculated field intuitively so it’s easy to locate the calculated field dimension (for example: Date Fiscal Week, Fiscal Week Date, etc…). In the formula box, you will use IF and ELSEIF statements to define the between ranges for each fiscal date segment you’d like to see. I’ve provided code below to show how a fiscal week could be defined using CASE STATEMENTS within Tableau.

Fiscal Month Code:

IF  [Date]  >=DATETIME(‘1/6/2015’) AND [Date]  < DATETIME(‘2/3/2015’) THEN  1
ELSEIF [Date] >=DATETIME(‘2/3/2015’) AND [Date] <DATETIME(‘3/3/2015’) THEN 2
ELSEIF [Date] >=DATETIME(‘3/3/2015’) AND [Date] <DATETIME(‘4/7/2015’) THEN 3
ELSEIF [Date] >=DATETIME(‘4/7/2015’) AND [Date] <DATETIME(‘5/5/2015’) THEN 4
ELSEIF [Date] >=DATETIME(‘5/5/2015’) AND [Date] <DATETIME(‘6/2/2015’) THEN 5
ELSEIF [Date] >=DATETIME(‘6/2/2015’) AND [Date] <DATETIME(‘7/7/2015’) THEN 6
ELSEIF [Date] >=DATETIME(‘7/7/2015’) AND [Date] <DATETIME(‘8/4/2015’) THEN 7
ELSEIF [Date] >=DATETIME(‘8/4/2015’) AND [Date] <DATETIME(‘9/1/2015’) THEN 8
ELSEIF [Date] >=DATETIME(‘9/1/2015′) AND [Date] <DATETIME(’10/6/2015’) THEN 9
ELSEIF [Date] >=DATETIME(’10/6/2015′) AND [Date] <DATETIME(’11/3/2015′) THEN 10
ELSEIF [Date] >=DATETIME(’11/3/2015′) AND [Date] <DATETIME(’12/1/2015′) THEN 11
ELSEIF [Date] >=DATETIME(’12/1/2015′) AND [Date] <DATETIME(‘1/5/2016’) THEN 12
ELSEIF  [Date]  >=DATETIME(‘1/5/2016’) AND [Date]  < DATETIME(‘2/2/2016’) THEN  1
ELSEIF [Date] >=DATETIME(‘2/2/2016’) AND [Date] <DATETIME(‘3/1/2016’) THEN 2
ELSEIF [Date] >=DATETIME(‘3/1/2016’) AND [Date] <DATETIME(‘4/5/2016’) THEN 3
ELSEIF [Date] >=DATETIME(‘4/5/2016’) AND [Date] <DATETIME(‘5/3/2016’) THEN 4
ELSEIF [Date] >=DATETIME(‘5/3/2016’) AND [Date] <DATETIME(‘6/7/2016’) THEN 5
ELSEIF [Date] >=DATETIME(‘6/7/2016’) AND [Date] <DATETIME(‘7/5/2016’) THEN 6
ELSEIF [Date] >=DATETIME(‘7/5/2016’) AND [Date] <DATETIME(‘8/2/2016’) THEN 7
ELSEIF [Date] >=DATETIME(‘8/2/2016’) AND [Date] <DATETIME(‘9/6/2016’) THEN 8
ELSEIF [Date] >=DATETIME(‘9/6/2016′) AND [Date] <DATETIME(’10/4/2016’) THEN 9
ELSEIF [Date] >=DATETIME(’10/4/2016′) AND [Date] <DATETIME(’11/1/2016′) THEN 10
ELSEIF [Date] >=DATETIME(’11/1/2016′) AND [Date] <DATETIME(’12/6/2016′) THEN 11
ELSEIF [Date] >=DATETIME(’12/6/2016′) AND [Date] <DATETIME(‘1/3/2017’) THEN 12
ELSEIF  [Date]  >=DATETIME(‘1/3/2017’) AND [Date]  < DATETIME(‘2/7/2017’) THEN  1
ELSEIF [Date] >=DATETIME(‘2/7/2017’) AND [Date] <DATETIME(‘3/7/2017’) THEN 2
ELSEIF [Date] >=DATETIME(‘3/7/2017’) AND [Date] <DATETIME(‘4/4/2017’) THEN 3
ELSEIF [Date] >=DATETIME(‘4/4/2017’) AND [Date] <DATETIME(‘5/2/2017’) THEN 4
ELSEIF [Date] >=DATETIME(‘5/2/2017’) AND [Date] <DATETIME(‘6/6/2017’) THEN 5
ELSEIF [Date] >=DATETIME(‘6/6/2017’) AND [Date] <DATETIME(‘7/4/2017’) THEN 6
ELSEIF [Date] >=DATETIME(‘7/4/2017’) AND [Date] <DATETIME(‘8/1/2017’) THEN 7
ELSEIF [Date] >=DATETIME(‘8/1/2017’) AND [Date] <DATETIME(‘9/5/2017’) THEN 8
ELSEIF [Date] >=DATETIME(‘9/5/2017′) AND [Date] <DATETIME(’10/3/2017’) THEN 9
ELSEIF [Date] >=DATETIME(’10/3/2017′) AND [Date] <DATETIME(’11/7/2017′) THEN 10
ELSEIF [Date] >=DATETIME(’11/7/2017′) AND [Date] <DATETIME(’12/5/2017′) THEN 11
ELSEIF [Date] >=DATETIME(’12/5/2017′) AND [Date] <DATETIME(‘1/2/2018’) THEN 12
END

Now I’ve provided a sample code to provide an example of what a fiscal calendar month code would look like. Of course, you’d have to repeat this step and modify code to create a fiscal week, fiscal quarter, and fiscal year code. In my experience, it takes about 30 min to an hour to create the fiscal calendar dates regardless of whether you create it with the Excel method or calculated field method. Trust me, this is well worth the time investment to ensure you’re reporting is accurately showing the year over year performance your superior will want to see. Best of luck with your new reporting technique and continue to check back in with White Rabbit Analytics to learn more powerful data visualization techniques.

It Takes a Village – Organizational Buy In Required for Impactful Analytics

As a new father, I’ve often heard the phrase that it takes a village to raise a child. I’ve learned that my wife and I can’t do it on our own and we need to rely on family and friends to do the best job we can do. As an analytics professional, I’ve also learned that it takes a village to get impactful analytical insights. What does that mean exactly? It means that the analytics practitioner can’t extract the insights all on their own. They must rely on the rest of the organization to fill in the knowledge gaps within the analytical research to deliver truly impactful insights. This article will focus on the role leadership in guiding analytical projects, the role key stakeholders play in adopting analytical insights, and the role the rest of the organization plays in implementing analytical solutions. Now let’s take a look at the important role leadership plays in implementing effective analytics.

I can’t express enough the importance that senior leadership will play in any successful implementation project. Leadership must set the tone for the rest of the organization of why analytical thinking is important. If your CEO’s message is they require data-driven solutions to solve company problems, then the Vice Presidents will require data-driven insights, then Directors, then Managers, and finally front line associates. However, if the CEO never requires data to back decision making then why would the next level ask for it and so on. The point is as an analytical professional it is your duty to message the importance of analytics to top leadership and the need for accountability as an organization for the infrastructure required to deliver best in class analytics. This means that not only does leadership have to believe in the power of analytics, but they must invest in the infrastructure to deliver powerful analytics. The proper infrastructure for analytics requires clean available data, tools to work with the data, and talent that understands how to use the data. Now that you understand the importance of leadership backing behind analytics projects, let’s discuss the role key stakeholders play in adopting analytical insights.

First and foremost, you must first understand who your key stakeholders are on any analytical project to ensure impactful insights can be delivered. This means that not only the requestor of analytical questions be involved in the feedback loop, but all areas of the business impacted by a potential insight. I’ve found that analytical interviews can be a great way to identify who the key stakeholders are. Start with a list of clarifying business questions for the requestor to understand potential impact and which people may get impacted. Within this list of questions, you should also include a question about other departments or individuals who may also be working on a similar project or a project that may have preceded this one. Clarifying business questions will identify your list of key stakeholders in which you should involve all of them from the very beginning of the project. Holding an analytical workshop with identified key stakeholder will make them feel more involved in the process and more willing to work together towards a unified solution. As the analyst, your role will be to provide objective data-driven information to help support or deny theories to the issue. Using problem solving models such as fishbone diagrams or the five whys model will capture insights in a logical manner and pinpoint the issue drivers to the problem at hand. Once the issue drivers list has been refined, then the workshop can begin testing process improvement theories and determine the best solution to the problem. Now that you understand the importance of key stakeholder’s adopting analytical solutions, it’s time to cover what the rest of the organization must do to implement the solutions.

Any analytical solution is only as good as its implementation and it can’t be done without the help of managers and front line associates. This is the reason the communication from leadership that analytics is important to establish the employee mindset that they will be required to utilize analytical insights into the day to day operation. The messaging has to be consistent and frequent from the top down. This also brings back the point that the data infrastructure must be sound. If company employees question the integrity of the data sources, then they will naturally question the analytical insights. The reporting environment needs to be accurate, optimized, and intuitive. Employees aren’t afraid of data, they are afraid of working with antiquated tools that make their jobs more difficult and inefficient. If you provide them data-driven tools that deliver the analytics information the top wants them to have then they will definitely adopt it. Why? Because it’s a guide for them to what there goals are and how close they are to achieving them. If you want productive employees, then give them a target, let them know where they stand, and provide them the tools to drive actual KPIs beyond their target KPIs.

In summary, analytics and reporting are powerful tools to align the company with their targets. Everyone from the CEO down to front line associates really want to see a company succeed. Where things go bad is when there is a miscommunication from the top and a lack of transperancy throughout the organization on the direction of the company. So, communicate to the leadership team the importance of analytics and let them know how important it is to routinely message the need for analytics. Get key stakeholders involved by holding workshops that make them feel engaged and impactful. Provide front line associates with interactive, insightful, and intuitive tools that provide visibility to their goals and information about how they can get there. If you do these three things for every analytical project, then you will be closer to hitting or exceeding your goals today than you were yesterday.