# When are Toronto's water mains likely to burst? Predict with Time Series Exploration in University Edition

by on ‎03-04-2016 09:54 AM - edited on ‎03-07-2016 10:13 AM by (1,418 Views)

• ,

• ,
• ### Tips and Tricks

I was going to do a Part II post to an earlier Survival Analysis article, "How to Survive Survival Analytics in University Edition," but ran into technical difficulties with my computer. So that article is on hold. Luckily, I started working on a project at work using the Forecasting Tasks, so decided to take this opportunity to dig into it further. Please note I’m learning this at the same time, so if you have any experience I’d appreciate your insight!

Get the Data

I’m using the City of Toronto’s Open Data set on water main breaks, available here; you can read about the data here.  As I lived in the city for over 20 years, and have worked in it since I was a teenager, water main breaks are something I’ve seen dozens of times. I wanted to see what interesting things I could find by digging into it, so let’s get started!

How to go about getting SAS University Edition

If you don’t already have University Edition, get it here and follow the instructions from the pdf carefully. If you need help with almost any aspect of using University Edition, check out these video tutorials. Additional resources are available in this article.

After I imported the data, I ran a simple SQL script to aggregate the data; each break is a separate row, but I want it as a count by day, so I used this:

``````proc sql;

create table work.import2 as
select break_date, count(*) as Total
from work.import
group by break_date
order by break_date;
quit;``````

Our data is now ready to be explored!

The Results

Within the Forecasting group of tasks, there are three different functionalities you can use – Time Series Data Preparation, Time Series Exploration, and Modeling and Forecasting. We’ll go through each in the next couple of articles, but I wanted to start off with Time Series Exploration.

Our first step is to set up the task, which is relatively straight forward; our data will be WORK.IMPORT2 (as created above), the Dependent variable is the TOTAL, we want to sum as we group the data and finally we’ll set the TIME ID to break_date.

Click on the Analyses tab, and check off the three different Series Plots; we won’t go through everything (as there’s an awful lot!) but we’ll touch on some of the more interesting things in the data.

After you’ve run the code, you’ll get a graph that looks like this:

There was clearly a spike in 1994, and a huge drop in 2012. This is a great start as it tells us there is something very interesting going on in the data; let’s keep digging!

If you refer back to the image with the DATA Tab, you’ll notice down at the bottom we’ve set the Interval to be YEAR. Change that to Quarter (nothing else needs to be modified) and re-run the query.

So what we can clearly see is that in the 4th Quarter of 1994, there was a huge jump. But you can see there does seem to be consistently higher spikes; let’s see if there’s an additional pattern. Scrolling down in your Results window, you should see a second graph like this:

Wow, what a very clear difference between Quarter 1 and 2!  January – March is a very clear period of time where water main breaks are more likely; this makes sense as the winter deep freeze has set in, and near March, freeze / thaw patterns may happen that also result in breaks.

Let’s take it down one more level – change the Interval from the second image to Month, and once again re-run the code.  Scrolling down, you’ll see this graph:

Over 600 water main breaks in a single month? That’s an average of 20 a day!  I feel bad for the poor city workers who would have had to work in the freezing cold.

Once again scrolling down in the Results window, we see even more clearly the January spike, with the drop down into May – August.

The data set does not have Neighbourhood IDs or anything else that can easily be translated to other data sets I’ve used from the City of Toronto. Although the Geographic Coordinates are provided, they would need to be converted before any further geo-spatial analysis can be done. The next article will be digging into additional Data Exploration features and then get into preparing the data for actual forecasting.

Did you find something else interesting in this data? Share in the comments. I’m glad to answer any questions.

Need data for learning?

The SAS Communities Library has a growing supply of free data sources that you can use in your training to become a data scientist. The easiest way to find articles about data sources is to type "Data for learning" in the communities site search field like so:

We publish all articles about free data sources under the Analytics U label in the SAS Communities Library. Want email notifications when we add new content? Subscribe to the Analytics U label by clicking "Find A Community" in the right nav and selecting SAS Communities Library at the bottom of the list. In the Labels box in the right nav, click Analytics U:

Click Analytics U, then select "Subscribe" from the Options menu.

Happy Learning!

by
on ‎03-05-2016 01:32 AM

Can't imagine the climate to have pipes freezing... wow!

I assume the purple lines in the seasonal charts are for 1994. I was curious to know what year the brown line was as it seems to be consistently higher. Would be interesting to look at this against local weather data too. :smileywink:

by
‎03-05-2016 10:52 AM - edited ‎03-05-2016 02:16 PM

Great points @MichelleHomes– thanks for making me have to think on a Saturday morning, before I even have coffee :-).

I should note as well that I’m trying to get the Neighbourhood IDs so that I can narrow down the areas (I’d love to see if there’s a trend for Neighbourhood / Weather / Breaks) but it doesn’t look promising.

I went in to the Exploration task again and under Statistics is a “Seasonal Statistics” option (second image), using the previous settings from before (changing the Year to Month):

Image2:

When I check off this option and re-run the task, I get:

...

Which gives you a sense of where the fluctuations are (the N is the number of days in the quarter with a water main breakage, and the sum is the total number of breaks).

In the actual application, the graphs have a “mouseover” feature where you can get specific details about the graph; in this case, the brown line points to 1991.

For the climate data, I wondered about this myself.  I found the Government of Canada’s Weather Open Data Page and downloaded the data for 1990-2014 to match the time period for the breakage data.  I then merged everything in Excel and imported the single file into SAS UE.  I should note here that the Temperature data stops for some reason after 2003, so I’m only going to show the graph for the limited dataset.

``````proc sql;
create table work.import3 as
select a.mintemp, a.maxtemp,
b.total,b.break_date,
year(break_date) as yr,qtr(break_date) as qtr
from work.import1 a, work.import2 b
where b.break_date = a.date_time;
quit;
``````

What I’ve done here is created a new table (work.import3) where I’m taking the Minimum and Maximum Temperatures from the Climate Data, and the Total Breaks and Break Date from the Watermain data.  I’m extracting the Year and the Quarter from the Break_date as well, as that’s the time period we want to focus on.  I join the two data sets based on the date (if you recall my articles on PROC SQL, this would be an inner join where each table has a match).

This next bit of code is a bit tricky and I’ll explain the logic as best as I can.

``````proc sql;
create table work.import4 as
select put(yr,4.)||':'||put(qtr,1.) as YrQtr, /*Concatenate the two*/
avg(mintemp) as avg_min,
avg(maxtemp) as avg_max,
avg(maxtemp-mintemp) as avg_range,
sum(total) as break_count
from work.import3
where mintemp<>.  /* do not include missing values in the output */
and maxtemp<>.
group by yr,qtr
order by yr,qtr;
quit;
``````

I’m creating a new table work.import4, and because we want the Quarter as a categorical variable, I combine the Year and the Quarter (separating them with a : for easier reading).  I average the Minimum and Maximum temperatures, but I also have a theory that it’s the fluctuation in the temperatures that cause the breaks (if the temperature drops freezing the pipes, and then warms up I hypothesise that this causes stress on the pipes).  As I mentioned, the 2003 Q2 through to the 2014 data is missing the temperature data, so I exclude that by removing the missing data.

After doing some data exploration, I establish that Q3 and Q4 are the ones we want to focus on, and that a line/bar graph would be the best to visualise this data.  I am using a WHERE statement (the % is the wildcard) to limit the output to Q3 and Q4, because these are the periods I want to focus on.

At the very bottom of the DATA screen (not shown), you can choose “SUM” or “MEAN”; because each quarter is a separate row, it doesn’t matter which one we choose.  I should also mention that I went into the Options tab and changed the line width to 3 (not shown).

Here’s the graph:

What I find interesting is that it seems that if the Q3 temperature fluctuation is high (more than 8 degrees Celsius) and the Q4 fluctuation is lower (less than 6 degrees Celsius) that Q4 has a higher rate of water main breaks.  This seems to shift in 1998 or so and then rapidly drops off in 2000; unfortunately, I don’t think I have enough data to come to any additional conclusions, but it definitely raises some interesting questions.

Thanks for asking; hopefully this showed a bit more of what you can do within the software, and answers your questions!

Chris

by
on ‎03-05-2016 10:25 PM

Amazing @DarthPathos to see your curiousity piqued! :smileywink: And before your Saturday morning coffee too...

Interesting analysis and your deduction about temperature fluctuation being the most probable cause.

Thanks for exploring this some more and sharing the tasks. Nice to see the variety of transformation options in the Exploration Tasks too.

Hope you are having a good weekend!

Kind Regards,

Michelle

by
on ‎03-05-2016 10:29 PM

I have no shortage of curiousity but can definitely have it pointed into a new direction :-)  There was actually a water main breakage 2 weeks ago in one of the Emergency Rooms at one of the busiest hospitals in Toronto - massive flooding, and all the patients had to be transferred elsewhere.  It was nuts.  I guess this sort of thing doesn't happen "down under" eh?  (on the other hand you have spiders that are bigger than a large pizza, so I'll take broken pipes any day :-))

by
on ‎03-05-2016 10:36 PM

Oh my! Certainly is different to here.

Spiders aren't that large, maybe as big as a palm. We get a few snakes around but they tend to go for other wildlife. http://www.abc.net.au/news/2016-02-27/snake-eats-possum-in-broad-daylight/7204928

by
on ‎03-05-2016 10:39 PM

Snakes don't bother me - it's the spiders that would make me scream.  Large pizza, palm, whatever - I see ones the size of a bottle cap and i am gone ;-)

by
on ‎03-06-2016 06:54 AM
PMFJI, but after viewing the killer snake story from @MichelleHomes link, I noticed the link to this story, which seemed appropriate to your spider vs snake discussion. http://mobile.abc.net.au/news/2016-02-11/daddy-long-legs-spider-wins-battle-brown-snake/7154766
by
on ‎03-06-2016 07:00 AM

@Quentin *shudder* I always knew spiders were smater than we gave them credit for......

(See you at SGF?  Haven't seen you since NESUG in Baltimore!)

by
on ‎03-06-2016 07:12 AM
Yes, happy to say I'll be traveling to Vegas. Looking forward to seeing you and many more. I noticed this blog post mentioned your upcoming book, congrats!
http://blogs.sas.com/content/sastraining/2016/03/03/what-do-star-wars-and-survival-analysis-have-in-...
by
on ‎03-06-2016 09:38 AM

Looking forward to seeing you!  Thanks for the congrats - one of the most amazing and stressful things I've worked on in a while :-)

by
on ‎03-06-2016 01:30 PM

I remember reading the story of the daddy long legs when it came out. I didn't want to add to @DarthPathos's fear... Hope you're ok Chris.

by
on ‎03-06-2016 01:52 PM

I....I'll be ok....(will be sharing it online though because why should I be the only one to suffer :P)

Contributors